This has come up so I decided to sit down and go thru the possibilities and how we got here. A lot of this is not 100% applicable to our customer base but some snippets may be relevant. I am putting this out there so folks can see the possibilities beyond just ERP 10. I assume we are not the only app in your data center – whether it be on premise in the cloud or both. The industry is a mix of both and Epicor certainly is as well. A lot of this is off the shelf tools – from MSFT, from Azure, and from Epicor. Much is custom built for purpose, not built for delivery to be reused as I am not sure the appropriate need as will be explained below. I am by no means the one doing all the work or the only group doing the work in Epicor. It’s an industry wide happening and companywide. Let’s get into it…
First the Epicor stuff is the most applicable. We’ve had Command Line Interpreter (CLI) APIs around since 10.1.500, really since 10.0 but we put in the effort to productize and document them. Converting them from internal dev tools to product for you to use in your daily administration tasks. We keep adding new ones each release. Some of these are for dev, some for admin and many of the new ones geared towards migrations. These have been critical to the on-going evolution to the cloud and enable our data center folks to concentrate on tasks beyond staring at screens and clicking when handling hundreds of customer applications.
These CLIs have been in the help doc since the beginning. They are under System Management in the help – search ‘Command Line Tools Guide’ and you can find it:
Most of these are written assuming legacy cmd console in Windows and written in the batch file style. We will be introducing our first native PowerShell script in 10.2.300 and assume more will be coming. PowerShell (PS) is ‘eating the world’ when it comes to automation in the data center. If you have not learned it yet, I’ll wait, go look - https://en.wikipedia.org/wiki/PowerShell.
PS is a superset of CLI abilities. It combines classic CMD abilities and dotNet style of coding in it’s scripting language. The blue is a bit rough sometimes on my old greenscreen mind though
For now, all the Epicor CLI features work in either ‘shell’. As mentioned above, PS scripts will start appearing so getting used to the ‘blue shell’ is a good idea from the beginning though all work today from either.
So how do you use these CLI tools?
The help doc does a good job giving you the how to and details. Some of the CLI tools come in the Client deployment, some as a part of the Admin Console, etc. So, you do need to find where each is located.
The ‘Batch File’ style of command lines uses a consistent look and feel so you don’t have to do a ton of guessing about the details. Each tool will have different abilities and that will vary actions available as you will see in a bit.
All these CLIs use the following standard:
/CreateNewLog=[Yes or No]
And of course
Action is what to do. Update this, install that, etc.
ConfigFile is the how to do it. This will vary massively from tool to tool. Think of it as the DataSet sent to a server method. You call SalesOrder.Update() but the meat of the details is the DataSet. Action and ConfigFile is that for CLI. Since these vary so much from tool to tool, there is a help of sorts on it – GenerateConfigTemplate . You can generate a ‘blank’ template file for any CLI by using
[MyAdminTool].exe /GenerateConfigTemplate = "C:\temp\Config.xml"
LogFile is straightforward. Where do you want the details?
CreateNewLog is the same. Overwrite the old, append to the existing or create a new one by appending a datetime to the file name. If you have used the ERP 10 tracing, you’ll be used to this.
LogLevel simple controls the amount of details to Info, Warning and Error. As with the rest of ERP 10, the ‘Flight Data Recorder’ logs all fatal errors to the Windows Event Log.
The intent for the CLI personality was to think of the IT guy doing scripting and automation of different tasks. The logging has a heavy surface area and the details are buried in the ConfigFile. This came from working with folks automating internal tasks and minimizes the rather verbose batch files that get created when a ton of details are present in the command line instead of standardized files. I’ll get into that later…
What can you do?
Quite a variety of tasks. I’ll point out a feature for regenerating the Data Model first. Administrators do this in their maintenance window all the time. Go to Admin Console, regen the model, 10-15 minutes later, recycle the app pool to refresh the cache of all those new User Defined columns and tables and voila you see the new fields in BAQ / BPM / Configurator, etc. Developers do this a TON when creating new services and one got tired of the UI and wrote this. It was quickly productized for other devs and then for the customer.
I use this example as it’s a common scenario bridging an Epicor CLI and an OS level activity. In this case restarting the IIS App Pool
This is a common theme running throughout this. Doing some Epicor stuff, lots of OS (or Azure) activities. That’s what I hope everyone notes and wonders how they can automate other activities beyond Epicor. Hopefully you will blend Epicor and other actions to give you back some time in your day.
As an example, IIS recycling has a couple of options for example (probably more but I use these two the most).
- Appcmd.exe is an IIS admin tool and is a ‘legacy’ CLI
- appcmd recycle apppool /apppool.name=‘MyAppPool’
- Restart-WebAppPool is a PS ‘cmdlet’ in the WebAdministration package
- Import-Module WebAdministration
- Restart-WebAppPool (Get-Website -Name <YourSiteName>).applicationPool
Depending on which tooling approach you use, you can create a little batch script to regen and recycle and kick it odd at midnight or whenever via Windows Task Scheduler or similar timer tool.
How to use it?
The help walks you thru this. Generate an empty configuration with something like:
Ice.Tool.DataModelGenerator.CL.exe /GenerateConfigTemplate = "Example.xml"
When you open it, you will see the template with a ton of help comments to fill in the blanks. The help for each describes the options and their use.
The Help covers what all the values mean and possible values of interest:
And Examples on how to use it:
Review the various CLI tools and see what you find of use. For installs, Database Server and Setup Environment are very useful for managing the DB and App Server. Task Agent is also useful.
For Development, the Solution Workbench tooling is useful to script deployment of ‘change control’. Adding a new BPM, BAQ, UD fields, etc as a solution and transitioning it between dev, test and live environment.
What about all the REST Work?
Oh, it’s still there and used. Using Curl from a command line or Invoke-RestMethod in PS is extremely powerful. Basically, all of ERP 10 is CLI driven now just by calling one of the REST endpoints with the relevant data. I’ll refer folks to the REST Overview and docs for those details.
Epicor Uses all this?
Oh yea. I don’t think we talk very much about life in the Dev or Ops areas of ERP 10. For example, we use a mix of TFS and VSTS/Git for our version control. When a developer checks in code changes, a build system kicks off and rebuilds everything relevant to the change and all it’s dependencies. So if service A relies upon service B and B changes, both are rebuilt. At the end of our builds, we run Unit Tests developers create to ensure nothing ‘breaks the build’ and emails / alerts folks when that happens.
Every night, we bundled up a ‘good’ build and deploy an ‘install’ around the world and into our QA labs. Manual testing of the code just changed occurs and we also kick off automated suites of tests. I think we are in the mid-twenties for the number of servers just for automated testing. Add to that QA test boxes for the various teams, etc. Automating all those deployments is critical.
Next, we also have an Upgrade Services hosted offering (aka Cirrus). This gets patch releases all the time. There are always a lot of customers going through a move to a new release at various stages from release x to release y. We need to have the latest available for them to choose along with all kinds of version targets. So as patches are made available, these are automatically added to the selection once they pas their testing. The process of moving from one release to a new one is also automated using many of the CLI tools described above.
Third, we have a Partner Program where select partners are provided a ‘sandbox’. Logically a Dedicated Tenancy Instance where they can test their integrations and verticalizations. On both the current release and vNext when it’s made available in preview. This is all hosted in Azure and we have made a ton of automation scripts around everything in Azure. Standing up new VMs, automating E10 app server and companion product installs, databases, DNS names, virtual networks, CDN deployments, the list goes on. This has driven a ton of automation into the Ops of the product.
(Not) Lastly, SaaS. SaaS Ops has been moving to Azure over the past months since the big partnership announcement with MSFT earlier this year. Managing the number of servers especially during a migration is a huge undertaking. Dev and Saas Ops has been working together a ton to feed improvements into the pipeline and ease the admin burden. Most of the CLI tools you see appearing over the last years is driven by them. The parts that are relevant to others are what we are productizing by putting docs in place such as above.
Where can I get all the scripts?
This is the sticky issue we weigh daily. A majority of these scripts are 99% geared to the domain. Standing up 40 databases on a box and see if it can handle the load? Not sure any customers need / want that. Sure, it’s cool to see 40 ‘test customers’ on a new system in thirty minutes is cool and MSFT has demoed that kind of ‘DevOps’ in every keynote for the past couple of years. It’s just not of value to the majority of the customer base that would we put more effort into ‘x’. It’s a huge struggle to balance when I put on my Product Owner hat.
Instead, we are looking at what we build and give nuggets of value that can be weaved together to fit your environment. Most of our scripts consist of something of a mix:
- Some Epicor CLI driving
- Some Azure Environment Driving
- Some Parallelization of Execution
The parallelization is an interesting geek out topic for me but again not sure applicability in the real world. I made a comment about migrating 40 dbs in parallel. It was a script I had in front of me we were playing with – how many can you run in parallel? Could a certain platform handle 2? 30? Without effecting performance in the other dbs? What is we bumped CPU or memory available for a bit? How much is that costing us in Azure?
FYI – Financial Development (FinDev) is the new dev thing. Just because you have ‘infinite’ hardware to rent from MSFT or Amazon or whomever, doesn’t mean you can afford it! Striking the balance of power for value is critical!
All of this is driven by automation and a ton of PowerShell. Keep an eye on the Command Line Tools section of the help. Assume more PS scripts will start trickling out in examples.