Automation Possibilities in E10

This has come up so I decided to sit down and go thru the possibilities and how we got here. A lot of this is not 100% applicable to our customer base but some snippets may be relevant. I am putting this out there so folks can see the possibilities beyond just ERP 10. I assume we are not the only app in your data center – whether it be on premise in the cloud or both. The industry is a mix of both and Epicor certainly is as well. A lot of this is off the shelf tools – from MSFT, from Azure, and from Epicor. Much is custom built for purpose, not built for delivery to be reused as I am not sure the appropriate need as will be explained below. I am by no means the one doing all the work or the only group doing the work in Epicor. It’s an industry wide happening and companywide. Let’s get into it…

First the Epicor stuff is the most applicable. We’ve had Command Line Interpreter (CLI) APIs around since 10.1.500, really since 10.0 but we put in the effort to productize and document them. Converting them from internal dev tools to product for you to use in your daily administration tasks. We keep adding new ones each release. Some of these are for dev, some for admin and many of the new ones geared towards migrations. These have been critical to the on-going evolution to the cloud and enable our data center folks to concentrate on tasks beyond staring at screens and clicking when handling hundreds of customer applications.

These CLIs have been in the help doc since the beginning. They are under System Management in the help – search ‘Command Line Tools Guide’ and you can find it:
image

Most of these are written assuming legacy cmd console in Windows and written in the batch file style. We will be introducing our first native PowerShell script in 10.2.300 and assume more will be coming. PowerShell (PS) is ‘eating the world’ when it comes to automation in the data center. If you have not learned it yet, I’ll wait, go look - PowerShell - Wikipedia.

PS is a superset of CLI abilities. It combines classic CMD abilities and dotNet style of coding in it’s scripting language. The blue is a bit rough sometimes on my old greenscreen mind though :wink:
image
For now, all the Epicor CLI features work in either ‘shell’. As mentioned above, PS scripts will start appearing so getting used to the ‘blue shell’ is a good idea from the beginning though all work today from either.

So how do you use these CLI tools?

The help doc does a good job giving you the how to and details. Some of the CLI tools come in the Client deployment, some as a part of the Admin Console, etc. So, you do need to find where each is located.

The ‘Batch File’ style of command lines uses a consistent look and feel so you don’t have to do a ton of guessing about the details. Each tool will have different abilities and that will vary actions available as you will see in a bit.

All these CLIs use the following standard:

[MyAdminTool].exe
/Action=[ActionIdentifier]
/ConfigFile="[ConfigurationFilePathAndName]"
/LogFile="[LogFilePathAndName]"
/CreateNewLog=[Yes or No]
/LogLevel=[LogLevelOption]

And of course

[MyAdminTool].exe /Help

Action is what to do. Update this, install that, etc.

ConfigFile is the how to do it. This will vary massively from tool to tool. Think of it as the DataSet sent to a server method. You call SalesOrder.Update() but the meat of the details is the DataSet. Action and ConfigFile is that for CLI. Since these vary so much from tool to tool, there is a help of sorts on it – GenerateConfigTemplate . You can generate a ‘blank’ template file for any CLI by using

[MyAdminTool].exe /GenerateConfigTemplate = "C:\temp\Config.xml"

LogFile is straightforward. Where do you want the details?

CreateNewLog is the same. Overwrite the old, append to the existing or create a new one by appending a datetime to the file name. If you have used the ERP 10 tracing, you’ll be used to this.

LogLevel simple controls the amount of details to Info, Warning and Error. As with the rest of ERP 10, the ‘Flight Data Recorder’ logs all fatal errors to the Windows Event Log.

The intent for the CLI personality was to think of the IT guy doing scripting and automation of different tasks. The logging has a heavy surface area and the details are buried in the ConfigFile. This came from working with folks automating internal tasks and minimizes the rather verbose batch files that get created when a ton of details are present in the command line instead of standardized files. I’ll get into that later…

What can you do?

Quite a variety of tasks. I’ll point out a feature for regenerating the Data Model first. Administrators do this in their maintenance window all the time. Go to Admin Console, regen the model, 10-15 minutes later, recycle the app pool to refresh the cache of all those new User Defined columns and tables and voila you see the new fields in BAQ / BPM / Configurator, etc. Developers do this a TON when creating new services and one got tired of the UI and wrote this. It was quickly productized for other devs and then for the customer.

I use this example as it’s a common scenario bridging an Epicor CLI and an OS level activity. In this case restarting the IIS App Pool
image

This is a common theme running throughout this. Doing some Epicor stuff, lots of OS (or Azure) activities. That’s what I hope everyone notes and wonders how they can automate other activities beyond Epicor. Hopefully you will blend Epicor and other actions to give you back some time in your day.

As an example, IIS recycling has a couple of options for example (probably more but I use these two the most).

  • Appcmd.exe is an IIS admin tool and is a ‘legacy’ CLI
    • appcmd recycle apppool /apppool.name=‘MyAppPool’
  • Restart-WebAppPool is a PS ‘cmdlet’ in the WebAdministration package
    • Import-Module WebAdministration
    • Restart-WebAppPool (Get-Website -Name <YourSiteName>).applicationPool

Depending on which tooling approach you use, you can create a little batch script to regen and recycle and kick it odd at midnight or whenever via Windows Task Scheduler or similar timer tool.

How to use it?

The help walks you thru this. Generate an empty configuration with something like:

Ice.Tool.DataModelGenerator.CL.exe /GenerateConfigTemplate = "Example.xml"

When you open it, you will see the template with a ton of help comments to fill in the blanks. The help for each describes the options and their use.
image

The Help covers what all the values mean and possible values of interest:

And Examples on how to use it:
image

Review the various CLI tools and see what you find of use. For installs, Database Server and Setup Environment are very useful for managing the DB and App Server. Task Agent is also useful.

For Development, the Solution Workbench tooling is useful to script deployment of ‘change control’. Adding a new BPM, BAQ, UD fields, etc as a solution and transitioning it between dev, test and live environment.

What about all the REST Work?

Oh, it’s still there and used. Using Curl from a command line or Invoke-RestMethod in PS is extremely powerful. Basically, all of ERP 10 is CLI driven now just by calling one of the REST endpoints with the relevant data. I’ll refer folks to the REST Overview and docs for those details.

Epicor Uses all this?

Oh yea. I don’t think we talk very much about life in the Dev or Ops areas of ERP 10. For example, we use a mix of TFS and VSTS/Git for our version control. When a developer checks in code changes, a build system kicks off and rebuilds everything relevant to the change and all it’s dependencies. So if service A relies upon service B and B changes, both are rebuilt. At the end of our builds, we run Unit Tests developers create to ensure nothing ‘breaks the build’ and emails / alerts folks when that happens.

Every night, we bundled up a ‘good’ build and deploy an ‘install’ around the world and into our QA labs. Manual testing of the code just changed occurs and we also kick off automated suites of tests. I think we are in the mid-twenties for the number of servers just for automated testing. Add to that QA test boxes for the various teams, etc. Automating all those deployments is critical.

Next, we also have an Upgrade Services hosted offering (aka Cirrus). This gets patch releases all the time. There are always a lot of customers going through a move to a new release at various stages from release x to release y. We need to have the latest available for them to choose along with all kinds of version targets. So as patches are made available, these are automatically added to the selection once they pas their testing. The process of moving from one release to a new one is also automated using many of the CLI tools described above.

Third, we have a Partner Program where select partners are provided a ‘sandbox’. Logically a Dedicated Tenancy Instance where they can test their integrations and verticalizations. On both the current release and vNext when it’s made available in preview. This is all hosted in Azure and we have made a ton of automation scripts around everything in Azure. Standing up new VMs, automating E10 app server and companion product installs, databases, DNS names, virtual networks, CDN deployments, the list goes on. This has driven a ton of automation into the Ops of the product.

(Not) Lastly, SaaS. SaaS Ops has been moving to Azure over the past months since the big partnership announcement with MSFT earlier this year. Managing the number of servers especially during a migration is a huge undertaking. Dev and Saas Ops has been working together a ton to feed improvements into the pipeline and ease the admin burden. Most of the CLI tools you see appearing over the last years is driven by them. The parts that are relevant to others are what we are productizing by putting docs in place such as above.

Where can I get all the scripts?

This is the sticky issue we weigh daily. A majority of these scripts are 99% geared to the domain. Standing up 40 databases on a box and see if it can handle the load? Not sure any customers need / want that. Sure, it’s cool to see 40 ‘test customers’ on a new system in thirty minutes is cool and MSFT has demoed that kind of ‘DevOps’ in every keynote for the past couple of years. It’s just not of value to the majority of the customer base that would we put more effort into ‘x’. It’s a huge struggle to balance when I put on my Product Owner hat.

Instead, we are looking at what we build and give nuggets of value that can be weaved together to fit your environment. Most of our scripts consist of something of a mix:

  • Some Epicor CLI driving
  • Some Azure Environment Driving
  • Some Parallelization of Execution

The parallelization is an interesting geek out topic for me but again not sure applicability in the real world. I made a comment about migrating 40 dbs in parallel. It was a script I had in front of me we were playing with – how many can you run in parallel? Could a certain platform handle 2? 30? Without effecting performance in the other dbs? What is we bumped CPU or memory available for a bit? How much is that costing us in Azure?

FYI – Financial Development (FinDev) is the new dev thing. Just because you have ‘infinite’ hardware to rent from MSFT or Amazon or whomever, doesn’t mean you can afford it! Striking the balance of power for value is critical!

All of this is driven by automation and a ton of PowerShell. Keep an eye on the Command Line Tools section of the help. Assume more PS scripts will start trickling out in examples.

26 Likes

As always Bart, thank you very much for your instructive and candid insights - all in the name of helping the community. We appreciate You!

3 Likes

this is amazing, we just had discussions the other day in my org for the need to do a full deployment for devops surrounding our salesforce team. Where we would do a static restore of our Production Epicor to a QA environment for final testing in a known environment. So aside from scripting the actual DB backup and restore, there is all that activity we want to automate like regen data model, restart app servers… maybe even redeploy dashboard assemblies and regen method sigs? how far could that go?

maybe even redeploy dashboard assemblies and regen method sigs? how far could that go?

Quite far actually. Dig thru the help and ask questions about what’s missing.

3 Likes

thank you, I will be doing that. Expect to hear back, this is all new to me but exciting and immediately applicable to our growing devops IQ/OQ/PQ efforts.

It’s been an under advertised area and most of my life for the past year so maaaay be a little self serving :wink:

I honestly see folks doing thing by hand all the time internally and externally and had to vent to paper last night, copy and paste this morning. Too many errors doing things manually.

NOTE - PLEASE test your scripts carefully. Automation also quickly messes thing up at scale if you make mistakes :slight_smile:

4 Likes

Bart.

Thank you for putting this together. Just last night, I was looking to automate some of these things you mentioned in your post.

Just one question:

Can these applet/cmdlet scripts be run from a remote client computer on the same domain network?
A lot of times I have to remote into the AppServer (Windows Server 2012 R2) itself and do these things (AppPool recycle, Regen Data model, TaskAgent Stop/Start etc.) via the EAC. It would be good just to be able to do it from a client computer (running Windows 10)…possible?

PowerShell has that ability to remote execute. I’ll look at stealing a snippet to show that since you are interested (may be a day or two)

2 Likes

Awesome! TIA. BTW, this would be for latest version 10.2.200.11 (not sure if that matters).

And @aidacra made a great suggestion on learning PowerShell:

Available as a book

https://www.amazon.com/Learn-Windows-PowerShell-Month-Lunches/dp/1617294160

and YouTube:

3 Likes

perfect! i need this, thanks for killing my next month lol

2 Likes

@Mark_Wonsil what is this “lunch” break that you speak of? :laughing:

1 Like

@markdamen - ‘lunch’ is where you type with one hand and consume life-giving proteins with the other. :wink:

2 Likes

Well, here we call it “Potty” training since that’s the only time and place we have to learn things… :thinking:

2 Likes

i never make it past my google news feed…smh

1 Like

With that in mind… Here is the PowerShell script that I have started but haven’t fully developed to use for our Patch Updates on our App Servers. I haven’t spent a lot of time on it so definitely call out no-no’s if you see anything.

function RunSyncScripts
{
    try {
        Write-Host "$(get-date) Starting" -ForegroundColor Green
        ManageRemoteAppPool "epicordevserver" "Caleb102200" "stop" "D:\Epicor\ERP10\TaskAgentConfigurations\Caleb.xml"
        Start-Sleep -s 10
        ManageRemoteAppPool "epicordevserver" "Caleb102200" "start" "D:\Epicor\ERP10\TaskAgentConfigurations\Caleb.xml"
        Start-Sleep -s 10
        ManageRemoteAppPool "epicordevserver" "Caleb102200" "recycle"

        Write-Host "$(get-date) Complete" -ForegroundColor Green

    } 
    catch {
        Write-Host $_.Exception.Message -ForegroundColor Red
    }

}

function ManageRemoteAppPool
{
param([string]$machine, [string]$AppPoolName, [string]$action, [string[]] $TaskAgentsConfigs)

Invoke-command -ComputerName $machine -ScriptBlock{  param([string]$AppPoolName, [string]$action, [string[]] $TaskAgentsConfigs) 
    # Ensure to import the WebAdministration module 
    Import-Module WebAdministration 

    $path = "IIS:\AppPools\" + $AppPoolName

    try
    {
        if(Test-Path $path )
        {
            $state = Get-WebAppPoolState -Name $AppPoolName
            Write-Host $state.Value
            if($action -eq "stop" -and $state.Value -ne "Stopped")
            {
                Stop-WebAppPool -Name $AppPoolName
                foreach($agent in $TaskAgentsConfigs)
                {
                    cd "C:\Program Files (x86)\Epicor Software\Epicor Task Agent Service 3.2.200.0\"
                    Start-Process "TaskAgentServiceConfiguration.exe" -action="stopagent", -configfile="$agent" , -LogFile="C:\cmd\logs\StopTaskAgentLog.txt"
                }
            }
            if($action -eq  "start" -and $state.Value -eq "Stopped")
            {
                Start-WebAppPool -Name $AppPoolName
                foreach($agent in $TaskAgentsConfigs)
                {
                    cd "C:\Program Files (x86)\Epicor Software\Epicor Task Agent Service 3.2.200.0\"
                    Start-Process "TaskAgentServiceConfiguration.exe" -action="startagent", -configfile="$agent" , -LogFile="C:\cmd\logs\StartTaskAgentLog.txt"
                }

            }
            if($action -eq "recycle" -and $state.Value -ne "Stopped")
            {
                Write-Host "Recycling $AppPoolName on $env:computername"
                Restart-WebAppPool -Name $AppPoolName
            }

        }
        else
        {
            Write-Host "AppPool is not there"
        }
    }
    catch {
        Write-Host $_.Exception.Message -ForegroundColor Green
    }


 } -ArgumentList $AppPoolName , $action, $TaskAgentsConfigs

}

RunSyncScripts
3 Likes

@Bart_Elia you can change PS colors.

Yea but if I make it black and green I forget what shell I am in :blush:

1 Like

I’m going to reiterate Mike’s comments here, as users we really cannot thank Bart and all the other Epicor staff that contribute here enough.

THANKS*

2 Likes

Well I have an insanely good team and great coworkers so it’s easy to represent their work. I hope I can drag more of them into a more social spotlight so they can get the credit they deserve.

1 Like