Is there a better way to reset an environment?

We’ve decided that for testing before go-live, we’re wiping our pilot environment, then re-populating it with static data. Our intention is to do multiple iterations of this.

Due to how long it takes to DMT static data into an environment (10s of thousands of parts, BOOs, BOMS) it’s severely slowing down testing. Sometimes days of populating functions, custom menus.

Is there a better way to repopulate an environment? I’ve tried automating the process by creating a script for the DMT CLI tool, but it really doesn’t provide much more benefit in terms of speed.

Am I missing a trick here? Copying an environments database from one environment to another takes 30 mins.

Am I able to do that but partially (without transactions, sales orders, jobs etc) But keep the bits I want (warehouse bins, parts, solutions etc.)?

Appreciate any response, thank you.

Are you on prem or cloud?
Can you backup the database after your initial data upload? Then when you need to refresh it should be pretty quick since it’s just a restore.

2 Likes

Thanks for your response,

Cloud for me.

It’s a good point and yes, we can up to a point, functions, BAQs, BPMs are fine to be “baked-in” because they’re easy to change at any point should an issue arise.

However most of the testing we’re doing at the moment is ensuring correctness of the parts, BOOS, BOMs etc. So they’re the bits liable to be different between iterations, and annoyingly the most time consuming part to DMT into an environment.

Another thing I was thinking was making a script which spins up 10 DMT’s at a time to run in parallel of each other. But not sure if that will bork something.

Yes, you can script and run DMTs in parallel just fine. Most of the time I find that just causes RPM to roughly divide in two. YOu may get better perf running DMT from separate clients, but I havn’t tried that.

Eventually when some bits are more or less in a ‘done’ state you can dmt them into Live env and they will be there in pilot after a db refresh. Then you’re just reloading the diff into pilot on each test iteration.

1 Like

Multiple DMT from different clients are possible, mileage mar vary. AS @jbooker says, the iterative approach moving confirmed config items into Live, restore live to Pilot and so on.
There is the diff tool which you can also use to manage just importing the difference.

Open trans are the last things you need before you go live. Cloud implementations definitely do require more planning and prep to get that data import cadence right. It’s not t as if you can do a backup and restore at will like you can do with on-prem.

2 Likes

There’s an old post on here somewhere in which someone shared a set of 99+ BAQs. Roughly one for each DMT template for use when onboarding.

Basically when you get a working subset in pilot, you can run Dmt -export from pilot and then Dmt into live. Found it very helpful for scripting.

Have looked for the old post a couple times since and cannot find.

1 Like

I got 99 problems but a BAQ ain’t one

4 Likes

SQL-to-BAQ converter supports SELECT * Syntax,

So failing other efforts, you can methodically:

SELECT * FROM Erp.Part - etc.

Then, run these in DMT with the -UseFieldNames parameter and they output in a format that DMT likes.

the #optionaltables you’d have to get creative with.

2 Likes

Foegot that one thanks for the reminder

@jbooker Was it this one?

2 Likes

Yeah that’s the one. Thanks for finding it. I’ve wanted to say thanks to the OP for a while now. Done.

I extracted his cab files and imported the many BAQs. His excellent work saved me designing BAQs to match every DMT template.

Then I created a “baq_BAQueries” to return the list of BAQs and made a script to run them all using that result. Most of my running was manual from VSCode. With DMT -NoUI flag, I could just right-click run the script in VSCode terminal without dealing w DMT UI at all.

Here’s the BAQ
baq_BAQueries.baq (6.6 KB)

and here’s the script.


$ERPCompany = "XXXXXX"
$User = "XXXXXXXXXXX"
$Pass = "XXXXXXXXXXXX"

#$DMTEnv = "PILOT"
$DMTEnv = "LIVE"
$ConfigValue = "C:\Epicor\ERPDT\" + $ERPCompany + "-" + $DMTEnv + "\Client\config\default.sysconfig"
$ConfigValue = "basic.sysconfig"
$DMTPath = "C:\Epicor\ERPDT\" + $ERPCompany + "-" + $DMTEnv + "\Client\DMT.exe"
$DMTPath = "C:\Epicor\KineticPowerTools\" + $DMTEnv + "\DMT.exe"
$SourcePath = "T:\DMT\EpicorDMTScripts"                                     #uses script location if not supplied

$BAQ = "baq_BAQueries"
$TargetFile = $BAQ + "_" + $DMTEnv + ".csv"

#optionally export BAQ list
Start-Process -Wait -FilePath $DMTPath -ArgumentList "-ConfigValue $ConfigValue -User $User -Pass $Pass -NoUI -Export -BAQ='$BAQ' -Target=$SourcePath\$TargetFile"
Write-Host $SourcePath\$TargetFile

Import-Csv $SourcePath\$TargetFile | Where-Object {
    #optionally filter baq list here
    #$_.QueryID -GT 'DMT-00-Acc'
    $_.QueryID -Like 'DMT-59*'
} | Foreach-Object {
    $BAQ = $_.QueryID
    $TargetFile = $BAQ + "_" + $DMTEnv + ".csv"
    #$ArgumentList = "-User $User -Pass $Pass -NoUI -Export -BAQ='$BAQ' -Target=$SourcePath\BAQ\DMT\$DMTEnv\$TargetFile -UseFieldNames"
    $ArgumentList = "-ConfigValue $ConfigValue -User $User -Pass $Pass -NoUI -Export -BAQ='$BAQ' -Target=$SourcePath\BAQ\DMT\$DMTEnv\$TargetFile"

    Write-Host $SourcePath\BAQ\DMT\$DMTEnv\$TargetFile
    #Write-Host $DMTPath $ArgumentList
    #EXPORT BAQ DATA
    Start-Process -Wait -FilePath $DMTPath -ArgumentList "$ArgumentList -UseFieldNames"
}
3 Likes

What’s the Diff tool you speak of? That sounds like it may be useful

IIR, its in the downloads section on Epicweb. Just shut down my pc for the day…