We’ve decided that for testing before go-live, we’re wiping our pilot environment, then re-populating it with static data. Our intention is to do multiple iterations of this.
Due to how long it takes to DMT static data into an environment (10s of thousands of parts, BOOs, BOMS) it’s severely slowing down testing. Sometimes days of populating functions, custom menus.
Is there a better way to repopulate an environment? I’ve tried automating the process by creating a script for the DMT CLI tool, but it really doesn’t provide much more benefit in terms of speed.
Am I missing a trick here? Copying an environments database from one environment to another takes 30 mins.
Am I able to do that but partially (without transactions, sales orders, jobs etc) But keep the bits I want (warehouse bins, parts, solutions etc.)?
Are you on prem or cloud?
Can you backup the database after your initial data upload? Then when you need to refresh it should be pretty quick since it’s just a restore.
It’s a good point and yes, we can up to a point, functions, BAQs, BPMs are fine to be “baked-in” because they’re easy to change at any point should an issue arise.
However most of the testing we’re doing at the moment is ensuring correctness of the parts, BOOS, BOMs etc. So they’re the bits liable to be different between iterations, and annoyingly the most time consuming part to DMT into an environment.
Another thing I was thinking was making a script which spins up 10 DMT’s at a time to run in parallel of each other. But not sure if that will bork something.
Yes, you can script and run DMTs in parallel just fine. Most of the time I find that just causes RPM to roughly divide in two. YOu may get better perf running DMT from separate clients, but I havn’t tried that.
Eventually when some bits are more or less in a ‘done’ state you can dmt them into Live env and they will be there in pilot after a db refresh. Then you’re just reloading the diff into pilot on each test iteration.
Multiple DMT from different clients are possible, mileage mar vary. AS @jbooker says, the iterative approach moving confirmed config items into Live, restore live to Pilot and so on.
There is the diff tool which you can also use to manage just importing the difference.
Open trans are the last things you need before you go live. Cloud implementations definitely do require more planning and prep to get that data import cadence right. It’s not t as if you can do a backup and restore at will like you can do with on-prem.
Yeah that’s the one. Thanks for finding it. I’ve wanted to say thanks to the OP for a while now. Done.
I extracted his cab files and imported the many BAQs. His excellent work saved me designing BAQs to match every DMT template.
Then I created a “baq_BAQueries” to return the list of BAQs and made a script to run them all using that result. Most of my running was manual from VSCode. With DMT -NoUI flag, I could just right-click run the script in VSCode terminal without dealing w DMT UI at all.