Test environment

so, i am looking to enable and improve a company’s EPICOR, however it appears that the test environment is only updated upon request from the company (maybe once a year) - Question - is it possible to have the test environment update on a regular basis (something similar to Quickbooks - todays live data is tomorrows test environment) and it updates continuously each day so that you can actually run true testing of the process without affecting the live data?

There are many people on here who have created scripts and processes that they can run whenever they want to do a full refresh.

I have always wanted to learn the script and procedure, but haven’t taken the time to try and learn it. Don’t even know if I could learn it and feel confident with it.

In my (cloud DT) experience, I could not get support to setup a regular snapshot of Live to Pilot. I expect the process is labor intensive, so they only do it upon request. That said, I often request snapshots multiple times a week, and they almost always get done perfectly on time.

It would be nice if there was a way to set it up to update automatically each day - same as backing up the live data but making a second copy into test environment.

Just something to think about… we don’t refresh automatically because often we are testing a new process or customization and don’t want to lose that progress. We refresh ‘as needed’, but have a set of tools/scripts to help automate the process.

  • Create backup on Live Server
  • Move backup file to Test Server
  • Stop Application Server
  • Restore backup on Test Server
  • Apply a SQL Script to make changes that make the Test system distinct from Live. (Add [Test] to the company names, etc)
  • Delete temporary backup file
  • Start Application Server

There is more I would like to do, automating the stop/start of the Task Agent, Generate DataModel…

2 Likes

Here’s what we do:

https://www.epiusers.help/t/how-to-automatically-replicate-an-entire-environment-for-development-and-testing/84540?u=stevefossey

But it’s only going to work for on-prem. I’d love to see the REST equivalent to pull the whole DB down from the cloud, but then I guess you’d still need a license to run it locally.

5 Likes

Thank you for this information - I will seriously look into this and hopefully it works here as well.

There you go Robert! This community holds all the answers.

@RKromminga Here is a previous thread with an automated SSIS process that @markdamen runs, plus others take on the process.

1 Like

Or use the equivalent REST calls. If you can do it from the client (excluding Admin Console), you can do it with PowerShell and Invoke-RestMethod.

2 Likes

I think I could technically pull this off now, except there are some special tables and objects that aren’t actually stored in the database, but as a service outside of the instance.

If I’m honest, and and want to follow best DevOps practices, I would have a test/dev database that has demo data to use for development work, upgrades, and regression testing. The database would be smaller and reduce the risk of accidental data leakage.

Instead of a tool that copies everything, it would be nice to have a tool that selectively exports data, anonymizes the data, and can be imported into my dev/ test instance as needed.

:person_shrugging:

2 Likes

oh, great idea, I wonder if the CLI version of Summarize could be used?

I’ve just run into a problem where I haven’t been able yet to deploy this on our new Windows 2019 servers. I can actually run Backup-SQLDatabase against my production sql server, but I can’t run it or Restore-SQLDatabase against my dev box.

Both are supposedly identical.

I can remote PS into dev, and run SQLSERVER: but can’t actually use those commands.

The error is simply “failed to connect to [server]” and it appears whether I’m remoting or local.

Any PS experts out there?

Is your firewall up on the server ? I just had a similar issue with mine and it was the reason…
Pierre

1 Like