SaaS Deployment Server

Just wondering out loud in case someone might be listening…

…In the SaaS environment, every client goes out to get updates from the cloud and change gets downloaded every time it finds one for each and every user. I’m sure that adds a lot of traffic to the Epicor Content (and edge) servers. Any thought to do something like Window Server Update Services (WSUS) and have a local machine checking for updates (or a sync service) to download to a local deployment server? It could save $$$ on Epicor’s Akamai bills and save bandwidth and improve update performance for Epicor customers.

Just a thought,

Mark W.


Mark -
We have created a local ftp server for updating the clients because when there is a large update (like the one coming up which will be around 250meg), multiple clients updating simultaneously would bring our internet connection to its knees. The server synchronizes itself with the Epicor FTP server nightly and a change to the users local hosts file redirects the requests to the local FTP server. This also allows the clients to update much more quickly. This is not a supported update method but has worked well for us so far.


1 Like

Hi Brad,

I didn’t think about doing a localhost redirect for the FTP site. That sure makes it easier than updating the .config’s Deployment URL on each workstation (which could be overwritten on updates). And we could have a sync’d ftp site in North America, Europe, and Asia as well.

In the longer term, maybe Epicor could add a localDeploymentServer item to the .config and use it if filled otherwise use the primary DeploymentServer. They could even do a check the primary for updates and make sure the local has been updated and warn the user when the local is out of date.



SaaS infrastructure has added a CDN so bits will be getting deployed closer to your geography. This is already active on all pilot instances and has been getting rolled out to live as we monitor the new infrastructure partner running the CDN. Hopefully this will ease a bunch of pain.

More to come…

1 Like

The CDN will certainly lower the strain on Epicor’s servers but we’ll have all our clients pulling the exact same releaseClient through one pipe around the same time, so any caching we could do locally would certainly lower the strain on the end-users’ side, which is nice


Mark W.

Understood on the need. The bits should be closer but still a lug to pull.

This is something I am curious of as an OLD IT infrastructure guy decades ago. Client Server was a means to offload mainframe effort due to performance and scale problems. Now we are going back to the mainframe via cloud. What kinds of things will the industry look at due to the same pattern problem? CDN is the first step.

Is a variation on a caching proxy server the next device all companies are going to start buying to improve client performance? That flies in the face of what a lot of c levels are looking for - 0 footprint of a local datacenter. But as more and more cloud is adopted, companies are and will note that IT folks are not going away and are still needed - the tools and pains will just evolve. Do we evolve to not hosting and migrating as an industry but some improve our local networking infrastructure at an average installation in the industry?

My personal mission is to minimize the amount of bits flying around so you don’t need the network gymnastics. We wouldn’t have this chat if were smaller.


I think we’re seeing WAN optimizers already and some of those do some kind of caching.

Something that I’ve been watching is Content Centric Networking. It’s something that started at PARC by Van Jacobson, the guy who wrote tracert. It makes total sense but that’s not what always ends up in our technology stack…

And we wouldn’t have this chat if everyone used a web client!


Mark W.

Not totally true - browsers client include javascripts libraries that can be pretty obese as well. And those get downloaded every time you open a form or a page if you don’t do your caching, minimizing, versioning, etc correctly. Browsers are not a panacea, just different issues. And don’t get me started with the security headaches.
In both cases you need to be smart about what you are sending down. If you neglect things, things are bad :wink:

Oh, I agree Bart. We’re a long way from 3270 terminals.

And then there’s JavaScript. JavaScript? How the hell did that become the language of the web? < shrug > I know Google tried to replace it with Dart, Microsoft with TypeScript, and then there’s the compile to languages like CoffeeScript. I guess it’s not the language as much as the sandbox it should run in. Downloading code and running it in a local machine? Doesn’t seem like a good idea. And then you have the idea of React where you bind the browser and server and run the code where it’s most efficient. Cool idea really.

I do like the idea of a caching proxy to help eliminate those bytes though…

Was more of a VT guy - started in VMS so the progression to NT was pretty easy :wink:
Remember (DEC) VMS = WNT (Windows NT)
(Increment each letter by one) And MS did hire two of the VMS architects so there is definitely some carry over in personality to the early platform.

I’ve been fascinated by appliances in that arena for some time. I remember Strangeloop Networking having one back in the early 2000s for ASP.NET - Richard Campbell of RunAsRadio and Dot Net Rocks fame founded it with others. There were (are?) a few like that out there. Back when I did IT hardware in the 80s and early 90s I liked them but I’ve been out of that loop for decades. It’s all software based now if you look at all the Azure virtual networking.

Do you see Virtual Desktop Infrastructure (VDI) solving some of these problems? Now you’re moving only clicks and video at the edge and have more control over the desktop. With the amount of virtual storage out there and the power to give a single workstation as needed, it might help get to that zero-footprint goal for some organizations.

Hi Bart,

I’m with a new Epicor customer in Australia, limited experience rolling out a client so far. We have the same issue with roughly 500-600 PCs to deploy the client app across about 30 disparate sites.

Based on the March/April SaaS update to 10.2.300 SLS we will end up with over 200GB download on the Monday morning during the next upgrade. There’s a pretty valid concern that this will cripple our network and impact users getting on with business during this time.

I see the client config file has deploymentServer=“URI” option, so my question is:

  1. Is there an Epicor technical guide and download scripts to update the client app?

e.g. Epicor PowerShell script to download the to an internal deployment server (File, Web Server or FTP). Then schedule the API call in the PS script overnight. Download any updates to the local deployment server. The local clients then check for updates and install the updates from the LAN.

My simplistic view is the Epicor client app is already making these checks when it first starts. Check for updates > download > extract > install > restart new .exe

But as mentioned I’m a new customer, perhaps this has been solved and I’m just needing a pointer to it? Anyway apologies for the ramblings, just keen to get some options.

Thanks in advance.


Hey John,

We feel your pain. Double whammy: there is no silent install (or wasn’t) for the Cloud installer but there is for on-prem. :expressionless: The good news is the install of the client is VERY lightweight (no PATH changes, no Registry Keys, etc.). @jstephens shared a post with his PowerShell script that might work for you.

This is a real pain point for SaaS adoption and there have been more suggestions regarding this so the more people who express their hurt, the more attention it may get.

Worse case scenario, create Short-Cuts for users with the -SKIP option then train them to manually execute an upgrade script (which replaces their Epicor folder from a central repos) whenever they get a version mismatch error upon login.

Mark W.