Mobile CRM and Epicor Web Access - Security/Best Practice

I am reaching out to you guys for help with security.

With the implementation of the REST architecture, mobile CRM, and things like web access we are now able to utilize these tools outside of the organization (if there is an ssl cert).

However, allowing these tools to be accessed in this manner increases exposure to cybersecurity threats.

I opened this thread to get help on how to identify those threats and how to mitigate them. I was hoping that someone could help clarify what the risks are (if any) of opening up port 443 for something like CRM and web access and then what tools there are in Epicor to help deal with those. I understand that some of the ways to mitigate risk are network related, but just assume we are all good there. What is there to maximize our security defenses in Epicor? I am open to hearing about network setups as well though.

I am hoping to hear about recommendations on password policies, integration accounts, azure AD authentication and the like. Is that all we should be setting up? Do we need to think about creating different epicor server architecturally? Would it be a good idea to create different server and put app servers/task agent on there to handle mobile CRM and epicor web access?

I know this is not a straight answer, but I was hoping someone could outline what threats there are in terms of security and what we could do in Epicor to best prevent those threats.

As far as I understand it, the most someone could with these things being exposed is login and then do whatever that user has access to in Epicor… does that sound correct?

Thanks,

Utah

Two thoughts here:

Azure API Management and Azure Application Gateway

Both will let you have Azure handle the public facing portion of the Internet, take care of authentication, etc. API Management is probably more suitable for exposing your RESTful functions while the Application Gateway acts like a reverse proxy to your on-prem Epicor.

3 Likes

The overall problem is that a hole in the firewall - is a HOLE is the firewall - whether it’s yours or one in the cloud somewhere. If you are a target for any reason, you have risk. There are a handful of options that would take all day to explore as your question is a bit broad - but let me offer these tidbits.

You are correct is thinking there is come combination of old-school (DMZ?) and new-school (Azure) that could be secure enough for you, especially if you combine that with a decent password policy. Given that your technology platform, budget, skillsets, etc. will determine what you can do, and if you can mitigate the potential threat to an internally acceptable level, you can certainly get comfortable in allowing external access to your apps.

MANY, many folks will disagree with that statement, but not all can exist in the perfect world of money+skill+time+tech that allows for the perfectly secure system (Does that even exist?) I suggest that you seriously consider the options and costs, and determine what you can do.

The concept of a DMZ is still very viable and creates an insulation layer between your LAN and the Internet. It takes some consideration to architect it - and may require additional licenses that can be expensive when facing the internet, but it works very well to create a difficult layer to breach.

If you are just thinking about Epicor, then Azure AD integrated Logins with 2FA and a strong complexity/expiration policy is great (although some would argue the need for strong passwords has the inverse affect - see recent articles by Microsoft and the elimination of passwords). I also agree with @Mark_Wonsil and the use of the application gateway - we are looking into that presently as well.

The Mobile CRM is the only thing we allowed to traverse the firewall at present. It’s set up with a public dns and SSL cert, through a pinhole in the firewall. All of our Epicor users have strong passwords, and we’re trying to get to a patch level that will allow Azure AD integration. Once there, we intend to move to 2FA across the enterprise.

We have a few apps that we want to expose or connect to with mobile apps, but they are not capable of Azure AD and two-factor, so we cannot expose them until we have a better plan in place. We’re sort of in the same boat as you :slight_smile: in this project.

3 Likes

Great discussion. I’d like to expand a little bit on this point. But first, I want to review the idea of a Zero Trust Architecture because in the end, it’s not just Epicor that we are worried about. Microsoft summarizes Zero Trust into three points:

  • Verify explicitly
  • Use least privilege
  • Assume Breach

Verify Explicitly
This means use all data points possible to determine if the request is valid. Do I know this device? Is it patched? Is this a normal time to do this kind of work? Does this person regularly use this service? Is the IP from a range of known logins for this account? If not, I can request MFA, issue a biometric challenge, require a hardware token, etc. Azure calls these Conditional Access rules. With a firewall, you generally just need to know the IP and Port and your request is forwarded regardless of who you say you are. Yes, you can create inbound rules for known IP addresses, etc. but that is a lot more work than Conditional Access.

Use Least Privilege
The best non-computer example of this is the safe at the convenience store. “Give me all your money.” “I can’t. The safe only lets me have $50 once an hour. You can have $50 though…” The idea is to give the least amount of access for the least amount of time. Many of us now use different accounts for different purposes (or should anyway). I have an Epicor Admin account but it doesn’t have an email address so it’s more difficult to attack.

Apple does something similar with the iPhone fingerprint scanner (and I assume Face Scan). The biometric hardware is completely separate from the rest of the phone. The phone and scanner communicate through a secure enclave. The system puts a message in the enclave, “Hey, is this Mark?” The scanner picks up the message, does the scan, and puts the reply back into the enclave. There is NO WAY for a remote user to access your biometric info because the phone does have have that privilege. Similarly, we can use the Gateway as an enclave where we exchange messages instead of having the privilege to go through a hole in the firewall.

Assume Breach
Threat actors can fail many times. Companies only have to fail once. Complex passwords and even MFA are great, but this is whack-a-mole. The actors will work around those with supply chain attacks, zero-day bugs, and simple social engineering. The problem with a firewall is that it assumes it is protecting a secured network. Talk to anyone who has experienced a malware attack and they know that there is no secure network. The recommendation is to microsegment the network. Take the ERP servers off the domain. There is no business reason for them to be in there. Moving it to the cloud is not more secure if you have a “not least privileged” VPN connecting it to your local network. Use Web API’s, which can be secured with conditional access. Separate your Operational Technology (production machines), your security network, your video network, your HVAC network, each building from the main network. If any one of those gets breached, restrict the blast area.

And if I haven’t made you uncomfortable enough already: Active Directory is dead. You cannot do this stuff in an AD only world. Even Microsoft is not installing any more AD servers. They will retire AD within the Microsoft internal network as existing AD servers age out. Yes, they will still support AD for years to come but the AD Trust Model just doesn’t work well in a Zero Trust world.

2 Likes

Thanks @Mark_Wonsil and @MikeGross I will take both of these statements into consideration.

@Mark_Wonsil’s expounding on my basic ideas is some great info on these subjects (albeit a bit “Nostradamus” :slight_smile: )

There is sooooo much to this topic that it would be impossible to cover it all here, And it’s changing a lot lately with the new attack vectors and their respective defenses. Can I suggest that find some professional consulting assistance? Someone who can sit with you and ask a lot of questions and guide you through all of these topics.

Also - in looking at the Azure stuff, I think the Azure Application Proxy might work better for us both and be the simplest to implement if you are already in Azure.

1 Like

:poop: I thought that’s what I posted. My bad.

1 Like

Mike if you know someone I would appreciate their contact info. And if they have done work with Epicor that would be helpful too.

Our organization is starting to ask for applications that require exposure and I just want to make sure they (and I) am aware of what we are signing up for. The last thing I would like to do is provide a great application at the expense of our cybersecurity defenses.

Do you know anyone?

I will look at that. I was already watching videos on the Application Gateway. I will now investigate the application proxy. We don’t do much with azure right now, but I think you can always set up on prem virtual servers to interact with the azure tools yeah?

(@Mark_Wonsil )The Application Gateway seems to be a layer 7 load balancer for Azure based apps, but I’ve confirmed that it will also use on-prem endpoints. The Application Proxy is simply a proxy for on-prem endpoints, so simpler and more direct in the end.

@utaylor - I do not have anyone in particular, sorry. If I did, I send you their name. What I do have is a really paranoid Network & Systems Engineer who does a lot “No that won’t work because…” when I ask for things and I learn from him. The more I look at the App Proxy, the more I think it is the way to go.

2 Likes

Yes, this is what I meant to post. The names are close and I grabbed that link instead. :man_facepalming:

1 Like

Separately, today is World Password Day! Here are Google’s thoughts:

1 Like

Thank you both! I looked into both of these as much as I could today.

For either one of these though, how would something like Epicor’s CRM app pull up a prompt for the user to log in with azure AD? Would it really work?

I have to read and watch some more videos on how this works with an app. I can see it working fine with a URL to a website hosted on an on-prem server, but I can’t picture how it would work when an app on a phone reaches out to an endpoint.

Thank you both again for giving me these options.

E-mails with dummied-up fake O365 invoices/login scrapers are the biggest current scourge we’ve seen.

Pass-phrases with M$ 2FA, and our VPN users get a bonus token generator. The rest is as you correctly observed: whack-a-mole!

2 Likes

Hey all, I just want to chip in my thoughts on this as I just had a customer tell me that they needed their Epicor server on the internet with an SSL cert. I told them no, most definitely not.

These are my thoughts, but I am curious to hear others responses to my opinions:

1) You should never, EVER, expose a live production ERP server to the internet. That is security 101 and should not need any explanation.
2) Epicor is running on IIS. Historically, IIS has had an absolutely TERRIBLE track record for being a secure web server. So even if you set up everything according to best practice, there is still a good chance that the bad guys can walk right in, if not this month then sometime over the next 6 months.
3) Let’s say you open port 443 but open it only to your website server so that it can query parts for example. Well, then when your website gets hacked they can walk right in to your production server again.
4) SSL has had its share of major vulnerabilities over the years too, with HeatBleed being one of the worst. Yes, I know HeartBleed didn’t affect IIS’s implementation of SSL, but my point is just because it’s SSL doesn’t mean it will not have issues.

My point is that the risks of ransomware, data exfiltration, and even DoS attacks are just way too high to put a production ERP server on the internet. No system is perfect and at the end of the day, the risks FAR outweigh the benefits. So what did I suggest to them?

When we consider how to get data out of Epicor and to a webserver safely which doesn’t require us to externally expose our Epicor server, two ways come to mind but I’m sure there are more
1) The first is to create an automated DMT export which dumps all required data into a .CSV file and then ships it off to where it is needed via SFTP. The website can query the CSV file on it’s server (or script it to update SQL tables on the web server) and we have total containment with zero risk to our production server.
2) The second way, a LOT more time consuming to set up and manage, is to set up a separate Epicor server which is isolated from everything else on our network, then expose that server to the internet, and then schedule a copy of the database from LIVE to this outside one on regular intervals. This is still not a good idea though as something like the Heartbleed attack would have still allowed a total compromise of the server and exfiltration of the entire database.

A beneficial side-effect of the .CSV approach is that it makes integration to the external website totally independent of the Epicor version. REST API can change, Epicor can be updated, etc all with zero concern for how it will affect the external website. Not having to engage website developers to test every single Epicor upgrade is a huge time and cost saver.

Well, you might need to explain “Expose … to the internet” to some people, because all of us that use a cloud hosted Epicor have our ERP exposed to the internet in some fashion or we wouldn’t be able to access it. :joy:

Exactly. We would never consider cloud hosted Epicor for all of the reasons that I mentioned. Best of luck to those who do, it’s definitely not a path we would consider with any vendor, not just Epicor.

Can you answer the below questions? The numbers correspond with your numbers.

2.) Can you give me an example of what walk right in means? What types of attacks are these, do you have any arstechnica or other articles about these attacks? Anything that can support your statement of “terrible track record?” I am not asking this to refute your argument, I would just like to read more on it.

3.) Our website getting hacked would just be the IIS website for epicor, right? What do you mean, “walk right into the production server?” What kind of access would they have, are they able to do anything and everything or only what the website services are capable of? Again, do you have any articles about this type of attack?

4.) Thank you for the example on this one.

5.) When you consider how to get data out of the epicor server…. This is a great thought-provoking question…. How do I get the data out, what type of data is it? Is it tactical data, strategic data? Do I need it to be real-time? What if your users need the data real time? Do you sync your external server every second?

Epicor made a mobile CRM app that integrates directly with the Epicor app server via port 443 and without linking it to an app server, it does not work. So, I take it you would never use that app so then do you build your own app that outputs the correct files?

While DMTs do not change that often, I have had my fair share of templates that change from version to version.

Thank you for sharing your approach, it mitigates the risk of exposing your Epicor server, but man it sounds like a ton of work to keep up with the syncing and DMT research that goes into how to integrate it. I would appreciate examples of the topics above.

Hey @aaronssh I’m gonna take issue with some of your points because I don’t think they necessarily are invalid but they are also not absolutes.

1) You should never, EVER, expose a live production ERP server to the internet.

I’m not sure why this is a thing that one should “NEVER” do; Epicor ERP is just another web app and while yes it contains a lot your business data it is not different than SAP, Sales Force or Dynamics CRM those are all cloud hosted and " on the internet". Is there risk? sure there’s always risk but as long as you keep your system up to date, use a robust authentication mechanism (2nd factor) and protect yourself with intrusion detection, lockout policies etc you should be fine.

2) Epicor is running on IIS. Historically, IIS has had an absolutely TERRIBLE track record for being a secure web server

IIS Is no more or less secure than NGINX or Apache or any other web server. As long as you keep your system up to date unless there is a zero day flaw you should be okay. This is the same by the way with Apache or NGINX there is no bug or flaw free software and at least with Microsoft you have a large (liable) corporation in front of it. There are millions of websites hosted on IIS and sure there have been issues but that’s the life in a connected world. As long as you put in the proper precautions keep your systems up to date etc you will be fine.

4) SSL has had its share of major vulnerabilities

If SSL is good enough for every major bank and financial powerhouses in the world… it is certainly good enough for anyone else. The issues with SSL are library specific and have been addressed as soon as they are found. SSL is meant to protect your transactions on the wire and it is bullet proof encryption (as long as you are using the most recent ciphers and software)

DDOS attacks can be mitigated by services like Cloud Flare and others and 99% ransomware comes in via email or your users clicking the wrong thing. Although not impossible it is highly improbable that a IIS flaw would be a vector of attack for ransomware though it could happen. The threat of ransomware is huge and looming and there is only way to be protected and that is to have bullet proof non mutable backups. It is not a question of if, but when you’ll get tagged.

API’s is generally the best way to get data out of the system, you can generate API keys and scopes which narrow down the attack surface and if you are paranoid you can even put that all behind an Azure Reverse Proxy for an additional level of protection. You can rate limit and throttle to your hearts content.
IMO Putting a CSV file on an FTP / SFTP site somewhere is fine enough but it makes real time integration nearly impossible , its error prone oh and by the way giving access to an outside entity to that SFTP / FTP server is akin to giving them access to a web server there are flaws in all software and FTP / SFTP is no exception.

All that said security is a complex topic and there are a thousand ways to skin a cat, however I’d recommend that instead of dealing with absolutes you weigh the pros and the cons of each approach and take that which you are comfortable with.

If you are a small shop with little to no IT staff and not enough know how to manage all this, by all means go with a cloud offering. Not only is it secure… its convenient.

4 Likes

Jose it sounds like you’re a little more comfortable with ransomware risk than me. For us, we can’t afford 21 days of downtime. My approach to security is that we don’t fill our fortress with windows and say “well, its ok there are locks that usually work” or “these windows are just as secure as those windows”. We remove all windows from the fortress, commonly called “reducing the attack surface” in the security space. Exposing your live ERP to the internet is one such window that we do not want in our fortress wall. Not everyone wants to spend the time and money to do it that way, I guess it depends on how much 1 hour of downtime costs each individual company. But that is our approach and it has kept us out of trouble many times. We watch as Azure and others have extended downtime while we keep chugging along with zero downtime.

Source: https://securityandtechnology.org/wp-content/uploads/2021/04/IST-Ransomware-Task-Force-Report.pdf