Epicor REST outside the domain

Epicor documentation contains the following:

There are several ways of enabling users on the public internet to connect to services and protocols you have
configured on servers on your network. The requirement for the ERP REST endpoints is that the Secure Socket
Layer (SSL) https channel (typically port 443) be enabled and open for inbound/outbound traffic on your Epicor
application server. The SSL certificate used to enable the SSL binding must be issued from a recognized certificate
authority vendor. With SSL in place and the SSL port open to Internet traffic, any application designed to use the
ERP REST endpoints (and that provides authentication into Epicor ERP) can use an Internet connection to get and
update ERP data.
Describing the network configuration required to securely host a server on the public internet is beyond the scope
of this guide. Consult with your IT network engineering staff to establish the strategy you are using to securely
host the ERP REST endpoints on the public internet. Your strategy may include installing an additional ERP
application server that is dedicated for this purpose.

We have avoided having to do anything that exposes any part of our Epicor installation so far, and our general IT support and maintenance is out-sourced. What’s the best way of dealing with this? Any good tips?

1 Like

We use a load balancer in a DMZ with a IP whiltelist of allowed machine IPs to expose REST outside of the network. This could also be done with a reverse web proxy server, but I opted for the load balancer instead.
Basically if the calling machine IP is on the whitelist, it will route https traffic through to your designated machine behind the firewall to port 443 on the app server.

I think there are lots of ways to skin this cat, but just wanted to offer up what we’ve done.

3 Likes

I noticed that starting in 10.2.400 they added Access Scope. I have only briefly tested this, but it’s what I expected it to be. You can put access scope on a user, a REST API key, or supposedly an entire app server. You say explicitly what BAQs and services are authorized and that’s all the connection is good for. And it takes precedence over user security (as in, the most restrictive wins).

Moreover, you can shrink a number of exposed methods for individual service too.

That sounds promising, and was the kind of thing I hoped might be practical and acceptable. I’ll see if something along those lines can work for us.

I think we’d apply as much of this as we can. But I notice it’s fairly common for knowledgeable people to throw up their hands in horror at the very idea of the ERP servers being accessible externally AT ALL, whatever security you apply to the Epicor system itself, so I’m reluctant to rely too heavily on it.

Yes @dhewi,
I found what you post in the Epicor EVA installation guide

The only advice that many IT specialist emphasised on is do not open this port and/or allow these calls to point at your SQL server, i.e. if your IIS/ Epicor App server installed on the same SQL server as per Epicor recommendation for company below 50 users, then install an extra one in different location (server) and mount it to the SQL server, i.e. the servers you open port 443 to, should be away from your DB server and preferably on a server away from your day to day activities.

1 Like

We have two app servers already and each has their own VM server separate from the one that hosts SQL Server. I’m open-minded about a third app server, having noted the Epicor hint, but I can’t see it would achieve very much in our case - adding external-facing duties to the secondary one already in place would do much the same job.

Thinking rationally about the security issues, there is no way to open access to the ERP without external systems being able to affect our internal ones, because that’s the point of doing so, and putting extra pieces in the chain between external and internal doesn’t in itself change that.

So my cautious conclusion is that it’s wise to let the Epicor system handle the “clever” parts of the security rather than try to re-invent that part of the wheel, and try to stop it having to deal with crude problems by means of some relatively dumb device in a bouncer role. I’m interested to note that the replies above are along those lines. But I’m keen to hear anyone else’s experience having actually implemented any of this stuff.

we will implement it soon, but as i explained as long as your SQL server is protected, then i agree with what you saying, and this layer of security designed as a part of general IT risk and recovery plan to avoid any possible hacking to the SQL database, beside in my environment i am allowing REST reading calls only i.e. ONLY GET command, all other commands only work from internal domain.

1 Like

Hello Daryl,

You probably know what I’m going to propose but because our ERP10 is in Azure, it would make sense put the public portion of the interface you want to expose in Azure or SharePoint Online and let Microsoft handle the front-end issues for you. You can use Azure Active Directory to create accounts separate from your local AD. You would then create a gateway in Azure to your on-prem server and now your ERP system can listen on that one IP in Azure filtering out undesirable traffic.

This would also give you the added benefit of encapsulating your business logic from Epicor so if you were to buy a new company, upgrade versions, or switch even ERP vendors, you would just have to rewrite the code behind the interface to your customers/suppliers.

Mark W.

2 Likes

Interesting thoughts, and I have been looking, slightly naively at this point, at something along those lines. It’s a steep learning curve, but probably close to the definitive solution for the future. We are actively looking more at Azure, even though we have already concluded it isn’t the answer we need for Epicor directly at least for the next year or two, so hybrid solutions are looking increasingly attractive.

We’re still exploring right now, though, and the immediate need is only to expose dev to the outside world. So I think we’ll be able to be a bit simpler than this at first.

1 Like

@Mark_Wonsil You don’t get to manage any of the Azure settings with a hosted account correct?

My understanding is you want to put everything behind a Application Gateway first and then you can have your IIS servers either hosted or on-prem.

So the way we chose to do it was to expose a middle-ware service which tightly restricts which data / service endpoints are accessible and how.
Granted,this was before Access Scope was available, we also chose to ONLY expose the BAQ/UBAQ and (now available) Functions End Point.

The regular BO endpoint we chose to keep private. (IMO Too much RISK)

This way we can tightly regulate exactly what is available on the DMZ and to whom. We white list the BAQ’s we want available and match them up with assigned Keys for each vendor/resource (ALA Access Scope)

Again using regular access scope will probably solve most of these issues, but we really didn’t want to expose the BO’s at all (A bit paraniod perhaps).

This is re-inventing the wheel and I’m not saying my wheel is better than Epicor’s but we wanted to be able to tightly restrict the data access and in the event of someone being suuuper determined we didn’t want to expose our app server directly in the DMZ.

With exposing this facade even via brute forcing they shouldn’t be able to access anything beyond the very limited white listed BAQs/UBAQs we expose.

I shared some of this here before

4 Likes

Eventually we will be doing this too via this middleware api. Trying to figure out the function libraries in 10.2.500 but it seems like a good fit for this roadmap

2 Likes

Yeah @Aaron_Moreng we added Functions into the middle-ware once 500 came out, we are not live in 500 yet but once we are it will support BAQs/UBAQs and Functions.

I’ll try to remember to update git with it.

2 Likes

That’s useful, thanks. I thought I’d seen something like that post somewhere but couldn’t find it when I went looking.

In general terms I’d always want to go through BAQs and uBAQs. It’s saved us a lot of headaches working that way within the domain and I wouldn’t want to change for external access. I’m not sure how feasible it is for us to mirror what you’ve done, though it would be nice to. Best of all worlds would be that same functionality but off-premise before the calls ever reach our server, but we’re nowhere close to that yet.

We use Azure Hybrid Connection. You can install the Azure Hybrid Connection Manager on your own on-prem server.

1 Like

We are on the GOVT Cloud. Epicor 10.2.700.12
I am hoping someone reading this can identify with our predicament and empathize.
We have discovered there are some personnel accessing the Epicor database to mine Engineering data (BOMs, primarily) I MUST put a stop to this, and I am getting little assistance from Epicor.
They do not have any Two Factor Authentication available, and I am not so sure that would solve the issue.
I also started pursing the “Access Scope” feature in User Account Maintenance and find little regarding that.
I want to prevent the access via API to the software.
Any recommendations? Please note, I know just enough to be dangerous in some areas in this regard.
Any kind attention is appreciated.

Turn on APIKey requirement in sysconfig for V1 and V2 of Epicor
That will probably break the active home page though

How are they getting in? Weak passwords?

Service Security manages access per service or per service method, regardless REST or not REST.

1 Like