Direct access to Microsoft SQL Database?

I still want my TEMP Tables…hurry up and add temp tables to BAQS. LOL

1 Like

Thanks everyone so much for your feedback. We’ve decided to not allow write access to the MS SQL db, although based on our understanding of the risks it looks like we will allow read only access for developers. My understanding of the reasons not to allow read only access are

  1. Security is bypassed (anyone with MS SQL access has access to all tables)
  2. Data may be stale (cached data may not have been flushed to the DB)

Please note we have a good number of system level software engineers(C and Assembly devs). Some of these developers would like access to the DB so they can generate custom reports with tools they are familiar with and prefer. Ownership trusts these engineers with access to all data with in Epicor and feels the benefits of giving them the access they desire so they can build reports to help the business is the best path forward. I also understand there are other benefits to getting direct access to the DB such as having the ability to create views which I understand can have a pretty big performance improvement. I get where these developers are coming from as I personally think it would be nice to be able to run a quick SQL query via Sequel Pro.

Assuming we’re not missing any other reasons to not give read only DB access, we’re curious to better understand how caching works in Epicor. Is anyone familiar or is there any documentation on this topic? As an example I can’t imagine data would remain in an object in RAM and not flushed to disk/db for more than a couple seconds.

In giving direct DB read only access are there any recommendations on how to set this up. That is should we create a MS SQL user with read only access and give the credentials to all developers, create one user per developer, or something else? Are there any other best practices we should be following given that we plan to give developers read only access to MS SQL?

2 Likes

Any views should be created in a different db. To avoid potential upgrade issues and naming overlap

2 Likes

Something else. Use a Trusted Connection and create an SQL user based on a Windows Security Group and put each developer’s login ID into the group. That way, if a developer leaves, you just disable the account. Shared passwords are generally a bad idea because they are difficult to change.

“If there is no sys-struggle, there is no sysprogress”

  • Frederick Douglass…or Bart Elia, I don’t remember…
1 Like

I ditto @Mark_Wonsil on the common sense MS / AD Auth access group approach. You can also use that group to lock down payroll which a lot of VPs get sensitive about for some reason.

The third item in the list @embedded mentioned - calculated fields. There are some fields that do not exist in the db and are calculated on the fly. Sure you can emulate the calculations in some scenarios (until the next feature enhancement which changes the math or the db structure).

Really a lot of this is around future proofing. If you enjoy constantly updating your processing and slowing your adoption of new releases, you are welcome to do so. I would not like explaining to my boss how we cannot absorb new features due to making more complex integration’s that were against best practices of the vendor but if you think you can handle it, you are the domain expert for your company and IT group.

To your specific question on the caching. There are two levels. I’ll point our User because I know that off the top of my head and am in a hotel without source code in front of me.

In the beginning there was the UserFile table (V8 or before). It had the name, email address, password and a few other columns. When the user sent up the password, it went thru an encryption process to convert ‘Epicor123’ into some blob of characters. The blob of character was compared against the db column and if they were the same, user passed and was let into the server. Each server call. Many times a form. Needless to the say the db (and dba) was not happy.

Then a cache was created. A static cache in the app server memory:

//Horrid over simplification
static Dictionary<string, string> Users; //username, password name value pairs

When the user logged in, the same encryption occurred and it was compared against the cache. If a user changed their password, the cache was refreshed with that new data in the static cache. Performance was improved, the dbas put down their pitchforks and all rejoiced.

Then a second (or more) app server was added (or second garden on a single server). Now a user would log into app server 1 to change password and when the load balancing gods routed you to a new app server, the user had an incorrect password because the encrypted passowrd was different from app server to app server. To solve this, a 5 minute refresh was created. On every app server, every 5 minutes the cache would be cleared and the db table for user would be refreshed from the db. That lovely dialog in User Maintenance would tell you please wait 5 minutes for the password to refresh. It SORT of worked for User - with a horrid UX - but in other caches, we could not wait like this. A new BPM for example - anything based on the customization framework really (Posting Engine, BPM, BAQ, UBaq, Configurator…). They could not handle a 5 minute delay.

Finally in 10.1 we tackled the farm issue for real and is the first version I would want to use in a cluster.

There is a pluggable notification system for app server to app server messaging. When a value is updated, the cache is hooked into the update pipeline. When an update to ‘embedded’ is detected, a note is sent to the Cache to clear user ‘embedded’. How we notify the other app servers is the interesting part.

We support the default of db but also have a UDP version of direct app server notification and have played with a few other out of process caching systems that probably folks would recognize the names of. These have never been shipped because honestly, the db one has performed so well and has not effected scale to any significant degree. I assume one day in SaaS I will have to add in another cache server but for now, keeping admin simplistic was a higher priority.

Next, we built a base class for the cache that knows about the notification engine and sends it a note -

Psst - User Cache, Embedded updated their password

The other app servers would get a note on next db access that that entry was invalidated and the remote servers removed / refreshed the data as appropriate to the cache usage.

Note here - if you go spelunking in Just Decompile or the like over a few releases we have actually built three versions of that cache base class. It was private so we optimized algorithms and improved API over time as more areas picked up the ‘app farm friendly’ cache. I suggested that as a class name and for some reason had the naming czars shot me down.

The last note is critical to note. We changed the API as other service owners wanted to leverage the cache. Who will start using it next release? No idea. Probably a service that needs more performance. Like one used a lot. Like a common one to modify and tweak in BPMs and integrate with by customers.

In other words, one most likely to hurt you if you go against a db table directly.

Hopefully that more fully explains the reasons I am rather against he idea. We spent a lot of time on REST and OData to make consuming data externally easier and less prone to breakage. We will continue to do some more tweaks in the coming releases - for our ‘integrations’ like Active Home Page and Mobile CRM and a few more projects in the pipeline. So if you have external feedback, please raise it here or in the EUG. It probably aligns with the internal complainnts but it’s also nice to tie a customer request to an internal request to bump up priorities.

I don’t know what service will be the next to leverage the latest cache classes. I just hope it is not one you choose to integrate against.

So, just thinking out loud here, it sure would be kind of nice if Azure users could connect to their nearest region and leverage the Azure network interconnects to speed up the overall access times…especially in a global install scenario but even if staying on the same continent. Hmmm. :thinking:

2 Likes

Thanks so much everyone. If helpful I’m not the decision maker, I’m just trying to stay informed so I can inform the decision maker(ownership). IT / design engineering wants direct DB access(the most technical group that carries the most clout in the company), although it’s pretty clear from this thread and forum you all are more knowledgeable / experienced on this topic than we are, and are advocating not to give direct DB access. This leads to the question of why, what are the risks / concerns so ownership can evaluate if the benefits outweigh the risks. Based on my understanding and explanation to ownership they seem to be ready to conclude read only access is OK. I think the only possible concern ownership has at this point is not giving access to payroll information but I understand the way we use Epicor you can’t tell anyones salary from the data(it’s a lump sum payment for payroll not specific to any one person).

I’ll pass along the suggestions to IT in regards to setting up access and pass along the information you provided about caching. We really appreciate all of you help on this, thanks so much!

1 Like

This has been a very interesting thread for me. I’m more of a developer than an admin myself, so I frequently find myself chafing against Epicor’s standard tools.

For DB access though, what I’ve settled on within my little team is to allow SQL access for ad-hoc things. If we want to know something (for ourselves or for someone else) then a query of some kind is fine. If it’s going to be a regular one, for whatever purpose, then again it’s fine to get what you’re after with a SQL query, but then go and replicate that as a BAQ before using it for anything or passing it on to other users in any form.

There still remain some cases where a BAQ doesn’t do the job, but for us that’s almost always very complicated queries and they’re never time-sensitive so cache isn’t an issue. We write those as views in a separate database calling back to the Epicor one and expose them via External BAQs as needed. In some cases they’re heavy enough that we write the content of the views to tables within the separate database overnight and the data is fast for users to access but they know it’s only accurate to the previous day.

I think it’s a fair point in this discussion, though, that the annoyances that come from ignoring Epicor’s tools outweigh the annoyances of using them in the long run. I’m hoping the choice of tools doesn’t keep changing too often, that’s all.

Eddie,

I just jumped in on this thread so I might have missed something. Why not set up a replicated database for them to bang on. We do all our reporting, development of external (read only) apps and BI analysis off our replicated db. It is current within a minute (worse case).

As long as you are not writing to the database (we don’t use bidirectional replication, if there is such a thing) then it is a safe alternative. If you are going to write to the db then you need to access the production db using the business objects as mentioned.

1 Like

I would just say to know what you are trying to prevent. Running SQL against a replicated DB may prevent overwriting data but not accessing data that they aren’t supposed to or crashing the server if you are using a shared DB server.

Access to data can be controlled with Row Level Security in MSSQL 2016 and higher.

Hi Eddie,

First, I’ve reread my posts and I apologize if I came across as a d*ck. I may have been a little rude.

Second, I’m approaching this from a staying current point of view. Yes, SQL Server is great. Yes, it has a lot of great tools that work with it. But I’m taking a longer view because of the new upgrade cadence. If people choose not to stay current and lose the ability to get hot fixes, that is a management choice. Epicor has warned users of this for awhile now.

Every modification we make here, we ask ourselves: How hard will this be to upgrade? How difficult is it to test? Will it work with changes to Epicor’s architecture in the future? Occasionally, Epicor makes breaking changes, like the Product Configurator but generally, if I stay within the tool set, I can reasonably expect the upgrades to go smoother. If they don’t, I have some recourse with them where I don’t if I go against their recommendations. To us, this is our biggest risk surface. If we stray outside this guidance, we’ve given up our right to say, “Upgrades take too long” because I chose a short-term gain over long-term stability.

Mark W.

1 Like

@Mark_Wonsil this kind of deserves a separate thread and a possible corner sofa at insights to chat about.
Obviously now that we are in Azure more knobs and sliders are available. If be lying if I said we are not looking at different opportunities for customers. Adding something of value that no other vendor gives you? Of course.
At the same time we are also wanting to improve migrations, quality in general new business domain features and performance. It’s a balance as every IT person up here knows (if you have an infinite budget and always get to the bottom of your IT wishlist, let me know)
The replicated read only dB was just one ask that came up before (probably you on the feature request list).
It will be an interesting year or two and I honestly appreciate so many open with your struggles and helping each other and us to build a better product and community

3 Likes

@mark_wonsil I don’t think you’ve been rude, I really appreciate your feedback to understand your experiences and thoughts. Open dialog like this is super helpful and I think a really healthy way to communicate. From my perspective I’m trying to take a humble approach in that I realize I don’t have the knowledge and experience that many of you do in this domain.

In my experiences saying this is right and this is wrong without understanding why(blindly following rules) can get us into trouble. Although if we understand the reasoning why something is right or wrong then we can evaluate each unique situation to figure out the best way to proceed. Essentially that’s what we’re trying to do, gain an understanding of the risks - what can go wrong, why, and in what situations so we can best evaluate the best path forward for our unique situation.

Again, I genuinely value all of your input. I’m working with an Epicor consultant as we transition to E10 that follows this forum and has worked with several of you and he told me how impressive you were to work with. I’m amazed to be getting help from you all as I’m just a stranger on the internet :slight_smile:

1 Like

@dhewl welcome to the community. They can be as hard on fellow members as to us at Epicor and also as helpful to both of us.
A lot of the time you hear are decades of experience having suffered thru bad decisions - or easy short sighted ones. The equivalent of yelling at your child before running into the street. You don’t want to yell at them but the ptsd and fear makes you jump first and think later :sunglasses:
IT has been around as a discipline a few decades. Erp less. Cloud even less. We don’t have the luxury of learning from the Egyptians building pyramids or Roman’s building roads and aqueducts. That kind of engineering and process has had things figured out for hundreds of years. Everyone has their preferences on accomplishing the same work and identifying trade offs. What is important for you and others may differ. I think everyone up here is just trying to gain and share knowledge. Hopefully you gain and contribute and I think all up here hope you dodge a few speeding cars.

4 Likes

Ive just read through this thread. Im just starting to get into understanding of REST and how it can be used as an additional tool for the business.

Would this be the best tool to use if I wanted to to see what suppliers have what attributes (read only)? If i use REST, does the user calling the REST have to also be an Epicor user with certain permissions?

Yes. And with REST v2, you can get really specific control with Access Scope.

So what is the best way (Epicor approved) to access epicor data outside of epicor without an epicor user account? (ie. Engineers need to see what Suppliers have a particular attribute)

You could follow the MES model where those users all share a single logon. I would do a simple web app that prompts for a supplier and then then return the attributes. The credentials can be put into your web app (via Azure Key Vault for example) and lock down that user with Access Scope.

Mark W.

3 Likes

A very easy way is to use excel and MSQuery to directly query the database. A simple query can be written with a parameter (tied to a cell). The SQL database will have to be set up as read only for the user. One would also have to be a windows user on the network to utilize it.