This may be more of a discussion question rather than “I need an answer”, but it is something that I am needing a little information on…with that preamble out of the way, lets continue:
If an SQL Admin creates a new table in the Epicor Database (such as EpicorDB.Erp.MyCustomTable), what happens to it during an upgrade? Does Epicor just carry it over and call it a day, or would it be ignored and potentially nuked during an upgrade? Same question, but in regards to custom Views (EpicorDB.dbo.MyCustomView) as well.
Would those custom objects be accessible using the standard Epicor tools, or would one need to use direct SQL Connections to access them?
Is creating custom objects in the EpicorDB even considered a ‘smart practice’?
What is the preferred way to go about something like this? This is what External Datasources are for, right?
Backstory: I recently came into a company that has roughly 100 custom tables in the EpicorDB, along with over 1,300 custom views. Those are not the _UD tables and their associated views either. These were all created by someone with SQL Admin privileges using SSMS. I am trying to figure out how much of a poop-show it is going to be around here when we upgrade out of 10.2.400 to [Insert whatever version we go to here] in the near future.
All of the poop. Not necessarily because Epicor is going to nuke all of your poorly considered DB edits (it could, they warn you as such), but because given the nature and scale of poor judgement demonstrated thus far, I’d imagine there’s even more awfulness in other areas.
I like how this sentence implies “person with DB admin access, but not actually a DB admin”. And by “like” I mean I just did the sign of the cross.
That is what I thought as well. When I came into this company and saw the things I saw, I was in shock. I cannot say I have ever had a need to create any objects in the database using anything other than the tools provided by Epicor (IE: the _UD tables with and such).
The sheer number of objects that are in our environment is something that concerns me. The “Epicor” person that they were contracting with prior to my arrival has made so much of the way this company works is dependent on those custom objects. I don’t see a path where an upgrade is going to be feasible without having to recreate the same dumpster fire in the new database as well (which I am obviously not going to do).
@klincecum - I think I mentioned this during coffee after Insights earlier this year and your reaction was pretty spot on with what has been mentioned here so far. Part of me wants to say “Run Jason, Run!”, but at the same time I am thinking “…ehhhh…Job Security? ”
Good catch. I intentionally worded it that way for a reason
I know I am not a DB Administrator. I know enough to get the job done, but there are probably better ways to go about whatever I am doing. If I am completely unsure about something, I don’t do it. I research, talk to experts, get general advice on the subject, etc. I stop, collaborate, and listen.
Fun times, right?
Anytime! I am always happy to lend out my misery to help others
We create a secondary DB to house extraneous views and tables that connect to other DB’s and servers. They would need to be brought in as an external baq, and you can’t have them be updateable (well, at least not without a ton of BPM c# work magic on custom action)
Another issue you’ll run into is you can’t mix and match external and regular in the query designer
Creating directly in the EpicorDB is bad, and probably against the maintenance agreement so you’d chance losing support, if I read ours correctly. Not only that, but they wouldn’t just automatically show up as everything base is written in dictionary tables to function properly and interfaces with safe to use business objects.
That is kind of what my thought process was. “Job Security”. And there is a lot going on here that needs to be unscrewed, so I could stretch it out long enough to retire, heh.
It’s not ideal, but as long as every custom table and view is in a schema of its own (NOT Erp or Ice, or any of the “default” schemas in the default database), the migration and update tools should leave them alone.
It is a big task, but not impossible. Depending on the number of rows in these tables they could fit into one UD with old table name as key1. Then all of the routines are basically the same with some repetitive editing.
I inherited something like this in the 90s in another package that was a configurator in hundreds of mini tables and I had to upgrade them to current and I pushed all of it into one keyed table.