I have been working on a proof of concept to import some log file data from our CNC machines. I have a process working using a combination of PowerShell script and SQL / BCP to import the data and it all works fine. The next stage is to make this accessible in EPICOR so I can match it to jobs etc.
The real simple solution appears to be a direct import into a UD table (using said BCP command), I have tried this on a test database and it appears to work, but it seems like this has “bad practice” written all over it
So really my question here is: Is it OK to import data directly into a UD Table such as UD01, UD02 etc.?
I know that I have other options, automated DMT, external data sources etc. but I like simple so I just want to see if this approach is OK first?
Personally, I would look into using Rest calls to import your data into the UD Tables. there are built in business methods that will make sure that you did everything correct if you use the Rest service.
I think just don’t go there. Arguably UD tables are the lowest risk, but direct SQL is just not OK for Epicor.
You could use PowerShell with REST and get the data in there, but I’ve moved over to storing this kind of data outside Epicor altogether and accessing it via External BAQ when needed whenever possible. It feels so much cleaner.
PowerShell has the Invoke-RestMethod cmdlet which would make it easy to use instead of BCP. When you get to 10.2.500, you can use Epicor Functions to do the hard work for you. Pass in a Json string and load the UD table for you.
And if you want to get really fancy, you could send the machine data into Azure (IoT) and fire off events to update the jobs in Epicor. The added benefit is now you can do some monitoring and analysis with the aggregate data.