I need some advice on trying to import a large number of rows into a UD table, so we can do further processing is. The data is coming from and Android App (Using REST) that we are using to collect stock take data. The collected data is standalone, so whilst there shouldn’t be, there is the potential for data to be duplicated. That’s why we are attempting to use a UD table to get all the data together, and then we can perform final processing. I am not really wanting to go down the service connect route if I can avoid it.
Ideally the situation is that once the user returns the handheld to the stock controller they then perform the finalize stock take process on each handheld and it uploads the collected data.
Has anyone done this sort of thing before, be it just for fun (uploading bulk data to a UD table with REST) or otherwise.
Any suggestions appreciated.
if your in office 365, checkout powerapps/flow stuff.
Sure @Hally I do this all the time
What are you looking for?
A platform? Or just wondering if it works?
Works great and it’s pretty fast for UD tables
Definately a platform. I’ve been messing around with it with swagger and postman, but not really getting it. If you have an example that would be very helpful indeed.
The current app is built using react native, I’ve been working withe the developer, creating the odd BAQ and testing other methods for them to use.
And even easier JSON processing with Epicor Functions. < cough, cough >.
I know, you’ll get there soon.
Jose may know some pre-10.2.500 tricks for reducing the round-trips…
Why not send each record as the transaction happens? Then there’s no bulk upload. Once they are done, either send an additional command from the handheld to finalize it, or process it in an Epicor screen.
Otherwise if you really wanted to send it in bulk. Send it as embedded JSON to a single field in a UD table. Then you can parse the JSON when it’s received and do whatever you want with it.
Ideally, I’d love to be able to reliably have scans hit Epicor straight up, but it is difficult to guarantee reliable network connectivity in our yards. Our App was built specifically to avoid this situation.
Your idea of using one field does have some merit though. I’ll take a look at that. Thanks.
I would think about improving your coverage. You are opening a bit of a pandora’s box by making a custom app. The week it goes out you will get a request for it to have some new feature.
What our app does (similar situation I suppose where connectivity is an issue) is that it collects the data and pushes it from the app to a CMS. Then the CMS batch sends the data to the ERP via the rest api (to a ud table via a ubaq)
Unfortunately timber does not play nice with WIFI and if you have a constantly shifting large stacks of it makes it difficult to have full coverage. You could argue to eliminate the issue why not go 4G/5G and that could be an option now, but we need to have the infrastructure last at least until it’s been fully depreciated. Sometimes you’ve just got to work with the tools you have.
@Aaron_Moreng after @Carson 's initial post, I thought about it a bit and headed straight for the posts on JSON deserializing, and thought specifically about a uBAQ to do it. Thanks for confirming my thought process.
I prefer writes to UD tables over uBAQ. It’s great to have a record of the message sent and makes resubmitting records super easy. It’s so nice when you get that email that says “Yesterday something didn’t work quite right when I submitted.” to be able to have an exact record of the message that was sent.
Testing when using a UD table is super easy too. Say I’ve got an enhancement and I want to test it out. I simply go into live and grab the most recent 1000 commands from the UD Table copy them to excel and past insert them into test.
The only problem with UD table writes is adding the method to clean up the records so the table doesn’t fill up.
Maybe a good candidate for a Progressive Web App?
Also, timber logic can be very confusing. There are so many (k)nots involved.
Gosh, you and Dad jokes …