I have an updatable dashboard that works very well. The only challenge is that it takes too long to save. What could be the problem?
Have you thought about using DMT for mass changing this data instead?
@Tomas We are afraid of giving users DMT because they can easily misuse it like deleting transactions.
When creating journals Paste update is very fast which i thought should be the same on my updatable dashboard
What update method is your BAQ using? Have you tried doing a trace to see what methods and BPMs are being triggered?
Ok and it looks like it’s not custom code as it’s named “##BASE##”. Still suggest running a trace log as updating sales orders typically hit many methods and tables. Could be why it’s slower than journal entries.
@JaneToo A few things that could help. Updating pricing causes Epicor to do a lot of validations. You can try setting Override Price list.
If the ready to process flag is set you are also creating booking detail record for every change.
UBAQs have a Multi-threaded save under actions. I have never seen a speed difference in any of my dashboards, but it might help.
i have disabled ready to [process check box and time has reduced form 1 hour to 25 minutes. What else should i check to bring down to less than 10 minutes
Did you try Lock Unit Price or Override Price list?
Try setting Allow Multiple Row Update and then actions multi threaded save.
I’m waiting on the outcome of this one. This might be an opportunity where it could make
a difference.
Curious - are you saying 25-60 minutes per row? Or for the whole list? Assuming the whole list.
So, I ran into a similar-ish scenario when trying to update the MfgComment field on JobMtl.
In that case I was making an Epicor Function, but still, I was using a business object to update a row, just like you are. It would take 30 seconds per row, and DMT was the same way. The problem was that the system’s methodology was to pull in the entire Job (the JobEntry BO) and then update a single item in it. I learned there how to pare it down to the single row I needed and now it works in milliseconds.
So… I don’t really know how to apply the same logic here to a uBAQ, but maybe someone else does…?
I don’t think it really applies here, as she’s just using a plain UBAQ, and UpdateEx.
@klincecum It is possible a stripped down custom update would be faster since it could get around some of the order updating being triggered like the price lookup which I suspect is the drag on this, but then the problem becomes you have to make some of those updates so the header summary matches the order detail post processing.
It does on this BO @klincecum (it shouldn’t but it does). This is an instance where epicor “Tests” with a tiny job, said “works for me” and pushed it. But they have bad logic in the BO that should be filtering, but it’s not. In a tiny job, it’s not noticeable, but as soon as your job starts getting large, it’s pulling a ton of data that it doesn’t need to, and really slows that BO down unnecessarily. I had some pretty simple UD fields that I wanted to change, but ended up have to do DB updates because the jobBO, while it would do it, was unusably slow.
Right and to clarify here, my scenario was not hijacking the update, but hijacking the GetRows (or whatever) that was used for getting the original BO.
But again, a uBAQ doesn’t have all that overhead, which is great of course, but I don’t know where that leaves you if the overhead is what is killing you.
And my suggestion may not even help, as the GetRows (or whatever) may not even be a problem.
Just throwing it out there.
I have not locked unit price or override price list. This is what i have after allow multiple row update and multi threaded save indicating it will take around 20minutes
@JaneToo I can’t read the snip, but 25 to 20 is not much of a savings. it is 1:40 AM here so I will not be online much longer. Can you send a clear snapshot?