I’m trying to track down an issue with a customisation that we are working with. The PDT is showing a very large execution time for a DynamicQuery of 260000ms! I am working with CSG on this, but thought it might be a good opportunity to use the e10Help collective mind to help other users (and myself) to better understand this and learn something.
I am wondering if a difference of security models between the two environments may have an impact. Our Pilot environment is using single sign on whilst the Dev environment is using the Education Database with standard Epicor security.
Is it possible that single sign on could cause delays as we have seen? Or am I just totally barking mad? Based on our investigations so far we appear to be using the same code at both ends, but the systems behave so much differently.
The updatable BAQ in our Pilot environment has no value in the security ID field.
I can’t help directly, but I’ll be interested to know what comes of this.
We use Dynamic Query extensively. Some months ago we simultaneously upgraded from 10.1 to 10.2, changed server, and split the various components of our Epicor system to run on separate VMs. Once it was all configured right, everything ran much faster and smoother … except most of the dynamic query calls, which were 3-5 times slower.
A patient Epicor support person went through possibilities with me, but there were no security differences in our case. I don’t think Epicor ever found any single cause. I added SQL indexes and rebuilt BAQs until the problem went away. My overall conclusion was that BAQs don’t much like being ported between environments, and need surgery to force them to re-optimise.
We have also had some cases where even on a running piece of the system, a BAQ will suddenly slow right down (most memorably on a BAQ report being emailed to thousands of customers, which mid-session went from seconds to minutes per report), and again I can only conclude that sometimes the optimisation kicks in and gets it wrong.
So if anyone can give more clue why these things happen then I’ll be very glad to know.
Possibly, but I couldn’t find any trace of anything in the database itself, and naturally at that point I was in crisis mode.
In that case, as we can’t afford to have variability in that particular business process, I recreated the core query in SQL and rebuilt the report around an external BAQ, since which time it’s behaved itself. I’m not proposing anything based on that one anecdote, but it did highlight to me that you can’t assume BAQs in particular will remain static in their behaviour and performance.
can i just ask, how did use the DynamicQuery in your UI customization ? the reason i am asking is that one of the important performance factors is number of executions, may be it is not applicable in your case when the different in performance comes from running the same Customization on different database, but it is something to consider to improve the performance of this technique, check some good links in this thread
Thanks for your reply. I didn’t write the code, but explanation I was given was the Updatable BAQ adaptor is used to allow the UI to generate PO lines based on information entered into another form. The actual form in question allows the user to enter in many lines of the same part with different UOM and quantity, and generate the individual POLines. It gets a bit more complicated, and the data entry form allows the user to select if the quantity is an each or a pack where the conversion factor between pack and each is stored as an attribute on the part, or additionally on a the Supplier Price list, we can have different suppliers supply the same part but they may have a different piece count per pack. In short it allows the ability to enter in 2 dimensional UOMs, when using the fo in Pack mode it will generate 1 line per pack, for example,. Enter a quantity of 10 in the form the PO form gets 10 rows.