BAQ timeoutException Error - TranGLC table

Any time that I test this query it show up the timeoutException error, it happen after the last 2 Epicor updates. Anyway, Epicor solution was that the query needs to be improved, so here is my query:

Si, what i did was add a criteria (the FIRST) that only returns data for the past 1 year. this to simplify the range of the info and try to make the query faster instead to bring all the info from all the periods of the table, also take all the GLAccount AND’s & combine them into one ‘in list’ to simplify the sql.

But still appear this error message:

Even if I create an Exceptioon where I change the queryTimeOut from 120 to 720, BUT if I uncheck the NOT criteria in RelatedToFile = FATran the query runs properly but I need the info the NOT criteria checked.

In Pilot works good but not on live, plus I got the Cloud version.

Any idea?

Thank you,
Luis P.

Hi Luis,

What I try to do, especially with PartTran and TranGLC, is to find the most limiting condition and put that in a subquery. What limits your records the most? The account number? The JournalCode? The RelatedToFile? Pick the most restrictive in that subfile BEFORE linking to the other file.

Then take the subquery and link it to the other file and make sure you link them with the BEST indexes for that file. Include all fields from left to right, starting with Company. You can see the available indexes in the Data Dictionary Viewer.

3 Likes

What are the relationships on the join?

Why do you need that. The RelatedTo can’t be that if it equals “PartTran”

This is the relationship in the join:

Thank you,
LP

Try with Key1 = SysDate,Key2 = SysTime, Key3 = TranNum

Same error message:

In the TranGlc, add first condition as Company = CurrentCompany, BookdID = primary book, FiscalYear, JournalCode = “IJ”, GLAccount, - required GL account and RelatedToFile = “PartTran”. TranGLc is used for all GL postings and top 1st or 2nd used table.

Based on posts by @Jason_Woods, @Mark_Wonsil and others in another thread. I did something like this. The advise was to make a sub of the largest tables and then use that sub to filter the original table. Here is the diagram of finding every GL expense either from PO or AP. Never would complete with either side, but with the subs it runs for 100K of entries. It will not do 500K :frowning: I think somehow the final query that is actually run is optimized to run the subs first and then filters based on them.

image

1 Like

Right. I believe when you link everything in one query, it has to build all those relations first and then apply the where clause. With sub queries, I think the SQL profiler can make some improvements by building fewer joins. I got a query written by someone else down from 90 seconds to just under two seconds using sub queries.

2 Likes

TranGlc design for PartTran, LaborDtl and InvcDtl is really very bad design. Vantage use of GLJrnSrc table to store the link to GLJrnDtl was better. I avoid using Key1 for date to link to PartTran as between the versions it changed a lot storing US date format, local date format, short date, etc.

1 Like

I have found since going to 600 lots of issues with BAQs and cache plans. I typically alter the joining keys and things will work better, or try to find a way to use OPTION(Recompile) with they query. I had one yesterday it was OrderRel -> Dtl -> Hed and it worked fine. I added a where clause to dtl and it would no longer run.