BAQ Timeout

We have a custom baq that times out when running. Its a custom pick list report and filtered just for one job. However when I included all the operations in the filter, the report finishes with no error.

So far I have tried adding an execution setting(query time) and set it to 10000. No luck

I also noticed when analyzing the baq i get the following errors

Any clue on how to fix this?

Check your joins for the tables referenced. Looks like they aren’t set to use an index which will improve BAQ performance.

This is familiar. I’d suggest that chasing indexes won’t be much help here.

Unless you’re self hosted and have accumulated deferred index/statistics maintenance, or haven’t provisioned enough RAM on the server, or haven’t set SQL Server max_memory to at least 8GB less than total available RAM, but that’s another problem. Worth checking though. These will inevitably cause serious problems which only the host can solve.

Otherwise, verify first. Copy that BAQ and start deleting things. Delete one joined table, run, take a note on performance. Don’t save the copied query after deleting that one thing. Reload it, delete another thing, etc. Odds are good you’ll turn up one or two that make a huge difference.

JobMtl (for example) has design issues that lead the optimization engine to choose poorly in joins. Sometimes if you have a variable receiving the job number you’re looking for, adding a WHERE condition explicitly like and JobMtl.JobNum = @jobnum can help even though it’s applied across a join. Or, select the fields you need from JobMtl off in its own little subquery jail with a WHERE condition on the job number and join the limited results. JobProd can also be a boat anchor of a joined data source, but check if that’s true in your case before making big changes.

2 Likes

Just to add we were once on- prem but now have moved to the cloud.
As for the indexes, could this be done in epicor(BAQ designer)?

Yes for joins if there is existing indexes it’d be under the Dictionary button. However there isn’t one between JobOper and JobMtl but the relations I’d suggest are what I have here on Company, JobNum, Assemblyseq, and OprSeq - RelatedOperation.

In these situations (it was super quick, then I changed one thing and now it takes hundreds of seconds), I think I have found 100% of the time that just slapping a company criteria on a table will jolt it back into obedience.

Like so:

Or if there’s a subquery, maybe do this there, too. I don’t think it would ever hurt.

Well, unless you need it to be cross-company. Eek, I don’t know what I’d do without this trick.

Maybe this:

Also, if you ever need to get into the weeds and you are on prem:

4 Likes

Well heres something, when changing the output to Word, the report generates fine lol. Output to pdf gives the error.

2 Likes

That might not a BAQ timeout then? Maybe the SSRS is doing something intensive and times out? Or it’s some other server side issue triggering a different timeout? How long does the SQL take to run in BAQ? Not by your perceived time - the execution time returned in the “problems” window.

Also, I tried the table criteria recommend by JasonMcD, and it worked using pdf out.

2 Likes

That’s not an index, that’s just a pre-programmed dictionary to make BAQ’s easier

The see indexes that are present you look in the data dictionary.

Basically, if you can use all of the fields in one of those indexes in your join, you should. For example, use Company even if you’re only one company. This will help the system use the index and help the query run faster.

Yes, you’re correct. It’s not the actual indexes of the tables.

Looks like the issue is back! BAQ timeout error again.

There was an update(hotfix and instance update) pushed out to our environment yesterday to fix an issue(classic menu error or certain modules). Not sure if that has anything to do with it.

When testing the BAQ(BAQ Designer) it runs fine despite the index errors

Maybe a model regen is needed?

Have you tried adding in the recompile option? It can help some BAQs and has been referenced in multiple threads here on EpiUsers I’ve linked one below.

1 Like

No luck with the recompile option.

These are the settings in place

2 Likes

Remove the Company criteria!

Ha, no I don’t know. It does feel like whack-a-mole. (No, I’m not adding the GIF.)

Seriously, though, this isn’t total shooting in the dark. As I have learned, the SQL engine (I made that up - is that a thing?) is not rigid or predictable.

It does what’s called parameter sniffing, which is to say (I think) that the first time it runs, if it needs to process 3 rows of data, it optimizes the execution plan for that small amount of data. Then next time you give it 30,000 rows to chew on and it does NOT scale well.

The experts will now rip apart that explanation, but the basic idea is that it can be just flat out unpredictable. So a way to trick SQL is to change the query so that it has to develop a new plan. That’s also the idea behind RECOMPILE. But that’s not always a slam dunk either, as you have found.

3 Likes

No, you pretty much nailed it.

3 Likes