After thoughts on how to speed up the database. When creating a job, the bigger the job, ie 100+lines in the jobmtl table the slower the insert is based on the BOM. We have jobs of over 1000 material entries. This also occurs for inserting large data into a PO or SO. The more information the slower it goes.
When running an update on a field, for a large Job, it can take 2 minutes to update the one field. Have the same effect when running SQL query from within MSSQL. The issues is the database design, not the SQL server/hardware.
Hi Richard,
Welcome!
Are you using the kinetic client or the smart client?
Kinetic can be faster in a lot of cases.
Have you run the PDT to check your database performance?
I would log a case with support if your Performance and Diagnositc tool is coming back clean.
Cheers
Patrick
Welcome aboard @RStute. It may be an indexing issue. Do you do any form of database maintenance?
I second @PatL comment, both the PO and SO classic screens are not too performant with lots of lines.
I’d bet a $1 it’s not a database issue, but a screen or bo issue.
I would second this. If @RStute is on version 22.2, then the grid is most likely the issue. We are experiencing the same thing on Job Builds with over 100 parts - the Job Material grid becomes almost unusable.
This is one of many tests in 25.1 that I need to follow up on.
In console, you can see that there is a bunch of network calls when using the grid
If it works faster in Kinetic vs Classic, I would agree. There is one other culprit and if running classic then it could be network…
Seen bad Epicor performance where a persons computer was connected into the pass through port on their IP phone. Gave them a straight through connection and everything improved, not to Kinetic speeds, but there was an improvement.