Bpm likely causing performance issues on intercompany Customer Shipment Entry

We had a custom BPM made when we first integrated Epicor, and I believe it is causing issues when processing customer shipment entries. It is only happening when saving inter-company shipment entries, and it isn’t consistent. We haven’t been able to narrow it down to one part, or a certain threshold of quantity, or anything really. Epicor will hang for up to 3 hours when saving. The lines seem to be entered within 1 hour of clicking save but the program still hangs. Sometimes, it will process within a few minutes with no issues. A suggestion from Epicor is shown below. I made these changes but there was no difference in performance.

When joining any two tables in a BPM, one should always include the company column as it is part of the primary key in every day within Epicor.

So instead of:from Part_Row in Part_Row.PartNum==ttShipDtl_XRow.PartNum

change it to: from Part_Row in Db.Part where Part_Row.Company==ttShipDtl_XRow.Company && Part_Row.PartNum==ttShipDtl_XRow.PartNum

That should improve performance of the BPM

After that suggestion they informed me to post here.
I will gladly upload more images if anyone has a suggestion and needs more information.


You are not using the entire index in your query. You’d user Company, PartNum

1 Like

That’s what Epicor Support had suggested with the code snippet posted above the pictures, correct? I attempted that with a test scenario and Epicor still hung. If that’s the only problem with the code then maybe there’s a setup issue as well.

I would add the Company to the query as well as filter by the RowMod on your foreach.
If Epicor is hanging on this it may mean you have other issues like your DB indexes need to be refreshed, stats updated etc. Do you guys do maintenance on your DB?

1 Like

I’m not 100% sure as I don’t have access to the server, but I believe we are performing maintenance on a regular basses on the DB. Although I will ask our IT Admin about that.

Also I’m not sure what you mean by add the company to the query. Do you just mean add it to the “Where” clause like so:

where Part_Row.Company==ttShipDtl_xRow.Company && Part_Row.PartNum == ttShipDtl_xRow.PartNum

And when you say filter by RowMod, do you mean that I should include the PartTran table in the code as well? I don’t beleive there is a RowMod field in the Part table. Correct me if I’m wrong though.

Yes add the company on the whereClause and in your foreach add a RowMod filter for the ttShipDtl

1 Like

Sorry for so many questions, but I’m new to coding in BPMs. Below is my new code. Is this what you were suggesting?

Erp.Tables.Part Part;

foreach (var ttShipDtl_xRow in (from ttShipDtl_Row in ttShipDtl 
	where ttShipDtl_Row.RowMod == IceRow.ROWSTATE_ADDED 
	|| ttShipDtl_Row.RowMod == IceRow.ROWSTATE_UPDATED 
	select ttShipDtl_Row)) 
        Part = (from Part_Row in Db.Part
    	        where Part_Row.Company==ttShipDtl_xRow.Company 
		&& Part_Row.PartNum == ttShipDtl_xRow.PartNum
                select Part_Row).FirstOrDefault();
        if (Part != null && Part.PartsPerContainer > 0) 
		ttShipDtl_xRow.Packages = (int) Math.Ceiling((ttShipDtl_xRow.OurInventoryShipQty + ttShipDtl_xRow.OurJobShipQty) / Part.PartsPerContainer);

If this is correct then I’m going to run some tests. I have to wait for the next time shipping has that error though.

Yes , this is what I was suggesting

1 Like

can i just ask why you are using foreach loop while dealing with one tt ShipDtl record ?

We are having similar slowdowns with CSG code for Shipment entry. If you are not using Packout, try going into the Site Configuration and under modules, Shipping Receiving check the Disable Pack Out checkbox.
We are using the pack out screen so disabling is not an option for us but in our testing when pack out is disabled shipping speed increased dramatically. We are on version 10.1.600.14.

1 Like

One comment to the original post… you may have done this but did not call it out so making a note for others reading this thread today or six months from now…

Anytime, anyone states they are not sure where a performance issue comes from - Measure!
All of these are relevant for traceflags in appserver.config.

< add uri=“trace://system/db/hits” />
< add uri=“trace://ice/fw/BPM” />
< add uri=“trace://ice/fw/perf” />

Disable BPM, run, establish baseline.
Enable BPM, run establish test.
Do a diff (PDT to excel?).

Now you have something you can say definitively where the time is going, no guessing :slight_smile:


This code was written before I was hired here and I’m not exactly a professional programmer so I’m not 100% sure. I need to trace back the purpose of this, I just wanted to post this here to get a head start in case anything was obviously affecting performance.

Thanks for the suggestion. I’ll check with our shipping/scheduling people and see if this is a viable option for us. also we are on 10.1.500.32

This is exactly what my next step was going to be. I definitely need to trace this back as to what purpose it is serving and log it’s performance. Thanks for the specific instructions.

1 Like

i also suggest to move this BPM to the ship method instead of MasterUpdate, and start with tt in ShipHead, in addition to track log and compare, you can publish an info message after each code block to check whether your IF condition statement is actually working as you want.



1 Like

I appreciate everyone’s help and I’ll use your advice in the future, but after running some test cases it looks like this performance issue might stem from some code in a customization. It looks like our shipping module was customized quite a bit before my time here. We’re probably going to work around it by instructing our shippers to just open 2 instances of Epicor so they can continue their work. I’ll have to look into the customization at a later date.

When in doubt, measure :slight_smile:
Then check the logs :stuck_out_tongue:

We have a joke internally - no logs, no error. It’s to encourage folks to do some diagnostics themselves before escalating internally. It one of the planks in our internal mentoring.

1 Like