Data directives and DMT

The docs for data directives say:

In-Transaction - Affects data before it is saved to the database. This directive type can only process one row at a time; it cannot process multiple dirty rows - rows that contain data not yet saved to the database. Multiple dirty rows are explained in the BPM Workflow Designer > General Principles section within the application help.

I’ve created an in-transaction data directive on ShipDtl to validate data from DMT Customer Shipment Combined. My BPM is triggered once for every row in the DMT spreadsheet. On every invocation, ttShipDtl contains a record for every row in the spreadsheet, with Added() or Updated() being true for all. This seems to contradict the documentation, and makes my BPM O(n²) with the number of rows in the spreadsheet, when it ought to be O(n). Why would this be?

It could be that using the combined DMT the ShipDtl table gets hit more than once per row of DMT data.

For example, combined Order DMT may create the 1st release when a new order Line is created, then update that row based on the OrderRel# fields.

You can use the trace in the DMT to find the method used. I have had some issues with data directives on DMT imports in the past.

Oh, nice. Trace revealed that what’s actually going on is a little weirder than I thought.

If the input rows all pass the data directive, DMT calls CustShip.Update once for each row in the input. But if Update raises an exception for any row, then DMT retries that same row as many times as there are rows remaining in the input. For example, if there are four rows and Update raises an error on the second row, then DMT will call Update with the data from the second row two more times, instead of calling Update with the third and fourth rows as expected.