Functions - Delay in DB reflecting changes after IssueReturnImpl.PerformMaterialMovement

Has anyone successfully encapsulated the various material issuance BO methods into a function? I’ve wrapped up all the methods found from the tracelog and everything works… sometimes. If I run my function and pull the job/assembly/mtl up in the Issue Material form, the Previously Issued field reflects what I ran through my function, but if I check the part’s bin stock level, it hasn’t gone down. Sometimes (not sure if this is reliable or not) if I wait 5, 10, 15 minutes, the stock level will magically update.

If I use the Issue Material form as normal, with the same parameters, the stock level is reflected immediately. I feel like I’ve run into this issue with other BO methods. I run all of the necessary methods and craft my table set, but ultimately can’t get something to commit itself to the DB.

Anyone have any tips or pointers?

Thanks!

How are you checking? Part Tracker? SSMS?

Some material transactions are deferred

Search the forum for defer updates there are several posts about it

1 Like

I never knew this…

Yup

1 Like

Thanks for digging those links up.

1 Like

Thanks for the links!

I tried placing the following code snippet

var context = Ice.Services.ContextFactory.CreateContext<ErpContext>();

Erp.Internal.Lib.DeferredUpdate libDeferredUpdate = new Erp.Internal.Lib.DeferredUpdate(context);
libDeferredUpdate.UpdPQDemand();

in a custom code block after my final call to Erp.IssueReturn.PerformMaterialMovement and then called the function via my REST client and receive the following error:

{
  "HttpStatus": 500,
  "ReasonPhrase": "REST API Exception",
  "ErrorMessage": "The underlying provider failed on Open.",
  "ErrorType": "System.Data.Entity.Core.EntityException",
  "CorrelationId": "bec8a4ad-adea-4b84-880e-6232a97f11f9",
  "InnerExceptionMessage": "Network access for Distributed Transaction Manager (MSDTC) has been disabled. Please enable DTC for network access in the security configuration for MSDTC using the Component Services Administrative tool."
}

Is this a side effect of calling the function remotely? Or am I not providing the correct context to the Erp.Internal.Lib.DeferredUpdate constructor? Passing the available Db object produces a type error Argument 1: cannot convert from 'EFx.CustomJobSet.Implementation.ILibraryContext' to 'Erp.ErpContext'.

[edit]Or possibly a Cloud/SaaS issue?[/edit]

We have (and do) use it successfully in Functions. Though we reflected the Db object instead of usig context

Epicor.Functions.IFunctionHost host = (Epicor.Functions.IFunctionHost)this.GetType().BaseType.BaseType.GetField("host", BindingFlags.NonPublic | BindingFlags.Instance).GetValue(this);
var Dbl = (Erp.ErpContext)host.GetIceContext();
Erp.Internal.Lib.DeferredUpdate libDeferredUpdate = new Erp.Internal.Lib.DeferredUpdate(Dbl);
libDeferredUpdate.UpdPQDemand();
3 Likes

Well, I’m not getting that error anymore, but my on hand quantity isn’t reflecting the change again. The PartTran table does appear to be getting updated since the Previously Issued field on the Issue Material screen does show the correct value.

So, similar to the response from @mikelyndersOKCC in the Automating Fulfillment Workbench Via Epicor Function - #4 by adaniell link you reference, I just needed a bit of patience for the previously deferred entries to make their way out. Calls to my function now result in the on hand quantity immediately reducing.

This potentially explains other instances with similarly odd behavior. I’m sure this isn’t a panacea, but hopefully another arrow in the quiver! Thanks for your help! Epiusers is truly one of the best things about Epicor :slight_smile:

1 Like