Error creating a contract in an external DLL

Hello everyone,

This is the first time I am developing a DLL to then add it to Epicor and call it from a function. In this DLL, I am trying to create the ‘CashHeader’, but I am getting an error that I haven’t been able to figure out how to solve yet.

The error is in the part where the CashRecSvcContract contract is created. The context being passed to it is created, and I verified this by accessing the tables, but for some reason, the contract is not created, and I get the error ‘Value cannot be null’

using (var context = (Erp.ErpContext)Ice.Services.ContextFactory.CreateContext())
{
Erp.Contracts.CashRecSvcContract cashRecSvcContract = null;
cashRecSvcContract = Ice.Assemblies.ServiceRenderer.GetService<Erp.Contracts.CashRecSvcContract>(context); //the code breaks here

}

The DLLs I have added to my project are the following, and I’m not sure if I’m missing any:

Epicor.Ice.dll
Epicor.ServiceModel.dll
Epicor.System.dll
Erp.Data.910100.dll
Ice.Data.Model.dll
Erp.Contracts.BO.dll
Erp.Service.BO.CashRec.dll
Erp.Contacts.BO.CashRec.dll
Epicor.Customization.Core.dll
Epicor.Customization.dll
Epicor.Functions.Runtime.dll
Ice.Data.Model
EntityFramework.dll

1 Like

Hello Alison,

What is the business problem that you’re trying to solve? Is this a Lockbox interface of some kind? Using Kinetic DLLs is not recommended after 2025.2. Tell us what you’re trying to do and let’s see if we can find a smoother, longer lasting solution for you.

3 Likes

Hi Mark,

The core problem we seek to solve is related to the high volume of HTTP requests directed to our application pool.

Currently, we have a dedicated Windows service for finalizing documents such as payments, sales, transfer orders, etc. The large quantity of generated documents results in an excessive number of requests that the pool cannot withstand, leading to its saturation.

To mitigate this problem, our proposal is to develop a series of DLLs. These DLLs will be responsible for performing the finalization processes by executing them in separate threads. These DLLs will be invoked via a function that will be called in a scheduler.

This architectural change aims to free up the large number of HTTP requests, thus preventing the application pool’s saturation and ensuring system stability.

3 Likes

Thanks, Alison. I can see that calling individual REST endpoints that emulate the client would certainly exhaust the application pool!

Which of the services listed (payments, sales, transfer orders, etc.) has the most volume? How many transactions/min?

For now, let’s take payments. You can still get to the architecture you want using functions to reduce the number HTTP requests. You could post all payment information to a UD Table Row in a single call. You could then launch a separate function that would process that batch recording errors in a separate field of the same record. Everything would be running at the server in function (a DLL) and reduce the number of HTTP calls.

2 Likes

Hi Mark,

The entire document creation logic is encapsulated within its respective Function. Consequently, we do not make multiple individual REST calls (such as getNew, Update, etc.). Instead, the BOs are utilized directly inside this function.

Currently, the JSON request is stored in a UD table. The initial consideration was to invoke the Function via a scheduler. However, this would necessitate creating N schedulers to simulate threading, which is unfeasible given the massive data volume. To illustrate the scale, we generate an average of 8,000 Sales Order records daily, not including other document types.

Given this constraint, the proposed solution is the development of DLLs. These DLLs will be responsible for invoking the BOs as if they were native and processing the documents using multiple threads. This approach will effectively prevent massive requests to the application pool, thereby avoiding its saturation.

1 Like

@Ali45 , are you getting the same error described in this thread?
Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached - Kinetic ERP - Epicor User Help Forum

I have run into this a few times now and am struggling to get to the bottom of it. I hope external DLLs are not the only solution…

Hey Alison!

Excellent. To improve throughput, you could expand that function to do more than one order to reduce the number of calls.

Great!

You could invoke the function directly with another function and not via http.

Looking back at the Task Agent, you can have up to three named task agents running at the same time. Some people have one agent for printing and another for everything else. You could have your input queue and set the maximum number of concurrent tasks to some reasonable number.

Now, if it is the Task Agent that’s eating up your pool, which is not clear from the posts above, there is another architectural idea and that is to write a program that controls the flow from a queue external (not UD Table) to Kinetic. This is basically what Deepak suggested in Tom’s post. It’s similar to the external DLLs without the baggage*. In this scenario, you would write a program that would queue the incoming orders, maybe by company/plant. The program would read from a queue and send the request to the next available httpClient. If you’re thinking of external DLLs then you probably have the programming talent in house to do this kind of solution. Here are some links that may generate some ideas:

And with modern tools available in the .NET stack, like Aspire, you could have a dashboard that would monitor the queue length, etc.

  • And by baggage I mean it is a tightly coupled solution. With every Kinetic upgrade, you’ll have to recompile your program.
3 Likes