I have a requirement now, I need to develop an API to quickly add 40000 pieces of part information. I need to accept a JSON data containing 40000 parts and add it
And how often do you plan to add 40000 parts?
The exact time was not specified, but the customer requested that the sooner the better. I wanted to find a solution because the Rest API can only use GetNewPart to iterate over additions.
Function taking a dataset as parameter
You’ll have to integrate on the back end Epicor doesn’t have bulk load Bos
Not using the API (directly or through DMT) would prevent object validation and potentially corrupt the database.
It is possible to split the load up into five batches of 8000 and run them parallel.
Thanks. I was thinking of enabling multithreading in Function
Thank you very much! I now want to use the DataSet to receive parameters and call the GetNew API in batches
I have these methods in our codebase that keeps all of our part information in sync between companies. While not exactly what you are trying to do, you may find bits of this useful for your use case.
These methods will break all of our parts into chunks of 50 parts and run 25 UpdateExt tasks in parallel. It still takes a little while to process 70000 parts across 3 companies.
private async Task SyncPartsToCompanyAsync(string fromCompany, string toCompany, IEnumerable<string> companyParts, CancellationToken cancellationToken)
{
// Get list of all product groups
var prodGroups = (await mRestClientFactory.BuildRequest()
.WithCompany(toCompany)
.GetService<ErpBOProdGrupSvc>()
.Get_GetListAsync("", 0, 0, cancellationToken)).ReturnObj.ProdGrupList
.Select(o => o.ProdCode)
.ToList();
// Get list of all part classes
var partClasses = (await mRestClientFactory.BuildRequest()
.WithCompany(toCompany)
.GetService<ErpBOPartClassSvc>()
.Get_GetListAsync("", 0, 0, cancellationToken)).ReturnObj.PartClassList
.Select(o => o.ClassID)
.ToList();
// New list of update chunks
var tasks = new List<Task>();
// Create throttler
var throttler = new SemaphoreSlim(25);
foreach (var parts in companyParts.Chunk(50).ToList())
{
// Wait for available thread
await throttler.WaitAsync(cancellationToken);
// Add task
tasks.Add(Task.Run(async () =>
{
try
{
// Update parts
await UpdatePartsAsync(fromCompany, toCompany, parts, prodGroups, partClasses, cancellationToken);
}
finally
{
// Release throttle
throttler.Release();
}
}, cancellationToken));
}
// Wait for all tasks to complete
await Task.WhenAll(tasks);
}
private async Task UpdatePartsAsync(string fromCompany, string toCompany, IEnumerable<string> parts, IEnumerable<string> prodGroups, IEnumerable<string> partClasses, CancellationToken cancellationToken)
{
// Get rows
var whereClausePart = $"PartNum in ({string.Join(",", parts.Select(o => $"'{o}'"))})";
var rows = (await mRestClientFactory.BuildRequest()
.WithCompany(fromCompany)
.GetService<ErpBOPartSvc>()
.Get_GetRowsAsync(whereClausePart, "", "", "", "", "1=0", "", "", "", "", "", "", "", "", "", "", "", "", "1=0", "1=0", "", "", "", "", 0, 0, cancellationToken)).ReturnObj;
// Set the company
SetCompany(rows, toCompany);
rows.PartUOM.Where(o => o.PartSpecific == true).ForEach(o =>
{
o.ConvOverrride = false;
o.HasBeenUsed = false;
o.EnableConvFactor = true;
});
// Add/Update
var result = await mRestClientFactory.BuildRequest()
.WithCompany(toCompany)
.WithUser("user")
.GetService<ErpBOPartSvc>()
.UpdateExtAsync(new IAI.Common.Epicor.Erp.ErpBOPartSvc.UpdateExt_input
{
Ds = GetUpdateDataset(rows, prodGroups, partClasses),
ContinueProcessingOnError = false,
RollbackParentOnChildError = true
}, cancellationToken);
// Throw if errors occurred
if (result.Parameters.ErrorsOccurred == true)
{
throw new Exception(string.Join('\n', result.ReturnObj.BOUpdError.Select(o => o.ErrorText)));
}
}
Is there a reason this NEEDS to be done with REST? I would think it would be easier to do with the DMT tool.
Post a small sample of your input data. (Just 2 or 3 parts worth.)
Thank you for the valuable example, thank you to everyone, I will take a closer look at how to apply it. And then publish it again so that people who encounter similar problems can see it.
Yeah, if this is a one time load, DMT is the way to go.
I assume you have a use case where this isn’t possible.
I wonder if there is a way to leverage SysRev to reduce the time to sync? If you kept track of the last SysRev pushed out, you could skip those records altogether.
Great idea. I will try that once the sync time becomes unreasonable.
And just one more idea, you could also apply a tag to a part when it changes (Data Directive), and then select just those for syncing. Use your update routine as is and clear the tag once sync’d.
Because the part information is generated by the old system, it will be synchronized to epicor after generating the part information.
The data sample has not been finalized yet and we are currently discussing possible options as we do not have any experience with this
You can still run DMT in an “Upsert” mode (Add or Update), so it adds new records or updates current ones.
Are there any use cases for this feature?
Plenty.