BPM Automation - Help Needed

Hey All,

We are getting ready to integrate a program that is going to bring in our customers’ PO, our sales demand, from our customers portal and upload it into our system. The problem with the program is that it will overwrite the promise dates on our sales orders. I have been tasked to come up with a solution to the problem.

My thoughts so far:

  1. Require the program to run at a certain part of the day thus limiting chances or error.
  2. Before the scheduled time for the program, have a scheduled BPM pick up on all promise dates in the system and copy them to a UD field in the OrderRel table.
  3. After the program runs, have a second scheduled bpm fill the promise dates back in.

Forseen issues:

  1. Timing for the program is going to be finicky because releases can vary from 2k lines to nearly 30k lines depending on which site is performing the update.
  2. Knowing when the upload is going to be done.

What I need:

  1. To know how to schedule a BPM to run.
  2. To know when the upload program is done.

Question:

Has anyone done something like this before or know bits and pieces to the puzzle?
Thanks a ton for any insight on this one!

Instead of going that route, can you segregate your integration transactions with a condition in the BPM and then just have the bpm flip the date back to where it started before the transaction finishes processing?

@Mike,
How would I segregate those transactions? Ive never tried something like that before.

The specifics of it really depend on your circumstance… but at a high level if you can find some data that lets you know that the transaction is from the integration and not a different transaction and then make the BPM run conditionally based on that data. The BPM has a condition widget so stuff like this field changed from this to this or the method is call by this user.

1 Like

Hi Dylan,

Can you just use a UD field to store duplicate info on every original promise date and then use this duplicate info to either put it back to where you wanted it via a bpm or to analyze against it?

Nancy

1 Like

@Mike,

Im going to have to reach out to them as I have been supplied with very little documentation on their means of import. Thanks for helping with the brainstorm!

@Nancy_Hoyt,

I believe this is the way I’m going to take it currently. The plan would be to use an existing field called PrevReqDate in the OrderRel table and create 2 BPMS. The first BPM will push the current OrderRel.ReqDate to the PrevReqDate. This would be run before every upload. Then after the upload we would run the second BPM which would back port the OrderRel.PrevReqDate to OrderRel.ReqDate as long as there was a PrevReqDate avaliable. This should ensure that we will always have a req date that can be back ported back. Thanks for the help!

@ckrusen,

Have you built a BPM that ports from one field in a table back into another on the same table?

1 Like

Use a Function instead. These can be scheduled easily and you can have the code run specifically in the order you want (or use widgets to accomplish the logic in order).

4 Likes

@Mike,
Just heard back from the integration team. Apparently, they are using rest to upload the data. I didn’t know rest could put things into a system. I’ve only ever used it to pull data down. The only way that I see that working is if they are calling against an updatable baq in the system.

Functions can do it too. You have the full-capability of BPM at your disposal. So you can use Business Objects to create, update, etc.

1 Like

@Mark_Wonsil,
So when they say rest that means they are calling Functions? And if so would these functions stand out in transactions from normal entry methods?

In functions, you have the choice to expose them via REST. Other objects are exposed as REST as well, as you know (DynamicQuery, SalesOrder, etc.).

“Normal entry methods” are just calling business objects too. So there’s no difference to the server. :man_shrugging: DMT calls business objects. Custom C# programs call business objects too. IMHO, EFx are more efficient since you can reduce the chatter on the wire and encapsulate the logic at the server instead of embedding it in the client - which would have to be tested on every upgrade.

The “API” you create can also be secured more easily and you can leverage Azure API Management to add throttling protection too.

1 Like

the only way that they would look different from normal transactions is if you flagged them differently. Remember that with Rest, you are calling base epicor Methods… the same methods that are called by the application. You can build up a data record and call Part.Update method and update a part record if you like.

2 Likes

All,

I’m in the process of trying to figure out what specific methods they would be using to update sales order information. I assume it would be a mix of ERP.BO.SalesOrdHedDtlSvc and ERP.BO.SalesOrderSvc. But, I would like to be 100% sure that is the case. In order to figure out which methods they are using, I would like to do a trace while creating a new sales order. Issue is when I run a trace with all settings I seem to get alot of flak and not the specific information I’m after. Could someone help me figure out which specific trace settings to enable?

image

Thanks!

Well, the BEST way to trace (in my opinion) is to trace one thing at a time. ie…

  1. get into the Process (Order Entry).
  2. Turn on Trace Logging (no datasets yet) and clear the log
  3. press the New Order in order entry
  4. Look at the log file to see what happened… document it… then clear the log
  5. enter your data into the order header and save
  6. do step 4 again… only this time, you will see it calls multiple things. Reason? When you enter a customer, it does things, etc… SOMETIMES, you need to do step 4 after every field, just in case you want to capture that step.
  7. Press new line, then step 4 again
  8. enter part number, check step 4
  9. enter Qty, check step 4
  10. enter price, check step 4
  11. Save, check step 4.

I think when doing it this way, you can narrow down the exact UDMethod you are trying to find.
All that said, you are probably looking for the method SalesOrder.Update which is the method that is used to save any changes to the order.

1 Like

One more hint for cleaning up the Trace Log: Before you start the Trace, right-click on System Monitor in your system tray and choose Exit, to eliminate the repetitive ReportMonitor calls.

I only select Track Changes Only, and that usually has enough info to figure out what is happening. Then add more checkboxes if needed.

New problem with the integration, it appears that their program is going to delete out all of my sales order releases and create new ones every time it runs. This means there is no way for me to call in my promise dates. Does anyone have a good idea on how to work around this?

Action Plan:

  1. Create a BPM to store all relevant data getting changed into UD Table before update.
  2. Create a BPM to write promise dates back in after call has been made.

The first bpm is planned to be a pre-process that grabs the entire order rel table related to the sales orders being updated. It will need to purge all release records in the table before it writes the new ones.

The second BPM will need to be a post-process that runs a C# custom code iterator for filling the promise dates back in.

What I need:

Help creating a function to purge entries in the ud table using widgets.
Help creating a BO method caller to initiate the transfer of data from the OrderRel table to UD01 Table.

Thanks in advance for the help!

Currently, in the process of creating the first BPM this is the layout I have so far:


The two show message sections are just troubleshooting portions and will be removed when I publish it to our live environment.

The Invoked BO Method is set up for Ice.UD01.UpdateExt, and is configured as follows:
image

The Update Table by Query is set up using the ds.OrderRel filtered for open releases and displays the company, ordernum, linenum, relnum, and reqdate. It is mapped to UD04LogRecords.UD01 table with the following configuration mapping:

I am receiving the following error when trying to save the bpm:

After troubleshooting I know that this error is coming from the Update Table by Query widget. Any ideas on what I can change to try and get around this?

Error was on my criteria statement for the query. Had it set to 1 instead of “true”. Credit this thread for solution:

New problem is that what I have built doesn’t seem to be populating the UD table at all. The bpm is a pre process because we need to grab the data before the integration data trickles in. I know there is data to grab as I am able to get the message boxes mentioned prior to populate. Am I missing a step?

Turns out I needed to have the update method after the update table by query widget. After moving those two around I get the following error: