BPM to dynamically add multiple tasks


We are attempting to move our revision process into Epicor, using ECO and tasks. One of our primary goals is to ensure that certain steps, which aren’t always or even often required, get done when they ARE needed. So instead of cluttering the task set with a load of tasks that everyone will get used to ignoring, we only want to generate them when the results of earlier tasks make it appropriate.

When certain tasks are completed, an option is chosen from a drop-down (like “No released jobs” or “Jobs on floor”) that determines the creation of new tasks, immediately and/or after a milestone creation later in the task set. The options are stored in UD100 (“results”) and the new tasks to generate are stored in UD100A (“outcomes”).

Most of my experiments have caused infinite loops of task-creation, so I’m pleased to report my progression to a BPM-halting error instead. The error is the “another user changed this data” one, and it’s thrown when more than one task has to be created. If I break the loop after one run, everything works fine. Here’s the code that runs in Task.Update post-process after an eligible task is completed:

var svc = Ice.Assemblies.ServiceRenderer.GetService<Erp.Contracts.TaskSvcContract>(Db);
// get the set of new tasks to create, based on the completed task, the chosen result, and the current stage
var newtask = from task in ttTask
join outcome in Db.UD100A
on new {task.Company, id = task.TaskID, result = task.UDField<System.String>(“Result_c”).ToString()}
equals new {outcome.Company, id = outcome.Key1.ToString(), result = outcome.Key2.ToString()}
join eco in Db.ECOGroup
on new {outcome.Company, stage = outcome.ShortChar03.ToString()}
equals new {eco.Company, stage = eco.CurrentWFStageID.ToString()}
where outcome.CheckBox01 == true && task.Key1.ToString() == eco.GroupID.ToString()

select new {task.RelatedToFile, task.Key1, task.Key2, task.Key3, outcome.ShortChar01, outcome.ShortChar02, outcome.ShortChar03};

// this.newds is a TaskTableset variable
// Get a new task row, set required fields, run ChangeTaskID to populate the rest, and update to save
// If I break this loop after a single run, everything is dandy
foreach (var row in newtask) {
svc.GetNewTask(ref this.newds, row.RelatedToFile, row.Key1, row.Key2, row.Key3);
this.newds.Task.Last<Erp.Tablesets.TaskRow>().Company = “GI”;
this.newds.Task.Last<Erp.Tablesets.TaskRow>().TaskID = row.ShortChar01;
this.newds.Task.Last<Erp.Tablesets.TaskRow>().SalesRepCode = row.ShortChar02;
svc.ChangeTaskID(ref this.newds);
svc.Update(ref this.newds);

I assume the problem has something to do with when the update runs. I’ve tried moving that out of the loop, and moving both ChangeTaskID and Update out of the loop. No go.

I have a handful of message boxes for debugging in this and related directives, but not a single one appears when the changed data error happens – not even messages in the pre-process directive that enables THIS directive.

Can anyone shed some light on what the conflict is? Or maybe how to call that ServiceRenderer in Visual Studio so I can debug properly?

Have you tried doing a screen mod and accessing the Treeview and just hiding the ones that are currently not appropriate?

Why not add a count in for the number of rows in “newtask” and then you can count each new task created and break out when the number of tasks created equals the number of rows in “newtask”

Mark, we’re interested in using the built-in “task created” alerts as well as the Mandatory option, so hiding them would just lead to confusion.

John, I actually tried several arbitrary counts. It throws an error if the loop runs more than once, regardless of how many rows are in newtask, so I don’t think it’s the counter.

I could be misunderstanding your goals here but I suspect you need to do this in a pre-processing directive. You should remove the svc.Update() call from the foreach loop and edit the ttTask dataset instead of the newds when you use a pre-processing directive.

By using a post processing directive you will not be able to change the data going to the database as it’s too late, the data is already there. Also by calling svc.Update within the directive you will trigger a new BPM to fire that will inturn call svc.Update and i suspect that is where your infinite loop / recursion is coming from.

My understanding of the pre/base/post is as follows:

Used to add new information to the dataset before it is checked by the BO and written to the database. This is a good place to populate user defined fields and do other checks before the process can continue.

Base Processing
This is where you shot yourself in the foot. A base process directive replaces the BO code with your code, don’t do this unless you really know what you are doing…

Post Processing
Post process is where you can update other systems, add audit trails etc. The code in the post processing directive is called after the trigger method has completed successfully ie the data is in the Db now time go update other systems, send notifications etc.

From the help:

Hope this is helpful