DMT Job Assembly with BOM Levels

I am trying to DMT Job Headers, Assemblies and Materials ready for our Go Live. I am struggling to bring in Subassemblies.
My DMT file brings in the first level Assembly (0) as the Job Header and then when I import the Assembly, it brings it in as the next level.

I now have some assemblies which need to link to these but the DMT just brings them in at the bottom. The fields I am bringing in are Company, JobNum, AssemblySeq, Part, PartNum, QtyPer. The DMT file does not fail but whatever I put into Parent gets overwritten and zero is populated in the table!!

Any suggestions, this is causing me grief. Thanks :slight_smile:

What happens if you dmt the changes in? I think you could run the DMT as update only with the same file and see if will change the parent. It’s not grayed out in the job, so I think it’s changeable.

Although I am surprised that it won’t put it in initially. I haven’t run a DMT to put assemblies in personally (only delete), I’ve only put materials in.


Doesn’t seem to matter what I do, it doesn’t populate the Parent field. I can go through and manually type it in on each line once I’ve DMT it in but with 1000+ lines…! I have tried running it as an update but still doesn’t seem to populate.

But after looking at the field help, I wonder if there isn’t something extra going on to get those to change.


Yes I’ve just spent the last hour using the ParentAssemblySeq field in DMT which does seem to make a difference but doesn’t give the structure that it does when manually entering. Just trashing my job now to start again.

Would it be better to DMT the BOM into the part master and get details on the job? Or does it have to match exactly?

Also, Have you tried a trace when you change it it the job entry screen? Maybe you can see what field it’s using.

And are you populating both fields?

Ahh good idea on the trace. I would rather bring it into the Job as this is to bring in our starting (go live) jobs and they may be different…

I noticed this in the DMT stuff. I’ve never seen it before, but I’ve never had to. Maybe this needs to be set correctly?

Eeek weird. This would mean I would need to bring in each level individually and then at the same time as bringing in the subassembly, change the parent to have the child…

yeah, that seems like a pain. It’s supposed to be maintained by the program, but I’m thinking DMT doesn’t get the right treatment. There still should be a way to make this work…

Where is your data coming from?

From a previous sql database so I have lots of options with that. I already have one DMT file for the JobHead, One for the first level, one for the 2nd level, etc down to the 7th level and then one for all of the material!

Just wondering if that child field was something you could pull from the other data base instead of having to try and figure it out on your own.

mm might give that a go. Jobs seem to be one of the few DMT types which don’t have a combined option. Also not sure how I can get the system to auto generate the next job. JobHead only seems to work if you populate the job.


What worked for us was two Imports for Job Assembly

First import had all Job assemblies with these fields


Second import added the “ParentAssemblySeq” field and was filtered to only return Assembly levels higher than “0”


The second import had to be re-run multiple times to clear the reprocess errors. (This was just due to the parent assembly sequence not being right yet)

AnalysisCode can be left off just something we use.

1 Like

Thanks - I’ll give that a go.
Did you have to populate the JObNum or could you get Epicor to give you the next one?

I populated the job number with our old existing number. I think you can get it to populate the next job number is you enter a “0” for the field.

This works in the labor entry screens so am assuming something similar will work with jobs.

I have been messing around with this many different ways.
Found that the sequence
Company,Plant, JobNum, AssemblySeq, ParentAssemblySeq, QtyPer, IUM, PartNum, Description, RevisionNum

worked like a charm.
Note - take one job and test it out several time. Also, delete the job once in awhile - there is something left in memory that was causing havoc on my data loads.

Now ready for the 12K load.