Improve Generate Data Model Performance

But I sometimes have to do this 3 - 4 times in a day specially if I don’t plan my fields out all in advanced… which happens sometimes when I’m working on big projects I realize I need a new field in the middle of the day once or twice… So collectively that’s an hour or so worth of work. “Lost” to staring at the progress bar.

We try to plan ahead and add them all at once but inevitably we forget stuff

1 Like

Yep, that is exactly what was happening to me. Add 3 fields, regen, realized after some testing you need a 4th, do a regen.

I know its a different beast and the benefits of the new system, but the old E9 fields made it so quick.

1 Like

Well you could still do that… add the Char01-50 and Number01-50 etc but its much nicer to have nice names and labels and properties…


Yes we have been doing this quite a bit lately with getting all of our customisations and tweeks happening. One of the other gripes I have with regen is the clunky UI.

Right mouse on the data model, opens the work in progress “please be patient” dialog an the generate Data Model form, then you have to click on generate. Wouldn’t it be better to just have generate DM form then when you click on generate it opens the work in progress dialogue?


I added a new guy to the team a couple of yours ago who was very performance conscious and he swore he would solve it. He tore thru it all weekend on his frustration and reported back on the Monday.

@##$$(&@@ that’s a lot of data the Entity Framework needs and our db is HUUGE…

I think every person on the team has gone through the same frustration trying to knock down the perf on it. I wrote the first generator personally with the architect of EF back in the day. His original code took 90 minutes… A couple of others have improved the organization, stability and minor performance gains but none have been able to change the volume of data EF needs at design or run time.

We’ve looked at a few approaches and we are making another run at it working with MSFT on a few research topics atm but nothing to advertise or promise. It’s equally frustrating but the alternative back in the day is dropping you into SQL or fixed stored procs and the migration would have really been horrendous without LINQ.

That is a topic near and dear to my heart. I wish I had a better answer :confused:


Not being that technical, but isn’t just a Delta approach really all that is required, or is that what is being done now and it just takes that long?

From a simplistic view on the sql side (at least) when we add a new column to a table it appears to rebuild or creates the UD version of the table and recreates the associated table view, what else does it do?

Likely there is also some relationship between the volume of existing data in the table you are adding the column to, and also if you were setting a default/initial value also.

Interested to know more how it works and happy to be told different if I’m just talking through my hat.

1 Like

I need to see if I have the numbers on where time is spent. It is an interesting idea to just delta the ‘ud columns’ instead of the whole thing. It would not be 100% accurate of course so if not working, always give the user a regen all option (Similar to today’s use).

An interesting thought I’ll add to my research list…


Can the “OK, I’m finished” be made to be on top of all windows? If I’ve clicked somewhere else during the regen, the box gets buried, and I think it’s still crunching (but has been finished for 10 minutes!)…

Not sure if Alt+Tab picks it up or not. I think I could only see it when looking at the task bar icon preview windows. Seems like an easy improvement.


That sounds like a ticket - agreed its annoying.

1 Like

But if you click again on the icon of the taskbar, it should show up …

But yes I agree, I even move the original position of the window in order to make sure the Finished window does not show behind!!!

FYI - You know there is a command line for DM Regen? I know our Ops folks use it exclusively as part of a PowerShell script to regen, recycle app pool, etc.


I just learnt something and it’s only 8am ish. :slight_smile:


If you have not spent anytime in the command line tools section of the help, spend 15 minutes to review it… We continue to add a bunch of scripting abilities every release for routine pain points - especially migration oriented


2 posts were split to a new topic: How to use Command Line for Data Model Regeneration

@aidacra was nice enough to put together a pretty awesome how-to regarding command line data model regen. Thanks Nathan!


I believe it is touched on in the System Admin manual as well. It also mentions about adding tables to the schema, which is something I have not even considered, I’ve always worked within the boundaries of what’s there.

We need that functionality for Partners to add tables as a part of their efforts. Even our own CSF teams needs to add new tables so the need is there though usually geared to higher level integration’s.

1 Like

How do upgrades cope with such tables?

Migration does not touch custom tables.
The inclusion of the custom tables into the data model is optional. The data model regen walks thru all tables in a db schema. There is a schema to include - Ice, Erp - and an exclude for tables -

<!–Comma delimited list of additional schemas to include. By default only standard schemas are included.–>
<!–Comma delimited list of tables to exclude from the data model.–> <TablesToExclude>


Was something improved under the cover? Our 10.2.300.18 test environment regenerated in 3 minutes! Or it could be the new hardware we migrated our testing environment onto last year…

1 Like