Need Help - Building Function to Dump Tables before Environment Reset

But I don’t want to make the changes. I’m lazy and believe there is an easier way.

2 Likes

So the way my small solution works if you were not going to use BAQs is that you use a separate database and put any views or stored procedures or dare I say REST endpoint (Using sp_invoke_external_rest_endpoint. Dislaimer I have never used it)… In actual fact using it that way may even work for Cloud customers…

1 Like

No just BAQs and a powershell script to call all your DMT executions

No just functions, one per BO. No maintenance required there since the function would build the BAQs for you. I guess you could do it all in a function but I like the artifact of the BAQ so I could compare from version to version in source control. If all code is in the function, you’d have to add some logic in the function to discover changes between versions. Why write source control when I can use an exisiting one?

1 Like

What a Function to build a BAQ??? :exploding_head:

So BAQ as code? Now that kinda make sense, but still using the DMT to get it into your destination system right?

1 Like

Do you have a list of the tables? I will compare with what I have…

ZBO*
ZData*

for the tables and BOs and

Query* for BAQs.

2 Likes

Need to know what exact parts of the database needs populating from @jkane also… Your not wanting to dump everthing and we don’t know what they are licenced for… I mean I guess you could to all and if count = 0 just skip it.

Also have the challenge of the the possibility that the column names are different, particularly if you are using a DMT that has combined data and fields are Tablename#FieldName in the DMT.

EDIT
Getting late I am rambling…

2 Likes

Amen brother.

How about an idea to create a “backup” of an environment that does NOT include transaction data… only static data?

When we load a new database, we load the static data (parts, customers, suppliers, GL accounts) and their related informaton (part classes, shipto addresses, contacts, segments). Why not create a “backup” routine with a default set of static data files?

1 Like

Get out of my head.

Not what I built this time but I did code up a baq generator that isn’t finished. Basically it’s everything is a baq lol

3 Likes

If you don’t want to write the BAQs maybe you could use the REST api ODATA endpoints for each table.

For example:

GET https://your-site/production/api/v1/Erp.BO.JCDeptSvc/JCDepts

would return all of the rows JCDepts table

2 Likes

Got to still get the data in, and I still think the DMT is the best way, and you would not want all those columns.

Only thing I feel the DMT is pretty slow.

Bring back BCP speeds!

If you create your DMTs in advance, you could use REST to retrieve the data, then fill them in your DMTs.
In C#, ClosedXML is a great library for manipulation of an xlsx file.

Yeah, this is one of the reasons why I created our New Company Setup with REST instead. We recreated our company several times until we went to production with it.

2 Likes

This is turning into a great post.

I think you could extend on @mbayley’s tool and have a second part that did any additional tables.

Here’s a couple a gists from my New Company Setup project if it helps.

AddCurrency.cs (github.com)

Excel Helper (github.com)

For getting the data into your DMTs, you could extend the ExcelHelper to SetTableData generically as well.

2 Likes

I don’t really care about the upload of the data to the new environment (at this time). Since I am still in implementation mode, I will be manually changing some of the files. It is more the extraction of the existing data that I am after as I find that non-value added. Being SaaS, I can’t save a backup copy because I don’t have a database to open it in.

I am sure at some point I am going to want to upload the data in a better way, but right now I am just looking for the easiest way to extract the data without creating a bunch of BAQs, then downloading the BAQs, so I can then upload the BAQs to do it all over again.

This doesn’t remove the need to build the queries, but you can automate their running by using the BAQ Export functionality of the DMT… and to upload them you could create a Playlist (again within the DMT). If we’re talking about a couple of dozen tables, it will be a few hours to set up the first time.

1 Like

I honestly don’t know a faster way to extract data without BAQs, but always open to new ideas.

1 Like

That was why I was trying to do it in a Function. Create a Function Library with individual Functions to get certain tables. At first I thought that doing a LINQ query would be the way, but then I realized it would probably be better to use the BOs to get a Tableset. For example, if you get the PartClass Tableset, all the data is in it for PartClass, PartClassPlt, and GLControlCodes. I just have no idea how to turn a Tableset into a csv to save to the server folder.

And using the BO means that no matter what Epicor changes, you always have the correct table structure.

1 Like

Definitely a lot more work than a BAQ

I think only GetById returns all fields for an object. One would have to run GetRows first and then iterate over each row and do a GetById to get all of the fields.

Same. And if I did, I wouldn’t know how to upload it later. If we could take a tableset and convert it to csv, it would have to look a lot like…a DMT input file, right? To move forward, I would think we need to know how we want to edit it and upload so we could create an export strategy. :thinking:

1 Like