DMT Batch File

,

Just so I understand, the blob storage thing is a third-party service we would purchase to hold this data. And this has nothing to do with Epicor, other than that is where the data source is.
Right?

This seems nice, but it also seems like a lot of overhead for a temporary file storage. How is this cheaper? I can see how it would be faster, but right now my storage is free. If I use this service I have to pay a monthly storage fee.

Correct. This is an Azure service. Amazon has a similar service called S3. The PowerShell that runs on a timer in the cloud is a different service called Azure Functions.

It is not free, it is covered by other current assets that are paid for, but that’s not free. You still have to maintain the devices, electricity, security, operating systems, and back up that data. To match the capability of blob storage, you would also have to keep a copy of the data at another facility 100+ miles away. You would have to write a script to delete old files and I’m not sure how you would make immutable copies without buying more hardware.

Roughly, the pricing for Blob storage depends on:

The cost of Azure Functions for an application that runs once a day for each file would be nearly free to you:

1 Like

I finally tried pulling data into Excel using OData and REST. This process seems super clunky and slow. Normally I can only get 100 records. If I set the top N records, trying 10,000 records, it seems to run forever and never spits out any data. So far I have only been able to take small bits of data out. If I tried to use this to export all the fields in the tables I want, I expect it would never load.

Am I missing something big in using OData/REST in Excel?

No. You’re still downloading all of those records over the WAN and adding the overhead of Excel parsing each record into cells. Are you using the BAQSvc or individual BOs?

Have you tried PowerShell’s Invoke-RestMethod?

In my testing I just used the salesordersvc. Using the example from that REST novel.
REST Overview Novel - Experts’ Corner - Epicor User Help Forum (epiusers.help)
I haven’t tried using powershell yet.

I HIGHLY recommend using the BAQ Service. It’s well-documented in the Technical Reference Guides.

1 Like

Thank you for that! I have now created a new access scope containing only the BAQs I want. I added a new API key limited to that access scope, and I am trying to use the REST help to generate the URL I will paste into Excel. In the tech ref for REST V2 I found this talking about connecting to Excel:

Important To be able to get data from a Service/BAQ feed into Excel using REST API v.2, you must disable the default API Key requirement for REST requests by setting the value of the
EnforceApiKeyForRestApiV2 property in the appSettings section of the web.config file to false

I haven’t messed with any web.config files before. Is this an on prem thing? Can I access the web.config as a cloud user?

1 Like

Personally, I would not reduce my company’s security profile just so I could use Excel. :person_shrugging:

It is only for on-prem users and the cloud team will not do that for you.

If you’re just after CSV, consider what @josecgomez did here:

REST to email - ERP 10 - Epicor User Help Forum (epiusers.help)

I thought REST was supposed to be the way to go!? Why bother having all this API crap if you consider it not secure enough to even use it?

The insecure part in this equation is the desktop app written in the 90s - not REST. Which by the way, Excel is the largest and most expensive JSON to CSV converter one can use. Why is it needed for this archiving project? :thinking:

If you wanted to do some scripting yourself, you could use v2. It’s just that Excel out of the box doesn’t work with v2. Excel isn’t about security, it’s about convenience.

1 Like

:rofl:
I see. I will look into some scripting approaches. I didn’t get that Excel doesn’t work well with v2. I will probably end up just skipping this export attempt and trust that my snapshot of live into my pilot database is good enough in the event that my DMT commands go awry.
Thanks for all your help and suggestions! I really would like to learn more about REST and OData. It looks like I will need to know that stuff for working in Kinetic.

Hi Nate,
If this is something you plan on doing regularly, one other option to improve overall speed may be to output the files to a shared location on your cloud host server perhaps a well crafted folder under EpicorData, and if possible, have some server side compressor script, written, with security reach and other access considerations taken, that could be triggered by you or your host admin to compress and wrap them up. Then you could access the share once the zip archive is complete for a smaller transfer. very much a traditional approach, I know, but we’re an onprem installation, so we face different considerations, but we do have lots of sharing and transferring of data all over the place.
Just ideas, not sure what access and services you have with your host. One last thought, do you have access to Automation Studio? That might unlock additional options, but not sure what can do done in a hosted environment.

2 Likes

I don’t plan on doing this ever again, ideally. We are working on updating all of our part rev masters. This involves using DMT to delete all the part operations, and reimport them with corrected opcodes. I also will be doing this to open jobs. My DMT processes are now working perfectly without any errors. I just wanted to have an easy way to snapshot some critical tables in case something goes wrong and I need to reference a specific part or field. I thought that getting an export would be a lot easier. but clearly this is not a trivial task. I think my pilot database will be enough of a security blanket for me to move forward and try this out in Live!

I don’t think I have Automation Studio, but I haven’t really looked into it.

Don’t underestimate this idea! When we think of system reliability, it’s a good strategy to know what the most recent work was in case of an incident of some kind. It fills the gaps between snapshots and you can get some interesting insights into the current work if you wanted to.

1 Like

Why not use the REST v1 API instead of v2? I use that all the time to pull data directly from Epicor BAQs into Excel. No API keys required.

2 Likes

He’s on the cloud and cloud thinks more about security than on-prem users.

2 Likes

I was able to get v1 working in Excel, but it is soooooo slow as to be virtually unusable. At least for this task.

Copying Jose’s code here. It’s going to be way faster without all of that Excel overhead.

#Epicor user name
$user = "epicor"
#Epicor Password
$pass= "epicor"

#BAQ End point URL
$uri = "https://YourEpicorServer.TLD/EpicorProduction/api/v1/BaqSvc/zCustomer01/"

#Create encoded password
$secpasswd = ConvertTo-SecureString $pass -AsPlainText -Force

#Create Credential
$cred = New-Object System.Management.Automation.PSCredential ($user, $secpasswd)

#Rest Call
$data = Invoke-RestMethod $uri -Method Get -Credential $cred -ContentType "Applicatin/Json"

#Dump results as CSV
$data.value | Export-Csv -Path './test.csv' -NoTypeInformation

For v2, you would pass in the api-key.

5 Likes

Thanks Mark, nice and clean!