Correct. This is an Azure service. Amazon has a similar service called S3. The PowerShell that runs on a timer in the cloud is a different service called Azure Functions.
It is not free, it is covered by other current assets that are paid for, but that’s not free. You still have to maintain the devices, electricity, security, operating systems, and back up that data. To match the capability of blob storage, you would also have to keep a copy of the data at another facility 100+ miles away. You would have to write a script to delete old files and I’m not sure how you would make immutable copies without buying more hardware.
I finally tried pulling data into Excel using OData and REST. This process seems super clunky and slow. Normally I can only get 100 records. If I set the top N records, trying 10,000 records, it seems to run forever and never spits out any data. So far I have only been able to take small bits of data out. If I tried to use this to export all the fields in the tables I want, I expect it would never load.
Am I missing something big in using OData/REST in Excel?
No. You’re still downloading all of those records over the WAN and adding the overhead of Excel parsing each record into cells. Are you using the BAQSvc or individual BOs?
Thank you for that! I have now created a new access scope containing only the BAQs I want. I added a new API key limited to that access scope, and I am trying to use the REST help to generate the URL I will paste into Excel. In the tech ref for REST V2 I found this talking about connecting to Excel:
Important To be able to get data from a Service/BAQ feed into Excel using REST API v.2, you must disable the default API Key requirement for REST requests by setting the value of the
EnforceApiKeyForRestApiV2 property in the appSettings section of the web.config file to false
I haven’t messed with any web.config files before. Is this an on prem thing? Can I access the web.config as a cloud user?
The insecure part in this equation is the desktop app written in the 90s - not REST. Which by the way, Excel is the largest and most expensive JSON to CSV converter one can use. Why is it needed for this archiving project?
If you wanted to do some scripting yourself, you could use v2. It’s just that Excel out of the box doesn’t work with v2. Excel isn’t about security, it’s about convenience.
I see. I will look into some scripting approaches. I didn’t get that Excel doesn’t work well with v2. I will probably end up just skipping this export attempt and trust that my snapshot of live into my pilot database is good enough in the event that my DMT commands go awry.
Thanks for all your help and suggestions! I really would like to learn more about REST and OData. It looks like I will need to know that stuff for working in Kinetic.
Hi Nate,
If this is something you plan on doing regularly, one other option to improve overall speed may be to output the files to a shared location on your cloud host server perhaps a well crafted folder under EpicorData, and if possible, have some server side compressor script, written, with security reach and other access considerations taken, that could be triggered by you or your host admin to compress and wrap them up. Then you could access the share once the zip archive is complete for a smaller transfer. very much a traditional approach, I know, but we’re an onprem installation, so we face different considerations, but we do have lots of sharing and transferring of data all over the place.
Just ideas, not sure what access and services you have with your host. One last thought, do you have access to Automation Studio? That might unlock additional options, but not sure what can do done in a hosted environment.
I don’t plan on doing this ever again, ideally. We are working on updating all of our part rev masters. This involves using DMT to delete all the part operations, and reimport them with corrected opcodes. I also will be doing this to open jobs. My DMT processes are now working perfectly without any errors. I just wanted to have an easy way to snapshot some critical tables in case something goes wrong and I need to reference a specific part or field. I thought that getting an export would be a lot easier. but clearly this is not a trivial task. I think my pilot database will be enough of a security blanket for me to move forward and try this out in Live!
I don’t think I have Automation Studio, but I haven’t really looked into it.
Don’t underestimate this idea! When we think of system reliability, it’s a good strategy to know what the most recent work was in case of an incident of some kind. It fills the gaps between snapshots and you can get some interesting insights into the current work if you wanted to.