Exporting data from Epicor Public Cloud

I would like to setup a daily export of data from Epicor Cloud to a local drive. Any suggestions?
When running the BAQ Export process a file is sent to a location where you have to manually download the file.

Thinking about using a report and setup advance print routing to email the file out, but that still doesn’t automate the process.

REST + PowerShell :slight_smile:

#Epicor user name
$user = "epicor"
#Epicor Password
$pass= "epicor"

#BAQ End point URL
$uri = "https://<YourServer>/<YourInstance>/api/v1/BaqSvc/<YourBAQ>/"

#Create encoded password
$secpasswd = ConvertTo-SecureString $pass -AsPlainText -Force

#Create Credential
$cred = New-Object System.Management.Automation.PSCredential ($pass, $secpasswd)

#Rest Call
$data = Invoke-RestMethod $uri -Method Get -Credential $cred -ContentType "Applicatin/Json"

#Dump results as CSV
$data.value | Export-Csv -Path './test.csv' -NoTypeInformation
7 Likes

If you have Office 365 look at PowerBI Dataflows. It’s a completely hosted ETL option.

Will REST work with Government Cloud?
Will want to run an export on a daily basis.

As long as the Epicor REST API is available… it should. I believe Public Cloud does expose rest of you request it.

Yes REST is available in any of the cloud or most new features built by Epicor would not work :wink:

1 Like

Microsoft Flow works great for doing this and you can create a flow that can allows a user to select a BAQ (source for export) and the target destination for the file. They can also be ran on schedules. Once a flow is created you can delegate the permissions to execute to as many as you would like (no local install required) Below is a sample flow that shows a bit of this. A simple replacement for BAQ Export Process
Steps:

  1. Manual trigger - but could be scheduled or email triggered or a number of other triggers that start this. What is kind of cool is that this can be triggered by Flow mobile app which then could run this flow and email you the file
  2. This is the REST connection (simple setup). Can add selection for what BAQ a user would like
  3. Parse JSON (formats for easily consuming in CSV table)
  4. Formats CSV table for file creation
  5. Creates file on local file server, OneDrive, Google Drive, SharePoint or many others. Could prompt user for destination.
1 Like

Thanks for the code! I had my IT look into trying this. He said the following:

In the Credential Create section, it fails to reference the username:

#Create Credential $cred = New -Object System.Management.Automation.PSCredential ($ pass , $secpasswd)

should be

#Create Credential $cred = New -Object System.Management.Automation.PSCredential ($ user , $secpasswd)

When the script ran, it outputted as a tab delimited csv, not ~ delimited csv. How can we define the delimiter for the BAQ in the REST call?

I believe you can change the delimiter in this step below. The Export-Csv cmdlet is putting your delimiter in the stream, not the BAQ.

 Export-Csv $CsvFile -delimiter '~'
1 Like

What @Mark_Wonsil said, look at the documentation for that cmdlet

1 Like

@danbedwards I have spent a few hours trying to wrap my head around this, but I’m still hitting a wall.

I keep hitting this error: The 'from' property value in the 'table' action inputs is of type 'Object'. The value must be of type 'Array'.

So, on your Parse JSON step, what is inside of it?

When I try to get Flow (Power Automate) to generate the schema, it starts with that “object” type. I think it’s trying to do something with the URL ahead of the array. Do you strip that out somehow?

Hopefully this will help.

Here is a simple flow (a trigger, custom action, and parse)

The Action is a simple GET using Rest to pull some data using a Baq

The Parse has a couple steps.

  1. In the Content when you click you will see the body appear on the right hand side under your “GET”. Add that to the Content area
  2. Click Generate from Sample and paste in a json body from Postman or Swagger. This will create your schema
1 Like

@danbedwards Hot dog. I got it.

I realized what it was - in the table step I used “body.” WRONG. I should be using “value” - that’s the array!

You answered what I asked, and your generated schema was slightly different than mine, but similar enough to let me know I was on the right track. Reevaluating the parse step (and a night’s rest) made me see I was grabbing the wrong object (or whatever).

Thank you, thank you, thank you!

image

image

1 Like

You trying to summon @Mark_Wonsil with puns like that? :sweat_smile:

1 Like

@Evan_Purdy Oh believe me, it crossed my mind.

It’s how eye-roll…

1 Like

I just want you all to know, I made a PowerPoint doc to explain all the steps for this. 55 slides.

And you all are like, screenshot, 10011010101010. And everyone is like, “Oh I see what you did there!”.

I am out of my league some days here…

2 Likes

Hi Jose, What about if I want to export the tables from Epicor Cloud into power Bi and do all transformations and clean up data into Power BI instead using BAQ?What’s the best practice? Sorry, I am very new to this.Thank you in advance.

The epicor cloud now provides a read only version of your db just ask your cAm