What would you recommend to pull all part data down?

Building a custom system to manage our part data as it’s easier this way for us as we have so many different data sources and methods in which we get the data. Also from our online store and POS, we need a way to control it all in one place and push it out to Epicor, the store, and the POS.

Issue I have now is all the parts stored in Epicor. I need to merge the data so I need to pull the data down to do so.

What method would you recommend to pull all that data down? DMT?

Is there away via the API that’s an option? I looked at /Erp.BO.PartSvc/Parts but that’s 100 parts at a time.

Looking for suggestions.

There is a 100 row limitation on standard REST API calls to prevent calls that could return too many records and crash the application.

The recommended workaround is to write a BAQ and use the BaqSvc to return the data you want more than 100 rows on. That limitation does not apply to the BaqSvc API.


Wouldn’t that be the same as doing a custom function as well?

I use a server that runs a python script on a schedule to call the DMT export function from the command line to export a baq:

Something like this:
exportProcess = subprocess.call(['C:\Epicor\ERPDT\Client\DMT.exe', '-User=ENTER USERNAME HERE', '-Pass=ENTER PASSWORD HERE', '-Export', '-BAQ="ENTER BAQ NAME HERE"', '-ConfigValue=XXXXX.sysconfig', '-NoUI', '-DisableUpdateService', '-Target="ENTER EXPORT FILE PATH HERE.csv"'])

You can use rest to call the Baq service like some others have said but I find this faster than iterating through/importing json.


What are your limitations with the respect to being SaaS MT?

Do you use Net drive?

I believe we’re on DT not MT. I wasn’t sure what to put for that when I setup my account. We’re on public cloud.

Anyways, net drive as in azure?

May need to look into this. JSON might be the better use case for what we’re doing.


I the application help you lookup Cloud Settings for Bartender. You should be able to request an ftp site to be created. You can then use Net drive to expose the ftp site as a drive on your server. That may help, but both the Python and Baq service are good solutions. Don’t forget the BAQ export process may be another option, output location may be an issue, but it should be accessible by using Server file download.

Makes sense. We are not currently using that. Haven’t had a chance to dig into this yet but I will between tomorrow and today.