BAQ UI & BAQSvc REST API Limitations

Sorry, I don’t understand the question. What limitations are concerning you? What are you trying to do that you fear you won’t be able to do? The BAQ ecosystem is foundational to much of Kinetic and is used by Epicor in its own development.

How many rows we can retrieve the BAQ Query?

How many rows we can retrieve with Childs using REST API using BAQSvc?

Are there other limitations should I know when I’m using BAQ with REST API?

@Mark_Wonsil Thank you!

I have never run into a row limit with BAQSvc personally. I believe the OData methods do have row limits. There is one sure fire way to find out: Try it! Create a BAQ with the table with the most records, go to Postman and run it.

@Mark_Wonsil Thank you for your response!

BAQSvc it’s fine, Do you know what is the Limitation of the BAQ Query limit how many records it can return in a single Execution of Query?

The designer has a limit, that can be overridden.

The service itself, as far as I know, has no limit.

The app/web server, does have a max, that can be overridden I believe, unless you are cloud,
but you can probably ask for that to be removed if there is a good enough reason.

@Olga, could you elaborate, I believe you had a post on it.

This is the post I was referring to:

@klincecum I know this one

This one is normal REST API Standard Points Limit.

But My Question is very clear I want to know What is BAQ Query limit and BAQ if I use REST API how many Max records I can get it?

NOTE: Yes we are on a Cloud environment.

I just check my cloud web config, and I see no limit set, except for max allowed content length,
which for mine is 4294967295

So, as far as I can tell, there is no limit, except for when you go over the max allowed content length.

My answer is clear. I never ran into a limit with the BAQSvc. I downloaded thousands of records without issue. I tried not to since it generally has been for supporting Excel '97 old-school users. (Coming from me, that’s saying something.) There may be a time limit to an API call, however.

As was I.

What is stopping you from testing it out? In the end, I wouldn’t believe what people tell me on a forum. It works or it doesn’t and there’s only one way to know for sure. :person_shrugging:

BTW, as a cloud user I might take a different approach for downloading larger files. I would create the file in the EpicorData folder (company or user) and use the Server File Download endpoint to retrieve the file. It’s a cloudier approach.

2 Likes

@Mark_Wonsil Thank you for your responses!

We trying to do some new implementations or new projects

We would like to see all kinds of limitations so that’s why I’m putting all these kinds of questions

For me to test I don’t have as of now lot of data in my Cloud env so just asked the community what we have faced in the past with the experience.

I hope you might be clear to you, why I’m asking these questions.

No, I understand. Sometimes we want answers to solutions that may not be appropriate for the scenario. Give this group your business problem and scenario, and you will find some amazingly smart people (way smarter than I am) with fantastic, clever, and creative ideas.

1 Like

I was able to pull down 2 million rows, but I couldn’t get it to pull 2.5.

These were 1 column rows, with just a number in it.

Now I wonder what the filesize limit is for ServerFileDownload? Still 4.29GB? :thinking:

4294967295 is 4GB, which is probably the limit. Go test :slight_smile:

2GB is the limit for Server File Download.

1 Like

That’s a limit of the System.IO method they are using.

Could be worked around.

Lol, use your brain to find out how to work with paged data, I don’t think that attempting to get out of memory exception is a good use of it :sunglasses:

I was referring to being able to download a larger file.

Sooner or later it all ends with OOM

1 Like

I’m forgetting :poop: already!