I have a client that would like to work with a very large BAQ (over 200K rows) and perform filters and summaries ad-hoc. If I run the BAQ without the test row limit, i can get results in less than 8 seconds. If I use a dashboard and filter those results, it times out.
I was thinking I could use REST to download the raw data to Excel and let Excel do the heavy lifting. But…
I can recreate this in SQL and use a direct SQL connection, but that bypasses a lot of security (and possibly features).
Does anyone have any good ideas here? I am thinking about:
- Compression of the REST data
- Utilizing Paging
- Pulling out my hair
Maybe using service connect to drop the results as a file via ftp and then consume it?
You have a good idea. Although Service Connect would not be needed. I could save as a CSV instead using BAQ Export. Then I could use that as a datasource…
1 Like
Is it an internal application? That certainly opens up a lot of doors for moving the data around
Yes. They simply need to be able to see the data easily and quickly.
You can always create a normal SSRS report that is just using a SQL connection to the DB. You can then use the oData feed to pull the data from the SSRS report. This is the same method that XL Connect 5 was using.
I think it’s really time that they add additional locations for BAQ exports as well like Azure Data Lakes.
1 Like
The SSRS idea could work. I can save that as Excel better than csv.
Jason Woods
http://LinkedIn.com/in/jasoncwoods