I’ve been creating some reports with power bi for a little while now and its all been good. I’ve been using the SQL data type for this, and it works very well.
With the recent announcement of cloud transition in a few years, I thought it would be a good idea to build any new reports with the appropriate data retrieval method if it needs to be different.
For those that are already on cloud, how are you handling your connections to power bi for data retrieval, are you using SQL data types, or using REST web or odata feeds?
I’ve been having problems testing rest web and odata feeds to get the BAQs I’ve been using. I’m not sure if I am doing something wrong in the setup, or what specifics I might be missing in making the request for the data. If you have documentation or an example on this that would be helpful.
I’m currently getting “access to the resource is forbidden”
My current method of data retrieval (SQL - stored procedure query) likely won’t work in the cloud, so I most likely need to move forward with REST unless someone can tell me otherwise.
REST and odata all the way. Make sure you create an API key in epicor, make sure your API key has a scope. make sure the data you need is in that scope. Mostly my data comes from BAQs so I just add the BAQs to the scope list. Make sure you use API v2 and test your calls with the REST helper: [REST API Help] (https://YOURSERVER.epicorsaas.com/YOURINSTANCE/apps/resthelp/#/home).
I always have bad results with v1 API. and v2 requires both API and credentials. I setup a new epicor user and enabled “integration account” for that user.
Of course this leaves my API key and creds kind of hard coded into powerbi, but I’m no security expert, so this is the best I can do.
Getting the credential incantations sorted is the most annoying clunky part. The rest is just normal tedium, moving SQL to BAQ and so on. Here’s a snippet I keep around for getting started, with a parameter example because those trip folks up sometimes:
_fromDate = Date.ToText(Date.<do some date stuff>, [Format="MM-dd-yyyy"])
,_throughDate = Date.ToText(Date.<do some date stuff>, [Format="MM-dd-yyyy"])
,Source = Json.Document(Web.Contents(
"https://centralusdtapp<server id>.epicorsaas.com/SaaS<instance id>/api/v2/odata/<company number>/BaqSvc"
,[
RelativePath="<baq name>/Data"
,Query=[
FromDate=_fromDate
,ThroughDate=_throughDate
]
,Headers=[Authorization="<auth string>", #"X-API-Key"="<key>"]
]
)
)
I’ve seen examples with clear credentials too. This example uses the auth string encoded from an account’s credentials, still just basic auth with a dash of obscurity to encourage honest colleagues away from fooling around. Do credentials in the way that is best for you.
In any case, keep those access scopes tight. Integration accounts don’t need maintenance except when reporting changes so there’s very little cost to having a whole fleet that can each only access one PBI report or workspace’s BAQ’s.
This worked for me, fyi. And when the prompt for authentication comes up you need to choose anonymous since its already included.
After seeing some other comments, I’m still wondering if this approach to use rest is going to allow me to easily switch the URL when moving to the cloud. I’m not sure if other options (fabric, or others mentioned) are right for our company size. We only have a handful of reports so far. Maybe some more things for me to look into. Appreciate the tips
Switching URL is easy, that’s actually why I split off RelativePath. The URL root is static, mostly. BAQ only requires updating one bit, unless there are parameters, and those aren’t bad. Whatever you get working, copy that off to a snippet somewhere with the important bits arranged somewhat obviously, and it’s just a matter of cut-paste from here on.
And don’t discount @NateS note! Power query uses M just like Power BI. You can literally paste the same M between Excel and PBI. Excel is super quick and powerful for drilling into the outputs of your M.
/edit, random thought…
YAGNI doesn’t get the time in the spotlight it deserves. It’s good to know the options, a virtue to hold on implementation until you are gonna need it.
Yes, we’ve heard the “how do I get this power bi data into excel”. Its a bit of a swear word in the power bi community I think. I thought @NateS was being sarcastic.
Fair point! I use it (excel Power Query) a ton too for taking tabular data, sucking it up into power query and visualizing from there or pivoting, etc.
My post kinda got overtaken by different conversations here.
Are you on Epicor on prem or cloud currently? How are you planning to use power bi in the same way with the cloud, or how are you currently managing this? Do you publish your reports to Power BI, or just use the desktop client?
Currently for us with on prem, we have the on premise gateway which is the middle man to get our data to power bi published reports. I’m confused on how that might still work with Epicor cloud. Would we still host the gateway on prem to push everything from Epicor Cloud to allow published reports to continue to work.
I split out the other mumbo jumbo as best I could into another thread. Sorry about being part of the distraction, and thanks for the callout @passenger88 !