Spin off of Power-Bi Vs other shtuff

I know that “AI Slop” is frowned upon, but I’m going to be transparent here and say I got this from Co-Pilot because I didn’t have the time to go find academic articles and what not. I just want to get you an answer.

For cloud you can have a copy of your database live that you can query directly, etc. At that point, I’d consider using Fabric if you are already happy with power BI. Found below is some AI answer from a prompt about fabric + BI.

Here’s a clear, practical explanation of why Microsoft Fabric makes more sense than querying directly into Power BI, especially from the perspective of someone managing data pipelines, governance, and performance.


:white_check_mark: Why Microsoft Fabric Instead of Querying Directly into Power BI?

Microsoft Fabric solves several long‑standing limitations of “just connecting Power BI to data sources” and brings major advantages across performance, governance, scalability, and cost efficiency.

Below is a breakdown in plain language.


:small_blue_diamond: 1. Fabric centralizes data into a single, governed lake (OneLake)

Power BI can query data directly from your ERP or databases, but:

  • You end up with data silos (Power BI datasets vs data warehouse vs data lake).
  • Each dataset becomes its own little island.
  • No unified governance or security model.

With Fabric, all compute workloads (BI, lakehouse, warehouse, real-time, ML) operate over OneLake, meaning:

  • One copy of the data
  • Reused by multiple reports and workspaces
  • Consistent security & lineage

In short: Fabric makes Power BI part of a unified analytics stack instead of a standalone reporting tool.


:small_blue_diamond: 2. Better performance than DirectQuery

DirectQuery connects Power BI straight to the transactional source (D365, SQL, etc.). Common issues:

  • Slow visuals due to live queries
  • Heavy load on the source system (bad for ERP performance)
  • Complex DAX often breaks or runs unpredictably
  • Limited modeling capabilities

Fabric improves this by:

  • Storing optimized, compressed parquet files in OneLake
  • Allowing Hybrid Tables, where hot data is real‑time but historical data is cached
  • Offloading query load from transactional systems

Result: Faster reports + less risk to production systems.


:small_blue_diamond: 3. ETL/ELT is easier, more scalable, and reusable

Power BI isn’t an ETL tool.
Fabric is.

It gives you built‑in:

  • Dataflows Gen2
  • Notebooks (Python, Spark)
  • Pipelines
  • Data Warehouse SQL engine
  • Lakehouse SQL & Delta tables

Instead of pushing transformation logic into Power BI (Power Query), you centralize it in Fabric and reuse it everywhere—BI, ML, APIs, other apps.


:small_blue_diamond: 4. Storage separation reduces costs

Power BI’s Import mode stores data inside each dataset.

Problems:

  • Duplicate storage across workspaces
  • Large semantic models = expensive capacity
  • Hard to manage retention & lifecycle

Fabric separates storage from compute, so:

  • Storage = cheap OneLake files
  • Compute = pay only when processing
  • Data can be hydrated by multiple models without duplication

This is especially useful for ERP (Dynamics, SAP, etc.) where tables are massive.


:small_blue_diamond: 5. Better governance, security, and lifecycle management

Power BI alone has:

  • Limited data lineage
  • Limited versioning
  • Hard‑to-control governance
  • Dataset sprawl

Fabric provides:

  • End-to-end lineage
  • Workspace & item-level governance
  • Security inheritance from OneLake
  • Git integration for version control
  • Data sharing without copying

This translates into cleaner administration and fewer headaches as the environment grows.


:small_blue_diamond: 6. Fabric turns Power BI into a fully enterprise-grade analytics platform

Think of Power BI as the visualization layer.

Fabric adds:

  • Data engineering
  • Data science
  • Real-time analytics
  • Data warehousing
  • Lakehouse architecture

All in the same tenant, UI, security model, and compute engine.

Instead of bolting Power BI onto various external data platforms…
Fabric unifies them into one Microsoft-native solution.


:blue_square: Summary: When Fabric Makes More Sense

Microsoft Fabric is better than querying directly in Power BI when:

:check_mark: You have large datasets
:check_mark: You need performance & scalable pipelines
:check_mark: You want to avoid overloading ERP/OLTP systems
:check_mark: You need governance, lineage, and repeatability
:check_mark: You want to use the same data for BI, ML, and analytics
:check_mark: You want to reduce duplication and capacity costs

For small/simple scenarios, direct Power BI queries can still be fine.
But for anything enterprise‑scale, Fabric is the intended solution.

2 Likes

Let me add to Utah’s thoughts.

For immutable records (Any G/L ledger or subledger), why would I requery those again and again? :person_shrugging:

Moving this data to some storage in a BI-friendly format (Iceberg, Parquet, Avro, etc.) would make your queries faster and reduce the time to extract the data. It never changes. You would only extract the new records as needed.

Also, moving the data to another place makes it easier to combine with other sources of immutable data from other systems: MES, CRM, etc.

6 Likes

Thanks Mark, glad that you added those important points. So many good reasons to build out fabric or something of the sort.

I think Fabric is going to be amazing…someday. All the podcasts/blogs/etc. I hear is Fabric is a great concept but it is not as well-developed as a SnowFlake or other BI systems. They love the potential but are waiting for it to come together.

2 Likes

Excel all the way, baby!
Education Flex GIF by FourSquare Training
:rofl:

9 Likes

Thanks for your feedback Mark, should we hold off on maybe adopting?

I have no experience with Snowflake or Databricks right now.

You can start small? You have the option of creating a Warehouse or a Data Lake and nobody seems to have guidance on when to do which and how hard to changed, etc. :person_shrugging:

1 Like

Thanks Mark! I am working with a firm that specializes in data strategy and will share whatever I learn about creating a warehouse or data lake and guidance, etc. Feel free to throw any other questions my way so I can keep them in mind.

2 Likes

We have been using an Azure data factory with Power BI and some SQL views that pull the data into it from Epicor. Seems to work ok.

3 Likes

Thanks, that was really well explained.
I’ve had a good run with Power BI in my previous companies using on-prem iScala and Epicor, so it was honestly a bit sad to see how Power BI works with Epicor SaaS. There’s always some delay, which just doesn’t work when you need real-time dashboards and visuals.

We also don’t have an EDA server in our region, and we’re not really interested in setting one up.

Epicor Data Fabric sounds like it might be the piece I’ve been missing. Curious to hear from you — how much delay does it actually have, and how good is it in real-world use?

1 Like

It’s Microsoft Fabric, it is not an Epicor Product. Real time data may be hard from Epicor itself… may I ask what activity in the ERP system needs to have real time data for your operations?

2 Likes

We’ve played with fabric. It’s a powerful tool, but it has a steep learning curve. Maybe not as steep at a purely code based solution like Snowflake. But it’s still mostly code connected with widgets. Get you’re python on! That’s really the only way we found to process the data. But the advantage is you can make tables that make more sense and are much more efficient for reports dealing with large data sets so you can do analysis and you don’t have to wait 2 minutes to re-run a query every time you want to change one paramenter, like you would with the raw tables in SQL.

Basically, you can schedule “pipelines” which are widget based tasks that can run things sequentially to do things like get data from a SQL server, copy data into tables, process that data then when you have those processed tables use PowerBI with tables to make serve reports. So how often it refreshes it up you and how you schedule it. What you are doing with the data and how efficiently you load and process it will dictate how close to real time you can get that data.

The biggest problem with it is it’s not very intuitive to use, and can be quite buggy. (for example we still can’t figure out a great way to do incremental loads of data with epicor, because it won’t recognize sysRevID. It only will use Date for that, but that doesn’t help when a row can change) When someone like Brent Ozar blasts it over and over on linked in, you know it has fundamental problems. But like @Mark_Wonsil said, it has huge potential if they can make it more useable.

4 Likes

Thanks for sharing @Banderson glad to know that I am not the only one looking at it/leveraging it. I put a few different BI tools in front of our users including Grow, Blue Sky, Fabric and PowerBI, etc.

Yeah. I think the biggest takeaway for someone who doesn’t know what it is, is that fabric is a extension of PowerBI. Fabric is a tool to gather data from ERP and any other data source you want, then asynchronously transform that data into something that PowerBI can digest easier/faster and then you are using powerBI from that point forward. It doesn’t replace PowerBI.

3 Likes

Exactly, well said.

It’s when you’ve created 10 reports, all doing different levels of ETL from 10 different BAQs and you have calculated fields in each of the BAQs named different things that have the same formula and someone asks you to update the formula for X, so you now need to go back track to which BAQ has that custom field, what the change is going to do, etc.

I couldn’t agree more to hold off on stuff till you need it, but in BI, I didn’t realize I needed it till it was too late. And the idea of having to re-do all of our pipelines, queries, etc. when I could have just done it at the start… that’s the hard part.

Just my two cents with BI. I’m not an expert, still learning more, but it only gets more complex the more your data needs grow. That’s also why I posted the thing from AI saying “For small/simple scenarios, direct Power BI queries can still be fine.”

1 Like

A hint of sarcasm, but really I use power query for a lot. PowerBI was too complicated. Then when I went to share stuff, I couldn’t share anything without an extra license. Excel does it all. And its good ole Excel, what’s not to like!?

2 Likes

M$ licensing is the WORST

3 Likes

:100:

1 Like

This is why I haven’t looked hard into Fabric I fear its cost is going to be high we are only small so will likely stick to PowerBI with BAQs