top of page
Search

Making Power BI Refresh Faster: Folding, Profiling, and Incremental Refresh

  • Writer: jacob matuzevicius
    jacob matuzevicius
  • Nov 9
  • 3 min read

If you’ve worked with Power BI for a while, you eventually run into the same issue every developer does: dataset refreshes that take forever. When that happens, most people start tweaking DAX measures, but more often than not, the real slowdown happens before DAX ever runs. The real story is how Power BI queries your data source.

Three techniques usually make all the difference: query folding, SQL Profiler, and incremental refresh.


Query Folding


This is where performance tuning begins. When you build steps in Power Query, Power BI tries to push those transformations down to the database so they’re executed there - this process is called query folding. In other words, instead of pulling all the data into Power BI to filter and join locally, it sends optimised SQL statements for the database to handle directly.


But folding isn’t guaranteed to persist. A single extra column, a custom transformation, or a merge done at the wrong point can silently stop folding. When that happens, Power BI downloads more data than it needs, filters it itself, and your refresh time explodes.


You can check for folding by:


  • Right-clicking a step in Power Query → View Native Query

  • If it’s disabled, folding stopped somewhere before that step


A practical approach is to keep your initial transformations “foldable” - filters, joins, and column selections - only then introduce non-foldable operations once you’ve narrowed the data down to a small subset.


Using SQL Profiler


If you’ve never looked at how Power BI communicates with your database, SQL Server Profiler is a great window into that world. It lets you watch every query Power BI sends, how long each takes, and how often they’re repeated.


Spending a few minutes with Profiler can uncover all kinds of hidden inefficiencies - unnecessary queries, repeated lookups, or tables missing indexes. It’s especially valuable when you’re using DirectQuery or working with large imports, where query behaviour directly impacts user experience.


Incremental Refresh


Incremental refresh doesn’t sound glamorous, but it’s one of the biggest quality-of-life improvements in Power BI. Rather than reloading an entire fact table during each refresh, you can refresh only the newest data - say, the past week - while keeping historical data static. That alone can cut refresh times by orders of magnitude.


Setting it up is straightforward: parameterise your date range, configure the incremental refresh policy in table settings, and Power BI takes care of partitioning the data behind the scenes.


If you’re using Microsoft Fabric, incremental refresh also integrates neatly with real-time data features - you can cache historical data and stream recent transactions in parallel, giving you a hybrid model with both speed and freshness.

For those who want to dig deeper, you can use SQL Profiler or SQL Server Management Studio (SSMS) to inspect the partitions created by incremental refresh. This lets you verify exactly how Power BI divides your data and confirm that only the correct partitions are processed during each refresh. It’s a powerful way to understand what’s happening under the hood.


Memory Management and Fabric SKUs


All these optimisations feed into one critical factor: memory consumption.

Every merge, filter, and non-foldable step consumes capacity memory during refresh. As your model grows, those inefficiencies add up - and before long, you’ll need to scale up to a higher Fabric SKU just to keep refreshes from failing.

Often, though, you can avoid that cost. A model that folds queries properly, minimises history reloads, and keeps only the necessary columns can comfortably run on a smaller SKU. That can translate to thousands in saved capacity costs each year without sacrificing any analytical value.


You can monitor your model’s resource use through the Fabric Capacity Metrics app or Activity Logs. Watching those metrics gives you the evidence to decide whether you truly need more capacity — or just a cleaner model design.


Final Thoughts


Improving Power BI refresh performance isn’t about one single trick. It’s about combining folding awareness, profiling insight, and incremental refresh strategy to keep your data pipeline efficient end-to-end. Done right, you not only get faster refreshes but also a more cost-effective, scalable analytics environment.

 
 
 

Recent Posts

See All
Building Scalable AI Systems with Databricks

Many organisations are beginning to feel the limitations of their existing data platforms. Dashboards are delivering insights and data science teams are active, yet deploying machine learning models i

 
 
 

Comments


bottom of page