top of page

Learning Series: Maximizing Cost Efficiency in Fabric Premium Capacity

  • RedCloud
  • 5 days ago
  • 6 min read

Note: While this post focuses on Fabric Premium Capacity, the principles also apply to legacy Power BI Premium Capacity. If your organization uses only Premium-per-User licenses, this guide may not be relevant for you.
Note: While this post focuses on Fabric Premium Capacity, the principles also apply to legacy Power BI Premium Capacity. If your organization uses only Premium-per-User licenses, this guide may not be relevant for you.

Many organizations turn to Power BI for self-service Business Intelligence — and for good reason. Power BI offers a powerful, intuitive platform to transform raw data into insightful reports quickly. With its ease of connecting data, creating metrics, and building compelling visuals, it empowers users to tell their data stories efficiently.


But here’s the catch: managing Premium Capacity can become a costly endeavor. Without careful optimization, your organization might find itself shelling out an extra $5,000 every six months just to keep up with the growing demand for Premium Capacity.



Why Is Fabric Premium Capacity So Expensive?

Fabric Premium Capacity is a valuable but pricey resource. Unless everyone in your organization has a Per-User license, Premium Capacity is necessary. This blog will walk you through practical ways developers can optimize their use of Fabric Premium Capacity — potentially saving your organization thousands on Microsoft bills each month.


What Is Fabric Premium Capacity?

When you interact with a Power BI report — opening, refreshing, or filtering it — compute power is required to process those actions. If you’re working locally in Power BI Desktop, your computer handles the workload, along with some assistance from connected data sources. However, once you publish a report to an online Workspace, your local machine can no longer provide this processing power.


This is where Fabric Premium Capacity steps in. Purchasing Premium Capacity allows your organization’s Workspaces to tap into Microsoft’s cloud computing resources, hosted in data centers around the globe (often in remote or secure locations).


With Fabric Premium Capacity, your reports can be refreshed, queried, and accessed anytime, anywhere — but just like your laptop can only handle so many Excel files before slowing down or crashing, Premium Capacity has its limits. These limits depend on the amount of capacity your organization purchases.

For detailed pricing information, see Microsoft’s official Fabric Capacity Pricing page.


Why Are You Probably Buying More Capacity Than You Need?

Every activity in the Power BI Service consumes capacity, but two operations especially — Refreshes and DAX Queries — tend to consume more capacity than necessary, often without being optimized.


In the sections below, we’ll explore simple strategies to identify the root causes of Premium Capacity overuse and offer easy fixes. These methods can help your developers reclaim unused capacity, improving performance and reducing costs — freeing up resources to support more reports or users without upgrading your plan.


Refreshes: Key Areas to Optimize for Premium Capacity Savings

When it comes to optimizing Power BI Semantic Model refreshes, there are three easy but impactful targets to focus on:

  • Unused Columns

  • Unused Tables

  • Non-Aggregated Tables


At first glance, tackling unused columns and tables might seem overwhelming. Many organizations have reports and data models built years ago, passed through multiple owners, and often aren’t fully understood by current developers. Sound familiar? You’re not alone!


The good news is—you don’t need to painstakingly review every single table and column in your model to identify the problem areas.


Instead of manually digging through every detail, you can analyze the model’s metadata within the VertiPaq engine file to pinpoint the largest elements consuming the most capacity during refreshes. This approach helps you target the biggest resource hogs without getting lost in unnecessary complexity.


Let’s put this into context with an example. Suppose an organization has a report that provides them with insights on how much Premium Capacity is used in their other Power BI Reports. Without even looking at the Report itself, I can quickly scan the model using DAX Studio to understand the model’s metrics. 



There’s a lot to take in from this screenshot, but let’s focus on two key columns: Name and %DB. The Name column lists each table (with a dropdown to see its columns), while %DB shows the size of the table or column relative to the entire model.


Two tables stand out as particularly large: CUDetail_Primay and TimePoints_Generated. Digging into TimePoints_Generated, the TimePoint column alone accounts for 23.45% of the model’s size. This quick analysis took just minutes and already highlights major contributors to Premium Capacity use during refreshes.


From here, we can target only these large tables and columns instead of combing through everything. If we identify even one or two large, unused elements to remove without impacting the report, we shrink the model and reduce the Premium Capacity needed for refreshes.


Now, suppose TimePoint is essential, for example, for a visual showing Premium Capacity usage by the hour. Let’s examine the table itself:



The ‘TimePoint’ column has values for every 30-second interval throughout the day, resulting in many more rows than needed.  Since we only need hourly data, we’re pulling in about four times the rows necessary. This is a perfect example of a non-aggregated table causing high capacity consumption during our Refresh. To fix this, we can aggregate the data by hour at the source or use the existing ‘Start of Hour’ field instead. Either way, the table size could shrink to a quarter of its current size.


These three optimization targets also apply when optimizing queries, which we’ll cover next.


DAX Queries: Why Slow Reports Drain Premium Capacity

If you’ve used Power BI, you’ve probably encountered slow-loading reports—these are usually caused by heavy DAX queries. Every visual element—bars, points, maps, matrices—relies on DAX queries that take input parameters, translate them into queries, and fetch data from the Semantic Model.


Sounds straightforward, but when you layer in multiple measures, relationships, and filters, these queries can balloon in complexity. Power BI does some optimization automatically, but there’s still more you can do to reduce Capacity use.


The main focus here is on measures—the calculations developers create to display metrics like “total sales” or “most recent purchase date.” While many developers write measures to get the right result quickly, they often overlook performance.


This section won’t dive into DAX best practices, but will provide a high-level way to spot inefficient measures and optimize your Semantic Models to save Premium Capacity.


Let’s say we have a stacked column chart in the same report, showing Fabric Premium Capacity usage broken down by day:



This visual delivers the needed info—Capacity Consumption by day, broken down by operation type. But users report it takes over 2 minutes to load. Checking the visual’s setup, we find it pulls data from just one table, using a few columns and a couple of measures:


From our earlier VertiPaq analysis, we know the table MetricsByItemandOperationandDay isn’t very large, so the columns likely aren’t causing the slow load. That shifts our focus to the measures.


We spot that the DayOverDayPercentChange measure is a time calculation, which can be resource-intensive. Plus, it’s only used in the tooltip—meaning it’s not essential. To test its impact, we compare the visual’s performance with and without this measure.


To do this, we duplicate the .PBIX file, limit the data volume, and use Power BI’s Performance Analyzer to measure the difference.


*Note: limiting the amount of data being queried by your model for testing is useful in saving time, but may not provide accurate results. It is recommended to test your model again with the full model before deploying. 


With ‘DayOverDayPercentChange’

Without ‘DayOverDayPercentChange’


Removing the measure improves visual load time by about 7%. While this might seem minor in Power BI Desktop, once the full data is published, every user benefits from faster load times. Faster loading means less Capacity used. Although it’s not a perfect one-to-one link, for our purposes, we can treat it as such.


Sometimes, you might not want to remove the measure entirely—perhaps an executive still needs it. But there’s still room to optimize. By reviewing and refining the DAX code, you can test different versions to find a more efficient solution using the same process. Unpacking

DAX optimization is a big topic—actually, it’s a whole book on its own.


For more learning on optimizing DAX, please refer to the masters of DAX, Marco Russo and Alberto Ferrari - The Definitive Guide to DAX – 2nd Edition - SQLBI.


If you’ve made it to this point and are thinking, “Hey, I want my organization to optimize our use of Premium Capacity, but I need more help,” that’s where we come in! Redcloud Consulting’s expert Engineers and Analysts can lead your optimization efforts—saving you time, enhancing your report users’ experience, and keeping those Microsoft bills from skyrocketing.


Contact our Data & AI Practice lead today to get started

Comentarios


bottom of page