Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
tm85
Frequent Visitor

Improve Performance - Measure heavy report

Edit: The switch is definitely the main culprit. By simply nesting the switch, performance has improved drastically. Almost instant updates to visuals when slicing or changing the switch measure table inputs. 

Example, instead of switching between 25 items, I first switch between 5 main categories (costs, volumes, ratios, etc.). Nothing else and the report is now actually totally usable. 

Hi all, 

 

My company has a financial report that has a lot of calculations that involve multiple Scenarios/Versions (Actuals, Last Year, Budget, Forecast, vs Bud, vs Fcst, vs LY)

 

The basic jist is that I have costs and volumes... Let's say  a, b, c, d and $a, $b, $c, $d. There are more dimensions like countries and products and those heirarchies I've separated into dimension tables. 

My data is currently normalized meaning that it looks like this:

 

KPIScenarioValue
aActuals1
bActuals2
cActuals3
dActuals4
$aActuals5
$bActuals6
$cActuals7
$dActuals8
aLast Year9
bLast Year10
cLast Year11
dLast Year12
$aLast Year13
$bLast Year14
$cLast Year15
$dLast Year16
aForecast17
bForecast18
cForecast19
dForecast20
$aForecast21
$bForecast22
$cForecast23
$dForecast24

 

My approach was to use tabular editor and create a series of base measures:

a (Actuals) = Calculate(sum(values),Scenario = "Actuals",KPI="a")  

 

Then I need to have the ratios between them, which would be unit costs, as well as versus budget, versus forecast, etc. So, these are all just measures of measures. For example, $a/a (Actuals) = [$a (Actuals)] / [a (Actuals)]  or if it were vs Budget then we'd simply have [a (Actuals)] / [a (Budget)]. 

 

The main reason for this is that later on we need to calculate how these KPIs are affecting overall cost. Some are fairly standard like a price impact  (delta price)*Quantity Actuals. Others are trying to isolate these unit costs above and they can get slightly more complex like 

IFERROR([a/b (vs Budget)]*[$a/a (LY)]*[b (CY)],0)

 

Point is, I need to constantly be calculating across scenarios, so I see no way of being able to use my data with a simple matrix. Once all the formulas were built, I wrote a series of measures with switches that reacted to an auxilary table

Something like
a = VAR scenario= SELECTEDVALUE(_PointOfView[Point of View],"CY")
RETURN
SWITCH(TRUE(),
scenario="CY",[a (CY)],
scenario="LY",[a (LY)],
scenario="FCST",[a (FCST)],
scenario="BUD",[a (BUD)],
scenario="vs LY",[a (vs LY)],
scenario="vs FCST",[a (vs FCST)],
scenario="vs BUD",[a (vs BUD)],
blank())

 

and then the final version called Value, simply did a switch bases on which of these KPIs it was. So, if it's "a" then choose [a]. So, in the end I get each calculation exactly where i need it, and i could add slicers that allow the user to choose whether they want to see Values/Versus and based on certain scenarios. Kind of a DIY pivot table since they love tables. Then the waterfall charts i could limit the options to simply "vs Fcst" "vs Bud" "vs Last Year" since those impacts abovementioned are always comparing actuals to something else. 

 

Well, the report works. The values come as expected. My matrix takes about 5 seconds to refresh, not too bad, the waterfall chart around the same, but on another page i tried to show 3 visuals - By Period (line), and By Country and By Product as bar charts. This takes around 15 seconds. 

 

I tried to change my data to pivoting the scenario into 4 columns - Actuals, Last year, Bud, Forecast, and I saw some improvement. 

 

Am I taking the approach wrong from the beginning? Do you think pivoting the KPIs would work better? There are 20 base KPIs coming from SQL (10 cost, 10 associated volumes), so that would mean 20 columns... Should I break my scenarios into separate facts tables? etc.. I'm not sure what i could possibly do. In the past this was in Excel with a lot of getpivotdata type formulas and while  a PITA, it's so far faster than Power BI. The data is around 500k rows, because it's supply chain it has flows like 1 country to anohter, and aroudn 15 different products. This along wiht 12 fiscal periods tends to get the row number pretty high.

 

Any tips or advice would be appreciated. Thanks in advance. 

End result in matrix format is something like this. 

  ActualsLast YearBudFcstvs Last Yearvs Bud
Volumes       
 a######
 b######
 c######
Costs       
 $a######
 $b######
 $c######
Ratios       
 a/b######
 $a/$b######
Unit Costs      
 $a/a######
 $b/b######
Impacts       
 price######
 volume######
 etc######
 etc######
1 ACCEPTED SOLUTION
Anonymous
Not applicable

Without full information about the model and data, it's not possible to give you meaningful advice. One would have to use some tools to figure out where the bottleneck is and then address it. For instance, we don't know anything about the relationships, the structures of your tables, the cardinalities of attributes... Well, almost nothing. One thing I can tell you, though, is that using IFERROR does have a negative impact. Also, SWITCHes can be a huge bottleneck depending on how the engine chooses to calculate them. If it calculates eagerly, then you're out of luck here. But I don't think this SWITCH is even needed; slicing and dicing by the right attributes should be more than enough.

 

But to troubleshoot one would have to be able to work directly with the model. Also, you should bear in mind that your hardware does have a role to play as well.

 

Best
D

View solution in original post

7 REPLIES 7
Anonymous
Not applicable

Without full information about the model and data, it's not possible to give you meaningful advice. One would have to use some tools to figure out where the bottleneck is and then address it. For instance, we don't know anything about the relationships, the structures of your tables, the cardinalities of attributes... Well, almost nothing. One thing I can tell you, though, is that using IFERROR does have a negative impact. Also, SWITCHes can be a huge bottleneck depending on how the engine chooses to calculate them. If it calculates eagerly, then you're out of luck here. But I don't think this SWITCH is even needed; slicing and dicing by the right attributes should be more than enough.

 

But to troubleshoot one would have to be able to work directly with the model. Also, you should bear in mind that your hardware does have a role to play as well.

 

Best
D

Hi, 

 

Appreciate your response. I would have to create a replica linked to Excel with some fake data to be able to show you as this contains confidential information. I'm going to also try the solution of using 4 fact tables instead of 1 to see if that improves performance. 

Will take some time for me to prepare as i'm also doing my day job, bear with me and i'll send something your way. Thanks!

 

Anonymous
Not applicable

By the way... 4 separate fact tables could in fact speed up calculations.

Best
D
Anonymous
Not applicable

Hi there.

Would you not be able to create something in a PBI file, please? Excel, as far as my experience with it goes, is much worse to work with than PBI. It's sluggish and does not have all the functions that PBI does. I only work with it when absolutely no other way is possible.

Sorry about it.

Best
D

PBIX sample sent in private message, linked to google drive. 

AllisonKennedy
Super User
Super User

DAX will always slow down the filtering and visual load, whereas anything you can do in Power Query M will slow down refresh, but optimize the slicer selections and interactivity. Most data models I work with have forecast table, budget table, and actual table, with separate columns for $ and volume, which typically works pretty well, so while it will be a massive effort to re setup your calculations, I think it will be worth it in the long run.


Please @mention me in your reply if you want a response.

Copying DAX from this post? Click here for a hack to quickly replace it with your own table names

Has this post solved your problem? Please Accept as Solution so that others can find it quickly and to let the community know your problem has been solved.
If you found this post helpful, please give Kudos C

I work as a Microsoft trainer and consultant, specialising in Power BI and Power Query.
www.excelwithallison.com

Ok. So, in your approach I would end up with 4 fact tables or 3 if I want to keep actuals together. 

 

Right now, the cardinality (i believe this is the term) is around 80 when you take into account that it's 20 KPIs and 4 scenarios in the data that need to be isolated. In your approach, it would be around 10 across 2 value columns and 4 tables. 

 

I created all the formulas in Excel and bulk loaded with a script in tabular editor, so tomorrow I can probably dedicate 3-4 hours to switch it up. 

 

So, definitely creating 20 data columns would be ridiculous? we can agree on that, correct? I just assumed that 1 value column would be quickest as we've noted in SQL/Access, but Power BI seems to be a different beast. I figured since i had the same granularity having 1 table would be faster than 4, but I truly don't know. 

 

In the end, I can't really get away from the DAX it seems as I need to combine what normally in Excel could be Calculated Items and Fields, but no way of doing that now inside one common pivot without playing with switches, etc., correct?

 

I much prefer a larger load time in Power Query than to have my users being held up 10 seconds with each slicer selection. The bulk of our ELT is being done in SQL On-Prem and I'm then using power query against a "final table" ... if that takes 5 minutes but my slicers take 2 seconds and my directors are happy it's well worth the extra load time. 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.