Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Hello everyone,
I have read several posts and some articles about the optimization of a Dax measure but I am having some issue to put these advices into fruition. I am trying to had a measure flagging Inventory Items (direct SSAS connection) into a table. Here is the measure i campe up with:
Having a large amount of item, it takes ages to show the values. Could you help me with some tips on how I could enhance my measure formula?
Best regards,
Marc
Solved! Go to Solution.
Hi @Anonymous
It's worth using variables for any expression which is computed multiple times, that way, they're only calculated once.
I'd also reccommend using the SWITCH function in place of nested IF statements. It doesn't improve performance but is easier to read.
See if this helps at all:
Cluster =
VAR SumDSI = SUM ( INVENTORY[DSI] )
VAR Result =
SWITCH (
TRUE (),
SUM ( INVENTORY[Annual_Usage_(Current_Year)] ) = 0, "Obsolete",
DATEDIFF ( SUM ( 'LAST INVOICE DATE'[Date] ), TODAY (), DAY ) >= 270, "Obsolete Warning",
SumDSI >= 361
&& SumDSI <= 720, "Excess Warning",
SumDSI >= 721, "Excess",
"Working"
)
RETURN Result
Best regards,
Martyn
If I answered your question, please help others by accepting it as a solution.
Im currenly having an issue with VAR data. When i try to run this with all that dax measurements i run out of memory. As you can see i only have 7 measurments in this because i couldnt figure out which measurement was causing me the issue. Unfortunately, its this one. Does anyone know of a diffrent way of attiving this? basically i want to subtract the previous dimension with the current dimension. these are based on time stamps. so in excell it would be =ABS ( A2-A3). Something like that but because power bi is not set up like this i had to make another equation basically to get the dates in order. Which is the first picture.
Any help would be appreciated
Hi @Anonymous
It's worth using variables for any expression which is computed multiple times, that way, they're only calculated once.
I'd also reccommend using the SWITCH function in place of nested IF statements. It doesn't improve performance but is easier to read.
See if this helps at all:
Cluster =
VAR SumDSI = SUM ( INVENTORY[DSI] )
VAR Result =
SWITCH (
TRUE (),
SUM ( INVENTORY[Annual_Usage_(Current_Year)] ) = 0, "Obsolete",
DATEDIFF ( SUM ( 'LAST INVOICE DATE'[Date] ), TODAY (), DAY ) >= 270, "Obsolete Warning",
SumDSI >= 361
&& SumDSI <= 720, "Excess Warning",
SumDSI >= 721, "Excess",
"Working"
)
RETURN Result
Best regards,
Martyn
If I answered your question, please help others by accepting it as a solution.
Im currenly having an issue with VAR data. When i try to run this with all that dax measurements i run out of memory. As you can see i only have 7 measurments in this because i couldnt figure out which measurement was causing me the issue. Unfortunately, its this one. Does anyone know of a diffrent way of attiving this? basically i want to subtract the previous dimension with the current dimension. these are based on time stamps. so in excell it would be =ABS ( A2-A3). Something like that but because power bi is not set up like this i had to make another equation basically to get the dates in order. Which is the first picture.
Any help would be appreciated
Yes it helps, thank you very much.
Marc
I am having a similar problem, and I've read much of the posts here on optimization, but I must be missing something.
The only big fact table I have is a Date Table - everything else is based on measures and dates. It was working just fine until I created this measure:
which was working fine until I added the [Reserve requirement] measure that provides the total.
Is there a way of optimizing this? It is taking about 2 minutes to load the card visual and almost 3 minutes for the table visual.
Any tips would be so appreciated. I'm finally catching on, but now that other people can see what is possible, I'm going to get more and more complicated problems to solve!
Hi,
I think you should open a new discussion as this one is marked completed.
Marc
Thanks. I've sort of solved this myself by creating a separate table with only January 1 of each year in the dates column, and using this table instead of the central dates table in the SUMMARIZE statement. Now the refresh is down to about 30 seconds.
User | Count |
---|---|
141 | |
113 | |
104 | |
78 | |
64 |
User | Count |
---|---|
136 | |
125 | |
107 | |
70 | |
61 |