Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
I have 2 fact tables:
Fact Table 1: Participations
Fact Table 2: Financial Transactions
There is a many to many relationship between Participations and Financial Transactions. This was resolved by creating a bridge table between the two fact tables.
I need to create a measure that perfroms a row by row calculation that simply multiplies an amount from Fact 2 with a column called participation from Fact 1.
I have a bunch of measures that are all written like this, the only thing that changes is the column Amount from Financial Transaction. They produce the correct result.
Measure =
SUMX (
ADDCOLUMNS (
'Financial Transaction Base',
"Participations", CALCULATE(SUM('Participation Base'[Participation]))
),
[Participations] * 'Financial Transaction Base'[Amount]
)
The measures are used in another dax query that is basically a SUMMARIZECOLUMNS and some filters.
In DAX Studio I'm noticing that the DAX engine is performing the calculation of the measures one at a time, which is painfully slow because the calculation of 1 measure takes 1 minute. If i request 20 measures.. you get the idea.
Is there any optimization possible here?
Thanks!
Solved! Go to Solution.
Hi,
Thanks!
I eventually found out that ADDCOLUMNS whas the biggest problem since it takes an entire table in memory and is not fast at all. So I ended up rewriting my measure using SUMMARIZE. This gives me the result in 10 seconds rather than 1 min 30 seconds.
It's still not super fast but it is a lot better than it was previously.
Kind regards,
Gilles
HI @Gilles_DR ,
I'd like to suggest you take a look at the following blog about optimizing iteration functions:
Optimizing nested iterators in DAX
Regards,
Xiaoxin Sheng
Hi,
That didn't get me any further i'm afraid.
I'd like to know why the dax engine doesn't perform the calculation on the same set of materialised data. In DAX studio I can clearly see that for each measure (like the one I posted above) , data is materialised and then the calculation is performed. The materialisation should only happen once.
Kind regards,
Gilles
Hi @Gilles_DR,
In fact, some of dax functions contain internal iteration to looping table records, if you nested multiple of them or use iteration to calculate across multiple tables based on relationship, the calculation amount will multiplier and cause the performance issues.
BTW, you can also take a look at the following links to trace DAX queries and their performance:
Introducing the Power BI Performance Analyzer
Capturing Power BI queries using DAX Studio
Regards,
Xiaoxin Sheng
Hi,
Thanks!
I eventually found out that ADDCOLUMNS whas the biggest problem since it takes an entire table in memory and is not fast at all. So I ended up rewriting my measure using SUMMARIZE. This gives me the result in 10 seconds rather than 1 min 30 seconds.
It's still not super fast but it is a lot better than it was previously.
Kind regards,
Gilles
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
110 | |
94 | |
81 | |
66 | |
58 |
User | Count |
---|---|
150 | |
119 | |
104 | |
87 | |
67 |