Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

DAX optimization

Hi Experts

 

I am trying to optimize the following DAX measure, not sure if there is a better way to re writre the current measure as the visual assoicated with the measure is close to timing out i.e. 225 seconds (the not enough resources available error)

 

Cumulative Complaint Rate_ =
VAR Cumulative_Complaints =
CALCULATE (
[TotalComplaintswithTrends],
FILTER (
ALLSELECTED ( PMS_FINANCIAL_PDS ),
PMS_FINANCIAL_PDS[Month Start] <= MAX ( PMS_FINANCIAL_PDS[Month Start] )
)
)
VAR Cumulative_Sales =
CALCULATE (
[TotalSalesTrend],
FILTER (
ALLSELECTED ( PMS_TM1_SALES_VOLUME ),
[Month Start] <= MAX ( PMS_FINANCIAL_PDS[Month Start] )
)
)
RETURN
IF (
DIVIDE ( Cumulative_Complaints, Cumulative_Sales, 0 ) <> 0,
DIVIDE ( Cumulative_Complaints, Cumulative_Sales, 0 ))

 

1 ACCEPTED SOLUTION
v-alq-msft
Community Support
Community Support

Hi, @Anonymous 

 

Here are some suggestions about optimize your model.

  • Remove unused tables or columns, where possible. 
  • Avoid distinct counts on fields with high cardinality – that is, millions of distinct values.  
  • Take steps to avoid fields with unnecessary precision and high cardinality. For example, you could split highly unique datetime values into separate columns – for example, month, year, date, and so on. Or, where possible, use rounding on high-precision fields to lower cardinality – (for example, 13.29889 -> 13.3).
  • Use integers instead of strings, where possible.
  • Be wary of DAX functions, which need to test every row in a table – for example, RANKX – in the worst case, these functions can exponentially increase run-time and memory requirements given linear increases in table size.
  • When connecting to data sources via DirectQuery, consider indexing columns that are commonly filtered or sliced again. Indexing greatly improves report responsiveness.  

 

For further informatiom, please refer to the official document .

 

Best Regards

Allan

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

6 REPLIES 6
v-alq-msft
Community Support
Community Support

Hi, @Anonymous 

 

If you take the answer of someone, please mark it as the solution to help the other members who have same problems find it more quickly. If not, let me know and I'll try to help you further. Thanks.

 

Best Regards

Allan

 

v-alq-msft
Community Support
Community Support

Hi, @Anonymous 

 

Here are some suggestions about optimize your model.

  • Remove unused tables or columns, where possible. 
  • Avoid distinct counts on fields with high cardinality – that is, millions of distinct values.  
  • Take steps to avoid fields with unnecessary precision and high cardinality. For example, you could split highly unique datetime values into separate columns – for example, month, year, date, and so on. Or, where possible, use rounding on high-precision fields to lower cardinality – (for example, 13.29889 -> 13.3).
  • Use integers instead of strings, where possible.
  • Be wary of DAX functions, which need to test every row in a table – for example, RANKX – in the worst case, these functions can exponentially increase run-time and memory requirements given linear increases in table size.
  • When connecting to data sources via DirectQuery, consider indexing columns that are commonly filtered or sliced again. Indexing greatly improves report responsiveness.  

 

For further informatiom, please refer to the official document .

 

Best Regards

Allan

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Pragati11
Super User
Super User

Hi @Anonymous ,

 

One thing I can note on your DAX expression is the RETURN part. Do you really need that IF condition on DIVIDE function?

Without IF condition, it should do same. Try replacing RETURN part in the expression with:

 

RETURN
DIVIDE ( Cumulative_Complaints, Cumulative_Sales, 0 )
 
Let me know if this change gives you the similar output as before or not; also if it improves any performance.
 
If this helps please give Kudos and mark it as a Solution! 🙂
 
Thanks,
Pragati

Best Regards,

Pragati Jain


MVP logo


LinkedIn | Twitter | Blog YouTube 

Did I answer your question? Mark my post as a solution! This will help others on the forum!

Appreciate your Kudos!!

Proud to be a Super User!!

Anonymous
Not applicable

Thanks Pragati11. just trying out your suggestion.

amitchandak
Super User
Super User

If this month start is a date. Create a date table move the calcultaion there.

Example

 

Cumm Sales = CALCULATE(SUM(Sales[Sales Amount]),filter(all(date),date[date] <=maxx(date,date[date])))

See if it improves performance

 

To get the best of the time intelligence function. Make sure you have a date calendar and it has been marked as the date in model view. Also, join it with the date column of your fact/s. Refer :
https://radacad.com/creating-calendar-table-in-power-bi-using-dax-functions
https://www.archerpoint.com/blog/Posts/creating-date-table-power-bi
https://www.sqlbi.com/articles/creating-a-simple-date-table-in-dax/

 

Anonymous
Not applicable

hi Amit

 

here is my date table. i have shown where i want the cumulative formula to start and end at for each of the 3 periods. I have included the excel formula on how the result is to be calculated. you could use the rolling month column??? this splits the periods up too

Canadapvalue.PNG

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.