Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hello,
I have an Import SQL query containing manufacturing data for multiple locations. I would like to be able to display when a location reached a specific goal (for example, return the date when Plant A hits 40 million cumulative units). I have a Measure that returns a date (see below), however it takes 3 minutes for the Dax query to complete. Can anyone suggest a more efficient method?
My data is as below.
Year | Month | Day | Location | # Units - Cumulative |
2021 | February | 21 | Plant A | 39,980,291 |
2021 | February | 22 | Plant A | 39,985,482 |
2021 | February | 23 | Plant A | 39,991,594 |
2021 | February | 24 | Plant A | 40,002,899 |
2021 | February | 25 | Plant A | 40,008,331 |
I have a Measure to return a Cumulative Total (seen above as [# Units - Cumulative]):
# Units - Cumulative = CALCULATE (
SUM ( 'Query1'[Units] ),
FILTER (
ALL ( 'key_DateTable' ), 'key_DateTable'[Date] <= MAX ( 'key_DateTable'[Date] )
)
)
I have used the following Measure, and it works, but it also takes ~3 minutes per goal and per manufacturing location. That's simply too long to wait for the data.
# Milestone - 40 mill = MINX (
DISTINCT ( Query1[PlantID] ),
CALCULATE (
VAR FirstDateOverTarget =
FIRSTNONBLANK ( key_DateTable[Date], IF ( Query1[# Units - Cumulative] >= 40000000, 1 ) )
RETURN
FirstDateOverTarget
)
)
Ideally, I would like to (quickly) be able to generate a table such as this one:
Location | 25 million units | 40 million units | 70 million units |
Plant A | October 20, 2016 | February 4, 2021 | TBD (not yet achieved) |
Plant B | December 5, 2017 | March 12, 2020 | June, 4, 2022 |
Plant C | August 19, 2019 | January 19, 2022 | TBD (not yet achieved) |
I would rather not create a Calculated Column (my understanding is that this will impact my performance), but I'm open to suggestions. For what it's worth, I have roughly 80-100 different manufacturing locations and my daily data goes back to the mid 1990's.
Any feedback is greatly appreciated. Thank you!
Hi,
Here's another way. See if this works faster. I have taken the threshold as 7 units. Hope this helps. Download the file from here.
@Ashish_MathurHi Ashish! Thank you for the response.
Your solution works, but it seems to take roughly the same amount of time as my previous attempts.
My query has 3.5 million rows (dating back to 1997) and the thresholds I am trying to find are in the 25/50/100 million unit range. I know you can't see my data itself, but, in general, is a lengthy calculation simply an inevitability with a table that large?
Hi @bigchippah ,
As far as I know large data model will take more time in calculation in Power BI. You may optmize your data model to reduce the time of calculation.
For reference:
Optimization guide for Power BI
Best Regards,
Rico Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
You are welcome. Sorry but i do not know how to reduce the processing time.
User | Count |
---|---|
106 | |
85 | |
81 | |
73 | |
71 |
User | Count |
---|---|
111 | |
102 | |
97 | |
74 | |
67 |