Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
bigchippah
Helper I
Helper I

(Quickly) Return date when goal value was achieved

Hello,

 

I have an Import SQL query containing manufacturing data for multiple locations.  I would like to be able to display when a location reached a specific goal (for example, return the date when Plant A hits 40 million cumulative units). I have a Measure that returns a date (see below), however it takes 3 minutes for the Dax query to complete.  Can anyone suggest a more efficient method?

 

My data is as below.

YearMonthDayLocation# Units - Cumulative

2021

February21Plant A39,980,291

2021

February22Plant A39,985,482

2021

February23Plant A39,991,594

2021

February24Plant A40,002,899

2021

February25Plant A40,008,331

 

I have a Measure to return a Cumulative Total (seen above as [# Units - Cumulative]):

 

 

# Units - Cumulative = CALCULATE (
    SUM ( 'Query1'[Units] ),
    FILTER (
        ALL ( 'key_DateTable' ), 'key_DateTable'[Date] <= MAX ( 'key_DateTable'[Date] )
    )
)

 

 

 

I have used the following Measure, and it works, but it also takes ~3 minutes per goal and per manufacturing location.  That's simply too long to wait for the data.

 

 

# Milestone - 40 mill = MINX (
    DISTINCT ( Query1[PlantID] ),
    CALCULATE (
        VAR FirstDateOverTarget =
            FIRSTNONBLANK ( key_DateTable[Date], IF ( Query1[# Units - Cumulative] >= 40000000, 1 ) )
        RETURN
            FirstDateOverTarget
    )
)

 

 

 

Ideally, I would like to (quickly) be able to generate a table such as this one:

Location25 million units40 million units70 million units
Plant AOctober 20, 2016February 4, 2021TBD (not yet achieved)
Plant BDecember 5, 2017March 12, 2020June, 4, 2022
Plant CAugust 19, 2019January 19, 2022TBD (not yet achieved)

 

I would rather not create a Calculated Column (my understanding is that this will impact my performance), but I'm open to suggestions.  For what it's worth, I have roughly 80-100 different manufacturing locations and my daily data goes back to the mid 1990's.

 

Any feedback is greatly appreciated.  Thank you!

4 REPLIES 4
Ashish_Mathur
Super User
Super User

Hi,

Here's another way.  See if this works faster.  I have taken the threshold as 7 units.  Hope this helps.  Download the file from here.

Untitled.png


Regards,
Ashish Mathur
http://www.ashishmathur.com
https://www.linkedin.com/in/excelenthusiasts/

@Ashish_MathurHi Ashish!  Thank you for the response.

 

Your solution works, but it seems to take roughly the same amount of time as my previous attempts.

 

My query has 3.5 million rows (dating back to 1997) and the thresholds I am trying to find are in the 25/50/100 million unit range.  I know you can't see my data itself, but, in general, is a lengthy calculation simply an inevitability with a table that large?

Hi @bigchippah ,

 

As far as I know large data model will take more time in calculation in Power BI. You may optmize your data model to reduce the time of calculation.

For reference:

Optimization guide for Power BI

 

Best Regards,
Rico Zhou

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

You are welcome.  Sorry but i do not know how to reduce the processing time. 


Regards,
Ashish Mathur
http://www.ashishmathur.com
https://www.linkedin.com/in/excelenthusiasts/

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.