Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
TrivialSin
Regular Visitor

Count number of price changes for a product over time period

Hi.  I am trying to create a measure of volatility for prices of products over  whatever time period is shown in visual, constrined by a slicer.  I thought the below would do it but instead the result is something mystical.  The price table has about 6000 products with about 750 dated records each.   

 

Volatility =
COUNTROWS (
FILTER (
                FILTER (
                        'Price History',
                        'Price History'[Item ID]
                ),
'Price History'[Price] <> CALCULATE( MIN ('Price History'[Price] ), PREVIOUSDAY('Price History'[Date]))
        )
) /
    (COUNTROWS (
    FILTER (
                'Price History',
                'Price History'[Item ID]
            )
)-1)

 

 

Sample data looks like this 

Date               Item ID    Price         
14/09/202121079087
14/09/202121425434
15/09/20212732740.3
16/09/20212386393.3
17/09/20212386393.3
14/09/202161194536
14/09/202161540883
15/09/20216848189.3
16/09/20216501842.3
17/09/20216234734
13/09/202181656332
14/09/202181656332
15/09/202181656332
16/09/20218617291.3
17/09/20218231325


I am trying to get a result for volatility like this


IDChanges   Records  Calculated  Volatility  
235=3/(5-1)0.75
645=4/(5-1)1.00
825=2/(5-1)0.5

 

Help as to what I am actually calculating  using my formula and what I should do to get the right result would be greatly appreciated.

1 ACCEPTED SOLUTION

@TrivialSin I updated the measure to reflect the correct values when broken down by ID:

Measure 3 = 
    VAR __ItemID = MAX('Table14'[Item ID])
    VAR __Table = 
        SUMMARIZE(
            FILTER(ALL('Table14'),[Item ID]=__ItemID),
            [Date],
            "__DailyChanges",COUNTROWS(DISTINCT('Table14'[Price]))-1              // Get number of changes in same day
        )
    VAR __Table1 = 
        ADDCOLUMNS(
            FILTER(ALL('Table14'),[Item ID]=__ItemID),
            "__ChangedSinceYesterday",
                    VAR __CurrentPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])),"__Price",[Price])
                    VAR __YesterdayPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])-1),"__Price",[Price])
                    VAR __MatchTable = INTERSECT(__CurrentPrices,__YesterdayPrices)
                    VAR __CountYesterday = COUNTROWS(__YesterdayPrices)
                    VAR __Matches = COUNTROWS(__MatchTable)
                RETURN
                    SWITCH(TRUE(),
                        ISBLANK(__CountYesterday),0,
                        ISBLANK(__Matches),1,
                        0
                    )
        )
RETURN
    SUMX(__Table,[__DailyChanges]) + SUMX(SUMMARIZE(__Table1,[Date],[__ChangedSinceYesterday]),[__ChangedSinceYesterday])

@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

View solution in original post

13 REPLIES 13
Greg_Deckler
Super User
Super User

@TrivialSin Thought about this a little more and came up with the following, seems to return the correct number of changes, 11. Makes the assumption that if on a current day any price is found that matches any price from yesterday that the price did not change between days but only during the day. If that is not a correct assumption, let me know. So, there is a potential to undercount in the case that a price changes from one day to the next and then the price on that same day changes back to the price from yesterday. That would technically be 2 changes but this would only count it as 1. Otherwise, I think it is accurate, it returns 11 from your sample data which is what I calculated by hand:

Measure 3 = 
    VAR __Table = 
        SUMMARIZE(
            'Table14',
            [Date],
            "__DailyChanges",COUNTROWS(DISTINCT('Table14'[Price]))-1           // Get number of changes in same day
        )
    VAR __Table1 = 
        ADDCOLUMNS(
            'Table14',
            "__ChangedSinceYesterday",
                    VAR __CurrentPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Date]=EARLIER([Date])),"__Price",[Price])
                    VAR __YesterdayPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Date]=EARLIER([Date])-1),"__Price",[Price])
                    VAR __MatchTable = INTERSECT(__CurrentPrices,__YesterdayPrices)
                    VAR __CountYesterday = COUNTROWS(__YesterdayPrices)
                    VAR __Matches = COUNTROWS(__MatchTable)
                RETURN
                    SWITCH(TRUE(),
                        ISBLANK(__CountYesterday),0,
                        ISBLANK(__Matches),1,
                        0
                    )
        )
RETURN
    SUMX(__Table,[__DailyChanges]) + SUMX(SUMMARIZE(__Table1,[Date],[__ChangedSinceYesterday]),[__ChangedSinceYesterday])

 


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

Hi Greg. I will be spending several hours going through your last response to see how that works. I think this solution only accounts for having one product though. The formatting in the table on my original post was poor. I have many products in the price table so there was actually an "Item ID" column between date and price.

I am using data from a game as a source of motivation to learn how I can utilise Power BI in my workplace environement.  This is in combination with The Definitve Guide to Dax and the wealth of internet material (Just added your book to my digital shelf on Kindle as well) .    I have made good progress but I just cant get my head round how to achieve the result for these price changes. This is the first time I have had to reach out for help and your time in responding is greatly appreciated. I wont just cut n paste an answer and then forget about it. I will disect the solution to see how it was achieved.

My current reading makes me think RankX needs to be involved somewhere. My original thoughts were that I would need to create a filtered table containing the records (This could be up to 750 rows spanning the last 18 months) for the product( ItemID) and then compare the current row against the previous row for each row in the filtered table, counting the rows where the price is not equal.  

 

To give some context, the volatility figure is to end up as a column in my current learning experiment below where Item Name maps to an Item ID.

  test4.png

daxer-almighty
Solution Sage
Solution Sage

@TrivialSin 

 

Your data does not allow to calculate the number of changes. For the simple reason that one day can have multiple prices and you don't indicate the temporal order of those prices. If there are 3 prices in a day for a product, then you can have the following cases (just showing some relevant ones):

1) A, A, B -> 1 change

2) A, B, A -> 2 changes

 

Even if you have 2 prices in a day, it's still not enough to say what the number of changes is. That's because of such cases:

1) DayBefore: A, CurrentDay: A, B  -> 1 change

2) DayBefore:A, CurrentDay: B, A   -> 2 changes

 

So, this problem is not well defined.

 

On top of that, your caclulations seem to be incorrect for the table you show.

 I have no aversion to critique but could you also point me toward something positive to aid my learning?  

VahidDM
Super User
Super User

@TrivialSin 
Does your result table show the correct value under the Changes column? For Item 6, the change is 1; how did you calculate that?

 

Appreciate your Kudos✌️!!

It didnt show the right value.  Amended.  Late night.  Thank you.

@TrivialSin I updated the measure to reflect the correct values when broken down by ID:

Measure 3 = 
    VAR __ItemID = MAX('Table14'[Item ID])
    VAR __Table = 
        SUMMARIZE(
            FILTER(ALL('Table14'),[Item ID]=__ItemID),
            [Date],
            "__DailyChanges",COUNTROWS(DISTINCT('Table14'[Price]))-1              // Get number of changes in same day
        )
    VAR __Table1 = 
        ADDCOLUMNS(
            FILTER(ALL('Table14'),[Item ID]=__ItemID),
            "__ChangedSinceYesterday",
                    VAR __CurrentPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])),"__Price",[Price])
                    VAR __YesterdayPrices = SELECTCOLUMNS(FILTER(ALL('Table14'),[Item ID]=__ItemID && [Date]=EARLIER([Date])-1),"__Price",[Price])
                    VAR __MatchTable = INTERSECT(__CurrentPrices,__YesterdayPrices)
                    VAR __CountYesterday = COUNTROWS(__YesterdayPrices)
                    VAR __Matches = COUNTROWS(__MatchTable)
                RETURN
                    SWITCH(TRUE(),
                        ISBLANK(__CountYesterday),0,
                        ISBLANK(__Matches),1,
                        0
                    )
        )
RETURN
    SUMX(__Table,[__DailyChanges]) + SUMX(SUMMARIZE(__Table1,[Date],[__ChangedSinceYesterday]),[__ChangedSinceYesterday])

@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

Amazing.  That gives exactly what I was looking for.  I have removed the "ALL" from the various places as I wanted to be able to see the volatility over the filtered period based on a date slicer on the page.   As the "live" table has 6000 item IDs and 4.6M records, I didnt fancy my chances on how many calculations that would take to complete a full run either.   I will wait a few more weeks until I have absorbed significantly more understanding before tackling optimisation.  Thank you so much for  this.    Hope your book is a good read.  I will make sure I leave a review. 

Greg_Deckler
Super User
Super User

@TrivialSin Can you just do a COUNTROWS of the DISTINCT prices? Like:

Number of Price Changes = COUNTROWS(DISTINCT(SELECTCOLUMNS('Table',"__Price",[Price])))

@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

Unfortunately not as many items have a sine wave pattern and so the result from distinct gives a much lower figure than expected.  

@TrivialSin So what about a DISTINCT of the date and price columns together?

Number of Price Changes = COUNTROWS(DISTINCT(SELECTCOLUMNS('Table',"__Price",[Price],"__Date",[Date])))

 


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

I know you have solved my original puzzle but  just wondering what if, I only ever had 1 price per ID, per day, could I follow my original train of thought and create a filtered table with only the entries for the current Item ID and then use your formula above to isolate the changes?

@TrivialSin If you only had 1 price per ID per day then that becomes more of a MTBF sort of thing where you just need to check the "previous" row. See my article on Mean Time Between Failure (MTBF): http://community.powerbi.com/t5/Community-Blog/Mean-Time-Between-Failure-MTBF-and-Power-BI/ba-p/3395....
The basic pattern is:
Column = 
  VAR __Current = [Value]
  VAR __PreviousDate = MAXX(FILTER('Table','Table'[Date] < EARLIER('Table'[Date])),[Date])

  VAR __Previous = MAXX(FILTER('Table',[Date]=__PreviousDate),[Value])
RETURN
  __Current - __Previous


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
Mastering Power BI 2nd Edition

DAX is easy, CALCULATE makes DAX hard...

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors