Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Dear Powerbi community,
I have been trying to solve the following problem for a few days now, but unfortunately I can't manage it. 😕
So: I have a table "Table1" with stock values, but I only ever get the last stock value entry when the stock has changed.
E.g.
Date | Article | Stockvalue |
18.6.21 | Product A | 5 |
17.6.21 | Product A | 4 |
15.6.21 | Product A | 3 |
18.6.21 | Product B | 4 |
15.6.21 | Product B | 2 |
Now I would like to have a measure "Stockvalue_Measure" that gives me the following end-results in matrix:
Date | Stockvalue_Measure |
18.6.21 | 5+4=9 |
17.6.21 | 4+2(Product 15.6.21)=6 |
16.6.21 | 3 (Prodcut A 15.6.21)+2 (Prodcut B 15.6.21) = 5 |
15.6.21 | 2+3=5 |
I think that I first have to do a precalculation with the variables and then in the second step I have to sum (sumx):
Solved! Go to Solution.
Hello @Jihwan_Kim ,
it works, too. I was able to speed it up a bit, but it is still too slow to use it productively 50000ms
@Stewwe interesting is it possible to share the large dataset, remove any other information from the file, just keep the relevant one. I would like to test and improve the dax on the larger set otherwise it is hard to optimize the queries.
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
@Stewwe interesting is it possible to share the large dataset, remove any other information from the file, just keep the relevant one. I would like to test and improve the dax on the larger set otherwise it is hard to optimize the queries.
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Hi @Stewwe,
have you found a faster solution yet? I'm currently struggeling with the same problem.
I have used the
Thanks for your help. Unfortunately it will take until Monday 28.6 to prepare something.
Bye
Stewwe
@Stewwe here is the measure, the solution is attached too. You need to create a star schema, with two dimensions, article, and date.
Stock from recent date =
VAR __table =
SUMMARIZE (
CROSSJOIN (
FILTER (
ALL ( 'Calendar'[Date] ),
'Calendar'[Date] <= MAX ( 'Calendar'[Date] )
),
ALL ( Article[Article] )
),
Article[Article],
"@Stock", MAX ( Stock[Stock] )
)
RETURN
SUMX ( __table, [@Stock] )
Check my latest blog post Comparing Selected Client With Other Top N Clients | PeryTUS I would ❤ Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos to whoever helped to solve your problem. It is a token of appreciation!
⚡Visit us at https://perytus.com, your one-stop-shop for Power BI-related projects/training/consultancy.⚡
Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!
Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo
If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤
Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.
Hello parry2k,
Thank you very much for your solution. The sample data is exactly what I need as a result.
Unfortunately, the solution is clearly too slow with the size of the real data - PowerBI aborts the evaluation.
My tables have the following size:
Article =38,000 rows
Table1= 800.000row>
Date = 4300 rows
If you have a more performant solution, I would be very happy to hear from you!
Bye stewwe
If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.
Wow @Jihwan_Kim , the solution looks very good and so fast feedback! Many thanks already 🙂
Now I only have one problem:
The calculation takes a very, very long 160000 ms.
My tables have the following size:
Article =38,000 rows
Table1= 800.000row>
Date = 4300 rows
Do you have an approach on how we can speed this up?
Thank you in advance!
Thanks for your feedback.
I am not sure if it would work, but please try to replace the [lastnonblankvalue measure =] with the below.
If this post helps, then please consider accepting it as the solution to help other members find it faster, and give a big thumbs up.
Hello @Jihwan_Kim ,
it works, too. I was able to speed it up a bit, but it is still too slow to use it productively 50000ms
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
118 | |
107 | |
70 | |
70 | |
43 |
User | Count |
---|---|
148 | |
104 | |
104 | |
89 | |
66 |