As a relative new user of PowerBI dataflow i'm stuck with this problem:
As source there is a table with id's, orderID's, Events, date of event and if needed a plandate.
Now I want to convert this table to a table in witch only the OnHold, OffHold and Plandates are shown by order.
I have made an example in Excel for the source and the wanted result.
I tried to make a copy of the source with only the needed events and then merge it with the source but that will not do. Filtering results in a loop.
Is there anyone who can help?
could you explain in more detail the logic with which you pass from the main table to the wanted one (keep in mind that, I don't know the meaning and the relations between the various acronyms used)?
As usual, @Smauro 's solutions are elegant and instructive, but it seems that his truly admirable effort of interpretation has produced a slightly different result from what you described as wanted. To hope to have an effective solution also on the size of your case, you should also say how many rows, columns (and possibly sub-groups) it is.
@Rocco_sprmnt21 The data I have is a table from an application for following orders.
Every step an order makes in the proces is logged in this table.
The orders are projects build by diffrent contractors and in the building process there are two roughly two steps where something can get stucked.
The first part is between orderintake (Start) and communicating first promissed date (Plan1). The second part is between communicating first promissed date and order ready.
In both parts there can be an onhold situation because of legitime reason or because of unlegitime reason. The onhold time because of legitime reason will be compensated. An Onhold situation can occure more than one time per order. The time lost is in both parts diffrent. Before communicating plandate the total time between Start and communicating first promissed date minus the time OnHold is the total runtime and that is one of the KPI's
The second part is more difficult. An onhold situation frustrates the planning of the contractor so the time between the plandate by Onhold and the plandate directly given after the Offhold moment is the time to compensate. The application asks for a new plandate bij the OffHold action.
KPI: realize project on 1st promised date (including compensation)
KPI: number of replans <=2
2 plan1 given
2 OnHold legitime (date = 03/22/2021)
3 OffHold (date = 01/14/2021)
4 Plan given (this plandate is not relevat but after OffHold there must be a plandate)
5 Plan1 given (plandate 03/19/2021)
6 OnHold (legitime)
8 Pladate (plandate 03/26/2021)
9 OnHold (not legitime)
11 Plandate (plandate 04/02/2021)
12 OnHold (Legitime)
14 Plandate (plandate 04/09/2021)
15 ready (date = 04/07/2021)
In the second example the compensation before giving the first plandate is 1 day,
The compensation after giving the first pandate is 7 days (step 5 and 😎 + 7 days (step 11 and 14) = 14 days.
The runtime of the order is total runtime - 15 days
The compensated plandate is the original plandate + 14 days (and in this case no score on 1th plandate)
In between the Events mentioned are diffrent other events for communication about the order.
In total there are about 150K orders with 3M events
The remoddeling of the table makes it possible for me to determine in witch part the event felt and how to compensate per order.
Thanks @Rocco_sprmnt21 , didn't catch the error there with missing HoldDates.
Issue was that they want to use the same value on different lines. For example, a plan's ID could be relevant in more than 1 holds. More, there could be no plan1 after a hold, so the previous should be used etc etc. That would all be solved if we were to look at the whole table everytime as I did in my first solution, but since we now know that there are 3 millions of records, I tried to look at every order's record only once. Of course, some values will not be there (like PdtPlan1)
Anyways, here the table for anyone that wants to play around:
And here's a solution that uses fill down to fix those issues:
Sorry for the late reaction and thanks for the possible solution. The data in my table seems to be too big for this rebuild. I have + 3 milion records in the table and the solution You gave runs out of time. So I tried to select a part of the records but it seems that Powerflow is using the original table still. It keeps on running out of time. Then I tried to get fewer records from the database using SQL statement in stead of chosing a table, but that won't work in Powerflow (same SQL works in PowerBI Dataset) So for now I'm running out of options. I wil come back if my dataproblem is solved.
Hi there, yes, this solution would be very heavy in your dataset.
Best I could hastly do is this:
Instead of buffering and filtering everytime, it sorts the table once and then groups locally. Give it a try and let me know 🙂
(YourTable is your table's query name)
Check out new user group experience and if you are a leader please create your group!
On-demand access to all the great content presented by the product teams and community members! #MSBizAppsSummit #CommunityRocks
Check out how to claim yours today!
Test your skills now with the Cloud Skills Challenge.