Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
dujhe
Regular Visitor

Create a table with loops

Hi,

 

I have a dataset of completed responses to a web form which is completed by our providers. Within the dataset is the providers unique ID, the date the form was completed and the value I need to report on. The provider can complete the form as many times as they like on whatever date they like.
Capture1.PNG

 

I have been asked to report on the trend of this value. What I am trying to do is produce a table that will give a date column starting from the earliest date in the above table and then list every day until today and then repeat this for each Unique ID.

I would like to take the most recent value for each Unqiue ID when there isn't a response for that provider on the given day. To try and explain this a bit clear please see the table before, which is what I would ideally like to get to. You can see Unique ID 1, has a value of 1 until the 26/05/2020 when they submitted a new value of 2 and that is now being reported until 30/05/2020 once they submittted another return.
Capture2.PNG

 

I have created a seperate date table and joined my dataset to that and then created the following measure 
Capture.PNG

 

I then created a matrix with the date from the date table and used this measure as the value but it was just giving me a total and not a value for each date.

Any help anyone can provider would be greatly appreciated.


1 ACCEPTED SOLUTION

Hi @dujhe 

Agree resorting to M is an option worth considering. In this case, however, the required DAX code is not excessively complex. Create a calculated table; Table1 is the initial table you show:

NewTable = 
VAR startDate_ = MIN(Table1[Date])
VAR endDate_ = TODAY()
VAR tab0_ = CROSSJOIN(DISTINCT(Table1[ID]),CALENDAR(startDate_,endDate_))
RETURN 
ADDCOLUMNS(tab0_, "Value", 
            VAR latestDate_ = CALCULATE(MAX(Table1[Date]),Table1[Date] <=EARLIER([Date]), Table1[ID] = EARLIER([ID]))
            RETURN CALCULATE(MAX(Table1[Value]),Table1[Date] = latestDate_, Table1[ID] = EARLIER([ID])))

 

Please mark the question solved when done and consider giving kudos if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

SU18_powerbi_badge

View solution in original post

3 REPLIES 3
mahoneypat
Employee
Employee

Making tables like this is easier in M/Query Editor IMO, so that's how I approached it.  I used similar data, and I believe this gets your desired result.  Please paste this into a blank query to see one approach on doing this.

 

let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTLVN9Q3MjAyADINlWJ1YIImMEEjJEEzVJVGYEEjbIKmSNpjAQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Unique ID" = _t, Date = _t, Value = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Unique ID", Int64.Type}, {"Date", type date}, {"Value", Int64.Type}}),
Custom1 = List.Dates(List.Min(#"Changed Type"[Date]), Duration.Days(List.Max(#"Changed Type"[Date])-List.Min(#"Changed Type"[Date]))+1, #duration(1,0,0,0)),
#"Converted to Table" = Table.FromList(Custom1, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Renamed Columns" = Table.RenameColumns(#"Converted to Table",{{"Column1", "Date2"}}),
#"Changed Type1" = Table.TransformColumnTypes(#"Renamed Columns",{{"Date2", type date}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "Custom", each List.Distinct(#"Changed Type"[Unique ID])),
#"Expanded Custom" = Table.ExpandListColumn(#"Added Custom", "Custom"),
#"Renamed Columns1" = Table.RenameColumns(#"Expanded Custom",{{"Custom", "Unique ID"}}),
#"Changed Type2" = Table.TransformColumnTypes(#"Renamed Columns1",{{"Unique ID", Int64.Type}}),
#"Merged Queries" = Table.NestedJoin(#"Changed Type2", {"Date2", "Unique ID"}, #"Changed Type", {"Date", "Unique ID"}, "Merged", JoinKind.LeftOuter),
#"Expanded Merged" = Table.ExpandTableColumn(#"Merged Queries", "Merged", {"Value"}, {"Value"}),
#"Grouped Rows" = Table.Group(#"Expanded Merged", {"Unique ID"}, {{"AllRows", each _, type table [Date2=date, Unique ID=number, Value=number]}}),
#"Added Custom1" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.FillDown([AllRows],{"Value"})),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"AllRows"}),
#"Expanded Custom1" = Table.ExpandTableColumn(#"Removed Columns", "Custom", {"Date2", "Value"}, {"Date2", "Value"}),
#"Changed Type3" = Table.TransformColumnTypes(#"Expanded Custom1",{{"Value", Int64.Type}}),
#"Renamed Columns2" = Table.RenameColumns(#"Changed Type3",{{"Date2", "Date"}}),
#"Changed Type4" = Table.TransformColumnTypes(#"Renamed Columns2",{{"Date", type date}})
in
#"Changed Type4"

 

If this works for you, please mark it as solution.  Kudos are appreciated too.  Please let me know if not.

Regards,

Pat





Did I answer your question? Mark my post as a solution! Kudos are also appreciated!

To learn more about Power BI, follow me on Twitter or subscribe on YouTube.


@mahoneypa HoosierBI on YouTube


Hi @dujhe 

Agree resorting to M is an option worth considering. In this case, however, the required DAX code is not excessively complex. Create a calculated table; Table1 is the initial table you show:

NewTable = 
VAR startDate_ = MIN(Table1[Date])
VAR endDate_ = TODAY()
VAR tab0_ = CROSSJOIN(DISTINCT(Table1[ID]),CALENDAR(startDate_,endDate_))
RETURN 
ADDCOLUMNS(tab0_, "Value", 
            VAR latestDate_ = CALCULATE(MAX(Table1[Date]),Table1[Date] <=EARLIER([Date]), Table1[ID] = EARLIER([ID]))
            RETURN CALCULATE(MAX(Table1[Value]),Table1[Date] = latestDate_, Table1[ID] = EARLIER([ID])))

 

Please mark the question solved when done and consider giving kudos if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

SU18_powerbi_badge

@dujhe 

another version in M. I think it is likely the version in DAX will be faster (if your tables are large). BaseTable1 is the first table you showed

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUTIy0Dcw1TcyMDIAcgyVYnUgwsaYwkYg1cZIwkZgYWMQywhJ2BgsbAISNsVqtpEZqiGxAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ID = _t, Date = _t, Value = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"ID", Int64.Type}, {"Date", type date}, {"Value", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"ID"}, {{"AvailableDates", each Table.FromColumns({[Date],[Value]}, {"Date", "Value"}) }}),
    allDates_ = List.Dates(List.Min(BaseTable1[Date]), Number.From(Date.From(DateTime.LocalNow()) - List.Min(BaseTable1[Date])) +1,#duration(1,0,0,0)),
    #"Added Column" = Table.AddColumn(#"Grouped Rows", "Missingdates", each let missingdatesL_= List.Difference(allDates_,[AvailableDates][Date]) in Table.FromColumns({missingdatesL_,List.Repeat({null},List.Count(missingdatesL_))}, {"Date", "Value"}) ),
    #"Added Custom" = Table.AddColumn(#"Added Column", "AllDates", each Table.Sort(Table.Combine({[AvailableDates],[Missingdates]}),{{"Date", Order.Ascending}})),
    #"Added Custom1" = Table.AddColumn(#"Added Custom", "Filldown", each Table.FillDown([AllDates],{"Value"})),
    #"Removed Columns1" = Table.RemoveColumns(#"Added Custom1",{"AvailableDates", "Missingdates", "AllDates"}),
    #"Expanded Filldown" = Table.ExpandTableColumn(#"Removed Columns1", "Filldown", {"Date", "Value"}, {"Date", "Value"})
in
    #"Expanded Filldown"

 

Please mark the question solved when done and consider giving kudos if posts are helpful.

Contact me privately for support with any larger-scale BI needs, tutoring, etc.

Cheers 

SU18_powerbi_badge

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.