Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
ElliotP
Post Prodigy
Post Prodigy

Query about how to do this in Power Query

Afternoon,

 

I'm trying to calculate the amount of Stock of an Item sold per hour from my data table; I've got this query atm and it works well with one limitation:

let
    Source = #"itemdetailsdogfood$",
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Time - Copy", Int64.Type}, {"Item", type text}}),
    #"Grouped Rows1" = Table.Group(#"Changed Type", {"Date"}, {{"Count", each Table.RowCount(_), type number}, {"AllData", each _, type table}}),
    #"Expanded AllData1" = Table.ExpandTableColumn(#"Grouped Rows1", "AllData", {"Hour", "Item"}, {"Hour", "Item"}),
    #"Added Custom" = Table.AddColumn(#"Expanded AllData1", "Smoothie", (CurrentRecord) => (Table.RowCount(Table.SelectRows(#"Expanded AllData1", each [Item] = "Smoothie" and [Hour] <= CurrentRecord[Hour])))/Table.RowCount(Table.Distinct( #"Changed Type",{"Date"})) ),
    #"Added Custom2" = Table.AddColumn(#"Added Custom", "Fritters", (CurrentRecord) => (Table.RowCount(Table.SelectRows(#"Added Custom", each  [Item] = "Fritters" and [Hour] <= CurrentRecord[Hour])))/Table.RowCount(Table.Distinct( #"Changed Type",{"Date"})) ),
    #"Removed Duplicates" = Table.Distinct(#"Added Custom2", {"Date", "Hour"}),
    #"Removed Columns" = Table.RemoveColumns(#"Removed Duplicates",{"Date", "Count", "Item"}),
    #"Removed Duplicates1" = Table.Distinct(#"Removed Columns", {"Hour"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Removed Duplicates1",{{"Hour", Int64.Type}}),
    #"Sorted Rows" = Table.Sort(#"Changed Type1",{{"Hour", Order.Ascending}}),
    #"Changed Type2" = Table.TransformColumnTypes(#"Sorted Rows",{{"Fritters", type number}, {"Smoothie", type number}})
in
    #"Changed Type2"

The limitation being (you can see in Added Custom and Added Custom 2), for me to calculate the average sales per day of the items I have to make custom columns for each item (time consuming). I tried adding in a * to query inplace of the item name and received a function output (never seen this before) and I was wondering if anyone new an easy way for me to do my Custom Column, but make it not be solely for Item = "Smoothie" but rather, all items, as so I don't have to added in 50 custom columns for each item.

 

Attached pbix: https://1drv.ms/u/s!At8Q-ZbRnAj8hk25Rwb3njrRokHF

 

 

6 REPLIES 6

I can't tell from the query what is going - even with the sample workbook (thanks for posting btw) because the query will only work (step by step) if the data source is accessible (which it is not of course).   But it doesn't sound to me like a Power Query problem, this sounds like a Power Pivot problem - exactly the opposite to your other post.

This article I wrote may help you http://exceleratorbi.com.au/shaping-modelling-power-bi/

 



* Matt is an 8 times Microsoft MVP (Power BI) and author of the Power BI Book Supercharge Power BI.

Agree with Matt that this might be better done in DAX. But if you want to stay within the query-editor, this is the dynamic code:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("3ZTLCoJAFIZfZXAd4hw1dRlSL+CihbgwG1RyGtDx/ZMEu3o8kEm0nfPxn+s/cWxwz7S4CRb3jJURZXUqD5U4sm2eN90Dd41k9QSFbaOVZBup2rMeYXZVqtm+KLX4RKQutRb1WB2TORYAlmgjkkrpohwNU3Y2wyjwLkhVTHTy/X3fEvho1KNEAbsD58pY6K38OeMiTPB7iIMgwOdheH92lo0xAYGh6KwJjE1gYDADUNw+CYFDUfIfjK3auhJNw8JCZaqzuWBhehKDFwHGvoF+mGDfpUTUgtcC36lxrIGe4e4Ek1wA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Date = _t, Item = _t, Hour = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Item", type text}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type",{"Date", "Item", "Hour"}),
    #"Sorted Rows1" = Table.Sort(#"Removed Other Columns",{{"Hour", Order.Ascending}}),
    #"Grouped Rows" = Table.Group(#"Sorted Rows1", {"Item"}, {{"All", each Table.AddIndexColumn(_, "Index",1,1), type table}}),
    #"Added Custom1" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.Group([All], {"Hour"}, {{"Prev", each List.Max(_[Index])}})),
    #"Removed Columns1" = Table.RemoveColumns(#"Added Custom1",{"All"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns1", "Custom", {"Hour", "Prev"}, {"Hour", "Prev"}),
    #"Added Custom3" = Table.AddColumn(#"Expanded Custom", "Share", each [Prev]/List.Count(List.Distinct(Source[Date]))),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom3",{"Prev"}),
    Pivot = Table.Pivot(#"Removed Columns", List.Distinct(#"Removed Columns"[Item]), "Item", "Share"),
    #"Filled Down" = Table.FillDown(Pivot, Table.ColumnNames(Pivot))
in
    #"Filled Down"

If you want to load it into the datamodel, you'd better skip the last 2 steps (pivot).

 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

That's amazing thank you.

 

I've been trying to replicate the second table 'Stock Turnover' from my pbix with my new found ability.

 

So Query3 becomes the new StocksSold Query;

let
    Source = #"itemdetailsdogfood$",
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Time - Copy", Int64.Type}, {"Item", type text}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type",{"Date", "Item", "Time - Copy"}),
    #"Sorted Rows1" = Table.Sort(#"Removed Other Columns",{{"Time - Copy", Order.Ascending}}),
    #"Grouped Rows" = Table.Group(#"Sorted Rows1", {"Item"}, {{"All", each Table.AddIndexColumn(_, "Index",1,1), type table}}),
    #"Added Custom1" = Table.AddColumn(#"Grouped Rows", "Custom", each Table.Group([All], {"Time - Copy"}, {{"Prev", each List.Max(_[Index])}})),
    #"Removed Columns1" = Table.RemoveColumns(#"Added Custom1",{"All"}),
    #"Expanded Custom" = Table.ExpandTableColumn(#"Removed Columns1", "Custom", {"Time - Copy", "Prev"}, {"Time - Copy", "Prev"}),
    #"Added Custom3" = Table.AddColumn(#"Expanded Custom", "Share", each [Prev]/List.Count(List.Distinct(Source[Date]))),
    #"Removed Columns" = Table.RemoveColumns(#"Added Custom3",{"Prev"})
    
in
    #"Removed Columns"

Produces this: https://gyazo.com/845668570b2ed9b7a674af83a3925775

Instead of this: https://gyazo.com/74a4e9364bf89fe3ea62dfdc651608cb

 

My first instinct is to pivot the table.

 

Pivotting produces this: https://gyazo.com/da68b8bef4ddac201137fb78f384d453

 

But then when I do the powerquery for the final table;

let
    Source = Csv.Document(File.Contents("C:\Users\Elliot Parker\Desktop\itemlist.csv"),[Delimiter=",", Columns=15, Encoding=1252, QuoteStyle=QuoteStyle.None]),
    #"Promoted Headers" = Table.PromoteHeaders(Source),
    #"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"Token", type text}, {"Item Name", type text}, {"Description", type text}, {"Category", type text}, {"SKU", type text}, {"Variation Name", type text}, {"Price", type number}, {"Current Quantity Neoscona", type text}, {"New Quantity Neoscona", type text}, {"Stock Alert Enabled Neoscona", type text}, {"Stock Alert Count Neoscona", type text}, {"Modifier Set - Bacon", type text}, {"Modifier Set - Coffee", type text}, {"Modifier Set - Banana", type text}, {"Tax - GST (10%)", type text}}),
    #"Added Conditional Column" = Table.AddColumn(#"Changed Type", "Custom", each if Text.Contains([#"Modifier Set - Bacon"], "Y") then "Bacon" else if Text.Contains([#"Modifier Set - Coffee"], "Y") then "Coffee" else if Text.Contains([#"Modifier Set - Banana"], "Y") then "Banana" else null ),
    #"Added Conditional Column1" = Table.AddColumn(#"Added Conditional Column", "Stock Level", each if Text.Contains([Item Name], "Flat White") then "10" else if Text.Contains([Item Name], "Fritters") then "15" else if Text.Contains([Item Name], "Scrambled Eggs") then "15" else if Text.Contains([Item Name], "Smoothie") then "10" else null ),
    #"Added Conditional Column2" = Table.AddColumn(#"Added Conditional Column1", "Reorder Quantity", each if Text.Contains([Item Name], "Flat White") then "25" else if Text.Contains([Item Name], "Fritters") then "16" else if Text.Contains([Item Name], "Smoothie") then "11" else if Text.Contains([Item Name], "Scrambled Eggs") then "10" else null ),
    #"Changed Type1" = Table.TransformColumnTypes(#"Added Conditional Column2",{{"Reorder Quantity", Int64.Type}, {"Stock Level", Int64.Type}}),
    #"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"Item Name", "Stock Level", "Reorder Quantity"}),
    StocksSold = #"Query3",
     #"Stocks Empty At" = Table.AddColumn(#"Changed Type","Stocks Empty At", (x) => if List.PositionOf(Table.Column(Query3,x[Item]),List.Min(List.Select(Table.Column(Query3,x[Item]),each _> x[Stock Level]))) > -1 then Query3[Time - Copy]{List.PositionOf(Table.Column(Query3,x[Item]),List.Min(List.Select(Table.Column(Query3,x[Item]),each _> x[Stock Level])))} else -1)
in
    #"Stocks Empty At"

It tells me that [Time - Copy] is an invalid identifier on the final line. It's the exact same name that is present in the query3 table; any thoughts?

I cannot see what the last query has to do with with the other issues (are they isssues?) you've raised.

But the last step of this last query "Stocks Empty at" refers to the 2n step of the query ("Changed Type") - so all other 5 steps in between will be ignored.

 

Re all the pictures you've sent: I believe that when you take my first query and leave the last 2 steps in as they were, you get what you want: A pivoted table with all empty cells containing the values from the "last non-empty".

 

It seems that you didn't understand why I recommended to use a long narrow table instead of short wide ones. This is considered as best practice when working with data models (in PowerBI). I believe that you will find some good information to this on @MattAllington's side: http://exceleratorbi.com.au/

 

If you come from the Excel-world, it might help to image that in the model you will work with commands that work the SUMIF-way, instead of having one column for each possible "if-value". (Your items)

(actually, "Aggregate"-if, because you are not limited to Sums, but can do counts, aggregates and other things as well 🙂

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

It's building upon what we built. https://gyazo.com/24d2bd69a12681a074c26d1dcd70c355

 

It's taking this and aiming to apply this:

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCs7Nzy/JyExV0lGyAGJDI6VYnWglt6LMkpLUomKoqJGJUmwsAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Item = _t, #"Stock Level" = _t, #"Reorder Quantity" = _t]),
    StocksSold = #"Query3",
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Item", type text}, {"Stock Level", type number}, {"Reorder Quantity", type number}}),
    #"Stocks Empty At" = Table.AddColumn(#"Changed Type","Stocks Empty At", (x) => if List.PositionOf(Table.Column(StocksSold,x[Item]),List.Min(List.Select(Table.Column(StocksSold,x[Item]),each _> x[Stock Level]))) > -1 then StocksSold[Hour]{List.PositionOf(Table.Column(StocksSold,x[Item]),List.Min(List.Select(Table.Column(StocksSold,x[Item]),each _> x[Stock Level])))} else -1)
in
    #"Stocks Empty At"

I'm stuck on how to pull the Stock Level from the itemlist query and not my entered data.

 

So;

 

This is what it looks like now, where the Stock Level and Reoder Quantity come from the Entered Data query (first line in the power query above). https://gyazo.com/d396f40e9f7ad3388052ce6484bbd576

 

I would like to be able to grab these two columns; https://gyazo.com/b5f1b05ab3a669b8d991e2743cfa6b13

 

Add them in the first line of the powerquery above; so I can apply this custom column to them;

 

Stocks Empty At = 
(x) => if List.PositionOf(Table.Column(Query3,x[Item]),List.Min(List.Select(Table.Column(Query3,x[Item]),each _> x[Stock Level]))) > -1 then Query3[Hour]{List.PositionOf(Table.Column(Query3,x[Item]),List.Min(List.Select(Table.Column(Query3,x[Item]),each _> x[Stock Level])))} else -1

Wherein I've edited it I think so it should work to show the amount of time in minutes (might be hours actually, but that's the next problem to be honest, one issue at a time).

 

In essence; how do I pull the Stocks Level and Reorder Quantity in powerquery from my Itemlist query instead of just entering the data in the powerquery code at the very top.

 

 

 

I recommend to watch this on-demand webinar.

It will teach you the M-basics that will enable you to find out how to write your own code or adjust someone else's.

 

I cannot follow your last post, but maybe the video will help you to solve it by yourself then.

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors