Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi All,
Looking for recommendations on how to best manage importing a CSV that contains different transaction types. Here is a quick sample of how the CSV looks.
Standard Orders
Name, OrderNumber, OrderDate
Bob, 123456, 23/09/2021
Jane, 987654, 24/09/2021
Premium Orders
Name, OrderNumber, OrderDate
Jack, 234567, 22/09/2021
Sally, 875903, 23/09/2021
As you can see the CSV starts with an identifier for the type of transaction it then lists the headers for the CSV and then the records. I am comfortable removing the blank lines and the headers. I am not 100% sure how to go about creating a new column to hold the transaction type?
Ideally I want to get the data set like this
Name, OrderNumber, OrderDate, TransactionType
Bob, 123456, 23/09/2021, Standard Orders
Jane, 987654, 24/09/2021, Standard Orders
Name, OrderNumber, OrderDate, TransactionType
Jack, 234567, 22/09/2021, Premium Orders
Sally, 875903, 23/09/2021, Premium Orders
I can find the transaction type using
if [Name] = "Standard Orders" then "" else "")
I tried creating a paramater and then changing it's value using the if statement above but it doesn't look like thats allowed. I then thought I might need to create a function and set a variable then put the if statement inside the function and return the value but it doesn't seem you can update variables either.
I think I am pretty close but just need someone to point me in the right direction. Appreciate any tips or pointers.
Cheers,
Fly Kick
Solved! Go to Solution.
Hi @FlyKick,
Not sure I fully understand the format of your file but paste the below into the advanced editor of a blank query and let me know if I'm on the right track.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCi5JzEtJLEpR8C9KSS0qVorViVbyS8xN1YEI+JXmJqUWQTkuiSWpYAVO+Uk6CoZGxiamZjoKRsb6Bpb6RgZGhmA5r8Q8oGZLC3MzUxOgpAmKZEBRam5maS7UMgXibPNKTM4GWQO0zRxIG6GYGJyYk1Opo2BhbmppYIzsGKDhsQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Split Column by Delimiter" = Table.SplitColumn(Source, "Column1", Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv), {"Column1.1", "Column1.2", "Column1.3"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "Custom", each if [Column1.2] = null then [Column1.1] else null),
#"Filled Down" = Table.FillDown(#"Added Custom",{"Custom"})
in
#"Filled Down"
Regards,
Kim
Have I solved your problem? Please click Accept as Solution so I don't keep coming back to this post, oh yeah, others may find it useful also ;). |
If you found this post helpful, please give Kudos. It gives me a sense of instant gratification and, if you give me Kudos enough times, magical unicorns will appear on your screen. If you find my signature vaguely amusing, please give Kudos. | Proud to be a Super User! |
You can add a column (using the custom column dialog)that retrieves the first item in the column like this:
#"Changed Type"[Column1]{0}
where #"Changed Type" is the name of the step (or table)
Hi Hot Chilli,
Thanks for the response, unfortunately that will fill all the rows with the Standard Orders even the premium rows. I am looking for a way to fill all rows with standard order until it gets to the next transaction type header.
That's why I thought it needs a variable or parameter. Something that can be updated based on the value of the last row.
in pseudo code it would be
if [name]{[index]-1 = "Standard Orders" then set columnNameVar to "Standard Orders" else
if [name]{[index]-1 = "Premium Orders" then set columnNameVar to "Premium Orders" else
dont update the variable name
Then I would add a conditional column based on the variable name. Not sure if this possible in Power Query though?
Might end up using a logic app to process the csv and add the records to azure table storage and then connect PBI to that.
Hi @FlyKick,
Not sure I fully understand the format of your file but paste the below into the advanced editor of a blank query and let me know if I'm on the right track.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCi5JzEtJLEpR8C9KSS0qVorViVbyS8xN1YEI+JXmJqUWQTkuiSWpYAVO+Uk6CoZGxiamZjoKRsb6Bpb6RgZGhmA5r8Q8oGZLC3MzUxOgpAmKZEBRam5maS7UMgXibPNKTM4GWQO0zRxIG6GYGJyYk1Opo2BhbmppYIzsGKDhsQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Split Column by Delimiter" = Table.SplitColumn(Source, "Column1", Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv), {"Column1.1", "Column1.2", "Column1.3"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "Custom", each if [Column1.2] = null then [Column1.1] else null),
#"Filled Down" = Table.FillDown(#"Added Custom",{"Custom"})
in
#"Filled Down"
Regards,
Kim
Have I solved your problem? Please click Accept as Solution so I don't keep coming back to this post, oh yeah, others may find it useful also ;). |
If you found this post helpful, please give Kudos. It gives me a sense of instant gratification and, if you give me Kudos enough times, magical unicorns will appear on your screen. If you find my signature vaguely amusing, please give Kudos. | Proud to be a Super User! |
Thank You Kim! That is such an elegant and simple solution! I was way over complicating it! Thank you, thank you, thank you......
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
102 | |
53 | |
21 | |
12 | |
12 |