cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Frequent Visitor

Value and Identifier in 2 Columns

I'm pretty sure this is rather easy but I cant figure out the best terms to search for the solution.

 

I have numerical values that I want to sum but their "identifier" is next to the value in another column. Its an exported report and as you can see, the categories change.

 

Model Material 1Model Material 1 UsageModel Material 2Model Material 2 Usage
VeroPureWhite56.992Agilus30Clear33.619
VeroMagenta0VeroPureWhite28.605
VeroPureWhite52.155Agilus30Clear34.051
VeroPureWhite39.022Agilus30Clear26.847
VeroPureWhite44.657Agilus30Clear

26.145

 

Whats the best way to properly categorize the data to get the sum of each material type?

13 REPLIES 13
Frequent Visitor

So it looks like alot of people had different approaches and i'm still trying to figure out the grouping. 

 

I grouped the data and unpivoted the table. I plan to use the data/time started as the unique indentifier column.

 

Since the material used falls on a line under the material type used, I need to get these on the same line? The group by function is what is confusing me.

 

 

PrinterStartedCompletedTotal Job TimeTotal Print TimeStatusAttributeValue
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 1VeroPureWhite
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 1 Usage56.992
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 2Agilus30Clear
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 2 Usage33.619
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 3TissueMatrix
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 3 Usage18.039
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 4VeroMagenta-V
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 4 Usage24.561
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 5BoneMatrix
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 5 Usage23.18
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 6GelMatrix
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishModel Material 6 Usage26.864
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishSupport Material 1SUP706
00E0F434FC6C9/18/20 15:069/18/20 19:140.04:08:310.04:04:32FinishSupport Material 1 Usage202.76
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 1VeroMagenta
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 1 Usage0
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 2VeroPureWhite
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 2 Usage28.605
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 3Agilus30Clear
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 3 Usage22.325
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 4TissueMatrix
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 4 Usage9.522
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 5BoneMatrix
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 5 Usage9.956
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 6GelMatrix
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishModel Material 6 Usage13.025
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishSupport Material 1SUP706
00E0F434FC6C9/16/20 7:189/16/20 9:260.02:08:290.01:57:46FinishSupport Material 1 Usage95.823
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 1VeroPureWhite
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 1 Usage52.155
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 2Agilus30Clear
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 2 Usage34.051
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 3TissueMatrix
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 3 Usage15.961
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 4VeroMagenta-V
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 4 Usage17.285
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 5BoneMatrix
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 5 Usage18.246
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 6GelMatrix
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishModel Material 6 Usage23.779
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishSupport Material 1SUP706
00E0F434FC6C9/11/20 7:529/11/20 11:290.03:37:480.03:32:18FinishSupport Material 1 Usage169.415

 

 

Hello @cwileyrrt 

 

I don't know what this dataset has now to do with the original request. Seems like you didn't show us the reald dataset in your initinal post. If you can show us how your orignal dataset is looking like, and what exactly you need.

Here my best guess 🙂

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("vZdNa8MwDIb/Sul5c235I7ZvW1l3Kgy6dofSQ2BmC4R2JC3s58+hJc4YAY+qukXOx5M3UqTX2+2U8ye+UFIt5mY+vZu6mbAz4BOhPTfD2HmhYswZV55bL0UfKC8hBotqX7Wf8WB5eA/1ZFkeQ1OV9aS7cBOaw8upCW+f1TFMd3cU2Mm6LT9CPKENcw5oqN25h4+qPrWSz+tQNkTYXqyUzAhHQ5Vx6bVq21OIS031TUTttQrLuCTSqi5VvIzo/bG83xBhe7GgmDaChqrj0uNhT5pWnZRKJiwNtHvKc6gpdZqk0zBr1AjVdJTCxw+RQufBnDHQMcGdA+F14ZXJaMGX4iVB9jI5BQ4yhwwuNKXSMsM1BVNmDhhcaBIKTAKJUJU3XHCZvU7HNIxZBlRkVqvFJQ5EOm0okDltFheYPIKMN9+mYFenr69Dc/zdZlfrl4Lf5qP+5aVMamZBjlDFmaphEArRk6SXkWT7AM7vd62dR6YmNw9M6LGE4kJzzTwyNXl5xbges3y40EwrjwxNf6lmbtTc4kJzjTwyNUktGFia+s2aLcjI4eYM1FgjxIXmjBdk4nCzUhRje9Arof8fMOjAlE7jmBKxbnc/", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Printer = _t, Started = _t, Completed = _t, #"Total Job Time" = _t, #"Total Print Time" = _t, Status = _t, Attribute = _t, Value = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Printer", type text}, {"Started", type text}, {"Completed", type text}, {"Total Job Time", type duration}, {"Total Print Time", type duration}, {"Status", type text}, {"Attribute", type text}, {"Value", type text}}),
    MaterialType = {"Model Material 1", "Model Material 2","Model Material 3", "Model Material 4", "Model Material 5", "Model Material 6", "Support Material 1"},
    #"Pivoted Column" = Table.Pivot(#"Changed Type", List.Distinct(#"Changed Type"[Attribute]), "Attribute", "Value"),
    AddRecordList = Table.AddColumn
    (
        #"Pivoted Column",
        "RecordList",
        (rec)=> List.Transform(MaterialType,  (trans)=> Table.Column(Record.ToTable(Record.SelectFields(rec, List.Select(Table.ColumnNames(#"Pivoted Column"), (sel)=> Text.Start(sel,Text.Length(trans))= trans))), "Value"))
        
    ),
    DeleteColumns = Table.RemoveColumns
    (
        AddRecordList,
        List.Select(Table.ColumnNames(AddRecordList),(sel)=> List.AnyTrue(List.Transform(MaterialType, (trans)=> Text.Start(sel,Text.Length(trans))= trans)))
    ),
    #"Expanded RecordList" = Table.ExpandListColumn(DeleteColumns, "RecordList"),
    #"Extracted Values" = Table.TransformColumns(#"Expanded RecordList", {"RecordList", each Text.Combine(List.Transform(_, Text.From), "&&"), type text}),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Extracted Values", "RecordList", Splitter.SplitTextByDelimiter("&&", QuoteStyle.Csv), {"RecordList.1", "RecordList.2"}),
    #"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"RecordList.1", type text}, {"RecordList.2", Int64.Type}}),
    #"Renamed Columns" = Table.RenameColumns(#"Changed Type1",{{"RecordList.1", "Material"}, {"RecordList.2", "Material Usage"}})
in
    #"Renamed Columns"

 

Copy paste this code to the advanced editor in a new blank query to see how the solution works.

If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too

Have fun

Jimmy

 

Super User III
Super User III

@cwileyrrt , I came up with a more generic way of conversion if your dataset consists of more pairs of Material/Usage,

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCkstyg8oLUoNz8gsSVXSUTI107O0NAIyHNMzc0qLjQ2cc1ITi4B8Y2M9M0NLpVgdiB7fxPTUvJJEoIQBEKObYmShZ2ZgCleMYoGRnqGpKTYLTPQMTA2x6jG21DMwwuYoIzM9CxNzrHpMTPTMTM2x6zE0AbotFgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t]),

    #"Unpivoted Columns" = Table.UnpivotOtherColumns(Source, {}, "Attribute", "Value"),
    #"Added Custom" = Table.AddColumn(#"Unpivoted Columns", "Model", each Text.Select([Attribute], {"0".."9"})),
    #"Added Index" = Table.AddIndexColumn(#"Added Custom", "Index", 0, 1, Int64.Type),
    #"Integer-Divided Column" = Table.TransformColumns(#"Added Index", {{"Index", each Number.IntegerDivide(_, 2), Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Integer-Divided Column", {"Model", "Index"}, {{"Count", each Record.FromList([Value], {"Material", "Usage"})}}),
    #"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"Index"}),
    #"Expanded Count" = Table.ExpandRecordColumn(#"Removed Columns", "Count", {"Material", "Usage"}, {"Material", "Usage"})
in
    #"Expanded Count"

I dont think this shows the final output I am looking for.

 

Essentially, this data is a report generated by our 3D printer showing how much material is used of each type. There could be the same material in different bays of the machine. Each row is an individual project/print.

 

I want to be able to take all this data and essentially add up the "Model Material # Usage" of each material. As shown below, VeroPureWhite was in multiple material bays.

 

Model Material 1Model Material 1 UsageModel Material 2Model Material 2 UsageModel Material 3Model Material 3 UsageModel Material 4Model Material 4 Usage
VeroPureWhite56.992Agilus30Clear33.619TissueMatrix18.039VeroMagenta-V24.561
VeroMagenta0VeroPureWhite28.605Agilus30Clear22.325TissueMatrix9.522
VeroPureWhite52.155Agilus30Clear34.051TissueMatrix15.961VeroMagenta-V17.285
VeroPureWhite39.022Agilus30Clear26.847TissueMatrix8.506VeroMagenta-V12.31
VeroPureWhite44.657Agilus30Clear26.145TissueMatrix8.535VeroMagenta-V9.665
VeroPureWhite53.209Agilus30Clear35.149TissueMatrix16.944VeroMagenta-V18.458
VeroPureWhite0.223Agilus30Clear0TissueMatrix0VeroMagenta-V0
VeroPureWhite42.624Agilus30Clear31.298TissueMatrix11.424VeroMagenta-V12.515
VeroPureWhite17.799Agilus30Clear2.116TissueMatrix2.116VeroMagenta-V2.116
VeroPureWhite0.132Agilus30Clear0TissueMatrix0VeroMagenta-V0
VeroPureWhite1.374Agilus30Clear1.058TissueMatrix1.058VeroMagenta-V1.058
VeroPureWhite0.008Agilus30Clear0TissueMatrix0VeroMagenta-V0
VeroPureWhite0.062Agilus30Clear0TissueMatrix0VeroMagenta-V0
VeroPureWhite0Agilus30Clear0TissueMatrix0VeroMagenta-V0
VeroPureWhite40.188Agilus30Clear24.96TissueMatrix10.583VeroMagenta-V10.658
VeroPureWhite49.989Agilus30Clear49.728TissueMatrix49.728VeroMagenta-V49.728
Impactful Individual
Impactful Individual

Hi, @cwileyrrt 

Or try this:

let
    Source = Table.FromRecords(Json.Document(Binary.Decompress(Binary.FromText("5Zc9a8MwEIb/SvCcHrqTTpaylc6BDm06lA6GitZgGrATKJT+9yqQZDk1ibgpZPXHg/T4fc/260+zXL+nYbbsNmnsu2GGzaJZpXH9uB3Ty2e/Sc1cXDJ7nrqP1CzYQ4wkzlNG3H/0w3ay5mFI3SgRdEBYCx6jOG8z4qmfpm3Kh8b+WxLsgYABjJUEt9/HMl/0tenuVhLhDghywB5/5//K2ENOqDBFC2dEHi1QAG+4aOGMyKMGIrAkEe6syKOFCEx0QsJFiSBAlquoSoQDw6hKBEP0klCTCGyBAitl2AiGdPUgD8G1GhkB2Hidi5ysU+W4RIVz4Fnuo1IFunJHLldhyw25VEUE77WpYAtk5LyqqghnFbqhmSe3c7pYBHAclDIMEFmVCzl4KzTIm2sMGG0pCDzJh1CVBASKQZUEBFdYROWAYNTWIk/cNupqkd89KEddhYsyoOpLYkdQdwKt7q1xxZ1AsK2uEpg/H3SNKAKqCrEjqFNgjFzFjaQgb97fbAXKvxJXsfG3Pw==", BinaryEncoding.Base64),Compression.Deflate))),
    trsp = Table.Transpose(Table.DemoteHeaders(Source)),
    rmv_nm = Table.TransformColumns(trsp, {"Column1", each Text.Remove(_, {"0".."9"})}),
    trans = List.Transform(Table.Split(rmv_nm, 2), each Table.PromoteHeaders(Table.Transpose(_))),
    cmbTbls = Table.Combine(trans),
    grp = Table.Group(cmbTbls, {"Model Material "}, {{"Model Material  Usage", each List.Sum([Model Material  Usage]), type number}})
in
    grp

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZQxa8NADIX/i+dUSLqTfDeWzIEOJR2CBw8mNYQWnAT686sEN6T125rtTjL69J503u2a7TB9vpyn4e19PA3NqjGnWjUOz/vxcD4mXh+Gfop7SuRS4/A6Ho/nYdOfpvErrlKI0yV+qbTp98PHqX/axl0zmUvTrXb3qUjw/PE9Vgs5G8CqUlJbYiuZ6q34LwVKYqhUysQmQIFRdQEKpCUtBiGpEiuySZ1KbpeQQsaOGKFPICJncmsxQjKwJBDJAKKSO1ZhiZQrssoCgYYd25EzklEoW4EQJtUEGLwsz6AyY3OUXDPqXEhrAZ0LZYWdK5lge2L+bUX2xIaJLxk/4cVLuMaxN5LQFv3LG6HUImsk1h85M4cXxlzjuGvm8uiuo6g/3Ap++N7FxArSHn+7ClZCmKwk5C7H68b25kq1oLWLRKtggrf4X8ic6Lpv", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t, #"Model Material 3" = _t, #"Model Material 3 Usage" = _t, #"Model Material 4" = _t, #"Model Material 4 Usage" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Model Material 1", type text}, {"Model Material 1 Usage", Int64.Type}, {"Model Material 2", type text}, {"Model Material 2 Usage", Int64.Type}, {"Model Material 3", type text}, {"Model Material 3 Usage", Int64.Type}, {"Model Material 4", type text}, {"Model Material 4 Usage", Int64.Type}}),

    cols=Table.FromColumns({List.Combine(List.Alternate(Table.ToColumns(#"Changed Type"),1,1,1)),List.Combine(List.Alternate(Table.ToColumns(#"Changed Type"),1,1,0))},{"material","qty"}),
    #"Grouped Rows" = Table.Group(cols, {"material"}, {{"sum", each List.Sum([qty]), type nullable number}})
in
    #"Grouped Rows"

 

 

 

image.png

 

in your local environment the result should looks like

 

 

image.png

 

I tryied to use the "type table" expression to set  the column name and also the values type, but seem doesn't work:  in the following expression the value of column qty are seen as text values  and not as number, and error is raised.

 

 

image.png

 

 

 

image.png

 

 

PS

does anyone have any idea where (meaning in what part of the function it can be generated) this error message comes from?

Why does it refer to a difference operation between two values?

 

PPS

I believe the reason lies in the way the groupby function works internally and this error message reveals this internal aspect

Hi, @Rocco_sprmnt21 

 

The type table expression in Power Query doesn't really set the data to a particular type, and even if it could, it wouldn't be easy to write as a dynamic expression.

hi @ziying35 ,

it was just an attempt to shorten the code by removing the line needed to change type from text to type number of the "usage" column.

 

other solutions similar to yours:

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZQxa8NADIX/i+dUSLqTfDeWzIEOJR2CBw8mNYQWnAT686sEN6T125rtTjL69J503u2a7TB9vpyn4e19PA3NqjGnWjUOz/vxcD4mXh+Gfop7SuRS4/A6Ho/nYdOfpvErrlKI0yV+qbTp98PHqX/axl0zmUvTrXb3qUjw/PE9Vgs5G8CqUlJbYiuZ6q34LwVKYqhUysQmQIFRdQEKpCUtBiGpEiuySZ1KbpeQQsaOGKFPICJncmsxQjKwJBDJAKKSO1ZhiZQrssoCgYYd25EzklEoW4EQJtUEGLwsz6AyY3OUXDPqXEhrAZ0LZYWdK5lge2L+bUX2xIaJLxk/4cVLuMaxN5LQFv3LG6HUImsk1h85M4cXxlzjuGvm8uiuo6g/3Ap++N7FxArSHn+7ClZCmKwk5C7H68b25kq1oLWLRKtggrf4X8ic6Lpv", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t, #"Model Material 3" = _t, #"Model Material 3 Usage" = _t, #"Model Material 4" = _t, #"Model Material 4 Usage" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Model Material 1", type text}, {"Model Material 1 Usage", Int64.Type}, {"Model Material 2", type text}, {"Model Material 2 Usage", Int64.Type}, {"Model Material 3", type text}, {"Model Material 3 Usage", Int64.Type}, {"Model Material 4", type text}, {"Model Material 4 Usage", Int64.Type}}),
    #"Transposed Table" = Table.Combine(List.Transform(Table.Split(Table.Transpose(#"Changed Type"),2),each Table.Transpose(_)))
in
    #"Transposed Table"

 

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZQxa8NADIX/i+dUSLqTfDeWzIEOJR2CBw8mNYQWnAT686sEN6T125rtTjL69J503u2a7TB9vpyn4e19PA3NqjGnWjUOz/vxcD4mXh+Gfop7SuRS4/A6Ho/nYdOfpvErrlKI0yV+qbTp98PHqX/axl0zmUvTrXb3qUjw/PE9Vgs5G8CqUlJbYiuZ6q34LwVKYqhUysQmQIFRdQEKpCUtBiGpEiuySZ1KbpeQQsaOGKFPICJncmsxQjKwJBDJAKKSO1ZhiZQrssoCgYYd25EzklEoW4EQJtUEGLwsz6AyY3OUXDPqXEhrAZ0LZYWdK5lge2L+bUX2xIaJLxk/4cVLuMaxN5LQFv3LG6HUImsk1h85M4cXxlzjuGvm8uiuo6g/3Ap++N7FxArSHn+7ClZCmKwk5C7H68b25kq1oLWLRKtggrf4X8ic6Lpv", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t, #"Model Material 3" = _t, #"Model Material 3 Usage" = _t, #"Model Material 4" = _t, #"Model Material 4 Usage" = _t]),

        tabSplitColumns=Table.Combine(List.Transform(List.Split (Table.ColumnNames(Source),2), each Table.RenameColumns(Table.SelectColumns(Source,_),{{_{0},"material"},{_{1},"usage"}}))) 
in
    tabSplitColumns

 

 

 

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZQxa8NADIX/i+dUSLqTfDeWzIEOJR2CBw8mNYQWnAT686sEN6T125rtTjL69J503u2a7TB9vpyn4e19PA3NqjGnWjUOz/vxcD4mXh+Gfop7SuRS4/A6Ho/nYdOfpvErrlKI0yV+qbTp98PHqX/axl0zmUvTrXb3qUjw/PE9Vgs5G8CqUlJbYiuZ6q34LwVKYqhUysQmQIFRdQEKpCUtBiGpEiuySZ1KbpeQQsaOGKFPICJncmsxQjKwJBDJAKKSO1ZhiZQrssoCgYYd25EzklEoW4EQJtUEGLwsz6AyY3OUXDPqXEhrAZ0LZYWdK5lge2L+bUX2xIaJLxk/4cVLuMaxN5LQFv3LG6HUImsk1h85M4cXxlzjuGvm8uiuo6g/3Ap++N7FxArSHn+7ClZCmKwk5C7H68b25kq1oLWLRKtggrf4X8ic6Lpv", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t, #"Model Material 3" = _t, #"Model Material 3 Usage" = _t, #"Model Material 4" = _t, #"Model Material 4 Usage" = _t])


in Table.FromColumns( List.Accumulate({0..Table.ColumnCount(Source)/2-1},{{},{}},(s,c)=>let ttc=Table.ToColumns(Source) in {s{0}&ttc{2*c} , s{1}& ttc{2*c+1}} ))

 

Hello @cwileyrrt 

 

this solution should work out for you. Is dynamic as well. Condition is that all columns with "Usage" in the name are put in one column and all other as well (This logic we can also change). After the final table is create, I applied a group function to sum all the same materials

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("rZQxa8NADIX/i+dUSLqTfDeWzIEOJR2CBw8mNYQWnAT686sEN6T125rtTjL69J503u2a7TB9vpyn4e19PA3NqjGnWjUOz/vxcD4mXh+Gfop7SuRS4/A6Ho/nYdOfpvErrlKI0yV+qbTp98PHqX/axl0zmUvTrXb3qUjw/PE9Vgs5G8CqUlJbYiuZ6q34LwVKYqhUysQmQIFRdQEKpCUtBiGpEiuySZ1KbpeQQsaOGKFPICJncmsxQjKwJBDJAKKSO1ZhiZQrssoCgYYd25EzklEoW4EQJtUEGLwsz6AyY3OUXDPqXEhrAZ0LZYWdK5lge2L+bUX2xIaJLxk/4cVLuMaxN5LQFv3LG6HUImsk1h85M4cXxlzjuGvm8uiuo6g/3Ap++N7FxArSHn+7ClZCmKwk5C7H68b25kq1oLWLRKtggrf4X8ic6Lpv", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t, #"Model Material 3" = _t, #"Model Material 3 Usage" = _t, #"Model Material 4" = _t, #"Model Material 4 Usage" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Model Material 1", type text}, {"Model Material 1 Usage", Int64.Type}, {"Model Material 2", type text}, {"Model Material 2 Usage", Int64.Type}, {"Model Material 3", type text}, {"Model Material 3 Usage", Int64.Type}, {"Model Material 4", type text}, {"Model Material 4 Usage", Int64.Type}}),
    ColumnsMaterial = List.Select(Table.ColumnNames(#"Changed Type"), each not Text.Contains(_, "Usage")),
    ColumnsUsage = List.Select(Table.ColumnNames(#"Changed Type"), each Text.Contains(_, "Usage")),
    CreateListMaterial = List.Combine(List.Transform(ColumnsMaterial, (column)=> Table.Column(#"Changed Type", column))),
    CreateListUsage = List.Combine(List.Transform(ColumnsUsage, (column)=> Table.Column(#"Changed Type", column))), 
    CombineFinalTable = Table.FromColumns({CreateListMaterial, CreateListUsage}, {"Material", "Usage"}),
    #"Grouped Rows" = Table.Group(CombineFinalTable, {"Material"}, {{"Usage Sum", each List.Sum([Usage]), type number}})
in
    #"Grouped Rows"

 

Jimmy801_0-1601107435328.png

 

Copy paste this code to the advanced editor in a new blank query to see how the solution works.

 

If you need any help for this, come back to me

If this post helps or solves your problem, please mark it as solution (to help other users find useful content and to acknowledge the work of users that helped you)
Kudoes are nice too

Have fun

Jimmy

Super User III
Super User III

Hi, @cwileyrrt , maybe you're intended to convert the dataset like this?

transform.png

If so, try the codes below,

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WCkstyg8oLUoNz8gsSVXSUTI107O0NAIyHNMzc0qLjQ2cc1ITi4B8Y2M9M0NLpVgdiB7fxPTUvJJEoIQBEKObYmShZ2ZgCleMYoGRnqGpKTYLTPQMTA2x6jG21DMwwuYoIzM9CxNzrHpMTPTMTM2x6zE0AbotFgA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Model Material 1" = _t, #"Model Material 1 Usage" = _t, #"Model Material 2" = _t, #"Model Material 2 Usage" = _t]),

    #"Merged Columns" = Table.CombineColumns(Source,{"Model Material 1", "Model Material 1 Usage"},Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"1"),
    #"Merged Columns1" = Table.CombineColumns(#"Merged Columns",{"Model Material 2", "Model Material 2 Usage"},Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"2"),
    #"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Merged Columns1", {}, "Model", "Value"),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Unpivoted Columns", "Value", Splitter.SplitTextByDelimiter(";", QuoteStyle.Csv), {"Material", "Usage"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Material", type text}, {"Usage", type number}})
in
    #"Changed Type"
Super User IV
Super User IV

@cwileyrrt What is the expected output or is that the expected output? Sorry, having trouble following, can you post sample data as text and expected output?


Not really enough information to go on, please first check if your issue is a common issue listed here: https://community.powerbi.com/t5/Community-Blog/Before-You-Post-Read-This/ba-p/1116882

Also, please see this post regarding How to Get Your Question Answered Quickly: https://community.powerbi.com/t5/Community-Blog/How-to-Get-Your-Question-Answered-Quickly/ba-p/38490

The most important parts are:
1. Sample data as text, use the table tool in the editing bar
2. Expected output from sample data
3. Explanation in words of how to get from 1. to 2.


---------------------------------------

@ me in replies or I'll lose your thread!!!

I have a NEW book! 
DAX Cookbook from Packt
Over 120 DAX Recipes!




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Helpful resources

Announcements
secondImage

Happy New Year from Power BI

This is a must watch for a message from Power BI!

December Update

Check it Out!

Click here to read more about the December 2020 Updates!

Community Blog

Check it Out!

Click here to read the latest blog and learn more about contributing to the Power BI blog!

Get Ready for Power BI Dev Camp

Get Ready for Power BI Dev Camp

Mark your calendars and join us for our next Power BI Dev Camp!.

Top Solution Authors
Top Kudoed Authors