Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
binayjethwa
Helper IV
Helper IV

Calculate sum to Row for each column.

Hi,

 

I want to create a new column called total to calculate sum of all the values in current row.

Below is the sample table.

Total is the sum of all the rows in that particular row for ex for ABC employee  it will be sum of all these numbers which will sum to 145 .

 

123ABC1020301010301112145

 

Basically i want to elimitate the rows for which sub totals are summing to zero , which is EMP 125 and filter out his data from table. Is there any way we can do this.

 

EMP IDName06-06-202307-06-202308-06-202309-06-202310-06-202311-06-202312-06-202313-06-2023Total
123ABC1020301010301112145
124asd1121311111311213154
125wed000000000
126adr1323331313331415172
127ass1424341414341516181
128wer1525351515351617190
129www1626361616361718199
130qqq1727371717371819208

Thanks,

Binay

 

1 ACCEPTED SOLUTION
PhilipTreacy
Super User
Super User

Hi @binayjethwa 

 

Download example PBIX file

 

Here's the query 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlLSUTIGYhMgNoXSxlC2OZQGYTMoNkdSC+JbQMUsgRjdLHOoGmMkdUPLvlid6GHtQ0rsNAHj2FgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Resource GPN" = _t, #"*Resource name" = _t, #"*Hire date" = _t, #"Employee type" = _t, #"Current/Future Rank" = _t, #"Current/Future Grade" = _t, #"*Home city" = _t, #"*Managerial Location" = _t, #"*Sub-service line" = _t, #"*Sub-management unit" = _t, #"*Client name" = _t, #"Client Number" = _t, #"*Engagement name" = _t, #"Engagement number" = _t, #"*Business unit" = _t, #"Booking type" = _t, #"Notes - Public" = _t, #"Time (Hours) 19 Jun 2023" = _t, #"Time (Hours) 26 Jun 2023" = _t, #"Time (Hours) 3 Jul 2023" = _t, #"Time (Hours) 10 Jul 2023" = _t, #"Time (Hours) 17 Jul 2023" = _t, #"Time (Hours) 24 Jul 2023" = _t, #"Time (Hours) 31 Jul 2023" = _t, #"Time (Hours) 7 Aug 2023" = _t, #"Time (Hours) 14 Aug 2023" = _t, #"Time (Hours) 21 Aug 2023" = _t, #"Time (Hours) 28 Aug 2023" = _t, #"Time (Hours) 4 Sep 2023" = _t, #"Time (Hours) 11 Sep 2023" = _t, #"Time (Hours) 18 Sep 2023" = _t, #"Time (Hours) 25 Sep 2023" = _t, #"Time (Hours) 2 Oct 2023" = _t, #"Time (Hours) 9 Oct 2023" = _t, #"Time (Hours) 16 Oct 2023" = _t, #"Time (Hours) 23 Oct 2023" = _t, #"Time (Hours) 30 Oct 2023" = _t, #"Time (Hours) 6 Nov 2023" = _t, #"Time (Hours) 13 Nov 2023" = _t, #"Time (Hours) 20 Nov 2023" = _t, #"Time (Hours) 27 Nov 2023" = _t, #"Time (Hours) 4 Dec 2023" = _t, #"Time (Hours) 11 Dec 2023" = _t, #"Time (Hours) 18 Dec 2023" = _t, #"Time (Hours) 25 Dec 2023" = _t, #"Time (Hours) 1 Jan 2024" = _t, #"Time (Hours) 8 Jan 2024" = _t, #"Time (Hours) 15 Jan 2024" = _t, #"Time (Hours) 22 Jan 2024" = _t, #"Time (Hours) 29 Jan 2024" = _t, #"Time (Hours) 5 Feb 2024" = _t, #"Time (Hours) 12 Feb 2024" = _t, #"Time (Hours) 19 Feb 2024" = _t, #"Time (Hours) 26 Feb 2024" = _t, #"Time (Hours) 4 Mar 2024" = _t, #"Time (Hours) 11 Mar 2024" = _t, #"Time (Hours) 18 Mar 2024" = _t, #"Time (Hours) 25 Mar 2024" = _t, #"Time (Hours) 1 Apr 2024" = _t, #"Time (Hours) 8 Apr 2024" = _t, #"Time (Hours) 15 Apr 2024" = _t, #"Time (Hours) 22 Apr 2024" = _t, #"Time (Hours) 29 Apr 2024" = _t, #"Time (Hours) 6 May 2024" = _t, #"Time (Hours) 13 May 2024" = _t, #"Time (Hours) 20 May 2024" = _t, #"Time (Hours) 27 May 2024" = _t, #"Time (Hours) 3 Jun 2024" = _t, #"Time (Hours) 10 Jun 2024" = _t, #"Time (Hours) 17 Jun 2024" = _t, #"Time (Hours) 24 Jun 2024" = _t, Source = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Resource GPN", Int64.Type}, {"*Resource name", Int64.Type}, {"*Hire date", Int64.Type}, {"Employee type", Int64.Type}, {"Current/Future Rank", Int64.Type}, {"Current/Future Grade", Int64.Type}, {"*Home city", Int64.Type}, {"*Managerial Location", Int64.Type}, {"*Sub-service line", Int64.Type}, {"*Sub-management unit", Int64.Type}, {"*Client name", Int64.Type}, {"Client Number", Int64.Type}, {"*Engagement name", Int64.Type}, {"Engagement number", Int64.Type}, {"*Business unit", Int64.Type}, {"Booking type", Int64.Type}, {"Notes - Public", Int64.Type}, {"Time (Hours) 19 Jun 2023", Int64.Type}, {"Time (Hours) 26 Jun 2023", Int64.Type}, {"Time (Hours) 3 Jul 2023", Int64.Type}, {"Time (Hours) 10 Jul 2023", Int64.Type}, {"Time (Hours) 17 Jul 2023", Int64.Type}, {"Time (Hours) 24 Jul 2023", Int64.Type}, {"Time (Hours) 31 Jul 2023", Int64.Type}, {"Time (Hours) 7 Aug 2023", Int64.Type}, {"Time (Hours) 14 Aug 2023", Int64.Type}, {"Time (Hours) 21 Aug 2023", Int64.Type}, {"Time (Hours) 28 Aug 2023", Int64.Type}, {"Time (Hours) 4 Sep 2023", Int64.Type}, {"Time (Hours) 11 Sep 2023", Int64.Type}, {"Time (Hours) 18 Sep 2023", Int64.Type}, {"Time (Hours) 25 Sep 2023", Int64.Type}, {"Time (Hours) 2 Oct 2023", Int64.Type}, {"Time (Hours) 9 Oct 2023", Int64.Type}, {"Time (Hours) 16 Oct 2023", Int64.Type}, {"Time (Hours) 23 Oct 2023", Int64.Type}, {"Time (Hours) 30 Oct 2023", Int64.Type}, {"Time (Hours) 6 Nov 2023", Int64.Type}, {"Time (Hours) 13 Nov 2023", Int64.Type}, {"Time (Hours) 20 Nov 2023", Int64.Type}, {"Time (Hours) 27 Nov 2023", Int64.Type}, {"Time (Hours) 4 Dec 2023", Int64.Type}, {"Time (Hours) 11 Dec 2023", Int64.Type}, {"Time (Hours) 18 Dec 2023", Int64.Type}, {"Time (Hours) 25 Dec 2023", Int64.Type}, {"Time (Hours) 1 Jan 2024", Int64.Type}, {"Time (Hours) 8 Jan 2024", Int64.Type}, {"Time (Hours) 15 Jan 2024", Int64.Type}, {"Time (Hours) 22 Jan 2024", Int64.Type}, {"Time (Hours) 29 Jan 2024", Int64.Type}, {"Time (Hours) 5 Feb 2024", Int64.Type}, {"Time (Hours) 12 Feb 2024", Int64.Type}, {"Time (Hours) 19 Feb 2024", Int64.Type}, {"Time (Hours) 26 Feb 2024", Int64.Type}, {"Time (Hours) 4 Mar 2024", Int64.Type}, {"Time (Hours) 11 Mar 2024", Int64.Type}, {"Time (Hours) 18 Mar 2024", Int64.Type}, {"Time (Hours) 25 Mar 2024", Int64.Type}, {"Time (Hours) 1 Apr 2024", Int64.Type}, {"Time (Hours) 8 Apr 2024", Int64.Type}, {"Time (Hours) 15 Apr 2024", Int64.Type}, {"Time (Hours) 22 Apr 2024", Int64.Type}, {"Time (Hours) 29 Apr 2024", Int64.Type}, {"Time (Hours) 6 May 2024", Int64.Type}, {"Time (Hours) 13 May 2024", Int64.Type}, {"Time (Hours) 20 May 2024", Int64.Type}, {"Time (Hours) 27 May 2024", Int64.Type}, {"Time (Hours) 3 Jun 2024", Int64.Type}, {"Time (Hours) 10 Jun 2024", Int64.Type}, {"Time (Hours) 17 Jun 2024", Int64.Type}, {"Time (Hours) 24 Jun 2024", Int64.Type}, {"Source", Int64.Type}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each Table.ToRows(Table.RemoveColumns(#"Changed Type", {"Resource GPN" , "*Resource name", "*Hire date", "Employee type", "Current/Future Rank", "Current/Future Grade", "*Home city", "*Managerial Location", "*Sub-service line", "*Sub-management unit", "*Client name", "Client Number", "*Engagement name", "Engagement number", "*Business unit", "Booking type", "Notes - Public", "Source"}))),
    #"Added Index" = Table.AddIndexColumn(#"Added Custom", "Index", 0, 1, Int64.Type),
    #"Added Custom1" = Table.AddColumn(#"Added Index", "Total", each List.Sum([Custom]{[Index]})),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom1", each ([Total] <> 0)),
    #"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Custom", "Index"})
in
    #"Removed Columns"

 

which operates on this data

 

x1.png

 

The important part of the code for you are the steps from #"Added Custom" onwards.

 

 

The #"Added Custom" step removes the columns you don't want to sum up from the source table, and creates in each row lists containing the values you do want to sum up.

= Table.ToRows(Table.RemoveColumns(#"Changed Type", {"Resource GPN" , "*Resource name", "*Hire date", "Employee type", "Current/Future Rank", "Current/Future Grade", "*Home city", "*Managerial Location", "*Sub-service line", "*Sub-management unit", "*Client name", "Client Number", "*Engagement name", "Engagement number", "*Business unit", "Booking type", "Notes - Public", "Source"}))

 

Then I add an Index column.

 

The next step sums the values from each row (stored in the aforementioned lists), that is, it sums up the values across the row only in the columns you want

= List.Sum([Custom]{[Index]})

 

You then filter out any 0 totals and delete the Index and Custom columns created during the query that are no longer needed.

 

Regards

 

Phil

 

 

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


View solution in original post

15 REPLIES 15
Ashish_Mathur
Super User
Super User

Hi,

This M code works

let
    Source = Excel.CurrentWorkbook(){[Name="Data"]}[Content],
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"EMP ID", Int64.Type}, {"Name", type text}, {"06-06-2023", Int64.Type}, {"07-06-2023", Int64.Type}, {"08-06-2023", Int64.Type}, {"09-06-2023", Int64.Type}, {"10-06-2023", Int64.Type}, {"11-06-2023", Int64.Type}, {"12-06-2023", Int64.Type}, {"13-06-2023", Int64.Type}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each List.Sum(List.Transform(List.Skip(Record.ToList(_),2), each Number.From(_)))),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom", each [Custom] > 0)
in
    #"Filtered Rows"

Hope this helps.

Untitled.png


Regards,
Ashish Mathur
http://www.ashishmathur.com
https://www.linkedin.com/in/excelenthusiasts/

Hi @Ashish_Mathur 

 

After the initial post the OP clarified that there are lots more than just the first 2 columns that need to be omitted.

 

It took a while but my latest reply solves it.

 

Regards

 

Phil

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


PhilipTreacy
Super User
Super User

Hi @binayjethwa 

 

Download example PBIX file

 

Here's the query 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlLSUTIGYhMgNoXSxlC2OZQGYTMoNkdSC+JbQMUsgRjdLHOoGmMkdUPLvlid6GHtQ0rsNAHj2FgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Resource GPN" = _t, #"*Resource name" = _t, #"*Hire date" = _t, #"Employee type" = _t, #"Current/Future Rank" = _t, #"Current/Future Grade" = _t, #"*Home city" = _t, #"*Managerial Location" = _t, #"*Sub-service line" = _t, #"*Sub-management unit" = _t, #"*Client name" = _t, #"Client Number" = _t, #"*Engagement name" = _t, #"Engagement number" = _t, #"*Business unit" = _t, #"Booking type" = _t, #"Notes - Public" = _t, #"Time (Hours) 19 Jun 2023" = _t, #"Time (Hours) 26 Jun 2023" = _t, #"Time (Hours) 3 Jul 2023" = _t, #"Time (Hours) 10 Jul 2023" = _t, #"Time (Hours) 17 Jul 2023" = _t, #"Time (Hours) 24 Jul 2023" = _t, #"Time (Hours) 31 Jul 2023" = _t, #"Time (Hours) 7 Aug 2023" = _t, #"Time (Hours) 14 Aug 2023" = _t, #"Time (Hours) 21 Aug 2023" = _t, #"Time (Hours) 28 Aug 2023" = _t, #"Time (Hours) 4 Sep 2023" = _t, #"Time (Hours) 11 Sep 2023" = _t, #"Time (Hours) 18 Sep 2023" = _t, #"Time (Hours) 25 Sep 2023" = _t, #"Time (Hours) 2 Oct 2023" = _t, #"Time (Hours) 9 Oct 2023" = _t, #"Time (Hours) 16 Oct 2023" = _t, #"Time (Hours) 23 Oct 2023" = _t, #"Time (Hours) 30 Oct 2023" = _t, #"Time (Hours) 6 Nov 2023" = _t, #"Time (Hours) 13 Nov 2023" = _t, #"Time (Hours) 20 Nov 2023" = _t, #"Time (Hours) 27 Nov 2023" = _t, #"Time (Hours) 4 Dec 2023" = _t, #"Time (Hours) 11 Dec 2023" = _t, #"Time (Hours) 18 Dec 2023" = _t, #"Time (Hours) 25 Dec 2023" = _t, #"Time (Hours) 1 Jan 2024" = _t, #"Time (Hours) 8 Jan 2024" = _t, #"Time (Hours) 15 Jan 2024" = _t, #"Time (Hours) 22 Jan 2024" = _t, #"Time (Hours) 29 Jan 2024" = _t, #"Time (Hours) 5 Feb 2024" = _t, #"Time (Hours) 12 Feb 2024" = _t, #"Time (Hours) 19 Feb 2024" = _t, #"Time (Hours) 26 Feb 2024" = _t, #"Time (Hours) 4 Mar 2024" = _t, #"Time (Hours) 11 Mar 2024" = _t, #"Time (Hours) 18 Mar 2024" = _t, #"Time (Hours) 25 Mar 2024" = _t, #"Time (Hours) 1 Apr 2024" = _t, #"Time (Hours) 8 Apr 2024" = _t, #"Time (Hours) 15 Apr 2024" = _t, #"Time (Hours) 22 Apr 2024" = _t, #"Time (Hours) 29 Apr 2024" = _t, #"Time (Hours) 6 May 2024" = _t, #"Time (Hours) 13 May 2024" = _t, #"Time (Hours) 20 May 2024" = _t, #"Time (Hours) 27 May 2024" = _t, #"Time (Hours) 3 Jun 2024" = _t, #"Time (Hours) 10 Jun 2024" = _t, #"Time (Hours) 17 Jun 2024" = _t, #"Time (Hours) 24 Jun 2024" = _t, Source = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Resource GPN", Int64.Type}, {"*Resource name", Int64.Type}, {"*Hire date", Int64.Type}, {"Employee type", Int64.Type}, {"Current/Future Rank", Int64.Type}, {"Current/Future Grade", Int64.Type}, {"*Home city", Int64.Type}, {"*Managerial Location", Int64.Type}, {"*Sub-service line", Int64.Type}, {"*Sub-management unit", Int64.Type}, {"*Client name", Int64.Type}, {"Client Number", Int64.Type}, {"*Engagement name", Int64.Type}, {"Engagement number", Int64.Type}, {"*Business unit", Int64.Type}, {"Booking type", Int64.Type}, {"Notes - Public", Int64.Type}, {"Time (Hours) 19 Jun 2023", Int64.Type}, {"Time (Hours) 26 Jun 2023", Int64.Type}, {"Time (Hours) 3 Jul 2023", Int64.Type}, {"Time (Hours) 10 Jul 2023", Int64.Type}, {"Time (Hours) 17 Jul 2023", Int64.Type}, {"Time (Hours) 24 Jul 2023", Int64.Type}, {"Time (Hours) 31 Jul 2023", Int64.Type}, {"Time (Hours) 7 Aug 2023", Int64.Type}, {"Time (Hours) 14 Aug 2023", Int64.Type}, {"Time (Hours) 21 Aug 2023", Int64.Type}, {"Time (Hours) 28 Aug 2023", Int64.Type}, {"Time (Hours) 4 Sep 2023", Int64.Type}, {"Time (Hours) 11 Sep 2023", Int64.Type}, {"Time (Hours) 18 Sep 2023", Int64.Type}, {"Time (Hours) 25 Sep 2023", Int64.Type}, {"Time (Hours) 2 Oct 2023", Int64.Type}, {"Time (Hours) 9 Oct 2023", Int64.Type}, {"Time (Hours) 16 Oct 2023", Int64.Type}, {"Time (Hours) 23 Oct 2023", Int64.Type}, {"Time (Hours) 30 Oct 2023", Int64.Type}, {"Time (Hours) 6 Nov 2023", Int64.Type}, {"Time (Hours) 13 Nov 2023", Int64.Type}, {"Time (Hours) 20 Nov 2023", Int64.Type}, {"Time (Hours) 27 Nov 2023", Int64.Type}, {"Time (Hours) 4 Dec 2023", Int64.Type}, {"Time (Hours) 11 Dec 2023", Int64.Type}, {"Time (Hours) 18 Dec 2023", Int64.Type}, {"Time (Hours) 25 Dec 2023", Int64.Type}, {"Time (Hours) 1 Jan 2024", Int64.Type}, {"Time (Hours) 8 Jan 2024", Int64.Type}, {"Time (Hours) 15 Jan 2024", Int64.Type}, {"Time (Hours) 22 Jan 2024", Int64.Type}, {"Time (Hours) 29 Jan 2024", Int64.Type}, {"Time (Hours) 5 Feb 2024", Int64.Type}, {"Time (Hours) 12 Feb 2024", Int64.Type}, {"Time (Hours) 19 Feb 2024", Int64.Type}, {"Time (Hours) 26 Feb 2024", Int64.Type}, {"Time (Hours) 4 Mar 2024", Int64.Type}, {"Time (Hours) 11 Mar 2024", Int64.Type}, {"Time (Hours) 18 Mar 2024", Int64.Type}, {"Time (Hours) 25 Mar 2024", Int64.Type}, {"Time (Hours) 1 Apr 2024", Int64.Type}, {"Time (Hours) 8 Apr 2024", Int64.Type}, {"Time (Hours) 15 Apr 2024", Int64.Type}, {"Time (Hours) 22 Apr 2024", Int64.Type}, {"Time (Hours) 29 Apr 2024", Int64.Type}, {"Time (Hours) 6 May 2024", Int64.Type}, {"Time (Hours) 13 May 2024", Int64.Type}, {"Time (Hours) 20 May 2024", Int64.Type}, {"Time (Hours) 27 May 2024", Int64.Type}, {"Time (Hours) 3 Jun 2024", Int64.Type}, {"Time (Hours) 10 Jun 2024", Int64.Type}, {"Time (Hours) 17 Jun 2024", Int64.Type}, {"Time (Hours) 24 Jun 2024", Int64.Type}, {"Source", Int64.Type}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each Table.ToRows(Table.RemoveColumns(#"Changed Type", {"Resource GPN" , "*Resource name", "*Hire date", "Employee type", "Current/Future Rank", "Current/Future Grade", "*Home city", "*Managerial Location", "*Sub-service line", "*Sub-management unit", "*Client name", "Client Number", "*Engagement name", "Engagement number", "*Business unit", "Booking type", "Notes - Public", "Source"}))),
    #"Added Index" = Table.AddIndexColumn(#"Added Custom", "Index", 0, 1, Int64.Type),
    #"Added Custom1" = Table.AddColumn(#"Added Index", "Total", each List.Sum([Custom]{[Index]})),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom1", each ([Total] <> 0)),
    #"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Custom", "Index"})
in
    #"Removed Columns"

 

which operates on this data

 

x1.png

 

The important part of the code for you are the steps from #"Added Custom" onwards.

 

 

The #"Added Custom" step removes the columns you don't want to sum up from the source table, and creates in each row lists containing the values you do want to sum up.

= Table.ToRows(Table.RemoveColumns(#"Changed Type", {"Resource GPN" , "*Resource name", "*Hire date", "Employee type", "Current/Future Rank", "Current/Future Grade", "*Home city", "*Managerial Location", "*Sub-service line", "*Sub-management unit", "*Client name", "Client Number", "*Engagement name", "Engagement number", "*Business unit", "Booking type", "Notes - Public", "Source"}))

 

Then I add an Index column.

 

The next step sums the values from each row (stored in the aforementioned lists), that is, it sums up the values across the row only in the columns you want

= List.Sum([Custom]{[Index]})

 

You then filter out any 0 totals and delete the Index and Custom columns created during the query that are no longer needed.

 

Regards

 

Phil

 

 

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


binayjethwa
Helper IV
Helper IV

hi @PhilipTreacy  Any update ?

binayjethwa
Helper IV
Helper IV

hi @PhilipTreacy ,

 

below is the list of column names,

 

Resource GPN
*Resource name
*Hire date
Employee type
Current/Future Rank
Current/Future Grade
*Home city
*Managerial Location
*Sub-service line
*Sub-management unit
*Client name
Client Number
*Engagement name
Engagement number
*Business unit
Booking type
Notes - Public
Time (Hours) 19 Jun 2023
Time (Hours) 26 Jun 2023
Time (Hours) 3 Jul 2023
Time (Hours) 10 Jul 2023
Time (Hours) 17 Jul 2023
Time (Hours) 24 Jul 2023
Time (Hours) 31 Jul 2023
Time (Hours) 7 Aug 2023
Time (Hours) 14 Aug 2023
Time (Hours) 21 Aug 2023
Time (Hours) 28 Aug 2023
Time (Hours) 4 Sep 2023
Time (Hours) 11 Sep 2023
Time (Hours) 18 Sep 2023
Time (Hours) 25 Sep 2023
Time (Hours) 2 Oct 2023
Time (Hours) 9 Oct 2023
Time (Hours) 16 Oct 2023
Time (Hours) 23 Oct 2023
Time (Hours) 30 Oct 2023
Time (Hours) 6 Nov 2023
Time (Hours) 13 Nov 2023
Time (Hours) 20 Nov 2023
Time (Hours) 27 Nov 2023
Time (Hours) 4 Dec 2023
Time (Hours) 11 Dec 2023
Time (Hours) 18 Dec 2023
Time (Hours) 25 Dec 2023
Time (Hours) 1 Jan 2024
Time (Hours) 8 Jan 2024
Time (Hours) 15 Jan 2024
Time (Hours) 22 Jan 2024
Time (Hours) 29 Jan 2024
Time (Hours) 5 Feb 2024
Time (Hours) 12 Feb 2024
Time (Hours) 19 Feb 2024
Time (Hours) 26 Feb 2024
Time (Hours) 4 Mar 2024
Time (Hours) 11 Mar 2024
Time (Hours) 18 Mar 2024
Time (Hours) 25 Mar 2024
Time (Hours) 1 Apr 2024
Time (Hours) 8 Apr 2024
Time (Hours) 15 Apr 2024
Time (Hours) 22 Apr 2024
Time (Hours) 29 Apr 2024
Time (Hours) 6 May 2024
Time (Hours) 13 May 2024
Time (Hours) 20 May 2024
Time (Hours) 27 May 2024
Time (Hours) 3 Jun 2024
Time (Hours) 10 Jun 2024
Time (Hours) 17 Jun 2024
Time (Hours) 24 Jun 2024
Source
binayjethwa
Helper IV
Helper IV

Hi @PhilipTreacy ,

 

below would be my column names . which need to be summed up .

 

Time (Hours) 17 Jun 2023 - 23 Jun 2023
Time (Hours) 24 Jun 2023 - 30 Jun 2023
Time (Hours) 1 Jul 2023 - 7 Jul 2023
Time (Hours) 8 Jul 2023 - 14 Jul 2023
Time (Hours) 15 Jul 2023 - 21 Jul 2023
Time (Hours) 22 Jul 2023 - 28 Jul 2023
Time (Hours) 29 Jul 2023 - 4 Aug 2023
Time (Hours) 5 Aug 2023 - 11 Aug 2023
Time (Hours) 12 Aug 2023 - 18 Aug 2023
Time (Hours) 19 Aug 2023 - 25 Aug 2023
Time (Hours) 26 Aug 2023 - 1 Sep 2023
Time (Hours) 2 Sep 2023 - 8 Sep 2023
Time (Hours) 9 Sep 2023 - 15 Sep 2023
Time (Hours) 16 Sep 2023 - 22 Sep 2023
Time (Hours) 23 Sep 2023 - 29 Sep 2023
Time (Hours) 30 Sep 2023 - 6 Oct 2023
Time (Hours) 7 Oct 2023 - 13 Oct 2023
Time (Hours) 14 Oct 2023 - 20 Oct 2023
Time (Hours) 21 Oct 2023 - 27 Oct 2023
Time (Hours) 28 Oct 2023 - 3 Nov 2023
Time (Hours) 4 Nov 2023 - 10 Nov 2023
Time (Hours) 11 Nov 2023 - 17 Nov 2023
Time (Hours) 18 Nov 2023 - 24 Nov 2023
Time (Hours) 25 Nov 2023 - 1 Dec 2023
Time (Hours) 2 Dec 2023 - 8 Dec 2023
Time (Hours) 9 Dec 2023 - 15 Dec 2023
Time (Hours) 16 Dec 2023 - 22 Dec 2023
Time (Hours) 23 Dec 2023 - 29 Dec 2023
Time (Hours) 30 Dec 2023 - 5 Jan 2024
Time (Hours) 6 Jan 2024 - 12 Jan 2024
Time (Hours) 13 Jan 2024 - 19 Jan 2024
Time (Hours) 20 Jan 2024 - 26 Jan 2024
Time (Hours) 27 Jan 2024 - 2 Feb 2024
Time (Hours) 3 Feb 2024 - 9 Feb 2024
Time (Hours) 10 Feb 2024 - 16 Feb 2024
Time (Hours) 17 Feb 2024 - 23 Feb 2024
Time (Hours) 24 Feb 2024 - 1 Mar 2024
Time (Hours) 2 Mar 2024 - 8 Mar 2024
Time (Hours) 9 Mar 2024 - 15 Mar 2024
Time (Hours) 16 Mar 2024 - 22 Mar 2024
Time (Hours) 23 Mar 2024 - 29 Mar 2024
Time (Hours) 30 Mar 2024 - 5 Apr 2024
Time (Hours) 6 Apr 2024 - 12 Apr 2024
Time (Hours) 13 Apr 2024 - 19 Apr 2024
Time (Hours) 20 Apr 2024 - 26 Apr 2024
Time (Hours) 27 Apr 2024 - 3 May 2024
Time (Hours) 4 May 2024 - 10 May 2024
Time (Hours) 11 May 2024 - 17 May 2024
Time (Hours) 18 May 2024 - 24 May 2024
Time (Hours) 25 May 2024 - 31 May 2024
Time (Hours) 1 Jun 2024 - 7 Jun 2024
Time (Hours) 8 Jun 2024 - 14 Jun 2024
Time (Hours) 15 Jun 2024 - 21 Jun 2024
Time (Hours) 22 Jun 2024 - 28 Jun 2024

@binayjethwa 

 

All the columns please.

 

You said the column names might change if the year changes, so these column names are no good in a query if explicitly named.

 

I need to know the names of the other columns that do not chnage so that they can be excluded from the SUM operation.

 

Whatever is left can be summed.  This way the names of the columns being summed are irrelevant.

 

Phil

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


binayjethwa
Helper IV
Helper IV

I have many columns in my data around 50  , but i only want columns with date name( Which are hours actually) as mentioned in above to be summed up.

@binayjethwa 

 

How will I know what the date names are?

 

Please help me to help you, post your data in a file.  Just the column headers will be fine.

 

Phil

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


Hi  @PhilipTreacy 

 

I have many columns in my data around 50  , but i only want columns with date name( Which are hours actually) as mentioned in above to be summed up.

binayjethwa
Helper IV
Helper IV

Hi @PhilipTreacy ,

 

I see below error for totals column , i have other fields in my table also.

 

Any suggestion ?

 

binayjethwa_0-1687337507269.png

 

Hi @binayjethwa 

 

Can you please show all the columns in the table so I can leave out the ones that don't need to be summed up.

 

Regards

 

Phil

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


PhilipTreacy
Super User
Super User

Hi @binayjethwa 

 

Have you tried my code?  It creates a total from the numbers in the row.

 

Have you had problems adapting it to your situation?

 

Regards

 

Phil

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


binayjethwa
Helper IV
Helper IV

Hi @PhilipTreacy ,

 

I do not have any totals column in my table , I have created it manually.

 

I tried to to a SUM based on add column using standard power query functions but my column names might change if year is changed. 

I have also tried unpivot columns but my data is bloating so much hence its not a good option for me.

 

Is there any other way we can achieve.

 

Thanks,

 Binay

PhilipTreacy
Super User
Super User

Hi @binayjethwa 

 

Download example PBIX file

 

If you use Table.ToRows you can isolate the values for each row and then sum them.  The row with total 0 can then be filtered out.

 

Here's the query

 

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("fZAxDoQwDAT/kpoijkkC5d09A1GcxD0gUPD9yzpx3FGw0li7MGLbHAV2k3u9PzXJ1wgI9opkSIjg9gmrucL3OvQeEEyjZhgQ3Fexwv3Dyj8+rZ3wjeNsL6jfQDArkiFkKPZVFrNL72LK86gZQoZSXy1idupdTDmOmiGkKPfVitV9612MOY2aIaRoaSv5m6UUvYsx51EzhBStbt//", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"EMP ID" = _t, Name = _t, #"06-06-2023" = _t, #"07-06-2023" = _t, #"08-06-2023" = _t, #"09-06-2023" = _t, #"10-06-2023" = _t, #"11-06-2023" = _t, #"12-06-2023" = _t, #"13-06-2023" = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"EMP ID", Int64.Type}, {"Name", type text}, {"06-06-2023", Int64.Type}, {"07-06-2023", Int64.Type}, {"08-06-2023", Int64.Type}, {"09-06-2023", Int64.Type}, {"10-06-2023", Int64.Type}, {"11-06-2023", Int64.Type}, {"12-06-2023", Int64.Type}, {"13-06-2023", Int64.Type}}),
    #"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each Table.ToRows(#"Changed Type")),
    #"Added Index" = Table.AddIndexColumn(#"Added Custom", "Index", 0, 1, Int64.Type),
    #"Added Custom1" = Table.AddColumn(#"Added Index", "Total", each List.Sum(List.RemoveFirstN([Custom]{[Index]}, 2))),
    #"Filtered Rows" = Table.SelectRows(#"Added Custom1", each ([Total] <> 0)),
    #"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Custom", "Index"})
in
    #"Removed Columns"

 

suymrow.png

 

Regards

 

Phil

 



Did I answer your question? Then please mark my post as the solution.
If I helped you, click on the Thumbs Up to give Kudos.


Blog :: YouTube Channel :: Connect on Linkedin


Proud to be a Super User!


Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.