Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hello all
Please be so kind and help me to resolve the issue with adding additional row to Power Query.
I need to add additional row with start date which would be for all articles from date 1.12.2020 and end date will be according to first end date minus 1 day in existing data table. I've already created Custom column with formula {Number.From([Start_date])..Number.From([EndDATE])} because I need to list all days between the periods.
Information about products are part of my data set and I would like to keep them, it means that new rows should contain all information.
All articles have different start and end dates.
Now I have for example this:
Article number | Old price | Start_date | New price | End_date |
123456 | 14.00 | 1.1.2021 | 15.00 | 2.5.2022 |
And this should be the result:
Article number | Old price | Start_date | New price | End_date |
123456 | 14.00 | 1.12.2020 | 14.00 | 31.12.2020 |
123456 | 14.00 | 1.1.2021 | 15.00 | 2.5.2022 |
History of price is in different dates, but we need to start from the same date (1.12.2020) and as we don’t know what is the prices before the first known date, we need to set up all first old prices from 1.12.2020.
Thank you in advance! 🙂
Solved! Go to Solution.
I think I understand what you want, and I believe the following will provide it, based only on the five columns you show.
I believe I have commented the code enough so you can understand it, but ask questions if you have them.
Main Code
let
//edit to reflect your actual data source
Source = Excel.CurrentWorkbook(){[Name="Table3"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{
{"Article number", Int64.Type}, {"End_date", type date}, {"Old buying price", type number},
{"Start_date", type date}, {"New buying price", type number}}),
//Group by article number
// then run custom function on each subtable
#"Grouped Rows" = Table.Group(#"Changed Type", {"Article number"}, {
{"Count", each fnSemiMonthly(_)}}),
//remove unneeded column and expand the subtables, renaming the columns
#"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"Article number"}),
#"Expanded Count" = Table.ExpandTableColumn(#"Removed Columns", "Count",
{"End_date", "Article number", "Old buying price"},
{"Date", "Article number", "Price"}),
//set the data types
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Count",{
{"Date", type date}, {"Article number", Int64.Type}, {"Price", type number}}),
//set desired column order
#"Reordered Columns" = Table.ReorderColumns(#"Changed Type1",{"Article number", "Date", "Price"})
in
#"Reordered Columns"
Custom Function
//Rename fnSemiMonthly
(tbl as table)=>
let
Source = tbl,
//create a list (then make it a table) of all dates from 2022.12.01 to the later of Today vs the last date in the table, for the1st and 15th of each month
#"Semi Monthly Dates" = Table.FromList(List.Generate(
()=>[d=#date(2020,12,1)],
each [d] <= List.Max({List.Max(Source[Start_date]),Date.From(DateTime.LocalNow())}),
each [d=if Date.Day([d]) = 1 then Date.AddDays([d],14) else Date.StartOfMonth(Date.AddMonths([d],1))],
each [d]), Splitter.SplitByNothing(), type table[End_date=date]),
//Combine with the original table
#"Combine with Source" = Table.Combine({#"Semi Monthly Dates",Source}),
//Sort by date
//Then fillup to replace the nulls with the first filled in entry
#"Sorted Rows" = Table.Sort(#"Combine with Source",{{"End_date", Order.Ascending}}),
#"Filled" = Table.FillUp(#"Sorted Rows",{"Article number", "Old buying price"}),
//Remove duplicates, then fill down new buying price to
// replace prices in old price column if needed
#"Removed Duplicates" = Table.Distinct(#"Filled", {"End_date"}),
#"Filled Down" = Table.FillDown(#"Removed Duplicates",{"New buying price"}),
#"Replaced Value" = Table.ReplaceValue(#"Filled Down",null,each [New buying price],Replacer.ReplaceValue,{"Old buying price"}),
#"Removed Columns" = Table.RemoveColumns(#"Replaced Value",{"Start_date", "New buying price"}),
//Fill down to get last article numbers filled in
#"Filled Down1" = Table.FillDown(#"Removed Columns",{"Article number"}),
//Remove dates that are not the 1st or 15th
#"Remove Rows" = Table.SelectRows(#"Filled Down1", each Date.Day([End_date]) = 1 or Date.Day([End_date]) = 15)
in
#"Remove Rows"
Results
Article numberDatePrice
222 | 12/1/2020 | 0.359 |
222 | 12/15/2020 | 0.359 |
222 | 1/1/2021 | 0.359 |
222 | 1/15/2021 | 0.359 |
222 | 2/1/2021 | 0.359 |
222 | 2/15/2021 | 0.359 |
222 | 3/1/2021 | 0.359 |
222 | 3/15/2021 | 0.366 |
222 | 4/1/2021 | 0.366 |
222 | 4/15/2021 | 0.366 |
222 | 5/1/2021 | 0.384 |
222 | 5/15/2021 | 0.384 |
222 | 6/1/2021 | 0.384 |
222 | 6/15/2021 | 0.384 |
222 | 7/1/2021 | 0.384 |
222 | 7/15/2021 | 0.384 |
222 | 8/1/2021 | 0.384 |
222 | 8/15/2021 | 0.384 |
222 | 9/1/2021 | 0.384 |
222 | 9/15/2021 | 0.384 |
222 | 10/1/2021 | 0.384 |
222 | 10/15/2021 | 0.384 |
222 | 11/1/2021 | 0.384 |
222 | 11/15/2021 | 0.384 |
222 | 12/1/2021 | 0.384 |
222 | 12/15/2021 | 0.384 |
222 | 1/1/2022 | 0.422 |
222 | 1/15/2022 | 0.422 |
222 | 2/1/2022 | 0.422 |
222 | 2/15/2022 | 0.422 |
222 | 3/1/2022 | 0.422 |
222 | 3/15/2022 | 0.422 |
222 | 4/1/2022 | 0.422 |
222 | 4/15/2022 | 0.422 |
222 | 5/1/2022 | 0.422 |
222 | 5/15/2022 | 0.422 |
222 | 6/1/2022 | 0.422 |
222 | 6/15/2022 | 0.422 |
222 | 7/1/2022 | 0.422 |
222 | 7/15/2022 | 0.422 |
111 | 12/1/2020 | 0.95 |
111 | 12/15/2020 | 0.95 |
111 | 1/1/2021 | 0.95 |
111 | 1/15/2021 | 0.95 |
111 | 2/1/2021 | 0.95 |
111 | 2/15/2021 | 0.95 |
111 | 3/1/2021 | 0.95 |
111 | 3/15/2021 | 0.95 |
111 | 4/1/2021 | 0.95 |
111 | 4/15/2021 | 0.95 |
111 | 5/1/2021 | 0.95 |
111 | 5/15/2021 | 0.95 |
111 | 6/1/2021 | 0.95 |
111 | 6/15/2021 | 0.95 |
111 | 7/1/2021 | 0.95 |
111 | 7/15/2021 | 0.95 |
111 | 8/1/2021 | 0.95 |
111 | 8/15/2021 | 0.95 |
111 | 9/1/2021 | 0.95 |
111 | 9/15/2021 | 0.95 |
111 | 10/1/2021 | 0.95 |
111 | 10/15/2021 | 0.95 |
111 | 11/1/2021 | 0.95 |
111 | 11/15/2021 | 0.95 |
111 | 12/1/2021 | 0.95 |
111 | 12/15/2021 | 0.95 |
111 | 1/1/2022 | 0.95 |
111 | 1/15/2022 | 0.95 |
111 | 2/1/2022 | 0.95 |
111 | 2/15/2022 | 0.95 |
111 | 3/1/2022 | 0.95 |
111 | 3/15/2022 | 0.95 |
111 | 4/1/2022 | 0.95 |
111 | 4/15/2022 | 0.95 |
111 | 5/1/2022 | 0.95 |
111 | 5/15/2022 | 0.95 |
111 | 6/1/2022 | 0.95 |
111 | 6/15/2022 | 0.95 |
111 | 7/1/2022 | 0.99 |
111 | 7/15/2022 | 0.99 |
111 | 8/1/2022 | 0.99 |
111 | 8/15/2022 | 0.99 |
111 | 9/1/2022 | 0.99 |
Hello, I'm sending you example of data on the screen shot below as I was not able to send it as attachment.
INPUT
I chose just few columns as example, which are the most important:
OUTPUT
*if newest price list is after today date, the date and price would be the same. If newest price lists is before actual date - give actual date
Every articel should have the start record from 1.12.2020 and price should be added according to oldest price lists. It means, if some articel has oldest price lists from 5.3.2021 - we need to add dates 1.12.2020,15.12.2020,1.1.2021,15.1.2021,1.2.2021,15.2.2021 a 1.3.2021 (always first and 15th day in the month).
This is little bit different approach, but this is exactly what I need.
Thank you.
I cannot copy/paste your data sample screenshot. You can either post it as text, or upload the workbook to some sharing site (eg OneDrive; DropBox, etc) and post a link here.
Extract from original data ->
POM | Article number | art_no | var_no | tunit_no | Article name | KS/MU | End_date | Old buying price | Start_date | New buying price | status_article | subsys_art_name | division | suppl_no | suppl_name | Difference price SUM | Difference price % |
12345_1_1 | 1 | 12345 | 1 | 1 | xxx | 1 | 2021-01-31 | 11 | 2021-02-01 | 12 | 1_4_7 | xxx | BF43 MEAT & POULTRY | 21212 | xxx | 1.71 | 10% |
123456_1_1 | 2 | 123456 | 1 | 1 | xxxxx | 1 | 2021-01-31 | 14 | 2021-02-01 | 10 | 1_4_7 | xxxxx | BF43 MEAT & POULTRY | 12345 | bbb | 0.9 | 10% |
1234566_1_1 | 12 | 1234566 | 1 | 1 | hhhhhh | 1 | 2021-01-31 | 9.9 | 2021-02-01 | 10.89 | 1_4_7 | hhhhhh | BF43 MEAT & POULTRY | 38031 | ddd | 0.99 | 10% |
2790120_1_1 | 26 | 2790120 | 1 | 1 | dddddddd | 1 | 2021-03-31 | 7 | 2021-04-01 | 8 | 1_3_0 | dddddddd | BF43 MEAT & POULTRY | 35709 | bbb | 0 | 0% |
2901231_1_1 | 27 | 2901231 | 1 | 1 | dddddddd | 1 | 2021-04-30 | 7 | 2021-05-01 | 8 | 1_3_0 | dddddddd | BF18 DAIRY | 35709 | ccc | 0 | 0% |
3012342_1_1 | 28 | 3012342 | 1 | 1 | bb | 1 | 2021-05-31 | 7 | 2021-06-01 | 8 | 1_3_0 | bb | BF18 DAIRY | 35709 | ddd | 0 | 0% |
3123453_1_1 | 29 | 3123453 | 1 | 1 | bb | 1 | 2021-08-31 | 7 | 2021-09-01 | 8 | 1_3_0 | bb | BF18 DAIRY | 35709 | xxx | 0 | 0% |
3234564_1_1 | 30 | 3234564 | 1 | 1 | bbbbb | 1 | 2021-08-02 | 7 | 2021-08-03 | 8 | 1_3_0 | bbbbb | BF18 DAIRY | 21212 | bbb | 0 | 0% |
3345675_2_1 | 31 | 3345675 | 2 | 1 | bbb | 1 | 2021-01-24 | 7 | 2021-01-25 | 8 | 1_3_0 | bbb | BF18 DAIRY | 12345 | ccc | -665.65 | -100% |
3456786_2_1 | 32 | 3456786 | 2 | 1 | bbbbbb | 1 | 2021-01-28 | 7 | 2021-01-29 | 8 | 1_3_0 | bbbbbb | BF18 DAIRY | 33333 | ddd | 0 | 0% |
3567897_1_1 | 33 | 3567897 | 1 | 1 | bbbb | 1 | 2022-06-14 | 7 | 2022-06-15 | 8 | 1_4_2 | bbbb | BF01 WINES | 33333 | xxx | 0.212 | 14% |
3679008_1_1 | 34 | 3679008 | 1 | 1 | bbbbbbbb | 1 | 2022-07-14 | 7 | 2022-07-15 | 8 | 1_4_2 | bbbbbbbb | BF16 FROZEN FOOD | 45454 | bbb | 0 | 0% |
Article number | End_date | Old buying price | Start_date | New buying price |
222 | 2021-03-04 | 0.359 | 2021-03-05 | 0.366 |
222 | 2021-04-30 | 0.366 | 2021-05-01 | 0.384 |
222 | 2021-08-25 | 0.384 | 2021-08-26 | 0.384 |
222 | 2021-12-31 | 0.384 | 2022-01-01 | 0.422 |
222 | 2022-02-08 | 0.422 | 2022-02-09 | 0.422 |
111 | 2022-06-30 | 0.95 | 2022-07-01 | 0.99 |
111 | 2022-08-31 | 0.99 | 2022-09-01 | 0.99 |
Article number | date | price |
222 | 1.12.2020 | 0.359 |
222 | 15.12.2020 | 0.359 |
222 | 1.1.2021 | 0.359 |
222 | 15.1.2021 | 0.359 |
222 | 1.2.2021 | 0.359 |
222 | 15.2.2021 | 0.359 |
222 | 1.3.2021 | 0.359 |
222 | 15.3.2021 | 0.366 |
222 | 1.4.2021 | 0.366 |
222 | 15.4.2021 | 0.366 |
222 | 1.5.2021 | 0.384 |
222 | 15.5.2021 | 0.384 |
222 | 1.6.2021 | 0.384 |
222 | 15.6.2021 | 0.384 |
222 | 1.7.2021 | 0.384 |
222 | 15.7.2021 | 0.384 |
222 | 1.8.2021 | 0.384 |
222 | 15.8.2021 | 0.384 |
222 | 1.9.2021 | 0.384 |
222 | 15.9.2021 | 0.384 |
222 | 1.10.2021 | 0.384 |
222 | 15.10.2021 | 0.384 |
222 | 1.11.2021 | 0.384 |
222 | 15.11.2021 | 0.384 |
222 | 1.12.2021 | 0.384 |
222 | 15.12.2021 | 0.384 |
222 | 1.1.2022 | 0.422 |
222 | 15.1.2022 | 0.422 |
222 | 1.2.2022 | 0.422 |
222 | 15.2.2022 | 0.422 |
222 | 1.3.2022 | 0.422 |
222 | 15.3.2022 | 0.422 |
222 | 1.4.2022 | 0.422 |
222 | 15.4.2022 | 0.422 |
222 | 1.5.2022 | 0.422 |
222 | 15.5.2022 | 0.422 |
222 | 1.6.2022 | 0.422 |
222 | 15.6.2022 | 0.422 |
222 | 1.7.2022 | 0.422 |
222 | 15.7.2022 | 0.422 |
111 | 1.12.2020 | 0.95 |
111 | 15.12.2020 | 0.95 |
111 | 1.1.2021 | 0.95 |
111 | 15.1.2021 | 0.95 |
111 | 1.2.2021 | 0.95 |
111 | 15.2.2021 | 0.95 |
111 | 1.3.2021 | 0.95 |
111 | 15.3.2021 | 0.95 |
111 | 1.4.2021 | 0.95 |
111 | 15.4.2021 | 0.95 |
111 | 1.5.2021 | 0.95 |
111 | 15.5.2021 | 0.95 |
111 | 1.6.2021 | 0.95 |
111 | 15.6.2021 | 0.95 |
111 | 1.7.2021 | 0.95 |
111 | 15.7.2021 | 0.95 |
111 | 1.8.2021 | 0.95 |
111 | 15.8.2021 | 0.95 |
111 | 1.9.2021 | 0.95 |
111 | 15.9.2021 | 0.95 |
111 | 1.10.2021 | 0.95 |
111 | 15.10.2021 | 0.95 |
111 | 1.11.2021 | 0.95 |
111 | 15.11.2021 | 0.95 |
111 | 1.12.2021 | 0.95 |
111 | 15.12.2021 | 0.95 |
111 | 1.1.2022 | 0.95 |
111 | 15.1.2022 | 0.95 |
111 | 1.2.2022 | 0.95 |
111 | 15.2.2022 | 0.95 |
111 | 1.3.2022 | 0.95 |
111 | 15.3.2022 | 0.95 |
111 | 1.4.2022 | 0.95 |
111 | 15.4.2022 | 0.95 |
111 | 1.5.2022 | 0.95 |
111 | 15.5.2022 | 0.95 |
111 | 1.6.2022 | 0.95 |
111 | 15.6.2022 | 0.95 |
111 | 1.7.2022 | 0.99 |
111 | 15.7.2022 | 0.99 |
111 | 1.8.2022 | 0.99 |
111 | 15.8.2022 | 0.99 |
111 | 1.9.2022 | 0.99 |
In your last two postings, you show article numbers which do not appear in your Extract from Original Data. How are these being computed from the original data extract?
Also, in your penultimate posting, you show multiple lines with the same article number, but this does not occur in either your screenshot or extract of your original data. Please clarify what you are showing.
This task is quite different as my very first request as we would like to apply different approach.
Description of my previous tables:
Extract – original data should demonstrate how many different columns we have. Before I sent just example of one article and you sent me back code for one article, but I have no idea how to apply it for thousand of different articles.
INPUT – 2 articles which are not part of previous extract. I wanted to demonstrate that for us is important article number, end_date,old buying price, start_date and new buying price. And from this input I created output which is desired for us.
OUTPUT – desired result
Description of result – I will demonstrate it on one article 111 for better understanding . We can see that first known price is 0.95 in 2022-06-30.
We need to add date 2020-12-01,2020-12-15,2021-01-01, 2021-01-15,2021-02-01,2021-02-15 .......until date 2022-06-15 as end date ends 2022-06-30 and we need list just first and 15th day in the month as report should show us evolution of prices on the month level (articles can be changed whenever – because of that 1st and 15th day is the best indicator for us) . We need to have start date for every articel from 2020-12-01.
Then we should focus on oldest Start_date, in this example it is 2022-07-01, dates will look like 2022-07-01, 2022-07-15,2022-08-01,2022-08-15.
Another period starts from 2022-09-01, at this point we know that date is after today date and we use just this date 2022-09-01. If this date would be 2022-07-01 – we know that this date is before actual date so give me actual date, dates will look like this – 2022-07-01,2022-07-15 (as today is 2022-07-28)
Article number | End_date | Old buying price | Start_date | New buying price |
111 | 2022-06-30 | 0.95 | 2022-07-01 | 0.99 |
111 | 2022-08-31 | 0.99 | 2022-09-01 | 0.99 |
I think I understand what you want, and I believe the following will provide it, based only on the five columns you show.
I believe I have commented the code enough so you can understand it, but ask questions if you have them.
Main Code
let
//edit to reflect your actual data source
Source = Excel.CurrentWorkbook(){[Name="Table3"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{
{"Article number", Int64.Type}, {"End_date", type date}, {"Old buying price", type number},
{"Start_date", type date}, {"New buying price", type number}}),
//Group by article number
// then run custom function on each subtable
#"Grouped Rows" = Table.Group(#"Changed Type", {"Article number"}, {
{"Count", each fnSemiMonthly(_)}}),
//remove unneeded column and expand the subtables, renaming the columns
#"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"Article number"}),
#"Expanded Count" = Table.ExpandTableColumn(#"Removed Columns", "Count",
{"End_date", "Article number", "Old buying price"},
{"Date", "Article number", "Price"}),
//set the data types
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Count",{
{"Date", type date}, {"Article number", Int64.Type}, {"Price", type number}}),
//set desired column order
#"Reordered Columns" = Table.ReorderColumns(#"Changed Type1",{"Article number", "Date", "Price"})
in
#"Reordered Columns"
Custom Function
//Rename fnSemiMonthly
(tbl as table)=>
let
Source = tbl,
//create a list (then make it a table) of all dates from 2022.12.01 to the later of Today vs the last date in the table, for the1st and 15th of each month
#"Semi Monthly Dates" = Table.FromList(List.Generate(
()=>[d=#date(2020,12,1)],
each [d] <= List.Max({List.Max(Source[Start_date]),Date.From(DateTime.LocalNow())}),
each [d=if Date.Day([d]) = 1 then Date.AddDays([d],14) else Date.StartOfMonth(Date.AddMonths([d],1))],
each [d]), Splitter.SplitByNothing(), type table[End_date=date]),
//Combine with the original table
#"Combine with Source" = Table.Combine({#"Semi Monthly Dates",Source}),
//Sort by date
//Then fillup to replace the nulls with the first filled in entry
#"Sorted Rows" = Table.Sort(#"Combine with Source",{{"End_date", Order.Ascending}}),
#"Filled" = Table.FillUp(#"Sorted Rows",{"Article number", "Old buying price"}),
//Remove duplicates, then fill down new buying price to
// replace prices in old price column if needed
#"Removed Duplicates" = Table.Distinct(#"Filled", {"End_date"}),
#"Filled Down" = Table.FillDown(#"Removed Duplicates",{"New buying price"}),
#"Replaced Value" = Table.ReplaceValue(#"Filled Down",null,each [New buying price],Replacer.ReplaceValue,{"Old buying price"}),
#"Removed Columns" = Table.RemoveColumns(#"Replaced Value",{"Start_date", "New buying price"}),
//Fill down to get last article numbers filled in
#"Filled Down1" = Table.FillDown(#"Removed Columns",{"Article number"}),
//Remove dates that are not the 1st or 15th
#"Remove Rows" = Table.SelectRows(#"Filled Down1", each Date.Day([End_date]) = 1 or Date.Day([End_date]) = 15)
in
#"Remove Rows"
Results
Article numberDatePrice
222 | 12/1/2020 | 0.359 |
222 | 12/15/2020 | 0.359 |
222 | 1/1/2021 | 0.359 |
222 | 1/15/2021 | 0.359 |
222 | 2/1/2021 | 0.359 |
222 | 2/15/2021 | 0.359 |
222 | 3/1/2021 | 0.359 |
222 | 3/15/2021 | 0.366 |
222 | 4/1/2021 | 0.366 |
222 | 4/15/2021 | 0.366 |
222 | 5/1/2021 | 0.384 |
222 | 5/15/2021 | 0.384 |
222 | 6/1/2021 | 0.384 |
222 | 6/15/2021 | 0.384 |
222 | 7/1/2021 | 0.384 |
222 | 7/15/2021 | 0.384 |
222 | 8/1/2021 | 0.384 |
222 | 8/15/2021 | 0.384 |
222 | 9/1/2021 | 0.384 |
222 | 9/15/2021 | 0.384 |
222 | 10/1/2021 | 0.384 |
222 | 10/15/2021 | 0.384 |
222 | 11/1/2021 | 0.384 |
222 | 11/15/2021 | 0.384 |
222 | 12/1/2021 | 0.384 |
222 | 12/15/2021 | 0.384 |
222 | 1/1/2022 | 0.422 |
222 | 1/15/2022 | 0.422 |
222 | 2/1/2022 | 0.422 |
222 | 2/15/2022 | 0.422 |
222 | 3/1/2022 | 0.422 |
222 | 3/15/2022 | 0.422 |
222 | 4/1/2022 | 0.422 |
222 | 4/15/2022 | 0.422 |
222 | 5/1/2022 | 0.422 |
222 | 5/15/2022 | 0.422 |
222 | 6/1/2022 | 0.422 |
222 | 6/15/2022 | 0.422 |
222 | 7/1/2022 | 0.422 |
222 | 7/15/2022 | 0.422 |
111 | 12/1/2020 | 0.95 |
111 | 12/15/2020 | 0.95 |
111 | 1/1/2021 | 0.95 |
111 | 1/15/2021 | 0.95 |
111 | 2/1/2021 | 0.95 |
111 | 2/15/2021 | 0.95 |
111 | 3/1/2021 | 0.95 |
111 | 3/15/2021 | 0.95 |
111 | 4/1/2021 | 0.95 |
111 | 4/15/2021 | 0.95 |
111 | 5/1/2021 | 0.95 |
111 | 5/15/2021 | 0.95 |
111 | 6/1/2021 | 0.95 |
111 | 6/15/2021 | 0.95 |
111 | 7/1/2021 | 0.95 |
111 | 7/15/2021 | 0.95 |
111 | 8/1/2021 | 0.95 |
111 | 8/15/2021 | 0.95 |
111 | 9/1/2021 | 0.95 |
111 | 9/15/2021 | 0.95 |
111 | 10/1/2021 | 0.95 |
111 | 10/15/2021 | 0.95 |
111 | 11/1/2021 | 0.95 |
111 | 11/15/2021 | 0.95 |
111 | 12/1/2021 | 0.95 |
111 | 12/15/2021 | 0.95 |
111 | 1/1/2022 | 0.95 |
111 | 1/15/2022 | 0.95 |
111 | 2/1/2022 | 0.95 |
111 | 2/15/2022 | 0.95 |
111 | 3/1/2022 | 0.95 |
111 | 3/15/2022 | 0.95 |
111 | 4/1/2022 | 0.95 |
111 | 4/15/2022 | 0.95 |
111 | 5/1/2022 | 0.95 |
111 | 5/15/2022 | 0.95 |
111 | 6/1/2022 | 0.95 |
111 | 6/15/2022 | 0.95 |
111 | 7/1/2022 | 0.99 |
111 | 7/15/2022 | 0.99 |
111 | 8/1/2022 | 0.99 |
111 | 8/15/2022 | 0.99 |
111 | 9/1/2022 | 0.99 |
It appears, from your example, that you want the added row end date to be the first Start date -1, that being the case, see the following code:
let
Source = Table.FromColumns(
{{123456}, {14}, {#date(2021,1,1)}, {15}, {#date(2022,2,5)}},
type table[Article number=Int64.Type, Old price=Currency.Type, Start_date=date, New price=Currency.Type, End_date=date]),
//add row with Start Date = 1-dec-2020 and end date = first start_date -1
#"Add Row" = let
artNum = Source[Article number]{0},
startDt = #date(2020,12,1),
endDt = Date.AddDays(Source[Start_date]{0}, - 1),
price = Source[Old price]{0}
in
Table.InsertRows(Source, 0, {[Article number=artNum, Old price=price, Start_date=startDt, New price=price, End_date=endDt]})
in
#"Add Row"
Note my dates are MDY format
Original
Result
Hi, thank you for the code. Is there any other option? As I have a lot of columns and articles there and I think it is impossible to write the code for more then 20 thousand of articles. Thank you 🙂
You should be able to just group by article number and then apply the same algorithm to each sub group. Please supply a realistic example either as text or as a downloadable workbook if you need assistance in adapting this. As a side note , in the future, the more realistic your example, the better solutions can be provided.