Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hi everyone,
I'm struggling with an issue in Power BI where percentages are automatcally recognized as a decimal values. This is the data table
FundId | 2023-01 | 2023-02 | 2023-03 | 2023-04 | 2023-05 | 2023-06 | 2023-07 | 2023-08 | 2023-09 | 2023-10 | 2023-11 | 2023-12 | 2024-01 | 2024-02 | 2024-03 |
FS00001 | 65,30% | -10,60% | 18,70% | -24,00% | 56,20% | -5,30% | -8,70% | -20,70% | 0,00% | 0,00% | 0,00% | 0,00% | 0,00% | 0,00% | 0,00% |
FS00002 | 8,80% | -18,50% | 0,30% | 0,30% | -7,70% | -1,90% | 3,10% | -1,10% | -12,80% | -21,10% | 14,40% | 0,00% | 0,00% | 0,00% | 0,00% |
FS00003 | 36,40% | -28,70% | 7,40% | -15,60% | -25,40% | 7,80% | 32,40% | -27,60% | -10,00% | -11,70% | 0,60% | -10,70% | -28,90% | 18,30% | 4,20% |
FS00004 | 30,70% | -37,40% | 12,80% | -31,40% | -30,30% | 17,10% | 55,70% | -29,80% | -13,80% | -16,80% | 16,90% | -11,00% | -37,10% | 23,40% | 7,80% |
FS00005 | 38,80% | -26,30% | -0,20% | 9,60% | -20,10% | 28,70% | 8,40% | 11,10% | -28,30% | -3,00% | -35,80% | 0,00% | 0,00% | 0,00% | 0,00% |
FS00006 | 14,80% | -14,20% | 20,40% | -0,10% | -2,00% | -9,70% | 4,70% | -3,30% | -12,40% | 22,10% | 3,20% | 1,20% | -3,80% | 0,50% | 26,90% |
The original data is in decimal format (e.g., 0.008 = 0.8%) but i convert it to percentage. However, when I import the Excel file into Power BI, the values are automatically converted to decimal numbers again.
The problem is that when I try to transform the decimal numbers to percentages using Power Query, it works initially (on power query view), but the format is lost when I charge the file. The only way I've found to maintain the percentage format is table view > column tools > format > percentage, but this is not a scalable solution since I have many columns to convert.
I've tried various methods to fix this issue, but none have worked so far. I'm looking for a solution that addresses the root cause of the problem, rather than just applying a workaround.
I put screenshot of the process:
1st : upload the file (which already changes the data type)
2nd : the file in power query
3rd: After transforming all the columns to percentages in power query looks fine.
4th: when i save and close power query everything gets wrong again
Can anyone help me resolve this issue?
Thank you very much in advance.
Solved! Go to Solution.
Hi @MAIK2705 ,
This can be resolved by following a couple of best-practice guidelines:
1) Structure your data correctly for reporting (normalisation):
After your navigation step, select your [FundId] column and go to the Transform tab > Unpivot Column (dropdown) > Unpivot Other Columns.
You will now have a normalised fact table structure with an [Attribute] and [Value] column.
You can change the names of these if you want by amending the code that has been created in the formula bar:
At this point, you can just change the Data Type of the new [Attribute] and [Value] columns. This will always work as you're no longer hard-coding specific column names into the transformation code.
2) Type your data correctly (standard data types):
Do not use the percentage data type in Power Query. It's just a mask over a Decimal type anyway. Change your new [Value] column to Decimal type and adjust the format under MEasure Tools in the Report View:
Pete
Proud to be a Datanaut!
What I'm saying is that if this month, you load and format a column named March 24 to Excel, and then next month load Apr 24 to excel in place of March 24 column, that new column name will not be formatted even though it's in the same place, because Excel formatting is based on the column name, not the cell location.
--Nate
The root cause is that your column names change, so Excel doesn't keep the formatting because there are new column names. Before loading to Excel, demote the headers, then change your column types. Yes, your columns will be named Column1, Column2, etc, but your data will remain typed, and you can always hide the first row in Excel.
--Nate
Hi watkinnc and thanks for the help,
However this is not the cause because is not only Excel, i tried uploading the data in CSV, with Gateway and MySQL and i have the same prob always. After formatting in power query its converted.
Thanks
Hi @MAIK2705 ,
This can be resolved by following a couple of best-practice guidelines:
1) Structure your data correctly for reporting (normalisation):
After your navigation step, select your [FundId] column and go to the Transform tab > Unpivot Column (dropdown) > Unpivot Other Columns.
You will now have a normalised fact table structure with an [Attribute] and [Value] column.
You can change the names of these if you want by amending the code that has been created in the formula bar:
At this point, you can just change the Data Type of the new [Attribute] and [Value] columns. This will always work as you're no longer hard-coding specific column names into the transformation code.
2) Type your data correctly (standard data types):
Do not use the percentage data type in Power Query. It's just a mask over a Decimal type anyway. Change your new [Value] column to Decimal type and adjust the format under MEasure Tools in the Report View:
Pete
Proud to be a Datanaut!
Hi BA_Pete and thanks for the answer,
1- I know maybe is not the most apropiate structure, and maybe i should normalise. Nevertheless, this table was just a sample, usually i have less columns FundID, Total Ret YTD, Total Ret 1Y, Total Ret 3Y and Total Ret 5Y, management fees etc...
That's why I thought normalising this type of data wasnt a priority because in this performance table i dont use dates and other reasson is that if i unpivot around 40k differents rows this could affect the performance of the report. Im wrong or i should normalize it ?
2- I knew this way and following the approach of unpivot tables make sense, but imagine that one day i have lots of columns with % types and i should do it one by one . I think is not optimal.
However, i think what you said is the reasson "Do not use the percentage data type in Power Query. It's just a mask over a Decimal type anyway. " but anyway I think this is something which should be corrected.
I think i will go for you second approach and hope not having lots of columns to change one day .
Thank you vey mach for the help.
Hi @MAIK2705 ,
My options 1) and 2) weren't really either/or, I recommend you do them BOTH.
1) Yes, unpivoting can be resource-intensive (as the whole table has to be loaded into memory) and may affect refresh speed but, as you get into using Power BI more, you'll see that normalised data makes everything else much easier once done. Power BI can compress data orders of magnitude more efficiently once it's normalised (so the PBIX file is much smaller and faster to publish etc.), relationships can be made between normalised tables far more intuitively, and FAR fewer measures have to be written.
2) This is another reason why you normalise the structure first where possible: so you only need to change the Type of your [Value] column. However, if you do have columns that logically need to stay as their own columns, then you can multi-select them (Ctrl+click or Click Shift+click), right-click on one of the selected columns title, go to Change Types and select a type to change ALL the currently selected columns. This makes bulk type changes very fast, just be mindful that the column names will now be hard-coded into the transformation.
Pete
Proud to be a Datanaut!
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.