Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
Eric7
Resolver I
Resolver I

Adding 1 to a large number gives unexpected result

I have a simple power query table with one column and 3 rows:

Column1
10000000000000000
1000000000000000
100000000000000

 

The first cell has 17 digits, the second has 16 digits and the third has 15 digits.

 

First I change the column type to Whole number:

#"Changed column type" = Table.TransformColumnTypes(Source, {{"Column1", Int64.Type}}),


Then I call function Table.ReplaceValue, which is supposed to add 1 to each of the values in Column1 and then replace the values in Column1 with the new value:

#"Add number" =Table.ReplaceValue(#"Changed column type", each [Column1], each ([Column1] + 1), Replacer.ReplaceValue, {"Column1"})
 
The result is:
Column1
10000000000000000
  1000000000000001
    100000000000001
 
Note that 1 was NOT added to the first cell having 17 digits in it!
If I try to add the number 3 instead of 1 in Table.ReplaceValue, I get this result:
 
Column1
10000000000000004
  1000000000000003
    100000000000003
 
Now the first cell has got 4 added!
Is there a limit the number of digits when adding??

However, if I add this large number 10000000000000 like this:
#"Add number" =Table.ReplaceValue(#"Changed column type", each [Column1], each ([Column1] + 10000000000000), Replacer.ReplaceValue, {"Column1"})
I get this result:
Column1
10010000000000000
  1010000000000000
    110000000000000
Which is as expected.

 

Anyone who knows why adding 1 and 3 to the 17-digit number gives this strange result?

Perhaps it has to do with the max digit number of Int64 being 19.

 

 

BTW, this may a duplicate post, as I registered my profile after posting the first one (which I then could not find)...

 

-----

Here is the complete M-code for the table:

 
let
  Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjRAA0qxOpii2ASVYmMB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable textmeta [Serialized.Text = true]) in type table [Column1 = _t]),
  #"Changed column type" = Table.TransformColumnTypes(Source, {{"Column1", Int64.Type}}),
  res = Table.ReplaceValue(#"Changed column type", each [Column1], each ([Column1] + 1), Replacer.ReplaceValue, {"Column1"})
in
  res

 

1 ACCEPTED SOLUTION

This is a place where unfortunately the people coming up with text for the UI aren't in sync with the people responsible for the language specification. What the UI calls "Decimal Number" is not related to "decimal precision" for numeric operations. The "15 digits long" refers to "double precision". "Decimal precision" has considerably more digits of precision -- I want to say 29? -- but has a far smaller range for the scale. Internally, they use the .NET "decimal" type which I seem to remember is also the same as the OLEDB decimal type.

View solution in original post

12 REPLIES 12
idp6134
New Member

image.png

idp6134
New Member

This causes a problem in BI and SSAS because the numbers are "not unique" according to the engine.  This is because only the first fifteen numbers are reported and my ID's are 18 digits long with the last three making the string unique.  Is there a fix for this?

Use type text or Int64.Type . Last months someone here was wondering why Power Query couldn't handle PostGreSQL Int128 types 🙂

curth
Power BI Team
Power BI Team

By default, arithmetic is done using double precision. You can get the desired result in this case by specifying decimal precision instead of double precision; that is, "Value.Add([Column1], 1, Precision.Decimal)" instead of "[Column1] + 1".

Thanks. With Precision.Decimal, the addition works at least ut to 10^28. But as the doc says 

"The largest precision that can be represented in a Decimal Number type is 15 digits long", I would expect less than 10^28. Is it safe with these large number of digits? How are they stored?

This is a place where unfortunately the people coming up with text for the UI aren't in sync with the people responsible for the language specification. What the UI calls "Decimal Number" is not related to "decimal precision" for numeric operations. The "15 digits long" refers to "double precision". "Decimal precision" has considerably more digits of precision -- I want to say 29? -- but has a far smaller range for the scale. Internally, they use the .NET "decimal" type which I seem to remember is also the same as the OLEDB decimal type.

Confirmed. Thank you @curth !

 

let
Source = {1..19},
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Added Custom2" = Table.AddColumn(#"Converted to Table", "Base", each "1" & Text.Repeat("0",[Column1])),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom2",{{"Base", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "PlusOne", each [Base]+1),
#"Added Custom3" = Table.AddColumn(#"Added Custom", "ValueAdd", each Value.Add([Base], 1, Precision.Decimal)),
#"Added Custom1" = Table.AddColumn(#"Added Custom3", "Result.PlusOne", each [PlusOne]-[Base]),
#"Added Custom4" = Table.AddColumn(#"Added Custom1", "Result.ValueAdd", each Value.Subtract([ValueAdd],[Base],Precision.Decimal))
in
#"Added Custom4"

Smauro
Solution Sage
Solution Sage

Hi @Eric7 ,

 

While we do "assign types" in numbers using Power Query, everything internally is done using float64, which means we get complete precision up to 2^53-1, which is a bit less than 10^16. After this, results like the one you've found tend to happen.

Check this out https://en.wikipedia.org/wiki/Double-precision_floating-point_format#Precision_limitations_on_intege...




Feel free to connect with me:
LinkedIn

That may be why the bug is happening, but it is still a bug, or bad marketing, or bad documentation.

The Float64 reason should be included in this documentation:

https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-data-types

 

Currently, the doc only says 

 

"Note

The Power BI Desktop data model supports 64 bit integer values, but the largest number the visuals can safely express is 9,007,199,254,740,991 (2^53-1) due to JavaScript limitations. If you work with numbers in your data model above this, you can reduce the size through calculations before adding them to a visual".

 

One could argue that as long as Power Query M does offer the datatype Int64.Type, it should really be Int64 and not Float64.

v-yingjl
Community Support
Community Support

Hi @Eric7 ,

Has reproduced it in my side and submited it internal, if there is any progress, I would update here as soon as possible. ICM: 226860443

 

Best Regards,
Community Support Team _ Yingjie Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

lbendlin
Super User
Super User

Yay, bug!

 

let
Source = {1..20},
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Added Custom2" = Table.AddColumn(#"Converted to Table", "Custom0", each "1" & Text.Repeat("0",[Column1])),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom2",{{"Custom0", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each [Custom0]+1),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each [Custom]-[Custom0])
in
#"Added Custom1"

 

lbendlin_0-1612460788671.png

 

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

Top Solution Authors
Top Kudoed Authors