Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Dataflows sets data types incorrectly

In one of my entities, I have a column of type Date and a column of type Fixed Decimal Number. I have explicitly set the data types, and I can see the correct data type icons in column headers.

 

When I save and refresh my dataflow, then export the JSON file, the data types are set incorrectly to Date/Time (dateTime) and Decimal Number (double), respectively.

 

As a result, I have to set the data types again in Power BI Desktop.

Status: Accepted
Comments
Daniil
Kudo Kingpin

Upon further inspection, it appears that Power BI Dekstop also does not understand what dataType double is, though dateTime and string data types are okay, so it sets a default numeric type (Decimal Number) to columns of type double.

 

I came to this conclusion because when I right-click on a column header in Power Query Editor and click Change Type, there is nothing pre-selected, but when I do the same for the Date/Time and Text columns, the data type is ticked:

image.pngimage.png

v-qiuyu-msft
Community Support

Hi @Daniil,

 

I have reported this issue internally: CRI 94252427. Will keep you update once I get any information. 

 

Best Regards,
Qiuyun Yu 

v-qiuyu-msft
Community Support
Status changed to: Accepted
 
ansa_naz
Continued Contributor

I have the same issue too. I have a Datetime column in my dataflow, which I convert to Date only in the PBI Service Power Query Editor. When I add this dataflow into PBI Desktop to try to create a report, this Date column is showing as Datetime in PBIDesktop, and throwing an error:

 

DataFormat.Error: We couldn't parse the input provided as a DateTime value.

 

Changing the column to Date type in PBI Desktop Query Editor does not resolve these errors

Anonymous
Not applicable

Same issue for me as well,

 

In fact, I believe it is also affecting a merge between two tables.  The merge works fine in Desktop, but it does not work in dataflow (i.e. produces no matches when date is used as one of the key columns).

nimrodg
Regular Visitor

@Daniil,

 

The current implementation of the dataflows model does not support the type Date, and the type Fixed Decimal Number. Date is automatically converted to DateTime, and fix decimal is converted to Double. If you think these types are important - we encourage you to request adding them via Power BI User Voice

 

Best Regards,

 

Nimrod Gal-Oz

 

 

 

 

 

 

Daniil
Kudo Kingpin

@nimrodg, I've posted an idea: Dataflows data types parity with Power Query

 

Please note that Time does not work at all -- it does not even get converted to DateTime.

 

Also, please note that Power BI Desktop does not understand type Double -- you have to set the type manually, as described above with a screenshot.

Daniil
Kudo Kingpin

More problems due to dataflows not supporting all data types:

 

I have a table in an Excel file with a single cell, like so:

 

image.png

 

Then I connect to this table from Dataflows. I set the data type of the column to Date/Time, since Date is not supported yet:

image.png

 

As you can see, there is no error, and the dataflow refreshes all right.

 

When I try to connect to this entity from Power BI Desktop, I can't get over an error, no matter what data type I set:

image.png

 

I tried setting the data type to text, date, datetime, by using locale, also tried setting the data type to date in the dataflow itself (just in case -- I know this isn't supported).

 

The only way I can solve this problem is by storing the date as 20181130 as an integer or text -- this is definitely far from ideal, and I should not have to do this.

 

Not supporting all Power Query data types becomes a product defect, not a mere inconvenience.

 

Also, would't supporting data type Date make files smaller? I assume currently time is stored even when the actual data type is Date. Since the entities are stored as CSVs, I can see a lot of resources wasted.

 

It would be great if all data types were supported (idea), and also it would be great if we didn't have to deal with locales because apparently dataflows store date/times in US format, and the rest of the world uses other date formats.

Daniil
Kudo Kingpin

The situation with dates is getting a bit ridiculous and must be addressed properly, not by asking me to submit an idea request.

 

My current setup:

  1. Excel file with dates in "mmm yyyy" format
  2. Laptop with Australian locale (DMY)
  3. Virtual machine with gateway with US locale (MDY)

I am parsing the dates in the Excel file, and every date is set to the first of the month, which is perfectly fine for me. This is what I see when I author a dataflow from my laptop:

Dates in DMY format, as expectedDates in DMY format, as expected

The data type is set to Date, and, according to @nimrodg, it is converted to DateTime, which is acceptable in this case.

 

Later, when I read the dataflow, all dates are read incorrectly and range from 1 January to 12 January for every year.

 

This makes very little sense to me, and I can't see how this is by design.

 

If someone tells me that I should aling the locales of my development machine and my gateway machine, I'll disagree -- there are some perfectly valid reasons for having different locales.

 

At this stage, the only workaround I know of is to save dates as YYYYMMDD text strings.

 

Please fix this -- this cannot be by design, I am sure.

v-qiuyu-msft
Community Support

Hi @Daniil,

 

Please post a new thread, we will try to investigate this issue and reply in a new thread. 

 

Best Regards,
Qiuyun Yu