Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Error when loading data in Datamart


Hi,

 

I am trying to add dataflows to a datamart, however when loading up the data to the datamart, I am getting an error that my date column is not a valid date/time field. Please see error below.

 

Wrapping things up

InternalError
ErrorMessageString was not recognized as a valid DateTime.
HttpStatusCode 500

 

I am suspecting it is because the locale seems to be resetting itself (to United States) after you load data in the dataflow.

 

Could you please have a look?

 

thanks,

 

Stephan

Status: Accepted

Thanks for @Anonymous '' sharing, users who have faced with the same issue could try his workaround:

 

It assumes a valid table and a string column list as parameters and assumes the columns are of type text.

 

Feel free to use and adapt it (e.g.making it more generic for different TZs).

 

Function TransformSQLDTZ () :

 

let
Quelle = (Tab as table, cols as list) => let
Quelle1 = Table.ReplaceValue(Table.ReplaceValue(Table.ReplaceValue(Tab,"+01:00","",Replacer.ReplaceText,cols),"T"," ",Replacer.ReplaceText,cols),"+02:00","",Replacer.ReplaceText,cols)
in
Quelle1
in
Quelle


Implementation  (query code snippet, copy to query code, adapt "StepBeforeDateConversion" to your previous step name) :


ChangeDatestoText = Table.TransformColumnTypes(StepBeforeDateConversion,{{"AnalyticsUpdatedDate", type text}, {"CompletedDate", type text}, {"StartedDate", type text}}),
TransformSQLDTZ = TransformSQLDTZ(ChangeDatestoText, {"AnalyticsUpdatedDate","CompletedDate", "StartedDate"}),


After those amendments you should be able to import datetimes/datetimezones successfully into datamart (as text values).

 

 

Best Regards,
Community Support Team _ Yingjie Li

Comments
JP8991
Advocate II

@Anonymous 

Great information and a really nice way to fix the issue. Good to understand why it is happening and it makes sense.

However, for a long term solution, I probably come down the line of this shouldn't be an issue and we shouldn't have to do workarounds.

Anonymous
Not applicable

Fully agree.

 

I stumbled upon something similar the first time when I used my own azure SQL Server as a temp storage to improve performance. for the first report ETL stage (with exports by R library (RODBC)) and reimported them to Power BI at 2nd stage. It could only store them sufficiently in SQL server when I transformed them into DTZ and then text. Already looked strange, but I thought its an R library issue.

 

In fact it seems here (and possibly the R issue above too) its a problem of the Azure SQL Server connector between Power BI and SQL Server on export to SQL Server. Microsoft technology to Microsoft technology 😂.
Hope they fix that soon.

Dilshod81
Regular Visitor

Same error

v-yingjl
Community Support
Status changed to: Accepted

Thanks for @Anonymous '' sharing, users who have faced with the same issue could try his workaround:

 

It assumes a valid table and a string column list as parameters and assumes the columns are of type text.

 

Feel free to use and adapt it (e.g.making it more generic for different TZs).

 

Function TransformSQLDTZ () :

 

let
Quelle = (Tab as table, cols as list) => let
Quelle1 = Table.ReplaceValue(Table.ReplaceValue(Table.ReplaceValue(Tab,"+01:00","",Replacer.ReplaceText,cols),"T"," ",Replacer.ReplaceText,cols),"+02:00","",Replacer.ReplaceText,cols)
in
Quelle1
in
Quelle


Implementation  (query code snippet, copy to query code, adapt "StepBeforeDateConversion" to your previous step name) :


ChangeDatestoText = Table.TransformColumnTypes(StepBeforeDateConversion,{{"AnalyticsUpdatedDate", type text}, {"CompletedDate", type text}, {"StartedDate", type text}}),
TransformSQLDTZ = TransformSQLDTZ(ChangeDatestoText, {"AnalyticsUpdatedDate","CompletedDate", "StartedDate"}),


After those amendments you should be able to import datetimes/datetimezones successfully into datamart (as text values).

 

 

Best Regards,
Community Support Team _ Yingjie Li

JP8991
Advocate II

@v-yingjl  I hope you are well, is there any update on this one?

Anonymous
Not applicable

I'll soon publish a more generic solution. Hold on.

However, the original issue can only be solved by Microsoft.

gmoneyrb
Regular Visitor

I had the same issue but have managed to get it to work, albeit stuck in English (United States) within project options.

 

My dataflow was set to English (United Kingdom) in project options. However, the datamart always reverts back to English (United States) despite trying to change it. Once I changed the dataflow to English (United States) and ran it through the datamart again, it completes the load and refreshes too.

 

Ideally, I would have both set to English (United Kingdom), but guessing the projects options issue is a bug in datamart preview.

JP8991
Advocate II

@gmoneyrb can confirm that this is the issue, it appears Datamarts are not saving the locale to what you select.

It always defaults back to United States.

onkelphill
Frequent Visitor

@gmoneyrb thanks for sharing.

 

Let's hope this is fixed soon.

onkelphill
Frequent Visitor

Brief update:

 

Without changing anything on my end i tried again to load the above mentioned date (dimension) table into the data mart and i succeeded. Not sure whether there was a fix applied by MS? 

 

I immediately added another (fact) table to the datamart so that i could create a relationship between the fact and dimension table. It did not work:( Unfortunately I am now stuck at this very stage. The error message is also not very helpful, either. 

 

onkelphill_0-1656408535644.png