Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Error when loading data in Datamart


Hi,

 

I am trying to add dataflows to a datamart, however when loading up the data to the datamart, I am getting an error that my date column is not a valid date/time field. Please see error below.

 

Wrapping things up

InternalError
ErrorMessageString was not recognized as a valid DateTime.
HttpStatusCode 500

 

I am suspecting it is because the locale seems to be resetting itself (to United States) after you load data in the dataflow.

 

Could you please have a look?

 

thanks,

 

Stephan

Status: Accepted

Thanks for @Anonymous '' sharing, users who have faced with the same issue could try his workaround:

 

It assumes a valid table and a string column list as parameters and assumes the columns are of type text.

 

Feel free to use and adapt it (e.g.making it more generic for different TZs).

 

Function TransformSQLDTZ () :

 

let
Quelle = (Tab as table, cols as list) => let
Quelle1 = Table.ReplaceValue(Table.ReplaceValue(Table.ReplaceValue(Tab,"+01:00","",Replacer.ReplaceText,cols),"T"," ",Replacer.ReplaceText,cols),"+02:00","",Replacer.ReplaceText,cols)
in
Quelle1
in
Quelle


Implementation  (query code snippet, copy to query code, adapt "StepBeforeDateConversion" to your previous step name) :


ChangeDatestoText = Table.TransformColumnTypes(StepBeforeDateConversion,{{"AnalyticsUpdatedDate", type text}, {"CompletedDate", type text}, {"StartedDate", type text}}),
TransformSQLDTZ = TransformSQLDTZ(ChangeDatestoText, {"AnalyticsUpdatedDate","CompletedDate", "StartedDate"}),


After those amendments you should be able to import datetimes/datetimezones successfully into datamart (as text values).

 

 

Best Regards,
Community Support Team _ Yingjie Li

Comments
onkelphill
Frequent Visitor

same issue here! 

ACPCI
Regular Visitor

Same issue, in my case it has paralyzed all my dataflows that are stored in azure, and I get the bugs described in the photosDiapositiva1.JPGDiapositiva2.JPG

bx5a
New Member

Same thing. In my case the culprit is the extended date table from EDNA which we have as a dataflow.

 

Simply can't ingest it into the datamart.

 

Tried recreating it as a new blank query in the datamart itself. Doesn't work either.

 

Anyone figured out a workaround?

v-yingjl
Community Support
Status changed to: Investigating

Although could not reproduce this issue, have other problems when testing the datamart as far as my test.

 

When I connected to an excel file in a dataflow, it has a date column and a value column with 3 rows. After creating a datamart and connecting to this dataflow, it would not remind the error but no rows show in the table.

 

Will continue to investigate the issues about datamart, if you have more detail about this issue, like the data source for the dataflow, the workspace type etc. which would help better to collect related information, you can share here.

 

Best Regards,
Community Support Team _ Yingjie Li

JP8991
Advocate II

Same issue for me too.

Hope this gets sorted too as I cannot use the feature currently as we rely heavily on Dataflows in our Business.

StephanB12
Advocate I

Hi,

 

I managed to load in my data this morning, so issue seems to be resolved.

 

Best wishes,

 

Stephan

onkelphill
Frequent Visitor

@StephanB12  i cannot confirm. Just tried again to load some dataflow and i did not succeed! 

StephanB12
Advocate I

Hi @onkelphill,

Please note my issue was related to loading up data in the new feature data mart, not a dataflow.

Not sure if this makes a difference?

Best wishes,

Stephan

onkelphill
Frequent Visitor

@StephanB12 thanks.

 

Just to add some detail, this is the error message. The dataflow holds just 1 table. There is 1 column of data type date, correctly formatted. 

 

onkelphill_0-1653988955538.png

 

TototheK
Regular Visitor

I have the exact same issue and same error message.

In my dataflow the datetime columns are being created via the datetime function of power M. So it should really be valid datetime. I also tried to add a regional scheme to the column in the data flow, but the 500 error persists.