Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Mogura27
Frequent Visitor

The specified culture is not supported

I'm doing similar operations to the case reported here - https://community.powerbi.com/t5/Service/Specified-culture-is-not-supported-error/td-p/8730 - but the solution there is not working for me currently.

 

When I try to refresh the published dataset at the online service, I get these messages:
 
Underlying error message The specified culture is not supported.
Microsoft.Data.Mashup.ValueError.Reason DataFormat.Error
 
The table in question is a collection of dates parsed from two columns from the master table:
 
let
   Dates = {List.Distinct(SupportAllCases[CaseCreatedDate]),List.Distinct(SupportAllCases[CaseClosedDate])},
   #"Converted to Table" = Table.FromList(Dates, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
   #"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"),
   #"Removed Duplicates" = Table.Distinct(#"Expanded Column1"),
   #"Renamed Columns" = Table.RenameColumns(#"Removed Duplicates",{{"Column1", "Dates"}}),
   #"Changed Type with Locale" = Table.TransformColumnTypes(#"Renamed Columns", {{"Dates", type date}}, "en-150")
in
   #"Changed Type with Locale"
 
In PBI Desktop, 5 additional custom columns are added with DAX definitions to aggregate datat from SupportAllCases based on the unique Dates row value. I've tried it as well without those additional derived columns without any change in the outcome.
 
I've also tried deleting this table (publish and refresh works) and regenerating the table (refresh then fails with the same error). I've tried dropping the 'Changed Type with Locale' line (and updating the In value accordingly), and still get the same error.
 
I've published aggregate date columns before, so I'm a little perplexed as to why it's failing now. 
 
Product Version:
2.56.5023.1021 (PBIDesktop) (x64)
3 REPLIES 3
v-piga-msft
Resident Rockstar
Resident Rockstar

Hi @Mogura27,

 

I have made a test with the information you provide by entering data and edit in Editor Queries, publish the report to Power BI Service and have a refresh, everything works well.

 

Untitled11.png

 

For another way, I configured an on-premises gateway and refresh the dataset, it also refresh successfully without any error.

 

Do you have this error when you configure a gateway? If it is, could you share more details of the error?

 

Could you share a dummy pbix file which can reproduce this issue, so that we can help further investigate on it? You can upload it to OneDrive or Dropbox and post the link here. Do mask sensitive data before uploading.)

 

Best Regards,

Cherry

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi,

 

Here is a sanitized test that repro's the problem. 

 

https://www.dropbox.com/sh/vnc06qs9p6ku4cn/AAAwobNwCIyj0tCIBLCkdytga?dl=0

 

The source dataset is a JSON file, with a subset of actual data from the production system as returned by the provider system and intermediary proxy. The date column strings are in dd/mm/yyyy format.

 

The pbix will initially publish this dataset correctly without errors, but a refresh with a gateway configured for the JSON file will throw the error. Removing the Changed Type with Locale steps from the 'locale-bug' table will let the dataset refresh online correctly.

 

  1. Locale-bug (source) and Query2 (target) both have Changed Type with Locale = Failure
  2. Only Query2 (target) has Changed Type with Locale = Success
  3. Only Locale-bug (source) has Changed Type with Locale = Failure

I need to be able to do date-based analysis of the rows in the primary table. I do control the source for the intermediary proxy, so I can transform the date strings from dd/mm/yyyy to mm/dd/yyyy to make it PBI friendly, but I would prefer to not to have to do that, and just use the pass-through data as-is without any mid-stream modification. 

 

For reference, I have the following columns in the primary table from the provider:

 

  1. Created date/time
  2. Closed date/time
  3. First activity date/time

I need to do filters and aggregation on those values, and derive additional values of the difference of (3 - 1) and (2 - 1). I could potentially derive the string columns from the source table, convert the types in derived tables, then link them back in a relationship on the row's primary unique ID. That's messy as well, which makes risks of problems.

 

Thank you.

 

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors