Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Picci
Helper I
Helper I

Cannot attach an external CDM folder to a new dataflow

Hi everybody,

as Power BI Admin I configured Power BI Dataflow to save data in my ADLS2 store.

When I create Power BI Dataflow starting from Power Query in Power BI Service, the corresponding folders and files are generated in ADLS2.

But when I try to attach an external CDM folder to a new dataflow (as explained here: https://docs.microsoft.com/en-us/power-bi/service-dataflows-add-cdm-folder) I receive the following error

 

Something went wrong
There was a problem creating a new dataflow.
Please try again later or contact support. If you contact support, please provide these details.
Activity ID: d6173bad-b724-4604-a586-18f7506586f6
Request ID: f9bce339-1e57-5796-7b8e-d4600a035fcb
Correlation ID: 9da79dac-7eea-24c1-d7af-73d4c963a4f6
Status code: 500
Time: Tue Feb 19 2019 19:22:00 GMT+0100 (Central European Standard Time)
Version: 13.0.8383.201
Cluster URI: https://wabi-north-europe-redirect.analysis.windows.net

 

 

Thanks for helping,

Elisa

1 ACCEPTED SOLUTION
Picci
Helper I
Helper I

After some more trials, I managed to solve the issue following this steps

  1. Create the ADLS2 folder in Azure Storage Explorer
  2. BEFORE populating the folder, assign Read and Execute permission to that folder to the user who is going to attach the CDM folder to the dataflow in Power BI Service
  3. Execute the Databrick notebook so that it writes inside that folder (model.json and other subfolder inherit user permission)
  4. In Power BI Service Create à Dataflow à Attach an external CDM folder
  5. Paste CDM folder path, ending with <…..>/model.json
  6. It works!

 

Step 2 and 5 are not so clear from the documentation (https://docs.microsoft.com/en-us/power-bi/service-dataflows-add-cdm-folder) IMHO.

 

Thanks,

Elisa

View solution in original post

2 REPLIES 2
Picci
Helper I
Helper I

After some more trials, I managed to solve the issue following this steps

  1. Create the ADLS2 folder in Azure Storage Explorer
  2. BEFORE populating the folder, assign Read and Execute permission to that folder to the user who is going to attach the CDM folder to the dataflow in Power BI Service
  3. Execute the Databrick notebook so that it writes inside that folder (model.json and other subfolder inherit user permission)
  4. In Power BI Service Create à Dataflow à Attach an external CDM folder
  5. Paste CDM folder path, ending with <…..>/model.json
  6. It works!

 

Step 2 and 5 are not so clear from the documentation (https://docs.microsoft.com/en-us/power-bi/service-dataflows-add-cdm-folder) IMHO.

 

Thanks,

Elisa

Anonymous
Not applicable

I'm exploring using the attach CDM capability to transition our ETL to ADFv2 and then create in the CDM format so that it can be consumed in PBI. I'm trying to create my own .JSON and fold structure to test this out before going to databricks route. 

 

Has databricks and then attaching the CDM for PBI consumption worked well for you?

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors