Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
admin_xlsior
Post Prodigy
Post Prodigy

Get Data from Azure Data Lake Sore Gen2 (Not Dataflow)

Hello,

 

When using the new feature to Get Data - Azure Data Lake Store Gen2, I realize I have to put the URL Path same as the URL Path when we connect it from Power BI Dataflow (which is on PBI Service side)

 

This is the (only) input after we choose the Get Data feature:

image.png

 

And this is the input if we Add CDM folder within Power BI Dataflow in PBI Services :

image.png

 

In above screen, they give us a clue on how the path should be, which is we need to put the model.json file path.

When we take a look at the our data in ADLSg2 of Azure portal which created from Power BI Services Dataflow process, they have that file, so I can go to it's properties and just copy it. 

 

My question is, if for example we have our own data, in their own separate CDM folder in the same Data Lake Store, but it is not from Power BI Service Dataflow, lets say from Azure Data Factory process, how to have this model.json file ?

 

I hope I can describe clear enough for this scenario. 

So basically I want to access my CDM folder (ADLSg2 data) using the Get Data feature in Power BI Dekstop, but for now I don't have that model.json file in my CDM folder.

 

Any advice on this case?

 

Thanks in advance.

 

 

 

 

 

 

 

 

1 ACCEPTED SOLUTION

Hi Thanks,

 

I think I found the solution already. It turns out, we don't need "model.json" when using this new feature to connect to ADLSg2.

 

Just put the full URL up to filename path. So for example if I have subfolder \ExternalDB then the file is customer.csv, I just put the URL like this :

https://mydatalake.dfs.core.windows.net/powerbi/ExternalDB\customer.csv

 

The problem is only I need to give acess one by one per file in Azure Storage Explorer. It is quite tedious.

 

Thanks,

 

View solution in original post

4 REPLIES 4
v-piga-msft
Resident Rockstar
Resident Rockstar

Hi @admin_xlsior ,

I have not tried connect with Azure Data Lake Store Gen2 connector without Dataflow in power bi desktop.

You may could try connect it with Azure Data Lake Store REST API.

In addition, you could refer to the comment of Kanad Chatterjee which should be helpful.

Best  Regards,

Cherry

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi Thanks,

 

I think I found the solution already. It turns out, we don't need "model.json" when using this new feature to connect to ADLSg2.

 

Just put the full URL up to filename path. So for example if I have subfolder \ExternalDB then the file is customer.csv, I just put the URL like this :

https://mydatalake.dfs.core.windows.net/powerbi/ExternalDB\customer.csv

 

The problem is only I need to give acess one by one per file in Azure Storage Explorer. It is quite tedious.

 

Thanks,

 

Hi @admin_xlsior ,

Glad to hear the issue is resolved,  you can accept your reply as solution, that way, other community members would benefit from your solution.

Best Regards,

Cherry

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi actually not really solved in a way, I need to give access the file one by one. What if I have thousands ?

 

Thanks,

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.