Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Anonymous
Not applicable

Configuring dataflow storage to use Azure Data Lake Gen 2 when there are existing dataflows

I hope everyone is keeping healthy. This is my first post here and I was hoping to talk about one of the requirements from my client and see if there's an easy solution to it. Hopefully someone here has encountered this before.

My client has 3 Power BI premium capacities. Some business users who had access to workspaces in these premium capacities have created some dataflows - 136 until now in about 28 workspaces. These dataflows connect to various sources and are finally used for reports.

As you know, by default dataflows use Microsoft provided data lake store where no one has direct access - it is considered as a problem because of data sensitivity. Hence, we want to start using a new Azure Data Lake Gen 2 account within my client's Azure subscriptions for dataflow storage.
As of now, you can only add a data lake account either on a workspace level or on a tenant level. My client wants to add it on the tenant level so that whenever a new premium workspace wants to use dataflows, their data will automatically be saved in my client's data lake store.

I know that if a workspace has any existing dataflows, we can't add the storage in that workspace. To do this, we have to delete all dataflows in that workspace, add the storage and then recreate the dataflows. Deleting the data flows and recreating them comes with their own challenges.

Hence, my questions are:

  1. What would happen if we added my client's data lake to their Power BI tenant?
    1. Will all existing dataflows stop working?
    2. Will existing dataflows remain working on MS provided data lake while new workspaces will use my client's data lake for storing dataflows?
    3. Will existing dataflows automatically move to my client's data lake during next refresh?
  2. Given the situation and requirement, is there a better way to do this change without impacting the dataflows and reports?

I hope I could paint you a picture. Please let me know if anything isn't clear.

2 REPLIES 2
Anonymous
Not applicable

Hi Jay,

 

Thanks for the reply. By "added my client's data lake to their Power BI tenant", I mean configuring my client's ADLS Gen2 with Power BI at tenant level for dataflow. Please refer to the first bullet point here https://docs.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-azure-data-lake-storag...

 

Regards,

v-jayw-msft
Community Support
Community Support

Hi @Anonymous ,

 

I don't quite understand what you mean "added my client's data lake to their Power BI tenant".

You may take a look at the Considerations and limitations.

https://docs.microsoft.com/en-us/power-query/dataflows/connect-azure-data-lake-storage-for-dataflow#considerations-and-limitations 

 

Best Regards,

Jay

Community Support Team _ Jay
If this post helps, then please consider Accept it as the solution
to help the other members find it.

Helpful resources

Announcements
PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

Top Solution Authors
Top Kudoed Authors