Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Anonymous
Not applicable

How to get additional resources (Memory ) for data flows to run faster ?

Hi Community,

 

We have been struggling to manage datasets and data flows refreshes in one Premium capacity which have P3 (32 cores) . Many times, datasets and data flows(backend operations) are not finding resources because of multiple user interactions on reports available in Service. 

 

We are planing to scale up our capacity and dedicate seperate memory and make the resources available for data flows and datsets to run faster ( refreshes) . Is there a way to do this ? Please assist !!!! 

 

G Venkatesh

 

1 ACCEPTED SOLUTION

Hi @Anonymous 

 

For the Linked entities to refresh automatically you will need to keep them in the same workspace, please refer to the article below.

https://docs.microsoft.com/en-us/power-bi/service-dataflows-linked-entities

 

You first entity should be simple with minor transformations, just to pull the data from your data source ( Ideally with incremental refresh ), the second ( and third if applicable  ), should be where you do most of your transformation and later use to build reports.

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
LinkedIn

 

View solution in original post

8 REPLIES 8
Mariusz
Community Champion
Community Champion

Hi @Anonymous 

 

You can optimize your dataflows, allocate max resources from your P3 (if you haven't it's up to 20% ) 

https://docs.microsoft.com/en-us/power-bi/service-dataflows-overview

https://docs.microsoft.com/en-us/power-bi/service-admin-premium-workloads

 

Get a separate instance of azure data lake gen2 and do your transformation there ( a lot cheaper than premium nodes not restricted by 20% ).

https://powerbi.microsoft.com/fr-fr/blog/power-bi-dataflows-and-azure-data-lake-storage-gen2-integration-preview/

 

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
LinkedIn

 

Anonymous
Not applicable

Hi Mariusz,

 

Thanks for the reply. Unfortunately, we do not have our Own ADLF in my organization . As of now, below is the work load pattern for data flows we have set. Do you recommend any changes ? 

 

dataflows workload.PNG

 

Kindly let me know your inputs. 

 

G Venkatesh

Hi @Anonymous 

 

You Max Memory is set 10%, you can go as far as 20% so try that if it makes any difference.

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
LinkedIn

 

Mariusz
Community Champion
Community Champion

Hi @Anonymous 

 

Also, read this article.

https://powerbi.microsoft.com/en-us/blog/power-bi-dataflows-june-2019-feature-summary/

 

When designing dataflows with premium capacity make sure use stage your data by referencing and use incremental refresh if possible.

 

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
LinkedIn

 

Anonymous
Not applicable

Hey Mariusz,

 

Thats a great suggestion . I think we are close to get this done. However, I do have some additional questions to ensure i am not breaking some basic rules doing this process. 

 

- When you say ,Stage the data by referencing . Do you suggest to create a reference of main data flow and keep the referenced data flow in same workspace ? 

- I did set up incremental refresh for couple of entities in data flows . They did refreshed my entities . However, I do not see the reports being updated until I refresh my data flows manually or in schedule manner (which are taking lot of time to refresh) . 

 

Thanks,

G Venkatesh

Hi @Anonymous 

 

For the Linked entities to refresh automatically you will need to keep them in the same workspace, please refer to the article below.

https://docs.microsoft.com/en-us/power-bi/service-dataflows-linked-entities

 

You first entity should be simple with minor transformations, just to pull the data from your data source ( Ideally with incremental refresh ), the second ( and third if applicable  ), should be where you do most of your transformation and later use to build reports.

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
LinkedIn

 

Anonymous
Not applicable

Hey Maurisz,

 

Hope you are having a fantastic day so far. Thanks for your reply.I think i have enough inputs from you surrounding usage of data flows in more optimized manner. 

 

With this being said, I will keep working on them and get back to you when i have nay further questions. 


Thanks for all your help !!! Kudosssss

Hi @Anonymous 

 

No problem, have a nice day and have fun with dataflows.

 

 

Best Regards,
Mariusz

If this post helps, then please consider Accepting it as the solution.

Please feel free to connect with me.
LinkedIn

 

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors