cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
buildbod Regular Visitor
Regular Visitor

Dataflow size limit

I'm considering breaking one of my larger models into a series of Dataflows to improve the overall refresh performance of the model. I was wondering whether an individual dataflow had a size limit? Also if several dataflows are combined into a single model, will refreshing the model force a refresh of the associated dataflows or will they refresh independently based on their individual refresh frequencies?

10 REPLIES 10
Super User
Super User

Re: Dataflow size limit

Hi there

From what I understand there is a size limit in terms of storage depending on if you have Power BI Pro (10GB) or Power BI Premium (100TB)

Then if you had to break them up, they would all sit in their separate data stores in dataflows.

As it currently stands you would then manage the refresh of the dataflows to ensure that the data is up to date. If you are using Power BI Premium you could use Linked Entities or Computed Entities (Depending on requirements).
Then you would need to import the dataflows into your Power BI Desktop file (PBIX) and then upload that to the Power BI Service.
Once in the Power BI Service, you would then need to schedule the refresh for your PBIX file.

Did I answer your question? Mark my post as a solution!

"Proud to be a Datanaut!"


Power BI Blog
aldupler Frequent Visitor
Frequent Visitor

Re: Dataflow size limit

The biggest limit I've run into is refresh time out (2 hours). I'm running a lot of Native Queries against AAS. These run slow as it is, but my feeling is that dataflows are slower than a regular service refresh. As a result, I've broken up a big model into entities in 5-6 different dataflows in the same workspace. Then I just staggered the refresh of each by two hours over the course of the night. Now everything runs fine.

Super User
Super User

Re: Dataflow size limit

Hi there

Yes extracting data out of AAS could potentially be very slow if you are using it as a data extraction tool with row by row data.

That will work with the staggering of dataflows refreshing.

Did I answer your question? Mark my post as a solution!

"Proud to be a Datanaut!"


Power BI Blog
buildbod Regular Visitor
Regular Visitor

Re: Dataflow size limit

Thank you. That confirms my thinking.

 

Does the refresh of the PBIX model force a refresh of the associated dataflows or will they refresh independently based on their individual refresh frequencies?

aldupler Frequent Visitor
Frequent Visitor

Re: Dataflow size limit

No. The schedules are separate. You can even have multiple dataflows with separate schedules feed the same report which can give you an extremely crude approximation of incremental refresh with out premium capacity.
Super User
Super User

Re: Dataflow size limit

Hi there

The only way that they could be refreshed is if you use Power Premium and Linked Entities

Did I answer your question? Mark my post as a solution!

"Proud to be a Datanaut!"


Power BI Blog
aldupler Frequent Visitor
Frequent Visitor

Re: Dataflow size limit

As long as you don't append the "partitions" until you ingest the data into a model, you're golden.
buildbod Regular Visitor
Regular Visitor

Re: Dataflow size limit

I've successfully broken my large model out into a number of Dataflows and they are all refreshing on their own schedules :-) I am facing a new challenge in that any Dataflow that is over about 300MB will refresh in Dataflows and display in Power BI Desktop Query Editor but fail to load when I apply the query in Power BI Desktop. The error is:

 

Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [DataSource.Error] Received an unexpected EOF or 0 bytes from the transport stream..'.

 

When applying it will happily load the data to a point - normally to the full size as it pauses for a long time at the same value and then crashes with the error. I'm going to break the larger items out into smaller Dataflows and then combine them as a new Table in Power BI Desktop. That should reduce the sizes being transferred over the wire.

aldupler Frequent Visitor
Frequent Visitor

Re: Dataflow size limit

I have larger tables so you might want to put in a support ticket.