cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
SamMiddleton
Frequent Visitor

Resultset query Exceeded 1m Rows on measures unrelated direct query table.

Hi all

I have a model that is a mixed storage mode with my larger fact tables set to import and my smaller tables set to Direct Query (to reduce the overall PBIX size but stop slow visual refreshes on the larger tables when users interact with the attached reports). 

My customer dimension is a large table (14m rows) but my pbix was becoming bloated (nearly 4gb size) so i switched it to direct query as currently it's not being used a whole lot in modelling (yet). When I switched the customer table to direct query, a number of measures in my sales fact (imported) table returned a 'resultset query exceeded 1m rows' error. 

None of the measures affected directly reference the customer dimension. They are simple 'count distinct number of sales orders' type measures, only relating to the sales fact table.  When i switched the customer dimension back to dual storage mode the measures all returned correctly.  So my question is - why would the storage mode of the customer table (which is unrelated to the measures being calculated) affect the resultset of those measures, and how can i avoid the error message? 

1 ACCEPTED SOLUTION

I add new data sources all the time in tabular editor. (I use the paid 3 version but I think it would work in free 2 version).

 

I can send more detailed instructions when I get to a computer (away camping at the moment). Roughly though...

1) Right click add table.

2) Expand the partitions tree and there will be one with same name as the table.

3) Click on it and the M expression window should open.

4) In power bi desktop add datasource as normal and do any transforms you need.

5) Copy M code over into tabular editor from advanced editor.

6) Right click the table and click "update table schema".

6) Save back to service (I save to dev workspace and then deploy it along from there)



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

View solution in original post

7 REPLIES 7
SamMiddleton
Frequent Visitor

thanks so much! legend. 

memepu
Resolver II
Resolver II

Hi @SamMiddleton ,

Based on your description, the limit can occur in cases where Power BI is not fully optimizing the queries sent, and there is some intermediate result being requested that exceeds the limit. It can also occur whilst building a visual, on the path to a more reasonable final state. For example, including Customer and TotalSalesQuantity would hit this limit if there were more than 1 million customers, until some filter were applied.

 

You may refer to the following links.

https://community.powerbi.com/t5/Desktop/DirectQuery-Data-source-has-exceeded-the-maximum-allowed-si...

https://community.powerbi.com/t5/Desktop/Error-Resultset-of-a-query-to-external-data-source-exceeded...

 

Best regards.

amitchandak
Super User
Super User

@SamMiddleton , In the case of import, we configure the incremental etl and load only selective data pbix file and load the rest of the data in power bi service.

 

If you are premium, you can load the rest of the data in the test instance using the deployment pipeline 

Premium- Deployment Pipeline, Load More Data on Test/Prod : https://youtu.be/l69cnWkoGX0

 

In Direct mode, you can use aggregated table to minimize the row in direct query

https://docs.microsoft.com/en-us/power-bi/desktop-aggregations

 

Try to simplify the expression to reduce the row queried

hi @amitchandak  unfortunately at the moment we are still in a development stage of our data model, where further datasources are being added to the model, which means we need a local copy of the pbix at all times.  As i understand it, incremental refresh means that you cannot download the PBIX once it's loaded to the service.  I don't believe there's a functionlity currently (happy to be corrected) that allows you to add further datasets to a model as you would via the power query transform functionality in desktop? 

Are you using premium capacity or premium per user?


If so I'd look to publish my dataset once and from then on use tabular editor to develop it connected to the service.

 

@amitchandak recommendation of making use of incremental refresh is a really good one.

 

Another thing to consider is looking at a dual mode setup with your customer dimension running dual (both import and direct query) and your fact table staying in direct query mode. That should allow the engine to make best use of processing on the sql end.

 

Lastly look at building an aggregate table that also runs in dual mode.

 

I'd highly recommend sql bi's mastering tabular video course which covers a lot of this stuff.



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Thanks Ben, I'd love to be able to use TE but what happens when i need to add additional datasources in to the power query editor? (eg views/tables).  As mentioned, Incremental refresh is the ideal but currently you can't download the PBIX when incremental refresh is set up  (and if i need to add a datasource in, i need a local PBIX) UNLESS there's a way to do it in the service or in TE.  I've not found a way... yet! 😄 happy to hear any suggestions.   I have my Sales Fact on Import as it's currently 56m rows and the reports using it have many measures that make it super slow to load on direct query.  Our customer table is also 30m rows (i've cut down to the bare minimum columns but we have a large customer base) and so i've had to also switch that back on to dual mode as when it was on direct q, that's when i was getting the initial error.  Unfortunately, its the curse of the BI developer when your customer requires lots of KPI metrics to report on 😞

I add new data sources all the time in tabular editor. (I use the paid 3 version but I think it would work in free 2 version).

 

I can send more detailed instructions when I get to a computer (away camping at the moment). Roughly though...

1) Right click add table.

2) Expand the partitions tree and there will be one with same name as the table.

3) Click on it and the M expression window should open.

4) In power bi desktop add datasource as normal and do any transforms you need.

5) Copy M code over into tabular editor from advanced editor.

6) Right click the table and click "update table schema".

6) Save back to service (I save to dev workspace and then deploy it along from there)



Ben Dobbs

LinkedIn | Twitter | Blog

Did I answer your question? Mark my post as a solution! This will help others on the forum!
Appreciate your Kudos!!

Helpful resources

Announcements
November 2022 Update

Check it Out!

Click here to read more about the November 2022 updates!

Microsoft 365 Conference â__ December 6-8, 2022

Microsoft 365 Conference - 06-08 December

Join us in Las Vegas to experience community, incredible learning opportunities, and connections that will help grow skills, know-how, and more.

Power BI Dev Camp Session 27

Ted's Dev Camp

This session walks through creating a new Azure AD B2C tenant and configuring it with user flows and custom policies.