Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
sunil-al
Frequent Visitor

The request limit for the database is X and has been reached. Tried all possible solutions

We have an Azure SQL database with 200+ tables which all come together into a massive powerbi report. Refreshing works perfectly via PowerBI Desktop by disabling parallel load of queries under File -> Options And Settings -> Options - >Data Load -> "Uncheck" Enable parallel loading of tables.

The problem started when we decided to automate everything and create a scheduled refresh. It looks like the Powerbi web service does not have a similar setting. 

 

The accepted answer is to connect to our DB via an On-Premise gateway and use "MashupDefaultPoolContainerMaxCount" setting and reduce it to 1. But this does not serve my purpose at all. Even though I changed the setting and configured our dataset to refresh via the gateway it still immediately hits the limit (300 connections on our azure account)

Hope someone here has something that will help us configure our setup correctly.

2 ACCEPTED SOLUTIONS
Icey
Community Support
Community Support

Hi @sunil-al ,

 

Hope these posts could help you:

The session limit for the database is 640 and has been reached;

Azure Sql request limit reached although the number of connections is well below the resource limit.

 

 

Best Regards,

Icey

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

sunil-al
Frequent Visitor

Hi @Icey !

The stack overflow link had a suggestion in one of the comments to add "MAXDOP = 1" as an option at the end of my queries. 

It looks like while there were approximately 200 queries hitting my SQL server the #connections was higher than that as powerbi was trying to optimize using some paralleism (I don't really understand how this works but I think that's what it is)

 

When I added that option at the end of my queries the problem went away. 

 

I'm accepting your reply as the answer, can you please edit your post to note that the second link and MAXDOP = 1 at the end of the queries was what helped so people can find it easily in the future. Thank you so much!

View solution in original post

5 REPLIES 5
Icey
Community Support
Community Support

Hi @sunil-al ,

 

Hope these posts could help you:

The session limit for the database is 640 and has been reached;

Azure Sql request limit reached although the number of connections is well below the resource limit.

 

 

Best Regards,

Icey

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

sunil-al
Frequent Visitor

Hi @Icey !

The stack overflow link had a suggestion in one of the comments to add "MAXDOP = 1" as an option at the end of my queries. 

It looks like while there were approximately 200 queries hitting my SQL server the #connections was higher than that as powerbi was trying to optimize using some paralleism (I don't really understand how this works but I think that's what it is)

 

When I added that option at the end of my queries the problem went away. 

 

I'm accepting your reply as the answer, can you please edit your post to note that the second link and MAXDOP = 1 at the end of the queries was what helped so people can find it easily in the future. Thank you so much!

Icey
Community Support
Community Support

Hi @sunil-al ,

 

Glad to hear the issue is gone. I have accepted your reply as a solution, too.😜 

 

 

Best Regards,

Icey

GilbertQ
Super User
Super User

Hi there

What if you had to put some of the tables into views?




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Hi, Thank you for your response. 

I'm not sure if you meant I should convert the tables on my Azure SQL to views. Not sure how that would solve the problem.

 

These tables get refreshed every 4 hours via Azure Data Factory pipelines pulling information from our branches across the country. 

I could write a procedure that would then convert all the tables into views, I just want to confirm that's what you're suggesting. Just create 200 views for all the tables and fetch them into powerbi instead. Maybe I'm completely misunderstanding what you meant.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors