Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
We have an Azure SQL database with 200+ tables which all come together into a massive powerbi report. Refreshing works perfectly via PowerBI Desktop by disabling parallel load of queries under File -> Options And Settings -> Options - >Data Load -> "Uncheck" Enable parallel loading of tables.
The problem started when we decided to automate everything and create a scheduled refresh. It looks like the Powerbi web service does not have a similar setting.
The accepted answer is to connect to our DB via an On-Premise gateway and use "MashupDefaultPoolContainerMaxCount" setting and reduce it to 1. But this does not serve my purpose at all. Even though I changed the setting and configured our dataset to refresh via the gateway it still immediately hits the limit (300 connections on our azure account)
Hope someone here has something that will help us configure our setup correctly.
Solved! Go to Solution.
Hi @sunil-al ,
Hope these posts could help you:
The session limit for the database is 640 and has been reached;
Azure Sql request limit reached although the number of connections is well below the resource limit.
Best Regards,
Icey
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Icey !
The stack overflow link had a suggestion in one of the comments to add "MAXDOP = 1" as an option at the end of my queries.
It looks like while there were approximately 200 queries hitting my SQL server the #connections was higher than that as powerbi was trying to optimize using some paralleism (I don't really understand how this works but I think that's what it is)
When I added that option at the end of my queries the problem went away.
I'm accepting your reply as the answer, can you please edit your post to note that the second link and MAXDOP = 1 at the end of the queries was what helped so people can find it easily in the future. Thank you so much!
Hi @sunil-al ,
Hope these posts could help you:
The session limit for the database is 640 and has been reached;
Azure Sql request limit reached although the number of connections is well below the resource limit.
Best Regards,
Icey
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Icey !
The stack overflow link had a suggestion in one of the comments to add "MAXDOP = 1" as an option at the end of my queries.
It looks like while there were approximately 200 queries hitting my SQL server the #connections was higher than that as powerbi was trying to optimize using some paralleism (I don't really understand how this works but I think that's what it is)
When I added that option at the end of my queries the problem went away.
I'm accepting your reply as the answer, can you please edit your post to note that the second link and MAXDOP = 1 at the end of the queries was what helped so people can find it easily in the future. Thank you so much!
Hi @sunil-al ,
Glad to hear the issue is gone. I have accepted your reply as a solution, too.😜
Best Regards,
Icey
Hi, Thank you for your response.
I'm not sure if you meant I should convert the tables on my Azure SQL to views. Not sure how that would solve the problem.
These tables get refreshed every 4 hours via Azure Data Factory pipelines pulling information from our branches across the country.
I could write a procedure that would then convert all the tables into views, I just want to confirm that's what you're suggesting. Just create 200 views for all the tables and fetch them into powerbi instead. Maybe I'm completely misunderstanding what you meant.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.