Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
supton
Helper I
Helper I

Poor Performance using Azure SQL Database and Direct Query

Hello! I was just wondering if anyone else has experienced very poor performance using Direct Query on an Azure SQL Database. I have limited the number of rows that I am using in my queries but this doesnt seem to make much of a difference. Even with > 100 DTU's allocated in the Azure SQL Database, any query made from power BI pushes the DTU percentage to 100% and power BI still seems to process quite slowely. On my largest table, I have < 500 thousand rows and < 20 columns, which I don't see as an unreasonable amount. Most of the other tables have < 100 thousand. How can I improve performance? I realize that importing is the ideal situation for power BI, but I believe direct query to be more scalable in terms of storage; I previously ran everything importing but I had too many records to actually schedule a refresh and power BI would break. 

1 ACCEPTED SOLUTION

@supton,

I am afraid that there is no way around this. Generally speaking, import models will be faster, and 500 thousand rows is not a large dataset, it is recommended that you use “Import” mode to import data to Power BI.

Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

14 REPLIES 14
v-yuezhe-msft
Employee
Employee

@supton,

In addition to other's post, you can consider to use In-Memory technology in Azure SQL database to optimize performance.

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-in-memory

Regards,

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

I started looking into this as it seemed like it might help lower my DTU usage. Unfortunately, you dont get any OLTP storage on the standard plan, and you need to spend more than I have to get an adequate amount of storage space on the Premium plan, if I understand properly.

 

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers#single-database-servi...

Anonymous
Not applicable

Have you checked that your database is properly indexed?  I mean, if you have no indexes and its trying to join 100k records in 1 table to 500k records in another table... that *is* gonna leave a mark.

Also, maybe a question for a different thread but does power BI load the tables per page of a report? I only ask because several pages seem to load just fine for me, usually those that only reference 2 or 3 tables. But the one I am referring too being extremely slow references many different tables. I even have some of the exact same visualizations on this and other tables and they seem to load much more quickly on the other pages. 

Anonymous
Not applicable

I'm not well versed on how Power BI makes its magic SQL queries, but... at any rate, I would certainly treat this as a straight SQL perf problem.  Take a profiler trace on SQL or maybe use sp_whoisactive ... while the performance is making you sad, and see what is going on w/ the query plans, etc.

 

It's possible you need to optimize your database a bit for the types of queries Power BI is sending.

I would want to agree, but I can run straight sql queries against the database and they run without a problem, much more quickly than power BI seems to handle them and with much less effort. The bottleneck here appears to be Power BI.

Anonymous
Not applicable

Did you capture the *actual* query that Power BI is sending to query?

 

@ImkeF We need you 🙂

Actually, I didn't. I merely queried the views that I am using directly. Thanks for pointing this out. But I see that Power BI adds a lot of extra statements when selecting the data. Is there a way around this, or to actually just send a direct SQL statement to the server and return the columns? 

@supton,

I am afraid that there is no way around this. Generally speaking, import models will be faster, and 500 thousand rows is not a large dataset, it is recommended that you use “Import” mode to import data to Power BI.

Regards,
Lydia

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

I understand. This was the route which I took initially. But I was unable to schedule a refresh because of the largest table (which was 1.5 million before it was split). The refresh took too much time, and the scheduled refresh would crash because of this. It would take over an hour to process any refresh of the data on my local machine.

@supton,

Do you still encounter the performance issue after you use In-Memory technologies ?

Regards,

Community Support Team _ Lydia Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

I refactored to use import from azure and it seems to work fine now. The Dynamics 365 Connector must just be quite slow then because I previously was using the same data but it would error out because it would take to long. Anyways, all good now. Thanks!

 

Sorry, I have no experience with direct query mode 🙂

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Yes the largest tables are indexed.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.