Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
OneWithQuestion
Post Prodigy
Post Prodigy

How to optimize importing large data from SQL Server tables into Power BI / SSAS Tabular?

I have several data tables that are read that take a rather long time for the data to transfer.

 

The query that runs against the source table is not complex (Efflectively SELECT (bunch of columns) FROM TableName.

 

The slow down appears to be that the data being moved is a large amount of data and thus takes time to pull off disk.

 

Has anyone worked with this issue, where your Tabular refresh time is highly bottlenecked by how fast you can get the SQL data source to send you the data?

 

Does setting up a DW in SQL using in memory tables help with this, or any way to compress the data being transfered, pre-optimize the data before it is consumed by SSAS Tabular or Power BI, etc?

 

 

1 REPLY 1
TeigeGao
Solution Sage
Solution Sage

Hi OneWithQuestion,

The process importing data from SQL Server to PowerBI can be divided into three parts, firstly, SQL Server executes the query to get the result, then transfer the dataset to PowerBI Desktop, after that PowerBI desktop will generate the data model.

>> The slow down appears to be that the data being moved is a large amount of data and thus takes time to pull off disk.

Actually, only the first part involves moving data from disk to memory, we need first checking how long does SQL Server take to execute the SQL query, we can use the SQL Server Profiler to monitor how long it takes for a query. We can also run the query in SSMS and get the time: http://www.sqlserver.info/management-studio/show-query-execution-time/

After SQL Server getting the result, the dataset will be stored in memory, then transfer to PowerBI, this part will not take much time. When PowerBI get the dataset, it will generates the data model and compress the data. This process will take much time, please refer to the following blog: https://www.sqlbi.com/articles/data-import-best-practices-in-power-bi/ to get better performance.

>>Does setting up a DW in SQL using in memory tables help with this, or any way to compress the data being transfered, pre-optimize the data before it is consumed by SSAS Tabular or Power BI, etc?

Setting up in-memory table for SQL Server only improve the performance of query in SQL Server, if SQL Server takes long time to get the result, we can consider using it, if it doesn’t take much time, there is no need. To compress data being transferred, we can use the suitable data type for each column, we can create fact table and dimension table if all information are stored in a table.

Best Regards,

Teige

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors