Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
murugaanbu
Regular Visitor

Huge data volume consumption in Power BI

Hi Experts,

    We have data volume of >10M records in Azure Synapse, what is the best performant approach to push such a huge volume of data from Azure to Power BI. Post initial load we need to perform incremental data refresh from Azure to Power BI.

 

1) What is the connection mode we need to use?

2) Star schema in Power BI is performant while handling such huge volume of data?

 

Please show some light over to this.

2 REPLIES 2
murugaanbu
Regular Visitor

We can use direct query but every time it will transact the backend hence we don't want to do that. We have models which is having more than 100 columns and data volume provided is just an example actual volume is more than 10M. 

lbendlin
Super User
Super User

Why do you need to push data from one Microsoft product to another? Can you not leave the data at the source and access with direct query?

 

10M rows is not "huge volume", it's more on the small side.  Huge starts when the dataset size is bigger than half the capacity memory.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors