Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Can Power BI Dashboard with more than10 Million records function properly

We are migrating a report from Tableau to Power BI. The tableau report has around 10 Million records and we had done some modifications at the server level to handle this level of data.

 

Post migration in Power BI it needs to handle the 10 Million records and we have daily refresh as source data get updated daily. Kindly let me know what all needs to be done to handle this level of data. Also, the data source is a single table.

2 ACCEPTED SOLUTIONS
parry2k
Super User
Super User

@Anonymous I work with more than 100 million rows, it all comes down to bring the data column which is going to be used in the report and make sure the data type is correct for each column,

 

For data refresh, you can always look at incremental refresh so that every time you don't have to do full data load. You can good for an incremental refresh, here are many posts on it.

 

End of the day I don't see it as an issue but at point, it is suggested and best practice to have a star schema model in Power BI instead of a flat one fact table. FYI

 

Check my latest blog post Comparing Selected Client With Other Top N Clients | PeryTUS  I would ❤ Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos to whoever helped to solve your problem. It is a token of appreciation!

Visit us at https://perytus.com, your one-stop-shop for Power BI-related projects/training/consultancy.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

View solution in original post

v-stephen-msft
Community Support
Community Support

Hi @Anonymous ,

 

There is no volume limitation for a load for either DirectQuery or Import. DirectQuery has a limit of rows behind each visual - 1 mln, but Import has no such limit. You have a limit for Import mode - memory usage limited to your computer resources (RAM). So if you have 8 GB of RAM and your report consumes 9 GB then it will fail. You can check usage via Windows Task Manager -> Resource Monitor (https://www.digitalcitizen.life/how-use-resource-monitor-windows-7)

 

Reference: 1 million rows

 

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

2 REPLIES 2
v-stephen-msft
Community Support
Community Support

Hi @Anonymous ,

 

There is no volume limitation for a load for either DirectQuery or Import. DirectQuery has a limit of rows behind each visual - 1 mln, but Import has no such limit. You have a limit for Import mode - memory usage limited to your computer resources (RAM). So if you have 8 GB of RAM and your report consumes 9 GB then it will fail. You can check usage via Windows Task Manager -> Resource Monitor (https://www.digitalcitizen.life/how-use-resource-monitor-windows-7)

 

Reference: 1 million rows

 

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

parry2k
Super User
Super User

@Anonymous I work with more than 100 million rows, it all comes down to bring the data column which is going to be used in the report and make sure the data type is correct for each column,

 

For data refresh, you can always look at incremental refresh so that every time you don't have to do full data load. You can good for an incremental refresh, here are many posts on it.

 

End of the day I don't see it as an issue but at point, it is suggested and best practice to have a star schema model in Power BI instead of a flat one fact table. FYI

 

Check my latest blog post Comparing Selected Client With Other Top N Clients | PeryTUS  I would ❤ Kudos if my solution helped. 👉 If you can spend time posting the question, you can also make efforts to give Kudos to whoever helped to solve your problem. It is a token of appreciation!

Visit us at https://perytus.com, your one-stop-shop for Power BI-related projects/training/consultancy.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.