Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello! I am pretty new to Power BI and I am looking for advice on best practices. We are attempting to use Power BI to report our largest set of data which is around 60 GB and it's hosted in azure SQL. And we have more than 600 tables in our database. I want to know which option is most suitable for us, whether we have to use direct query or import query?
When we are using Power BI Desktop it's downloading all data to local system. Is there are any option to limit that, so we can get part of the data to power BI desktop to design the report. Because it takes hours to import data to the. pbix file.
Hi deepakumar,
Imo, since your data is voluminous, you could store the data in SSAS cubes (although this will introduce a new product in your solution i.e. SQL Server).
you could then use PowerBI as a visualization tool.
We are working extensively with such a solution where we deal with datasets of millions of rows. Another advantage is that the entire modelling can be centrallized to the server and PowerBI will only contain the visualization layer which will just be a couple of mb in size.
Thanks,
Rajeev.
Hi @deepakumar,
There is 1 GB dataset limitation for Import mode(If this limitation is hit, the report cannot be published to Power BI service). However, the 1 GB dataset limitation does not apply to DirectQuery. So you may need to use DirectQuery in your scenario.
In addition, filtering rows option in Power Query or importing Data from Database using Native Database Query can also be used to limit the data used in Power BI.
Regards
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.