Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
We are in the process of creating a report out of a table that contains 70M records. Our report query would pull around 60 M of data from this table.
Out Database source is Postgres. We have hosted our Power BI in a windows virtual machine. The RAM is 10 GB and 2 CPU processors are in place in this server.
The issue is when we try pulling the data from the DB - View which contains around 60M data, it shows memory allocation error.
I have tried reducing the dataset to 25 M also. Still same memory allocation error. But when I load the same data from a CSV file, the data gets imported to Power Bi.
Can anyone please suggest a way how can I improve the performance of Power BI so that I can pull data directly from DB without using file as a source.
Are you using the 32bit version of Power BI desktop? That would cause this, and there is no fix, short of moving to the 64 bit version. The 32bit app is limited to 2GB of RAM, and 70 million records would probably break that barrier depending on the number of fields, other tables, etc.
If you are using the 64bit version, I've seen memory errors and crashes if the data has non-printing ASCII characters, which you can strip off using a CLEAN transformation in Power Query before it gets imported into the data model.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingCovering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
114 | |
97 | |
85 | |
70 | |
61 |
User | Count |
---|---|
151 | |
121 | |
104 | |
87 | |
67 |