Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Quick question, tried search and if the answer was in there it was a bit too technical for me to understand.
I have PRO. Our EMR vendor has created a SQL database that is 7gb in size, 5.5M rows containing 4 years data. I know there is a 10gb limit for single table to be refreshed into a DataFlow. I query that and pull about 2M rows in after filtering out rows I don't want/need.
Question is: is that 10gb size limit the limit for the total table before I filter out rows? Or (hopefully) the limit of the subset of data I want to import?
Hi. The limit is for the total amount of tables blob or lake gen2 size. This means you can't check the size of db, get data and expect to be the exct size in dataflows. Power Bi reduce the size of the original source to smaller files. I think 5 millons is not a lot of data and you will be able to work with it normally.
If you want to aproximate this size, you can do all transformations on Power Bi Desktop, save the file and check the file size.
Regards,
Happy to help!