cancel
Showing results for 
Search instead for 
Did you mean: 

Knowing the uncompressed dataset size on the gateway

Since with datasets that always refreshed overnight as scheduled I'm now getting the "received uncompressed data on the gateway client has exceeded limit" error - without many changes from the previous datasets size - I'd like to know if there's a way to know the actual uncompressed data size.

(And I wonder why what always used to work, doesn't work anymore...)

 

Thanks,

Davide

Status: Accepted
Comments
Frequent Visitor

Apparently, the people of Microsoft updated their page of problems with the update of the dataset

I found this:

 

Error: The received uncompressed data on the gateway client has exceeded limit.

The exact limitation is 10 GB of uncompressed data per table. If you are hitting this issue, there are good options to optimize and avoid the issue. In particular, reducing the use of highly repetitive, long string values and instead using a normalized key or removing the column (if not in use) will help.

 

https://powerbi.microsoft.com/en-us/documentation/powerbi-gateway-onprem-tshoot/

Helper IV

I am seeing this error as well.  The report is only about 8MB in total size when saved.   I can upload it directly from Power BI Desktop to Power BI service, but a refresh gives me the "uncompressed data on the gateway client has exceeded limit" error.  I have reports that are a few hundred megabytes that refresh just fine.  I'm not sure how else to gauge how much uncompressed data is being sent up to the Power BI service, but if my report is this small, I don't know how it could remotely reach the 10GB limit of uncompressed data.

Frequent Visitor

Has someone found a way to know what's the actual size of the "uncompressed data"?

I'm still getting the message even with datasets that are way smaller than others where I don't get it...

Helper II

Hi, I have never really gotten an answer on this, but figured out it not really about the size of the PBIX file or the number of rows in your tables, but the amount of text columns and other dimensional data that seems to have caused the issue for me. 

 

When i pulled out all the descriptions and other dimensional data from a table and added lookup tables for that data, the file refreshed just fine in the service. And I had the same number of rows as before. 

 

Not sure if this is the same issue you are dealing with, but hope that helps.