Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
tommyallitt
New Member

The received uncompressed data on the gateway client has exceeded limit.

Hi,

 

I have been receiving the message:

 

The received uncompressed data on the gateway client has exceeded limit.

 

On my data refresh since 8th June. I have upgraded my gateway, I have update the data via desktop and uploaded it fine, but everytime the refresh occurs in the morning it fails with the above error. The data size is under 400mb. Can someone please help me figure out what is going wrong.

 

Kind Regards

 

Tom

13 REPLIES 13
ascendbi
Advocate II
Advocate II

For those not following a similar and related thread: Online refresh is failing :The received uncompressed data on the gateway client has exceeded limit

 

I have copied my update here for reference. 

========================

 

I receive the following notice from Microsoft Support and Engineering yesterday. They have confimred that this is a bug. No ETA other that "soon". 

 

Date: Tue, 20 Jun 2017 18:28:39 (UTC -04:00)

Hello David,

 

After extensive research we found out that this is a bug. I have notified the product group and I have created bug tracking number for them. They are working on this issue and this issue should be resolved soon. I will get back to you once I receive an update from my product group.

 

 

Regards,

Uday Pendem

Engineer - Microsoft Business Intelligence

 

 

Apparently, the people of Microsoft updated their page of problems with the update of the dataset

I found this:

 

Error: The received uncompressed data on the gateway client has exceeded limit.

The exact limitation is 10 GB of uncompressed data per table. If you are hitting this issue, there are good options to optimize and avoid the issue. In particular, reducing the use of highly repetitive, long string values and instead using a normalized key or removing the column (if not in use) will help.

 

https://powerbi.microsoft.com/en-us/documentation/powerbi-gateway-onprem-tshoot/

Anonymous
Not applicable

This recommendation on how to reduce size is fine. However it doesn't make sense:

Publishing the report from Power BI Desktop over 10 GB uncompressed data is not an issue, everything works fine. But when the same is updated via the gateway, then it is an issue? If so, then this inconsistency is most confusing. Either this, or there is more to this bug, then the sheer size of the uncompressed data.

This issue is affecting 2 of my data models  one is ~500 MB on disk as a .pbix and the other is ~75 MB.

 

Both refresh and publish to the PBI Service fine but fail to refresh via the Data Gateway. MS had told us it is a bug and that a fix is in the works.

 

Thankfully there is a manual work around or we would be dead in the water. 

Can you please how can you manually work around this problem since it is still happening till now. Regards

The response of the support team was:

Hello Juan,
    I received feedback from the Product Group. This issue is by design:
https://powerbi.microsoft.com/en-us/documentation/powerbi-refresh-troubleshooting-refresh-scenarios/#uncompressed-data-limits-for-refresh
 
We are working to add into this documentation there is a limit of 10GB of raw (uncompressed) data.

According to the documentation already updated on 12-June, the limit per table is 10GB at the time of reading the data.

My pbix file has 320MB and 45 million rows in the fact table. To find the actual weight of the data at the time of reading, what I did was import the data to sql server and see when space occupied the table with 45 million records. I found that approx. It weighed 10.3Gb.With the weight of my model in mind, I found a solution.

 

Check the data type of each of the 30 columns of my fact table and change the data types for example:

INT -> For tinyint or Smallint

Decimal -> For smallmoney or money

 

wirh that, I reduced my data from 10.3Gb to 3.9GB. With that change, my dataset is updated correctly without any problem.

 

 

Hi @iJuancho - nice work - what were the data types before you changed them? 

In my case, The data type changes I made were:

 

- 9 columns of INT (4 bytes) to TINYINT (1 byte) => 45 millions * 9 columns * 4 bytes = 1.5Gb vs 386MB

- 5 columns of INT to SMALLINT (2 byte) => 45 millions * 5 columns * 4 bytes = 858Mb vs 429MB

- 4 columns of DECIMAL(14, 6) (13 bytes) to SMALLMONEY (4 bytes) = 45 millions * 4 columns * 13 bytes = 2.2Gb vs 686Mb

- 1 column of Varchar(50) to Varchar(30) = 2.1Gb vs 1.2Gb

 

If we add the changes these give us:

Before = 6.6GB

After = 2.7Gb

 

Finally, my fact table has a UNION between Sql Server and Oracle. When defining my Sql Server table as primary, Power BI takes the metadata of the primary table and forces data coming from Oracle to have that same data type.

 

 

Hi @iJuancho

 

It appears I will need to investigate the same approach you did in reducing the data sizes. See our response from MS here: 

http://community.powerbi.com/oxcrx34285/board/message?board.id=power-bi-web-app&message.id=27670#M27...

 

So the data type changes you made below... did you make them at the database table level or did you change their type in a SQL view or Stored Procedure? 

 

Thank you 

Anonymous
Not applicable

Thanks for sharing your support feedback and solution experience. Optimizations like this are best practice anyways.

But for the sake of the issue initiating this thread, it is quite a heavy workaround. There should be a way embedded into the Power BI Desktop or to the Power BI Service to figure out the actual uncompressed memory footprint. This doesn't seem to be rocket science, and would save a lot of headache. It would allow analysts to discover how inefficient their data models are. It would ultimately help the Power BI infrastructure as well by helping people to avoid loading the Power BI servers with unnecessary workload.

Anonymous
Not applicable

The same error and behavior happes in my case too. The PBIX file has 325 MB size, and standard publish has no issues. Refresh via the gateway fails with the same error as above:

 

Last refresh failed: Mon Jun 19 2017 20:15:35 GMT+0200 (Romance Daylight Time)
The received uncompressed data on the gateway client has exceeded limit.Hide details
Cluster URI: WABI-NORTH-EUROPE-redirect.analysis.windows.net
Activity ID: 9eac2bf0-296e-4cfc-94e9-9002f0002b62
Request ID: 3eb3672f-b9ae-cadd-6dc9-f19bd1b4ec25
Time: 2017-06-19 18:15:35Z

 

Weird. And needs fixing quickly, because this is clearly broken.

CasperAbakion
New Member

We also have this problem with two of our customer's solutions. I found four  posts on it on the forums. It would be really nice to get this solved.

 

Last refresh failed: Wed Jun 14 2017 10:40:20 GMT+0200 (Romance Daylight Time)
The received uncompressed data on the gateway client has exceeded limit.
Hide details

Cluster URI:WABI-NORTH-EUROPE-redirect.analysis.windows.net
Activity ID:d748407e-178a-d40e-e1d4-bca80d794676
Request ID:5f15b9a9-10ab-fe8d-98be-bfdb9257dd61
Time:2017-06-14 08:40:20Z

 

 

Last refresh failed: Wed Jun 14 2017 00:25:38 GMT+0200 (Romance Daylight Time)
The received uncompressed data on the gateway client has exceeded limit.
Hide details

Cluster URI:WABI-NORTH-EUROPE-redirect.analysis.windows.net
Activity ID:be47e442-a91a-49d7-9f9f-5e4e6562c964
Request ID:07225bac-8945-48b9-8e5a-0171afa8c1fe
Time:2017-06-13 22:25:38Z

 

This problem must be something that has changed from the microsoft side of things as it is effecting a number of people from sometime last week. It would be nice for someone to tell us what the problem is.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors