Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
soptim_vond
Frequent Visitor

Refresh via enterprise gateway fails due to the amount of data

Hi,

 

I´m currently facing the following problem: I´m working on a dataset with approximately 15 mio. rows and +- 100 columns (Table in an ORACLE database). Importing the data in Power BI desktop takes time but works fine. Publishing it to the cloud also works, but as soon as I try refreshing the dataset, it fails after appr. 30 minutes with an error message that I found 0 hits for when I tried to google it. Since I´m using a german version, I´m not sure what the exact message looks like in english, but it states something like:

"The limit for uncompressed data received by the gateway was exceeded".

 

Can anyone help me with this?

It would be a huge disappointment if the gateway would indeed not be able to handle that amount of data...

 

Regards,

 

Nico

1 REPLY 1
v-ljerr-msft
Employee
Employee

Hi @soptim_vond,

Error: The received uncompressed data on the gateway client has exceeded limit.

The exact limitation is 10 GB of uncompressed data per table. If you are hitting this issue, there are good options to optimize and avoid the issue. In particular, reducing the use of highly repetitive, long string values and instead using a normalized key or removing the column (if not in use) will help.

 

According to the documentation already updated on 12-June, the limit per table is 10GB at the time of reading the data.

My pbix file has 320MB and 45 million rows in the fact table. To find the actual weight of the data at the time of reading, what I did was import the data to sql server and see when space occupied the table with 45 million records. I found that approx. It weighed 10.3Gb.With the weight of my model in mind, I found a solution.

 

Check the data type of each of the 30 columns of my fact table and change the data types for example:

INT -> For tinyint or Smallint

Decimal -> For smallmoney or money

 

wirh that, I reduced my data from 10.3Gb to 3.9GB. With that change, my dataset is updated correctly without any problem.



Here is a similar thread in which a workaround is mentioned. Could you go to check if it helps in your scenario. Smiley Happy

 

Regards

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors