cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
Frequent Visitor

Refresh via enterprise gateway fails due to the amount of data

Hi,

 

I´m currently facing the following problem: I´m working on a dataset with approximately 15 mio. rows and +- 100 columns (Table in an ORACLE database). Importing the data in Power BI desktop takes time but works fine. Publishing it to the cloud also works, but as soon as I try refreshing the dataset, it fails after appr. 30 minutes with an error message that I found 0 hits for when I tried to google it. Since I´m using a german version, I´m not sure what the exact message looks like in english, but it states something like:

"The limit for uncompressed data received by the gateway was exceeded".

 

Can anyone help me with this?

It would be a huge disappointment if the gateway would indeed not be able to handle that amount of data...

 

Regards,

 

Nico

1 REPLY 1
Highlighted
Microsoft
Microsoft

Re: Refresh via enterprise gateway fails due to the amount of data

Hi @soptim_vond,

Error: The received uncompressed data on the gateway client has exceeded limit.

The exact limitation is 10 GB of uncompressed data per table. If you are hitting this issue, there are good options to optimize and avoid the issue. In particular, reducing the use of highly repetitive, long string values and instead using a normalized key or removing the column (if not in use) will help.

 

According to the documentation already updated on 12-June, the limit per table is 10GB at the time of reading the data.

My pbix file has 320MB and 45 million rows in the fact table. To find the actual weight of the data at the time of reading, what I did was import the data to sql server and see when space occupied the table with 45 million records. I found that approx. It weighed 10.3Gb.With the weight of my model in mind, I found a solution.

 

Check the data type of each of the 30 columns of my fact table and change the data types for example:

INT -> For tinyint or Smallint

Decimal -> For smallmoney or money

 

wirh that, I reduced my data from 10.3Gb to 3.9GB. With that change, my dataset is updated correctly without any problem.



Here is a similar thread in which a workaround is mentioned. Could you go to check if it helps in your scenario. Smiley Happy

 

Regards

Helpful resources

Announcements
May 2020 Community Highlights

May 2020 Community Highlights

It’s time for another PBI Community recap!

Community Blog

Community Blog

Visit our Community Blog for articles, guides, and information created by fellow community members.

Using the Community

Using the Community

Need help with the Power BI Community? Our 'Using the Community' support articles are a great place to start.

Galleries

Galleries

Looking for inspiration on how to present your data? Need instructional videos? Check out our Galleries!

Top Solution Authors