We received a response from Microsoft and they are now declaring that our issue is not a bug as first suggested, but the new limitations of refreshing data via Gateway with the Power BI Service.
I agree with @vidotom - this is a bait and switch. Adding an additional limitation to the 1 GB .pbix a full year after expanding it from 250 MB leaves a very bad taste in our mouth. We are now left with the choice of 1) eliminating rows 2) figuring out how to reduce data size with methods proposed by @iJuancho or 3) fork over $60,000 and move to premium (not knowing what limitations with slowly creep into that approach over time)
The most likely option is #2 but this is a grind and we don't even know how to measure how close to the limit we are.
After extensive investigation come to a conclusion, the challenge you are experiencing is a direct result of the new changes applied since June 1er 2017, now the gateway only allowed 10 gigabit of flow through, as solution you will have to modify your query to reduce the data, and this experience is a design limitation, this is the :
Uncompressed data limits for refresh
The maximum size for datasets imported into the Power BI service is 1 GB. These datasets are heavily compressed to ensure high performance. In addition, in shared capacity, the service places a limit on the amount of uncompressed data that is processed during refresh to 10 GB. This limit accounts for the compression, and therefore is much higher than 1 GB. Datasets in Power BI Premium are not subject to this limit. If refresh in the Power BI service fails for this reason, please reduce the amount of data being imported to Power BI and try again.
and this leads us to believe that the problem no longer exists or you have other more pressing business priorities. If I do not hear to the contrary, I will close this incident tomorrow at close of business on 07/06/2017.
Please e-mail me at your earliest convenience in case you have contrary elements and want to continue with the troubleshooting.
Thanks @ascendbi for the update on your situation. It is good to see the clear picture now.
So it is official now. I wonder if this new limitation will show up in any documentation sometime soon. Not to mention to let the community know in advance, and not via error messages breaking their solutions which used to work for a year...
Regarding option 2) reducing data size being imported:
I opened an idea for having a memory consumption indicator, or something like that. Exactly for this very purpose. Because it is very difficult and cumbersome to figure out your actual data size, and having the only feedback is the refresh via the Power BI Service is... let's say sub-optimal, and not the most efficient way of figuring out whether you are still above the limit or you managed to slide under the door.
Dear all: feel free to vote for this idea if you agree that it would be useful to have this feature.
Just adding to the thread the workaround we now have working...
We are now refreshing and publishing the same data modelS via Power Update. (http://poweronbi.com/power-update-features/) This software, on a schedule, opens PBIDesktop, refrshes the data and publishes it with Replace to the wworkspace. The software runs on a Windows 2008 R2 VM with 48 GB RAM that is also running the SQL Server DB that the model sources some of its data from.
Note Power Update gave timeout errors when the refresh took over 90 minutes, but works well if the refresh is under an hour. We have a suppoert ticket open with them to resolve the timeout as it occurs when the data model approachs the 1 GB Model Size limit accepted by power BI.
Note also that since they pulled Active Directory from the new Data Gateway - Personal (the old one will stop working July 31, 2017), we are refreshing models with AD connection with Power Update as well.
For those following this thread, I received a call from Microsoft Support today indicating that there is a Bug in the change made that implimented the limit of 10 GB of Uncompressesd data refreshed via the Data Gateway. The ticket is still open because I was pushing back on the fact that they imposed a limit but have provided no method to measure or manage the limit beyond the phrase "reduce the data size"
The message is that "... RCA revealed a bug. A corrective action is in process and will be completed at the end of August 2017 on 31."
It is unclear as to what the bug is or whether it is directly affecting all or some of my data models. It was also unclear as to whether th 10 GB limit was being rolled back or how it is measured is being corrected.
Anyways, it more than I knew yesterday, so I thought I would share.
It is good to know that there is something going on. We hope to get more info on this in any channel. I agree that it is not fair to introduce a limitation without any capability to measure where people are compared to the limit. Not to mention the need to announce such limits in advance, and not make everyone discover them the hard and painful way: their reports are broken.