Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

In premium Gen2, getting error: database exceeds the maximum size limit on disk

Hi Community,

 

I'm getting an error when I'm trying to refresh my Power BI Dataset from the Service in a Premium P1 capacity with Gen2 activated.

 

The dataset has a size around 12GB (but the power bi file has a size of few MB as it contains only a subset of data, and the source is parameterized).

 

I have two workspaces (UAT, PROD: both in same capacity), for one it's working smoothly and for the other I'm getting this error.

 

Any help would be very appreciate.

 

Thanks

4 REPLIES 4
selimovd
Super User
Super User

Hey @Anonymous ,

 

what exactly is your data source?

Try to use a relational database as SQL server. If that is already the case try to filter out the big chunks as a first step of your transformation. Does the query folding work to filter out the big chunk?

 

If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
 
Best regards
Denis
 
Anonymous
Not applicable

Hi thanks for your reply. 

 

I'm using parquet files stored in a datalake as a source, there is no query folding. But this is working fine in one workspace. And on the other I am getting this issue: 'Database xxx exceeds the maximum size limit on disk'.

 

I saw the limitation of 12GB in import mode but Gen2 is removing this limitation right?

Hi @Anonymous ,

 

Does your problem have been solved? 

 

If the problem is still not resolved, please provide detailed error information or the expected result you expect. Let me know immediately, looking forward to your reply.

 

Best Regards,
Winniz

Hi @Anonymous ,

 

Do you enable "Large dataset storage format" for your dataset? Does your workspace assigned to Premium capacity have the large dataset storage format?

 

Then please check if your dataset size exceeds the maximum size of the offline dataset in memory. This is the compressed size on disk. Default value is set by SKU and the allowable range is from 0.1 – 10 GB.

 

For a full refresh, at least double the current dataset memory size is required. Please monitor memory metrics to determine if there is sufficient memory.

 

About managing and optimizing Premium capacity, please refer to 

Premium capacity scenarios 

Optimizing Premium capacities 

 

 

Best Regards,
Winniz

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors