Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
AndersASorensen
Frequent Visitor

PPU license vs Fabric

Hi,

 

With a PPU (20GB a month) I can have a 100GB dataset and share with collueges in same tenant that have a Pro or PPU.

But with a F32 ($4200 a month) I get 10GB models (40GB direct lake)? 

Am I understanding this correctly?

 

 

AndersASorensen_0-1713787562858.png

 

 

2 ACCEPTED SOLUTIONS
ibarrau
Super User
Super User

Hi. It's not that exactly straight forwards, because if you turn on Large Datasets, the memory is optimized in a different way, no more data model size 10gb then it's up to the roof. It loads to memory depending on the use. In practical talking, yes you are reading correctly due to the fact that the size model would have the maximum memory size. PPU can help at scenarios with large models and small amount of users using just PowerBi. The main difference is that using Fabric let's use create many more content than Power Bi staff like lakehouse, warehouse, notebooks, pipelines, etc. Using PPU is just for PowerBi and users and viewers should have PPU if you develop in a PPU workspace.

If you use F64 or higher you additionally have the sharing with Free users as viewers and Copilot.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

View solution in original post

v-nuoc-msft
Community Support
Community Support

Hi @AndersASorensen 

 

@ibarrau Thank you very much for your prompt reply and here allow me to share some of it.

 

For Power BI Per User Premium (PPU):

 

The default maximum dataset size is 10GB, but when using the Large Dataset Storage Format, the dataset can be larger, depending on the available memory.

 

When the Large Dataset Storage Format option is enabled, the model size for PPU is limited to 100GB.

 

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-per-user-faq#using-premium-per...

 

For Power BI F32 capacity:

 

The screenshot you provided shows that with F32 capacity, you get 10GB for models and 40GB for direct lakes.

 

Also, the documentation states that for the Direct Lake semantic model, the maximum memory represents the upper limit of memory resources, i.e., the amount of data that can be paged.

 

In practice, exceeding it will not result in a fallback to DirectQuery.

 

However, if the amount of data is large enough to cause paging in and out of the model data from the OneLake data, it may have an impact on performance. So, 40GB is not the maximum limit.

 

https://learn.microsoft.com/en-us/fabric/get-started/direct-lake-overview#fallback

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

View solution in original post

2 REPLIES 2
v-nuoc-msft
Community Support
Community Support

Hi @AndersASorensen 

 

@ibarrau Thank you very much for your prompt reply and here allow me to share some of it.

 

For Power BI Per User Premium (PPU):

 

The default maximum dataset size is 10GB, but when using the Large Dataset Storage Format, the dataset can be larger, depending on the available memory.

 

When the Large Dataset Storage Format option is enabled, the model size for PPU is limited to 100GB.

 

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-per-user-faq#using-premium-per...

 

For Power BI F32 capacity:

 

The screenshot you provided shows that with F32 capacity, you get 10GB for models and 40GB for direct lakes.

 

Also, the documentation states that for the Direct Lake semantic model, the maximum memory represents the upper limit of memory resources, i.e., the amount of data that can be paged.

 

In practice, exceeding it will not result in a fallback to DirectQuery.

 

However, if the amount of data is large enough to cause paging in and out of the model data from the OneLake data, it may have an impact on performance. So, 40GB is not the maximum limit.

 

https://learn.microsoft.com/en-us/fabric/get-started/direct-lake-overview#fallback

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

ibarrau
Super User
Super User

Hi. It's not that exactly straight forwards, because if you turn on Large Datasets, the memory is optimized in a different way, no more data model size 10gb then it's up to the roof. It loads to memory depending on the use. In practical talking, yes you are reading correctly due to the fact that the size model would have the maximum memory size. PPU can help at scenarios with large models and small amount of users using just PowerBi. The main difference is that using Fabric let's use create many more content than Power Bi staff like lakehouse, warehouse, notebooks, pipelines, etc. Using PPU is just for PowerBi and users and viewers should have PPU if you develop in a PPU workspace.

If you use F64 or higher you additionally have the sharing with Free users as viewers and Copilot.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors