Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
alvino
Frequent Visitor

volumn of data

Hello.

I have 5 terabyte of data, i need to make a dashboard .

If i buy power bi premium, i can create it??

 

2 ACCEPTED SOLUTIONS

Power BI does have good compression machenism thus your 400 GB model will contain around 4TB of data at source level.

 

As you mentioned you wanted to have 5TB of data in single Model I would suggest to rechout to Microsoft to opt for P4/P5
I would recommand to have only required data in your model that will help you to reduce the data size.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

View solution in original post

Hi @alvino 

The size of a Datamodel depends on a great number of things, not just the number of rows. The datamodel compresses on a column by column basis based on column repetition. So the more your data contains repeated data the better it compresses. Equally with Power Query you can restrict which columns you bring in to Power BI. Many things can be done to optimise the Datamodel both for size and performance.

See: Optimization guide for Power BI - Power BI | Microsoft Docs

Also, if your data server is fast enough you can have a direct connection to the data, not necessarily import the data into the Datamodel, this can work for very large Databases. 

See: Using DirectQuery in Power BI - Power BI | Microsoft Docs

Lastly, you can use incremental refresh to work with very large Datasets, so you only have the portion of data you need fast access to in the Datamodel and the other data out on direct query.

See: Incremental refresh for datasets and real-time data in Power BI - Power BI | Microsoft Docs

So it's always difficult to say exactly how large a Datamodel will be, certainly you'll need Power BI Premium with large model support, but then there are several options which will govern the size of the resultant Datamodel. 

Hope this helps

Stuart

View solution in original post

5 REPLIES 5
alvino
Frequent Visitor

Thanks @arvindsingh802 and @Burningsuit , I understood, I already saw that I confuse the interpretation between model and data in general.

arvindsingh802
Super User
Super User

Yes, you will need Power BI Premium to have data more than 10 GB, it can hold data upto 100 TB


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

I saw on the Microsoft website that the model size limit on the power bi premium goes only up to 400GB, and I want to load all 5TB in a single model, since they all represent purchase.

Hi @alvino 

The size of a Datamodel depends on a great number of things, not just the number of rows. The datamodel compresses on a column by column basis based on column repetition. So the more your data contains repeated data the better it compresses. Equally with Power Query you can restrict which columns you bring in to Power BI. Many things can be done to optimise the Datamodel both for size and performance.

See: Optimization guide for Power BI - Power BI | Microsoft Docs

Also, if your data server is fast enough you can have a direct connection to the data, not necessarily import the data into the Datamodel, this can work for very large Databases. 

See: Using DirectQuery in Power BI - Power BI | Microsoft Docs

Lastly, you can use incremental refresh to work with very large Datasets, so you only have the portion of data you need fast access to in the Datamodel and the other data out on direct query.

See: Incremental refresh for datasets and real-time data in Power BI - Power BI | Microsoft Docs

So it's always difficult to say exactly how large a Datamodel will be, certainly you'll need Power BI Premium with large model support, but then there are several options which will govern the size of the resultant Datamodel. 

Hope this helps

Stuart

Power BI does have good compression machenism thus your 400 GB model will contain around 4TB of data at source level.

 

As you mentioned you wanted to have 5TB of data in single Model I would suggest to rechout to Microsoft to opt for P4/P5
I would recommand to have only required data in your model that will help you to reduce the data size.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors