Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
Taisuke
New Member

Problem with exceeding storage space when Power BI Pro license is used

Hello all

 

We are considering creating a new workspace to create a report in an environment that is already using Power BI with a Pro license.
Upon consideration, the development of the report has been suspended due to the risk of exceeding the storage capacity limit when using the existing environment.

 

So my question is. What problems would arise if we exceed the storage capacity limit as we create the report?

I am assuming the following problems
1. creation of a new workspace, report, dataset, etc. will not be possible. 2.

2. All reports in storage cannot be displayed, regardless of whether the data is acquired in import mode or direct query mode.

3. In case of import mode, the latest data cannot be retrieved.

 

We would appreciate it if you could share your knowledge.

1 ACCEPTED SOLUTION
StefanoGrimaldi
Resident Rockstar
Resident Rockstar

hey, 

take a look to this doc talking about capacity storage managment: 

https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi

in short, the workspace that exceed the capacity wont be able to refresh, add new content, etc. same for the model wont be able to refresh. using multiple workspace and dividing the model in different datasets its a way to divide the load according. algo using dataflow to storage the data in azure data lake gen 2 its a efficient way to attack this topic. basically using dataflow wont use workspace storage, and when consuming a dataflow data it consume way less in the data model tham importing and tranforming directly on the dataset.





Did I answer your question? Mark my post as a solution! / Did it help? Give some Kudos!

Proud to be a Super User!




View solution in original post

1 REPLY 1
StefanoGrimaldi
Resident Rockstar
Resident Rockstar

hey, 

take a look to this doc talking about capacity storage managment: 

https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi

in short, the workspace that exceed the capacity wont be able to refresh, add new content, etc. same for the model wont be able to refresh. using multiple workspace and dividing the model in different datasets its a way to divide the load according. algo using dataflow to storage the data in azure data lake gen 2 its a efficient way to attack this topic. basically using dataflow wont use workspace storage, and when consuming a dataflow data it consume way less in the data model tham importing and tranforming directly on the dataset.





Did I answer your question? Mark my post as a solution! / Did it help? Give some Kudos!

Proud to be a Super User!




Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel1

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

Top Solution Authors