Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Taisuke
New Member

Problem with exceeding storage space when Power BI Pro license is used

Hello all

 

We are considering creating a new workspace to create a report in an environment that is already using Power BI with a Pro license.
Upon consideration, the development of the report has been suspended due to the risk of exceeding the storage capacity limit when using the existing environment.

 

So my question is. What problems would arise if we exceed the storage capacity limit as we create the report?

I am assuming the following problems
1. creation of a new workspace, report, dataset, etc. will not be possible. 2.

2. All reports in storage cannot be displayed, regardless of whether the data is acquired in import mode or direct query mode.

3. In case of import mode, the latest data cannot be retrieved.

 

We would appreciate it if you could share your knowledge.

1 ACCEPTED SOLUTION
StefanoGrimaldi
Resident Rockstar
Resident Rockstar

hey, 

take a look to this doc talking about capacity storage managment: 

https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi

in short, the workspace that exceed the capacity wont be able to refresh, add new content, etc. same for the model wont be able to refresh. using multiple workspace and dividing the model in different datasets its a way to divide the load according. algo using dataflow to storage the data in azure data lake gen 2 its a efficient way to attack this topic. basically using dataflow wont use workspace storage, and when consuming a dataflow data it consume way less in the data model tham importing and tranforming directly on the dataset.





Did I answer your question? Mark my post as a solution! / Did it help? Give some Kudos!

Proud to be a Super User!




View solution in original post

1 REPLY 1
StefanoGrimaldi
Resident Rockstar
Resident Rockstar

hey, 

take a look to this doc talking about capacity storage managment: 

https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi

in short, the workspace that exceed the capacity wont be able to refresh, add new content, etc. same for the model wont be able to refresh. using multiple workspace and dividing the model in different datasets its a way to divide the load according. algo using dataflow to storage the data in azure data lake gen 2 its a efficient way to attack this topic. basically using dataflow wont use workspace storage, and when consuming a dataflow data it consume way less in the data model tham importing and tranforming directly on the dataset.





Did I answer your question? Mark my post as a solution! / Did it help? Give some Kudos!

Proud to be a Super User!




Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors