Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hello all
We are considering creating a new workspace to create a report in an environment that is already using Power BI with a Pro license.
Upon consideration, the development of the report has been suspended due to the risk of exceeding the storage capacity limit when using the existing environment.
So my question is. What problems would arise if we exceed the storage capacity limit as we create the report?
I am assuming the following problems
1. creation of a new workspace, report, dataset, etc. will not be possible. 2.
2. All reports in storage cannot be displayed, regardless of whether the data is acquired in import mode or direct query mode.
3. In case of import mode, the latest data cannot be retrieved.
We would appreciate it if you could share your knowledge.
Solved! Go to Solution.
hey,
take a look to this doc talking about capacity storage managment:
https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi
in short, the workspace that exceed the capacity wont be able to refresh, add new content, etc. same for the model wont be able to refresh. using multiple workspace and dividing the model in different datasets its a way to divide the load according. algo using dataflow to storage the data in azure data lake gen 2 its a efficient way to attack this topic. basically using dataflow wont use workspace storage, and when consuming a dataflow data it consume way less in the data model tham importing and tranforming directly on the dataset.
Proud to be a Super User!
hey,
take a look to this doc talking about capacity storage managment:
https://learn.microsoft.com/en-us/power-bi/admin/service-admin-manage-your-data-storage-in-power-bi
in short, the workspace that exceed the capacity wont be able to refresh, add new content, etc. same for the model wont be able to refresh. using multiple workspace and dividing the model in different datasets its a way to divide the load according. algo using dataflow to storage the data in azure data lake gen 2 its a efficient way to attack this topic. basically using dataflow wont use workspace storage, and when consuming a dataflow data it consume way less in the data model tham importing and tranforming directly on the dataset.
Proud to be a Super User!