Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi,
We have S0 pricing tier for Azure Analysis Services.
So total allowed memory is 10 GB.
The lower limit for the memory is 2 GB. this comes by default.
Assume I have a model of 3 GB size. When I process the data, the memory bumps to 9.65 GB and then it fails saying
Failed to save modifications to the server. Error returned: 'You have reached the maximum allowable memory allocation for your tier. Consider upgrading to a tier with more available memory.
What could be the reason behind this ? Is there any setting which is bumping the memory so high even if the total size of the model should not be more than 3 GB( checked it from on premise SSAS)
Regards,
Akash
Hi Akash,
I would suggest you go to forum=AzureAnalysisServices for professional support.
Best Regards,
Dale
Thanks
User | Count |
---|---|
104 | |
86 | |
79 | |
70 | |
70 |
User | Count |
---|---|
112 | |
100 | |
98 | |
72 | |
66 |