I feel like I'm missing something regarding the way Power BI Premium dataset refreshes work vs how it is managed in Azure Analysis Services.
A quick about: we're an ISV that has a single database per client. We've used Power BI to define the semantic model, and we publish to hundreds of workspaces programatically via API.
We're considering moving to Power BI Premium, but the six concurrent dataset refresh limit is a significant obstacle for us from a pricing perspective (we have hundreds of datasets). We're toying with the idea of creating a queuing system (six at a time, when one finishes, a new one starts), but it seems like a hack.
I know you can automate Analysis Services models updating (https://azure.microsoft.com/en-us/blog/azure-as-automated-partition-management/?v=17.23h), but I also know that Microsoft wants Power BI to be the analysis services engine of choice moving forward.
So here's my question:
Are there concurrent dataset refresh limits in Azure Analysis Servics? Is there something I'm missing here? It seems like Power BI isn't a reasonable solution for a company who needs to process hundreds of datasets at a higher frequency.