I have a question with you experts about how to implement a Power BI architecture in the context of Azure Cloud and POwer BI service.
Couple months ago, we started a POC with Power BI, Power BI service connecting to Azure database and data being loaded in Power BI itself (input mode not directquery) and the Power Bi service .All worked fine and dandy as we had little data to start with. Now we loaded our full data into the file, the pbix is 180MB. As we did that we are also starting to see some circles(latency) when a filter is selected between the visualizations and we are starting to doubt this approach. Additionally we got a requirement we have to share this with external users who will be editing the file so started a Group workspace as well with with Pro License.
With the data limilations on the Pro account(10 GB) and overloading the data in PBIX is causing a performance hit, we are starting to look into Direct Query. With Direct query data resides on the database, is SQL Azure or SQL data warehouse is capable of getting us rid of the perfomance or we need a SSAS tabular engine? Can you please share your ideas in the Context of Azure and Power BI service if you already have implemented something similar. Thank you in Advance!
I have been trying to understand how large scale datasets are handled.
Here is my scenario. I have a db per tenant (customer), and plan to create reports per customer which internally point to the corresponding customer db.
Im unclear about how the datasets are handled internally by powerbi service, and what levers are available to me to manage concurent data sets for 10s of customers, as well ensure required latency, e2e from db to powerbi service to powerbi embedded vnode to the web app dashboard.