Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi everyone,
I am working with a customer on a Power BI Embedded implementation.
I created a workspace collection and a workspace, and was able to upload a small pbix file in the workspace,
The PBIX data source is an Azure SQL Database with several GB of data, which is updated in a daily frequency : that's why I first wanted to use Direct Query.
Problem : I had to use restricted DAX measures for some business calculations : the performances were very bad (which I guess is why it is restricted in the first), even if I am using P1 performance level, with 125 DTU and column store indexes.
So I tried to go with Import Data, which generates a 450 MB pbix file.
When I try to upload it through the API (using the ProvisionSample project available here: https://github.com/Azure-Samples/power-bi-embedded-integrate-report-into-web-app/), the process does not work as it does for the Direct Query pbix.
I get this error message from the console application : "Ooops, somehting broke.", written in red.
Any idea why it does not go through? Is there a limit to pbix files when uploaded to a Power BI Workspace ? If yes, is it possible to call Microsoft to bypass this limit?
Regards,
Nicolas
@Anonymous: If you use import data option in Power BI embedded, How you are doing to refresh your data.
But 450 MB not issue, as per my knowledge size limit is 1GB
@Sunkari thanks for the reply !
I am waiting the feature from the API 🙂
Meanwhile, I had this *dirty* workaround, where I schedule a script to open the pbix with Power BI Desktop, hit refresh, then save it and upload it using the API.
@Anonymous:Great sir 😉
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
13 | |
2 | |
2 | |
1 | |
1 |