Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi,
My organisation desperately needs some oversight of power bi usage metrics - I've managed to get a PowerShell script working with task scheduler to export the daily usage into csv as per the solution on this blog: http://angryanalyticsblog.azurewebsites.net/index.php/2018/02/16/power-bi-audit-log-analytics-soluti...
Due to export limits (<5000 rows/<90 days) we will need to build this up over time with the daily export.
We also have a SQL Server DW hosted on Azure so I was wondering if anyone had successfully used Azure Automate / runbooks load these csv's into our data warehouse as I feel like this is a more robust solution. Any ideas or suggestions - i've searched the entire PBI community on audit logs and it seems like no one has mentioned this solution?
Cheers
Solved! Go to Solution.
Yes, pretty much. Or if you don't care about history you can do the API calls directly in the Dataflow PowerQuery (rather than with Powershell)
Since these are csv files it is more appropriate to push them into Dataflows. Putting them into a database is just adding more complexity.
So do you mean storing the CSVs on a shared drive somewhere and connecting to this drive via dataflow?
Yes, pretty much. Or if you don't care about history you can do the API calls directly in the Dataflow PowerQuery (rather than with Powershell)
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.