Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
mmossel
Employee
Employee

What database to create for storing activity logs >90 days?

I'm aware of the PowerShell commands to retrieve activity logs but I can't find any best practice where to store it (JSON right?). Because Microsoft only stores up to 90 days, I'm wondering what we would advise to customers?

1 ACCEPTED SOLUTION
v-cazheng-msft
Community Support
Community Support

Hi @mmossel 

You can consider Azure Blob Storage . For more details, you can refer these links:

 

 

Best Regards

Caiyun Zheng

 

Is that the answer you're looking for? If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

View solution in original post

4 REPLIES 4
v-cazheng-msft
Community Support
Community Support

Hi @mmossel 

You can consider Azure Blob Storage . For more details, you can refer these links:

 

 

Best Regards

Caiyun Zheng

 

Is that the answer you're looking for? If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

ibarrau
Super User
Super User

Hi. Everytime I consider this I think about saving the files as .csv in an Azure Data Lake Gen2. That way you can read all files as one with a lot of tools without issues because .csv is very friendly for Microsoft. You also have in favor it's cheaper than a database.

I'm sure PowerShell can save csv files. Now a days you can use any programming language because it's a REST API.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Hi @ibarrau, please don't consider cost in this case - the goal is to automate it in a large organization.

 

I read that one option is to use an orchestration tool like Azure Data Factory to iterate over historical activity logs, store log files to Azure Data Lake Storage and then incrementally update a Power BI data model for reporting and analysis.

 

What would other options be?

Even though I remove the cost influence. The best approach is a data lake to store files data. I wouldn't use Data Factory to iterate, I would try Azure Functions because you have the API paginated and that might be an unnecessary pain to build in data factory or power query. Then if you want to orchestrate the daily script with data factory it's ok.

I hope that helps.


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors