cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
NathanMoyle
Regular Visitor

Work around small API Limits - 2000 API calls per hour for dot digital

I am working on getting all of the activity data from dotdigital into PowerBI but I can't fetch much data because of the API limit of 2000 calls per hour. I have been able to get all email campaigns from dot digital with 11 API calls (1 API call per 1000 records). These records have data such as the ID, the subject, the email it was sent from and wether or not it is active. There is no date field.

 

I then need to get activites related to the campains using the campain ID. There are also thousands of activities related to most campains, which means that I can only get data that goes back a few days.

 

We have Power BI premium, is there any way to get all of the historical data over time so it can be stored on Microsoft servers?

1 ACCEPTED SOLUTION
ibarrau
Super User
Super User

Hi. We can talk about the architecture to implement historical data. However we can't say if dotdigital will let you do it easily because that's a third party tool.

The best way to build historical API data is loading a storage. You can create scripts in Azure Functions, Runbooks or even usar Azure Data Factory to get data from API and insert in storage. The storage is up to you. You can use Data Lake Gen2, Databases, etc.

That way you can store the data of the results every day until you just run the minimum amount of requests per day to get the most recent data to the store.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

View solution in original post

1 REPLY 1
ibarrau
Super User
Super User

Hi. We can talk about the architecture to implement historical data. However we can't say if dotdigital will let you do it easily because that's a third party tool.

The best way to build historical API data is loading a storage. You can create scripts in Azure Functions, Runbooks or even usar Azure Data Factory to get data from API and insert in storage. The storage is up to you. You can use Data Lake Gen2, Databases, etc.

That way you can store the data of the results every day until you just run the minimum amount of requests per day to get the most recent data to the store.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

View solution in original post

Helpful resources

Announcements
PBI_User Group Leader_768x460.jpg

Manage your user group events

Check out the News & Announcements to learn more.

Welcome Super Users.jpg

Super User Season 2

Congratulations, the new Super User Season 2 for 2021 has started!

Community Connections 768x460.jpg

Community & How To Videos

Check out the new Power Platform Community Connections gallery!