Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
NathanMoyle
Regular Visitor

Work around small API Limits - 2000 API calls per hour for dot digital

I am working on getting all of the activity data from dotdigital into PowerBI but I can't fetch much data because of the API limit of 2000 calls per hour. I have been able to get all email campaigns from dot digital with 11 API calls (1 API call per 1000 records). These records have data such as the ID, the subject, the email it was sent from and wether or not it is active. There is no date field.

 

I then need to get activites related to the campains using the campain ID. There are also thousands of activities related to most campains, which means that I can only get data that goes back a few days.

 

We have Power BI premium, is there any way to get all of the historical data over time so it can be stored on Microsoft servers?

1 ACCEPTED SOLUTION
ibarrau
Super User
Super User

Hi. We can talk about the architecture to implement historical data. However we can't say if dotdigital will let you do it easily because that's a third party tool.

The best way to build historical API data is loading a storage. You can create scripts in Azure Functions, Runbooks or even usar Azure Data Factory to get data from API and insert in storage. The storage is up to you. You can use Data Lake Gen2, Databases, etc.

That way you can store the data of the results every day until you just run the minimum amount of requests per day to get the most recent data to the store.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

View solution in original post

1 REPLY 1
ibarrau
Super User
Super User

Hi. We can talk about the architecture to implement historical data. However we can't say if dotdigital will let you do it easily because that's a third party tool.

The best way to build historical API data is loading a storage. You can create scripts in Azure Functions, Runbooks or even usar Azure Data Factory to get data from API and insert in storage. The storage is up to you. You can use Data Lake Gen2, Databases, etc.

That way you can store the data of the results every day until you just run the minimum amount of requests per day to get the most recent data to the store.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors