Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Incremental loading rolling 24 hours on database update time

Hi, is it possible to keep data for a rolling 24 hours and load new data every two hours (as the relevent old data is removed from the service)?

 

One of our API's makes a call to one of our internal systems on a ~2 hour basis, extracting the most relevant information for say 10% of our requests (eg the next 200 combinations of city pairs out of a total ranked 2,000) and places the results in a Oracle database (200 city pairs equals output of around 12m records).

 

Our Oracle database is emptied everyday at midnight meaning that we dont get the vast majority of the 'city pairs' until later in the day when everyone has gone home (because the API starts at city pair '1' and goes down the list).

 

I want to keep the data for say 22 hours, and load the new data as it comes in using Inremental loading, whilst dropping off the previous days 1-200 city pairs when todays pull of 1-200 city pairs is added.

 

How do I go about this?

Sorry if that is a bit confusing, but please let me know what more information you need to help me

 

Michael

10 REPLIES 10
GilbertQ
Super User
Super User

Hi there

With Power BI you only get 8 data refreshes a day. So if you refresh every 2 hours, you could cover 16 hours of the day.

I would then suggest that you read the data from your Oracle table into Power BI, which will refresh the entire dataset every time.

Currently there is only Incremental Loading in Power BI Premium. And based on your data I think that you could possibly refresh the entire dataset on every load.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Anonymous
Not applicable

@GilbertQ if I just load the entire table every time then first load of the next day will erase the entire previous days records.

This will mean the last ~1,800 city pairs records (from the last ~22 hours) will be removed from Power BI dataset when I upload the new data (from the Oracle table containing the first run of 200 city pairs)

Hi @Anonymous,

 

You can create a dynamic filter based on UTC time. You can enter this formula in a blank query:

 

= DateTimeZone.SwitchZone(DateTimeZone.FixedUtcNow(),+8)-#duration(0,22,0,0)

This returns the datetime 22 hours ago. You can name it DateTime24HoursAgo or anyway you want and use that as a datetime filter in your query.    Don't forget to change +8 to your timezone. 

 










Did I answer your question? Mark my post as a solution!


Proud to be a Super User!









"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.
Anonymous
Not applicable

@danextian 

That is when importing correct? if I just load the entire table every time then first load of the next day will erase the entire previous days records.

 

This will mean the last ~1,800 city pairs records (from the last ~22 hours) will be removed from Power BI dataset when I upload the new data (from the Oracle table containing the first run of 200 city pairs)

That is correct, if your source table only has got the first run of 200 city pairs, the syntax above will only be applied after the data is loaded.

So the net result is it will still only have 200 city pairs.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Anonymous
Not applicable

@GilbertQ right, so not really a solution to the issue then

Hi @Anonymous

 

I think @danextian was assuming that there was data for longer than 22 hours in your source table.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

@GilbertQ is it possible to call on a variable storing LastRunTime? It then can be store at place separately, which act as a data filter for next incremental update.

Hi there,

Yes you could store the LastRunTime as a variable which could be the MAX dateTime from your dataset. So you know where to start next time?




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Hi there, ok I now understand what you are saying.

Currently Power BI will refresh all the data from a table.

So what I would suggest doing is to keep a rolling 24 hours of data in another table in Oracle. And then Power BI can simply look at this 24 hour rolling table and refresh off that.

That is how I would get it done.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors