Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
PowerRocky12
Helper I
Helper I

On-premise Incremental Refresh with Data Flows

If I apply the following model: Incremental refresh using PowerBI pro and Dataflows - Exceed how would I account for new/changes when I append the querys? 

 

How my data works is when a vendor has a new purchase it will generate a new record. This is easy if its a new Vendor but it gets complicated when they are a exisiting vendor. If that vendor is an exisiting vendor and they have a new purchase it will re-trigger and load all past and new records into the new data set. Also, if there is a change to a vendors record, it will re-load all records into the new data set.

 

How can I append my dataflow like in the above link but also load new and if vendor record already exisits in old, have the new replace old.

3 REPLIES 3
amitchandak
Super User
Super User

@PowerRocky12 , one way just does not set incremental. If you set incremental your date column should get updated for both new and old vendor when data changes

Hey! Still trying to understand this. I se show incremental refresh will remove old records and replace with new based on sales date and last changed date. If last changed date for new record is different for new sales date then old sales date it will replace it. Where I'm getting stuck is my data is separated in two separate files. One for old and one for new. So I created two data flows. One for new and one for old. Then I set up incremental refresh for new and appended both queries in my data set. My new file runs weekly loads so I have to store this data within power bi and then that file gets updated with new/changed data. I need some guidance on what parameters to set up incremental load for and if I'm setting up data flows correctly? Maybe I should create one data flow by combing both files? I'm really stuck here.

Thanks for the reply. The data in the new file runs weekly and I need to get that data from my dataset (local folder) and pull this into my "New" Dataflow so it gets added to my appended table (New & Old). I don't have a refresh on my old data flow as it is historical data but there is a potential for records to still change so I will need the new data to replace the old if there is a change. Yes, I will try adding an incremental refresh. What details should I use in incremental refresh? For detect data changes I am planning to use Last Changed Date. Just not sure for "Store" and "Refresh" rows.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.