Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
nmasimore
Helper I
Helper I

Streaming/Push Dataset Question

Hello

 

I'm new to streaming/push datasets.

 

I want to build a real-time dashboard displaying insights from an Excel table that is constantly being modified by a data entrist (I can't build a PBI model because the scheduled refreshes take too long).


I created a streaming/push dataset with an api. Historical data analysis is turned on. I have an Excel table that a data entrist adds data to. When they are done adding the data to the table, they refresh the workbook (which triggers a Power Query to push the data in the Excel table using Web.Contents to the PushURL).

 

If, for some reason, I decided to go into the Excel table and change one of the values in one of the cells (say, the price of an apple or the price of a banana... screenshots below), something happens that I don't understand.  The row will appear twice in the push dataset, with one record for the old price, and another record for the new price.  I don't want the duplicate values.

 

This is what I'm seeing in the push dataset, as displayed in a PBI dashboard:

 

nmasimore_1-1663359528731.png

 

This is what the datasource Excel table looks like, after I made the price changes:

nmasimore_2-1663359616218.png

 

This is the M-Query code for pushing data to the PushURL. This code is run from the Excel datasource workbook:

 

nmasimore_3-1663359695871.png

 

nmasimore_4-1663359719796.png

 

 

Any suggestions or insights? Thanks (:

 

 

1 ACCEPTED SOLUTION

when you have an import mode dataset you can choose to schedule refreshes (up to 48 per day if on premium, and up to 8 per day if not).  But these schedules may not coincide with your business process events.  The better option is to initiate a refresh when the source data has changed, based on some kind of trigger.  Once the event happens (source data changed)  you can then issue an API call to request a dataset refresh.  Once the dataset is refreshed successfully your dependent dashboards will update automatically. 

For dependent reports you will need to find a different solution (direct query with auto page refresh, for example)

View solution in original post

4 REPLIES 4
nmasimore
Helper I
Helper I

Thanks @lbendlin . Can you discuss more what API based regular dataset refreshes are? This may be the right route for me, but a little more information on that would be super helpful to me.

 

Thank you!

when you have an import mode dataset you can choose to schedule refreshes (up to 48 per day if on premium, and up to 8 per day if not).  But these schedules may not coincide with your business process events.  The better option is to initiate a refresh when the source data has changed, based on some kind of trigger.  Once the event happens (source data changed)  you can then issue an API call to request a dataset refresh.  Once the dataset is refreshed successfully your dependent dashboards will update automatically. 

For dependent reports you will need to find a different solution (direct query with auto page refresh, for example)

Oh I see, thank you @lbendlin !

lbendlin
Super User
Super User

Push datasets have that name for a reason.  You can only push into the dataset. You cannot modify data that you have pushed. You cannot (at the moment) remove individual pushed rows.  You can only clear out the entire dataset and start over.

 

Push datasets are not suitable for your scenario. Consider using a SQL Server dataset in Direct Query mode or API based regular dataset refreshes.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors