Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply

Share your thoughts on streaming dataflows aka SDF (preview feature)

Back at MBAS, our CVP Arun Ulag gave everybody a sneak peek of streaming dataflows, the new experience for real-time data preparation and comsumption in Power BI. Later at BUILD, we showed everyone an extended demo of the current bits being tried by several of our customers as part of the private preview. And today, finally streaming dataflows is released to the world for public preview with even more updates a new UI.

 

We would love to hear your feedback and opinions to help us decide what comes next and how to improve this functionality in general. Thanks you so much in advance for your feedback and we will be alert to answer any questions you might have and listen to any ideas you might come up with.

 

Cheers

The streaming dataflows team

Data for Everyone! We mean it
44 REPLIES 44
epresson
Frequent Visitor

Hello. When can we expect Streaming dataflows to move out of preview? The feature isn't yet included in "Trusted Microsoft Services" yet so it won't work for us since our firewall settings in Azure only allows trusted microsoft services. I assume it will be "Trusted" once it moves out of preview status?

MrKrukauskas
Helper I
Helper I

Hello, does not seem that the streaming dataflows work with deeper than 1 level nested items? F.e. trying to pull the _env field and it does not produce an output. Any ideas?

MrKrukauskas_1-1652752606192.png

 

jrnh123
New Member

Hi,

 

Neither of the time fields will appear in the input preview. You need to add them manually. is this also true for the deviceID (if there is more then one device connected to the IoThub)??? and how to do that?? 

rhdezm
New Member

Hello, I'm using streaming dataflow. My data source has a Record field type, but it is not possible to bring it into an output table. It is normal?

That is by design. You can use 'Manage Fields' and flatten the record using that operator to see it in output.

Oh my gosh, it's true! Many thanks.

Oliver2020
Helper I
Helper I

Can not see cold table in Power BI Desktop, only hot table (which works perfect).
Is there any configuration? Retention is set to 7 days, could this be switched off in PowerBI Admin?

Can not see cold table in Power BI Desktop, only hot table (which works perfect).

Re: Hi @Oliver2020, can you confirm that you're using "Power Platform dataflows" connector and not "Power BI dataflows"? If yes, is there an error that you see or it takes a long time to get data? Usually, loading cold entities take a little while since it pulls in all the historical data.

Shivan_1-1629313064385.png

 

Retention is set to 7 days, could this be switched off in PowerBI Admin?

Re: Unfortunately no, the max retention policy that is allowed is 7 days (which only pertains to hot entities). You should still be able to tap into historical data using cold entities for data past that 7 day time window.

Anonymous
Not applicable

@Shivan, @MiguelMartinez , @Eklavya  - I do not see the option for Power Platform Dataflows, is there a preview feature I am missing

 

MarkEndicott_0-1633611735788.png

 

It has now been renamed to "Dataflows"

 

Eklavya_0-1633620483952.png

 

Anonymous
Not applicable

Someone needs to update their documentation in that case 😂

 

MarkEndicott_0-1633623204041.png

 

Oliver2020
Helper I
Helper I

Hi!
I receive all messaage fields from the IoT Hub, but not the internal attributes as "iothub-enqueuedtime"!
So you do not have any timestamp and is useless data! (when it is not in the message)
With Stream Analytics there is no probelm with that because all IoT-columns are provided.
Oliver

Ok, figured it out, you have to manually add this fields:
EventProcessedUtcTime
EventEnqueuedUtcTime

 

Preview does not show content, but it works on output table

AndyDDC
Solution Sage
Solution Sage

Hi,

 

I'm trying to work out the "hot" and "cold" storage process.  I have a Streaming Dataflow that I have set the Retention Duration in the Dataflow settings tab to 1 Day.  I have then run that Streaming Dataflow and can see data in the Hot table when connected via Power BI Desktop.  However, after several days I can certainly see the same data in the Cold storage table but the data remains in the Hot storage table.  I have tried stopping/resuming the Streaming Dataflow but the data remains in the Hot table - I would have thought this would not be visible after being moved to Cold storage?

Hi Andy,

 

The idea is slightly different:
Hot - Keeps data only for the retention duration. While refreshing, data is actively dropped from Hot entities as it moves out of retention.

 

Cold - Data is always kept in cold, regardless of retention.

 

Data is never moved from hot to cold.

Hi,

 

The process still isn't very clear to me. 

So does a message always exist in Cold storage no matter the retention period?

 

What defines the hot retention period? I would not expect to still see data in the hot table several days after the "1 Day" retention period setting in the Dataflow.

 

thanks

> So does a message always exist in Cold storage no matter the retention period?

Yes.

 

> I would not expect to still see data in the hot table several days after the "1 Day" retention period setting in the Dataflow.

Dataflows enforce retention policy only while refresh is running. So if your refresh is not running, you will still see data that is out of retention.
Do you have a use case where you need the data to be dropped out of retention even when the refresh is not running?

 

Hi,

 

OK I understand now after testing.  I see both the same data in the hot and cold storage when I set-up a new Streaming Dataflow,

 

I have started the older Streaming Dataflow and after a few minutes I now see significantly less rows in the hot table - there are still around 90 rows (which is strange).

 

I don't have a specific use case but I do think the documentation could be clearer in terms of the differences between hot and cold and under what conditions data will be retained further than the retention point (eg when it's no running).

 

thanks for the responses, much appreciated

markusr
Frequent Visitor

Hi Miguel, I'm just evaluating SDF in a premium per user workspace. I was able to create the dataflow based on IoT Hub, start it, create a report on top of it in Power BI Desktop, and the report refreshes automatically after I configured change detection. Great! Now when I try to publish this report to exactly the same (ppu) workspace, publishing never finishes. Same applies when I try to upload the pbix file to the workspace: "circle of death" forever, and no error messages. I can easily upload to other non-premium workspaces, but of course I lose the streaming functionality. Do you have any hints for me...?

OK, I have to modify my post a little, after all, this is preview cloud software, and it's changing fast! Today, publishing the report works fine. But as I try to access the report via browser in my ppu workspace, the error message is "The data source Extension is missing credentials and cannot be accessed". I was able to publish there, and I do have a Premium per User license. What can I do?

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors