Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
ebjim
Helper IV
Helper IV

Bizarre and long running DF refreshing

ebjim_0-1692970435735.png

I like to know from other users if you have exprience anything similar to what the screenshot shows.

 

- The first attempt (success) occurred after I made some changes to the dataflow and lakehouse table.

- The second attempt was triggered by running the DF (no mods) in a data pipeline. 

- The third attempt was manually triggered. DF is unchanged.

- The fourth attempt was manually triggered. DF is unchanged.

 

The situation is downright ridiculous, with no way to terminate refreshes and no way of knowing how long it's going to keep going in a endless spiral.  

7 REPLIES 7
BryanCarmichael
Advocate I
Advocate I

Hi - we invested a lot of time into porting very stable Gen1 dataflows over to Gen2 dataflows as the concept is great - however they just are not very stable at the moment and the performance is poor at best.

 

We have architected a solution by using our old gen1 dataflows to load to datamarts and then using the copy activity in data pipelines to move the data into our Warehouse.

 

If youa re interested to know more let me know

@BryanCarmichael I appreciate your kind offer. Right now, I have a created a plan B using Azure, which I know will work. So, for now I would say that I can move forward in some way, but I'll let you if I need to take a look at your gen1 solution. Thank you!

ebjim
Helper IV
Helper IV

Through a support ticket, I found out a column in a source table contained an unsupported data type. The surprising part is that this table had nothing to do with the dataflow. It got me thinking:

 

1. Does the dataflow refresh look at all source tables in the lakehouse, even those not used?

2. How much data can a dataflow handle? In terms of MB/GB? In terms of the final number of rows and columns?

DataPne
Frequent Visitor

ebjim I am having the exact same issue - me being the only one using the capactiy and having only one workflow running. Would love to hear any fixes you come up with 🙂 

@HimanshuS-msft - could this be a compute issue? Sometimes if I have a flow running it gets locked for 8 hours as well, and then I am also unable to work with any other resources in Fabric.

@DataPne I have no fixes to offer up. Since we as users cannot even terminate any refesh, complaining to MSFT is the only recourse I can think of. Because we get errors that just seem 'way out there', I suspect there are memory leaks or caching problems behind the scenes that have yet to be addressed. When I was using Azure, nothing of this sort happened. 

ebjim
Helper IV
Helper IV

@HimanshuS-msft Thank you for your feedback. As a trial user, I am aware of the limits of assigned resources. That is why I only refresh one DF at a time. What I find troubling is that there seems to be a point of complexity within a DF, from which refreshing totally breaks down (like a cliff dropoff). Often times, I am the only one using the trial account, so it's not a high load situation. I would still encourage the Fabric group at Microsoft to resolve such issues.

HimanshuS-msft
Community Support
Community Support

Hello @ebjim 
Thanks for using the Fabric community.

I am in agreement with you that its does not look right . The same DF takes 8 hrs and then fails and in the best case scenario it completes in 4 mins .  I think we need to check if there are other workloads running which is taking the capacity units  ,please do read this

If its not related to to CU issues , I will suggest you to work with the MS support as they can dig more on this . 
Thanks
HImanshu

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayFBCUpdateCarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.