Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
silcambro
Frequent Visitor

Refresh on Dataflows using your own Azure datalake

Hello,
 
I have a very simple question. If i setup a Dataflow using my own Datalake do i need to schedule the refresh?
 
Just to be more specific If I create a Data flow with the following options:
 
Attach a Common Data Model folder (preview)
 
is there a need to refresh the Dataflow? Or i just change the CDM and the dataflow will reflect the changes?
 
Thank you so much

2 REPLIES 2
V-pazhen-msft
Community Support
Community Support

@silcambro 
The solution is Yes, you need to set schedule refresh. However, the schedule refresh is refreshing the change of the datasources that are added to the dataflow. In fact, Attach a Common Data Model folder as dataflow is just change the entities in the dataflow, you still need to refresh the data.


Paul Zheng _ Community Support Team
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly

Thank you for your reply!

 

I will add files to my CDM folders and change the data through Azure Data Factory.

 

Why do i need to refresh? Will the refresh copy the my CDM folders Data to Azure Datalake managed by Microsoft?

 

Thank you so much

 

 

 

 

 

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors