Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and a 50 percent discount on exams.
Get startedEarn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hi
I have a SharePoint Folder which stores CSV timesheet data.
I need to store all this data in Fabric at a client's request using a Pipeline/Dataflow.
Is it possible to use the to Dataflow to automatically pick up and data in the SharePoint folder, merge the individual sheets located from all the CSVs within the folder, and output the merged table in Fabric?
I wanted to use the data pipeline in Fabric but you cannot pull in or connect to SharePoint Folder data.
Is the automation process mentioned above possible? Or will I need to use Power Automate in said SharePoint then use the Dataflow to pull in a pre merged CSV into Fabric?
Thanks
Solved! Go to Solution.
I have used dataflows to join SharePoint folder data; the SharePoint folder connector works fine there. One of the columns available to filter on in the query is the path, so you can use that column to limit by specific folder before expanding the file data.
I have used dataflows to join SharePoint folder data; the SharePoint folder connector works fine there. One of the columns available to filter on in the query is the path, so you can use that column to limit by specific folder before expanding the file data.
This worked perfectly thanks!
Okay so I would limit the look up to identify data in the set folder and then expand and combine from the data flow instead of expanding just a single data set?
I understand what you're explaining I will test then update you!
Thanks for your time.
Yeah, the dataflows function exactly the same as the desktop editor, they're just more reusable. So you can expand the files in the dataflow and connect to that table if you want to reuse the transformations vs connecting and expanding them in every pbix you want to use them in. That way when you change something, you don't have to go back and update every pbix query.
HI @Housden996 ,
WHen you state that they want to store it in "Fabric" you need to be a bit more clear about where you are storing it. Technically, Power BI is now Fabric and so storing it in Power BI Dataflow is in Fabric. Not OneLake or LakeHouse or any of that though....
I would use the datafow to get all the data and put it together and then pick up the result set to put into the location of Fabric that you are needing.
Proud to be a Datanaut!
Private message me for consulting or training needs.
Apologies this is in a premium workspace.
The plan is to extract all of the data from the share point folder, merge then just pull into the reports via the work flow connector.