Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
Housden996
Helper I
Helper I

Look Up on a SharePoint Folder and then merge All the Excel Data located then pull into Fabric.

Hi 

 

I have a SharePoint Folder which stores CSV timesheet data.

 

I need to store all this data in Fabric at a client's request using a Pipeline/Dataflow.

 

Is it possible to use the to Dataflow to automatically pick up and data in the SharePoint folder, merge the individual sheets located from all the CSVs within the folder, and output the merged table in Fabric?

 

I wanted to use the data pipeline in Fabric but you cannot pull in or connect to SharePoint Folder data.

 

Is the automation process mentioned above possible? Or will I need to use Power Automate in said SharePoint then use the Dataflow to pull in a pre merged CSV into Fabric? 

 

Thanks

1 ACCEPTED SOLUTION
christinepayton
Super User
Super User

I have used dataflows to join SharePoint folder data; the SharePoint folder connector works fine there. One of the columns available to filter on in the query is the path, so you can use that column to limit by specific folder before expanding the file data. 

View solution in original post

6 REPLIES 6
christinepayton
Super User
Super User

I have used dataflows to join SharePoint folder data; the SharePoint folder connector works fine there. One of the columns available to filter on in the query is the path, so you can use that column to limit by specific folder before expanding the file data. 

This worked perfectly thanks!

Okay so I would limit the look up to identify data in the set folder and then expand and combine from the data flow instead of expanding just a single data set?


I understand what you're explaining I will test then update you!

 

Thanks for your time.

Yeah, the dataflows function exactly the same as the desktop editor, they're just more reusable. So you can expand the files in the dataflow and connect to that table if you want to reuse the transformations vs connecting and expanding them in every pbix you want to use them in. That way when you change something, you don't have to go back and update every pbix query. 

collinq
Super User
Super User

HI @Housden996 ,

 

WHen you state that they want to store it in "Fabric" you need to be a bit more clear about where you are storing it.  Technically, Power BI is now Fabric and so storing it in Power BI Dataflow is in Fabric.  Not OneLake or LakeHouse or any of that though....

 

I would use the datafow to get all the data and put it together and then pick up the result set to put into the location of Fabric that you are needing.




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Apologies this is in a premium workspace.

 

The plan is to extract all of the data from the share point folder, merge then just pull into the reports via the work flow connector.

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

Top Solution Authors