Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Hello PowerBi community!
I have a problem/question about importing data from a Sharepoint List and I found similar topics but not quite what I was looking for.
Problem:
I need to import data from a Sharepoint list, which is generally running smoothly, but has one big disadvantage:
With additional rows the list grows exponentially in size. For example, I have a list that barely has 10.000 rows (15 columns or so) but needs an import of 50MB. There are also larger lists to import (100k rows) and this poses a real problem for the scheduled refresh, because Sharepoint does not provide the most stable connection. So with an import of several 100 MBs, the refresh would fail quite often. The reason for this large import seems to be some relations in the background of Sharepoint, but I'm not too familiar with the structure of SP. Anyway, I need a smaller import size 🙂
Solution using REST API?:
So I read somewhere, that it's maybe an option to import the data using the REST API. Could anyone explain to me how this would work?
And apart from that: are there any other suggestions on how I could reduce the refresh size? Incremental refresh seems not to be an option, since Sharepoint does not support query folding.
Any suggestions are highly appreaciated!!!
Thanks in advance, Paul
Hi @Anonymous
As for "Solution using REST API" you are interested, it means you could import data from our SharePoint Online REST (OData) service
Get SharePoint List/Library Using OData Feed
Import data from an OData feed using Power BI
For refresh performance:
here are some references:
Power BI Refresh Time using Sharepoint OData Form Library Source
Slow refresh of SharePoint list data
Power BI performance best practices
Best Regards
Maggie
Community Support Team _ Maggie Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hey v-juanli-msft,
thanks for replying. Although the OData Feed solution does not really reduce the loaded size, it's a little quicker at least. I'll switch to that method in my queries and see if it improves stability.