cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Anonymous
Not applicable

SharePoint Online slow refresh

As the title says.

I have seen this issue posted several times but never found anything that may help.

My reports pull from both a SQL db and several SharePoint lists. The SQL dbs will import 100,000s of records - typically faster than the time it takes for the SharePoint list, which pulls in 25,000 rows. I can see on the Data Refresh the size that has been downloaded for the SharePoint list, and it seems to be pulling in the data slower than 500kb/s - what is this?

 

On my Excel version of this sheet, I can use the

let
    Source = Table.Combine
    ({
        OData.Feed("mysite/_api/web/lists/GetByTitle('mylist')/Items?$filter=
        Date ge '" & DateSplit(1) & "' and Date le '" & DateSplit(2) & "'"),
        OData.Feed("mysite/_api/web/lists/GetByTitle('mylist')/Items?$filter=
        Date ge '" & DateSplit(3) & "' and Date le '" & DateSplit(4) & "'"),
        OData.Feed("mysite/_api/web/lists/GetByTitle('mylist')/Items?$filter=
        Date ge '" & DateSplit(5) & "' and Date le '" & DateSplit(6) & "'")
    }),

Where the user will pick a date and it will fill a table with the dates for the month. It essentially splits the query into three, due to the 5000 limit threshold, but it is faster than it would otherwise be.

For BI, where the expectation is that the user can pick any date (ranges), I will need to have the full data imported preferably.

There is nothing demanding in my query,

let
    Source = SharePoint.Tables("mysite", [ApiVersion = 15]),
    #"GUID" = Source{[Id="myGUID"]}[Items],

So what gives with the SQL db being so much faster than SharePoint Online? Is it because SharePoint Online is given less resources due to it not being premium?

24 REPLIES 24
mahoneypat
Super User
Super User

Please see this post on a much faster way to pull SP list data.

https://community.powerbi.com/t5/Desktop/SharePoint-list-query-alternative-or-optimization/m-p/10926...

 

Regards,

Pat





Did I answer your question? Mark my post as a solution! Kudos are also appreciated!

To learn more about Power BI, follow me on Twitter or subscribe on YouTube.


@mahoneypa HoosierBI on YouTube


hello
Frequent Visitor

At a high-level, what are the steps I need to take to implement this?
I'm starting in PBI Desktop

In that link, I provided some M code both in query form and in function form.  Below is the query form.  Just paste it into a blank query in Advanced Editor in the query editor over the existing text.  Replace the text in quotes in the site, tenant, and list.  Look at the URL for your SharePoint List for reference.  You may need to adapt the hard-coded part of the URL too if it doesn't match.  This will return a List, which you can click on to get a list of records, then convert that list to a table, then expand the records choosing the fields of interest.

 

let
Source =

let

site = "NameOfMySite",

tenant = "NameOfMyTenant",

list = "NameOfMyList",

getdata = Json.Document(Web.Contents("https://" & tenant & ".sharepoint.com/sites/" & site & "/_api/web/lists/GetByTitle('" & list & "')/items?$top=5000", [Headers=[Accept="application/json"]]))

 

in

getdata
in
Source

 

Note the above will return up to 5000 records.  If you want to use pagination to get many more, you need to use a M code and a web call like below (I haven't adapted the friendly form above to work like this yet).  Update the parts in <> (delete the <>).  This gets the count of items in your list, make a list of numbers up to that counting by 5000, makes each web call so you can expand all the result together. 

 

Note there are three rows called "fieldselect"; two should be commented out at any time.  You need to choose depending on if you have lookup/choice columns in your list.  There is example syntax in the latter two on how to choose which fields to expand in the return.

 

I still plan to write a blog post about this one day ...

 

let

    sitename ="<nameofyoursite", // if a subsite use "Site/SubSite"

    listname = "<nameofyourlist>",

    baseurl = "https://<yourtenantURL>/sites/" & sitename & "/_api/web/lists/GetByTitle('" & listname & "')/",

    itemcount = Json.Document(Web.Contents(baseurl&"ItemCount", [Headers=[Accept="application/json"]]))[value],

    skiplist = List.Numbers(0, Number.RoundUp(itemcount/5000), 5000),

    #"Converted to Table" = Table.FromList(skiplist, Splitter.SplitByNothing(), null, null, ExtraValues.Error),

    #"Renamed Columns" = Table.RenameColumns(#"Converted to Table",{{"Column1", "Skip"}}),

    #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"Skip", type text}}),

    fieldselect = "&$top=5000", // all fields with no expansion

    //fieldselect = "&$top=5000&$select = Id,Title,Person,Date", // list desired fields (no expansion)

    //fieldselect = "&$top=5000&$select = Id,Title,Person/LastName,Person/FirstName,Date&$expand = Person", //expand list fields

    #"Added Custom" = Table.AddColumn(#"Changed Type", "Items", each Json.Document(Web.Contents(baseurl& "/items?$skipToken=Paged=TRUE%26p_ID=" & [Skip] & fieldselect, [Headers=[Accept="application/json"]]))),

    #"Expanded Items" = Table.ExpandRecordColumn(#"Added Custom", "Items", {"value"}, {"value"}),

    #"Expanded value" = Table.ExpandListColumn(#"Expanded Items", "value"),

    #"Expanded value1" = Table.ExpandRecordColumn(#"Expanded value", "value", {"Id", "Title", "Person", "Date"}, {"Id", "Title", "Person", "Date"})

in

    #"Expanded value1"

 

Regards,

Pat





Did I answer your question? Mark my post as a solution! Kudos are also appreciated!

To learn more about Power BI, follow me on Twitter or subscribe on YouTube.


@mahoneypa HoosierBI on YouTube


hugo50
Frequent Visitor

Same here, it's taking ages to load CSV file from SharePoint to PowerBI datamodel. It's terrible, please Microsoft do something it is so annoying. I don't even have complicated data transformation in the query.
Joorge_C
Resolver II
Resolver II

So I used this as work around and the refresh takes less than a Minute.

 

Im exporting every 7 mins the couple of sharepoint lists into a CSV file and dumping them into a shared drive, where I set up the PBI connections.

 

There is a few links out there for reference on this flow. This is what I used.

http://www.sharepointsamples.com/export-sharepoint-list-items-as-csv-using-microsoft-flow/

hello
Frequent Visitor

Have you tried this with a list/library with 100k+ items?  Using PowerShell from my desktop I can quickly export list data, but I'm afraid of using Flow due to the possibility of getting throttled.

Once the report is set up, I'm hoping that an incremental refresh will help bridge the load time gap.

Thanks @Joorge_C  it seems that i's the only solution. Quite very disappointing that we must resort to such dramatic workarounds to get two Microsoft environment to talk to each other...

Wilhelm
Advocate I
Advocate I

Same here - to the point where it's becoming unusable when my line speed drops - which is often the case unfortunately. My problem is not so much the slowness of the query, but rather the fact that I don't know how to stop it from doing the query every single time I press enter to add a new line. It takes around 10 to 40 minutes (yes, minutes) to refresh and apply the query changes - it is not a large dataset, merely a few projects' data I'm bringing over from Microsoft Project (online). Surely, since the data is already loaded into memory there should be a way to set it to use the memory instead of querying the source, until such time that I actually want to refresh from SharePoint, right? Perhaps someone from Microsoft could help explain this to us?
Wilhelm
Advocate I
Advocate I

Same here - to the point where it's becoming unusable when my line speed drops - which is often the case unfortunately. My problem is not so much the slowness of the query, but rather the fact that I don't know how to stop it from doing the query every single time I press enter to add a new line. It takes around 10 to 40 minutes (yes, minutes) to refresh and apply the query changes - it is not a large dataset, merely a few projects' data I'm bringing over from Microsoft Project (online). Surely, since the data is already loaded into memory there should be a way to set it to use the memory instead of querying the source, until such time that I actually want to refresh from SharePoint, right? Perhaps someone from @microsoft could help explain this to us?
ChrisW3
Frequent Visitor

I am having this same issue. Wondering if it only happens when you go over the 5,000 item list view threshold limit in Sharepoint online? The list I am working with currently has about 11k records. Not that it really matters, just curious. I would love to see this fixed. I don't think the other way of doing it via OData query gives you nearly as much data. Example, I have a lot of items in my list with custom permissions (shared with specific people), and pulling with Power Query via the standard connector allows access to those security role assignments by item.

Artie
Frequent Visitor

Hi everybody,

I experiencing the same issue particularly with the CSV files. Has anybody found a solution?  I already updated PowerBi desktop to March 2020 version. The refresh of each step can last up to 10 minutes.  If I go to the Sharepoint (Cloud) via Internet Explorer, the access to the site and files is quick fast. Thanks.

leigh_yh
Frequent Visitor

I have the same issue. My report is connected to 8 SharePoint online lists which are then appended. Would it make a difference if I connect using the Odata feed?

v-yulgu-msft
Microsoft
Microsoft

Hi @Anonymous ,

 

Please check whether there existing any step in Power Query cosuming too much cpu or memory. And try to use Table.Buffer to improve the data load performance.

SharePoint Online List- Power BI - slow query update

Slow refresh of SharePoint list data

 

Best regards,

Yuliana Gu

Community Support Team _ Yuliana Gu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Anonymous
Not applicable

Hi Yuliana,

 

I do not have anything complicated happening on my queries.

This issue is replicated just by: New query -> SharePoint list -> Browse to list. That is all I am doing and it takes forever.

Another issue may be that SharePoint pulls all the data before filtering, unlike SQL

Anonymous
Not applicable

Bumping this

Anonymous
Not applicable

Will bump again.

Surely others using SharePoint Online have the same concern with how the standard PowerQuery (Excel & BI) downloads so much in file size and so slowly?

Again in Excel it can be worked around but not in BI as far as I can see, so refreshing datasets will just get slower and slower.

 

It seems that SharePoint online will not filter the data on the standard query until it has all been downloaded. I can see this by, in preview, when I filter a column and find it downloads the same file size again before completing the filter

Anonymous
Not applicable

Im having the exact same issue,

 

Origionally i thought it was an old powerbi file with too much going on (queries etc) so i started one from fresh to just use two lists and its unbearably slow, the download from sharepoint to powerbi is going at a rate of 500Kb/s (it aint my connection ive tried from two very good internet connections)

 

How the hell am i supposed to do very simple reports on large lists? (its over 30000+ entries)

 


@Anonymous wrote:

Will bump again.

Surely others using SharePoint Online have the same concern with how the standard PowerQuery (Excel & BI) downloads so much in file size and so slowly?

Again in Excel it can be worked around but not in BI as far as I can see, so refreshing datasets will just get slower and slower.

 

It seems that SharePoint online will not filter the data on the standard query until it has all been downloaded. I can see this by, in preview, when I filter a column and find it downloads the same file size again before completing the filter


 

Anonymous
Not applicable

Is it possible to have an official response on this?
Is it that the connector is just badly done, a fault on our sides, or something else?

 

And yes @Anonymous I notice the same with the slow connection speed. It is barely 400KB/s which is unacceptable.

 

To be clear: If I write a VBA SQL connection, it pulls it much faster.

Another thing: Go to SharePoint Online -> Your list -> Export to Excel. You will notice it pulls it much faster, so what gives with the slow PowerQuery connection?

I have the same issue, sometimes the download speed drops to 2 KB/s, so the 10 MB I'm trying to retrieve takes ages...

Anonymous
Not applicable

Having the same issue and never found a workaround. I experience the same speeds that @jvondermans mentioned with no solution. I tried changing some of the option settings related to Data Load but that never showed any significant changes.

Helpful resources

Announcements
UG GA Amplification 768x460.png

Launching new user group features

Learn how to create your own user groups today!

November Power BI Update 768x460.png

Check it Out!

Click here to read more about the November 2021 Updates!

M365 768x460.jpg

Microsoft 365 Collaboration Conference | December 7–9, 2021

Join us, in-person, December 7–9 in Las Vegas, for the largest gathering of the Microsoft community in the world.