cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Cigo
Frequent Visitor

429 Error SharePoint List Source

Hello

 

I'm relatively new to PBI and queries and trying to resolve an issue my org currently have with 429 errors on dataflows that are accessing multiple SharePoint lists as their source. The current set up is there is one dataflow with an entity for each of the 15 lists being accessed, that then feeds into secondary dataflows for data manipulation and formatting before being presented in reports.

 

Most of these lists currently have less than 1000 rows of data in them, with one containing 3500. On a regular basis we are receiving 429 errors for the list with 3500 rows on it. Unfortunately we are currently restricted to using SP lists as the primary datasource and was surprised we're hitting these issues on such a small volume of data. The dataflows are currently running four times a day with around 3 hours beteween each refresh.

 

To get around this I am currently looking into delaying the calls to the SP service to hopefully avoid the throttling issue. I've had a look at the various really helpful posts from Chris Webb for using the InvokeAfter function but am struggling to see how to integrate this with the SP table source. 

 

I've tried the below but it doesn't seem to have impacted the refresh times so I assume the delay is only being applied at the table level and not per row. I imagine I'm probably inserting it at the wrong step or need to add an index/custom column to reference? I'm also quite restricted due to org policies in what I can do and ideally need to come up with a solution that can be achieved from within the query. If anyone could offer any advice on how this can be done or an alternative to avoid these errors it would be greatly appreciated. 

 

let
  Source = Function.InvokeAfter(
    ()=>SharePoint.Tables("https://"ORG URL HERE"", [ApiVersion = 15]),
    #duration(0,0,0,1)
    ),
  #"Navigation 1" = Source{[Id = "--------------------"]}[Items],
 
 
2 ACCEPTED SOLUTIONS
mahoneypat
Super User
Super User

The problem seems to be that each of your SP list calls are talking too long.  Please see this article/video for how to get data from SharePoint Lists way faster.

Get Data From SharePoint Lists … But Fast – Hoosier BI

 

Also, how are you combining the 15 lists/queries?  Hopefully, you are appending them and not merging (which can slow things down).

 

Regards,

Pat





Did I answer your question? Mark my post as a solution! Kudos are also appreciated!

To learn more about Power BI, follow me on Twitter or subscribe on YouTube.


@mahoneypa HoosierBI on YouTube


View solution in original post

Sigh.

 

Unfortunately, you are getting the 429 Too Many Requests because SharePoint feels you are making too many calls at one time and is throttling you. I've got an open ticket on this. The v2 of the SharePoint List connector will eventually correct this but it's currently missing data that I need for production reports.

 

If you can use the v2 Lists connector, try that.

 

Thanks!

--Treb

View solution in original post

3 REPLIES 3
mahoneypat
Super User
Super User

The problem seems to be that each of your SP list calls are talking too long.  Please see this article/video for how to get data from SharePoint Lists way faster.

Get Data From SharePoint Lists … But Fast – Hoosier BI

 

Also, how are you combining the 15 lists/queries?  Hopefully, you are appending them and not merging (which can slow things down).

 

Regards,

Pat





Did I answer your question? Mark my post as a solution! Kudos are also appreciated!

To learn more about Power BI, follow me on Twitter or subscribe on YouTube.


@mahoneypa HoosierBI on YouTube


View solution in original post

Cigo
Frequent Visitor

Hi Pat

 

Thank you for the response. So the dataflow in question is not performing any sort of merge or append between the lists, that's being conducted in separate dataflows that are using the linked entities from this dataflow we're discussing. The dataflows completing the merge/appends and other similar steps are all set to refresh an hour later and are all averaging under five minutes for refresh times.

 

The dataflow with the 429 errors is simply set up with one entity connected to each SP list and most of them are refreshing relatively quickly. However the one connected to the SP list with 3500 records is taking 15-20 minutes (there is also a list with 2000+ records and this is taking 15 odd minutes too). The SP list in question consists of 40 fields of either single line of text, multiple line of text or choice formats.

 

Am I misunderstanding that the 429 error is due to too many requests to the SP service in a short space of time? If not, wouldn't I want to try and reduce the number by adding a delay rather than trying to make the requests faster?

 

I've also noticed that when using dataflows rather than PBI desktop there doesn't seem to be an option to select the connector version that pops up in your video. Do you know if there is a way to select the connector version in a dataflow and whether you can swap from v1 to v2 on an existing dataflow?

 

I tried the steps in your video and managed to get the data to appear in the query using the below, however when I try to save the dataflow I then get an error "Can't save dataflow. One or more tables reference a dynamic data source."

 

let
  sitename = "https://<org URL>",
  // if a subsite use "Site/SubSite"
  listname = "<list name>",
  baseurl = sitename & "/_api/web/lists/GetByTitle('" & listname & "')/",
  itemcount = Json.Document(
     Web.Contents(baseurl & "ItemCount", [Headers = [Accept = "application/json"]])
   )[value],
  skiplist = List.Numbers(0, Number.RoundUp(itemcount / 5000), 5000),
  #"Converted to Table" = Table.FromList(skiplist, Splitter.SplitByNothing(), nullnull, ExtraValues.Error),
  #"Renamed Columns" = Table.RenameColumns(#"Converted to Table", {{"Column1""Skip"}}),
  #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns", {{"Skip"type text}}),
  fieldselect = "&$top=5000",
  // all fields with no expansion
  // fieldselect = "&$top=5000&$select = Id,Title,Person,Date", // list desired fields (no expansion)
  // fieldselect = "&$top=5000&$select=Id,Title,Choice,LookupColumn/Title,LookupColumn/Project,LookupColumn/ProjectStatus,Date,Person/LastName,Person/FirstName,Person/EMail&$expand=LookupColumn,Person",
  Custom1 = Table.AddColumn(#"Changed Type", "Items"each Json.Document(
       Web.Contents(
         baseurl & "/items?$skipToken=Paged=TRUE%26p_ID=" & [Skip] & fieldselect,
         [Headers = [Accept = "application/json"]]
       )
     )),
  #"Expanded Items" = Table.ExpandRecordColumn(Custom1, "Items", {"value"}, {"value"}),
  #"Expanded value" = Table.ExpandListColumn(#"Expanded Items", "value")
in
  #"Expanded value"
 

Thanks again!

 

Sigh.

 

Unfortunately, you are getting the 429 Too Many Requests because SharePoint feels you are making too many calls at one time and is throttling you. I've got an open ticket on this. The v2 of the SharePoint List connector will eventually correct this but it's currently missing data that I need for production reports.

 

If you can use the v2 Lists connector, try that.

 

Thanks!

--Treb

View solution in original post

Helpful resources

Announcements
UG GA Amplification 768x460.png

Launching new user group features

Learn how to create your own user groups today!

November Power BI Update 768x460.png

Check it Out!

Click here to read more about the November 2021 Updates!

M365 768x460.jpg

Microsoft 365 Collaboration Conference | December 7–9, 2021

Join us, in-person, December 7–9 in Las Vegas, for the largest gathering of the Microsoft community in the world.

Top Solution Authors
Top Kudoed Authors