Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Grow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.

Reply
Melben
Frequent Visitor

We encountered an error during evaluation. Details: The evaluation with ID is already in progress

Hi all, I'm getting an error in a query that just started a few days ago after having worked flawlessly for a week or so. I am extracting data from HubSpot via API and loading to a table in Dataverse using a PowerQuery DataFlow. I am using an adaption of some code I have found on this and HubSpot's community forums, but again, it worked fine and has just stopped. The code and some comments are below:

let
  headers = [
  #"Content-Type"= "application/json",
  Authorization="Bearer XXXXXXXXXXXXXXXXXXXXXX"
],
  initContent = Text.ToBinary("
      {
        ""filterGroups"": [
          {
            ""filters"": [
              {
                ""propertyName"": ""hs_lastmodifieddate"",
                ""operator"": ""GTE"",
                ""value"": " & Text.From(#"Last Update Deals") & " // Separate query that returns a Unix datetime stamp of last updated deal
              },
              {
                ""propertyName"": ""pipeline"",
                ""operator"": ""IN"",
                ""values"": [" & Text.From(#"Sales Pipelines") & "] // Separate query that returns only the pipelines I'm interested in
              }
            ]
          }
        ],
        ""properties"": [" & Text.From(#"Properties Deals") & "], // Separate query that returns all properties I need
        ""limit"": 100,
        ""after"": 0
      }
"),
  initReq = Web.Contents(
    baseuri,
    [  
    Headers= headers,
    Content= initContent,
    ManualStatusHandling= {400, 401, 403, 404, 500}, IsRetry = false
    ]
),
  initData = Json.Document(initReq)[results],
  // We want to get data = {lastNPagesData, thisPageData}, where each list has the limit # of Records,
  // then we can List.Combine() the two lists on each iteration to aggregate all the records. We can then
  // create a table from those records
  gather = (data as list, nextContent) =>

let
//get new offset from active uri
newOffset = Json.Document(Web.Contents(baseuri,
                    [
                    Headers= headers,
                    Content = nextContent
                    ]
                )
            )[paging][next][after],

//build new content using a copy of the initContent so we dont append offsests

newContent = Text.ToBinary("
      {
        ""filterGroups"": [
          {
            ""filters"": [
              {
                ""propertyName"": ""hs_lastmodifieddate"",
                ""operator"": ""GTE"",
                ""value"": " & Text.From(#"Last Update Deals") & "
              },
              {
                ""propertyName"": ""pipeline"",
                ""operator"": ""IN"",
                ""values"": [" & Text.From(#"Sales Pipelines") & "]
              }
            ]
          }
        ],
        ""properties"": [" & Text.From(#"Properties Deals") & "],
        ""limit"": 100,
        ""after"": 0
      }
"),

//get new req & data
newReq = Json.Document(Web.Contents(
              baseuri,
              [
              Headers= headers,
              Content= newContent
              ]
     )
),
newdata = List.Buffer(newReq[results]),

//add that data to rolling aggregate. I suspect here is where I'm hitting performance issues resulting in the error (or a stack overflow error)
data = List.Combine({data, newdata}),

//if theres no next page of data, return. if there is, call @gather again to get more data

check = if Table.Contains(
                        Record.ToTable(newReq),
                        [Name =
"paging"] ) = true
              then @gather (List.Buffer(data), newContent) // I have read that buffering tables or lists could be helpful, but i have not found it so
             else data

in check,

  // before we call gather(), we want see if its even necesarry. First request returns only one page? Return.
  output = if Table.Contains(
                           Record.ToTable(Json.Document(initReq)),
                           [Name = "paging"]) = true
                 then gather(initData, initContent)
                 else initData,
  #"Convert if List" = Table.FromList(output, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
  #"Expanded Column1" = Table.ExpandRecordColumn(#"Convert if List", "Column1", {"id", "properties", "createdAt", "updatedAt"}, {"id", "properties", "createdAt", "updatedAt"})
in
  #"Expanded Column1"

Any help at all would be very well appreciated. \

Cheers,
Ben
1 ACCEPTED SOLUTION

No worries, you're welcome 🙂
That code is really bad in many ways:
1) It uses native recursion (@) instead of List.Generate, which is supposed to be very slow
2) How the buffers are used will make performance even worse, so it would be better to remove them

3) For my understanding, the current code should result in an endless loop (which would explain the error-messages):
This recursive call doesn't include any of the results that have been returned in the previous call:

//get new req & data
newReq = Json.Document(Web.Contents(
   baseuri,
    [
     Headers= headers,
    Content= newContent
     ]
      )
),
as the headers are fix and the so called newContent is fix as well, as it doesn't refer any of the previous calls result.

The general "architecture" for a solution using the faster List.Generate would look like so:

let
  headers = [
    #"Content-Type" = "application/json",
    Authorization   = "Bearer XXXXXXXXXXXXXXXXXXXXXX"
  ],
  initContent = Text.ToBinary(
    "
      {
        ""filterGroups"": [
          {
            ""filters"": [
              {
                ""propertyName"": ""hs_lastmodifieddate"",
                ""operator"": ""GTE"",
                ""value"": "                            
      & Text.From(#"Last Update Deals")
      & " // Separate query that returns a Unix datetime stamp of last updated deal
              },
              {
                ""propertyName"": ""pipeline"",
                ""operator"": ""IN"",
                ""values"": ["                              
      & Text.From(#"Sales Pipelines")
      & "] // Separate query that returns only the pipelines I'm interested in
              }
            ]
          }
        ],
        ""properties"": ["                          
      & Text.From(#"Properties Deals")
      & "], // Separate query that returns all properties I need
        ""limit"": 100,
        ""after"": 0
      }
"
),
  initReq = Web.Contents(
    baseuri,
    [
      Headers              = headers,
      Content              = initContent,
      ManualStatusHandling = {400, 401, 403, 404, 500},
      IsRetry              = false
    ]
  ),
  initCall = Json.Document(initReq),
  initData = initCall[results],
  // will generate a list with the results of all calls. Can be expanded using the UI afterwards.
  gather = List.Generate(
      ()=> [request = initCall, result = initData, newOffset = initCall[paging][next][after], Counter = 0],
     
      //Condition, under which the next interation should take place
      each Table.Contains(Record.ToTable(request), [Name = "paging"]) = true
            and [Counter] < 10 // for debugging purposes, delete after everything is running smoothly: Limits to max. 10 recursive steps
            ,
      each [
          // [result] refers to the result-step of the previous call, so that's where the recursive effect is coming from here.
          // But I am not sure, if that actually needs to go into the content parameter here.
          // Probably the newOffset would have to be considered as well? API documentation will tell you.
          request = Json.Document(Web.Contents(baseuri, [Headers = headers, Content = [result]])),
          result = request[results],
          newOffset = request[paging][next][after],
          Counter = [Counter] + 1
      ]
  )
in
  gather

But it might not work, as I wasn't sure what needs to go into the Content-parameter. Is it "only" the result (of the previous step) or has the newOffset to be integrated there as well?
.. sorry for the shi... formatting, somethings wrong with the code editor here in the forum...
 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

View solution in original post

4 REPLIES 4
ImkeF
Super User
Super User

Hi @Melben ,
what does the error message say?

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Melben
Frequent Visitor

Apologies @ImkeF I should have put it in the body of my question - forgive me it was my first one! The error is:

 

We encountered an error during evaluation. Details: The evaluation with ID "XXXXXXXXXXX" is already in progress

Sometimes (unfortunately) if I refresh it provides this error:

Evaluation resulted in a stack overflow and cannot continue.

I have a stron suspicion something is being mis-handled with the recursive combination of lists but I can't pinpoint it as I hadn't changed anything when it stopped working. Any ideas?

No worries, you're welcome 🙂
That code is really bad in many ways:
1) It uses native recursion (@) instead of List.Generate, which is supposed to be very slow
2) How the buffers are used will make performance even worse, so it would be better to remove them

3) For my understanding, the current code should result in an endless loop (which would explain the error-messages):
This recursive call doesn't include any of the results that have been returned in the previous call:

//get new req & data
newReq = Json.Document(Web.Contents(
   baseuri,
    [
     Headers= headers,
    Content= newContent
     ]
      )
),
as the headers are fix and the so called newContent is fix as well, as it doesn't refer any of the previous calls result.

The general "architecture" for a solution using the faster List.Generate would look like so:

let
  headers = [
    #"Content-Type" = "application/json",
    Authorization   = "Bearer XXXXXXXXXXXXXXXXXXXXXX"
  ],
  initContent = Text.ToBinary(
    "
      {
        ""filterGroups"": [
          {
            ""filters"": [
              {
                ""propertyName"": ""hs_lastmodifieddate"",
                ""operator"": ""GTE"",
                ""value"": "                            
      & Text.From(#"Last Update Deals")
      & " // Separate query that returns a Unix datetime stamp of last updated deal
              },
              {
                ""propertyName"": ""pipeline"",
                ""operator"": ""IN"",
                ""values"": ["                              
      & Text.From(#"Sales Pipelines")
      & "] // Separate query that returns only the pipelines I'm interested in
              }
            ]
          }
        ],
        ""properties"": ["                          
      & Text.From(#"Properties Deals")
      & "], // Separate query that returns all properties I need
        ""limit"": 100,
        ""after"": 0
      }
"
),
  initReq = Web.Contents(
    baseuri,
    [
      Headers              = headers,
      Content              = initContent,
      ManualStatusHandling = {400, 401, 403, 404, 500},
      IsRetry              = false
    ]
  ),
  initCall = Json.Document(initReq),
  initData = initCall[results],
  // will generate a list with the results of all calls. Can be expanded using the UI afterwards.
  gather = List.Generate(
      ()=> [request = initCall, result = initData, newOffset = initCall[paging][next][after], Counter = 0],
     
      //Condition, under which the next interation should take place
      each Table.Contains(Record.ToTable(request), [Name = "paging"]) = true
            and [Counter] < 10 // for debugging purposes, delete after everything is running smoothly: Limits to max. 10 recursive steps
            ,
      each [
          // [result] refers to the result-step of the previous call, so that's where the recursive effect is coming from here.
          // But I am not sure, if that actually needs to go into the content parameter here.
          // Probably the newOffset would have to be considered as well? API documentation will tell you.
          request = Json.Document(Web.Contents(baseuri, [Headers = headers, Content = [result]])),
          result = request[results],
          newOffset = request[paging][next][after],
          Counter = [Counter] + 1
      ]
  )
in
  gather

But it might not work, as I wasn't sure what needs to go into the Content-parameter. Is it "only" the result (of the previous step) or has the newOffset to be integrated there as well?
.. sorry for the shi... formatting, somethings wrong with the code editor here in the forum...
 

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Melben
Frequent Visitor

Hi @ImkeF  thank you so much you nailed it showing that there was an endless loop. Not sure when, but I had left out the refernce to the 'newOffset' which pulled through the paging 'after' field from the paging API call. This change alone has the query working again.

newContent = Text.ToBinary("
      {
        ""filterGroups"": [
          {
            ""filters"": [
              {
                ""propertyName"": ""hs_lastmodifieddate"",
                ""operator"": ""GTE"",
                ""value"": " & Text.From(#"Last Update Deals") & "
              },
              {
                ""propertyName"": ""associations.deal"",
                ""operator"": ""IN"",
                ""values"": [" & Text.From(#"New Deal List") & "]
              }
            ]
          }
        ],
        ""properties"": [" & Text.From(#"Properties Line Items") & "],
        ""limit"": 100,
        ""after"": " & newOffset & " // RIGHT HERE
      }
"),
 
I'm marking your answer as a solution, but may come back for advice as I'm about to start looking at List.Generate to improve performance.
 
Thanks again,
Ben
 

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel1

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

Top Solution Authors
Top Kudoed Authors