Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Expression.Error: Evaluation resulted in a stack overflow and cannot continue.

I am pulling in to queries from Kusto. I have used filters and sorts to pair down the queries within the query editor. These queries are pulling in data from two 4G controllers we have in the field. Once filtered the atributes of the queries are listed below:

 

Query 1:

6 columns (3 text, 1 numeric, 2 date/time/timezone)

13 rows

 

Query 2:

8 columns (2 text, 4 numeric, 2 date/time/timezone)

5533 rows 

 

4 of the columns between the two querries match exactly in title and data type. I am attempting to append these queries, but everytime I do it this error results "Expression.Error: Evaluation resulted in a stack overflow and cannot continue." I have tried both direct query and importing the data itself. I have also tried deleting columns to where I am only appending 1 column from each query with the same name and data type. I still recieve this error. I don't think I'm maxing out the number of rows because I have appended millions rows of data before, but I don't know what else could be wrong with it. I have updated my PowerBI and tried opening a new project and trying the same thing in the event there was a glich in the window I had tried it in. No luck. Any ideas on how to append these two queries? 

19 REPLIES 19
nicpenning
Helper I
Helper I

I don't use Table.Combine, but I am wondering if this effects Text.Combine. We have a sample set of data that is less than 700MB but when we try to query the web source recursively the mashup processes use up over 20GB of RAM and then we get the crash error.

This is my line for Text.Combine in my function and in my Query:
Json.Document(Web.Contents("https://datasource:9200", [RelativePath=relativepa, Headers=[#"Authorization"=Text.Combine({"ApiKey ",apiKey,""}), #"Content-Type"="application/json"], Content = Text.ToBinary(urlBody)]))

urlBody = Text.Combine({"{""size"":",batchSize,",""query"":{""match_all"":{}}}"}),

v-jingzhang
Community Support
Community Support

Hi all,

 

Below is the update for this issue. (IcM: 219724212, 224968762)

 

This is caused by a common Mashup Engine bug which is currently under investigation. There is no ETA currently. Below are two workarounds to it:

 

- Workaround #1:

Call Table.AddIndexColumn on each source query before using Table.Combine. Or add a new step to each query using "Index Column" button under Add Column tab before combining tables.

 

- Workaound #2:

Call Table.Buffer on each of the tables being combined before calling Table.Combine.

 
Hope this helps.
 
Best Regards,
Community Support Team _ Jing

Hello! I am using a recursive function to append 10K results at a time. Do you think the following is affected by this bug?
Function:

let
    RecursiveElasticFetch = (relativepa, relativepa2, batchSize, urlBody, scrollBody, scrollID, counter, scrollEND, apiKey) =>

    let
        Results = if (scrollID = null) then
            //Initial Query
            Json.Document(Web.Contents("https://es:9200", [RelativePath=relativepa, Headers=[#"Authorization"=Text.Combine({"ApiKey ",apiKey,""}), #"Content-Type"="application/json"], Content = Text.ToBinary(urlBody)]))
        else
            //All other queries execute this to gather results from the scroll api even if the scroll id changes.
            Json.Document(Web.Contents("https://es:9200", [RelativePath=relativepa2, Headers=[#"Authorization"=Text.Combine({"ApiKey ",apiKey,""}), #"Content-Type"="application/json", MyHeader=Text.From(counter)], Content = Text.ToBinary(scrollBody&scrollID&scrollEND)])),

        //If this is the first time the function runs, the counter should be null so this will dynamically calcuate how many times this function needs to run.
        counter = if (counter = null) then
            //Dynamically get the counter - Note: You can uncomment the next line for testing and then comment out the Number.RoundUp
            //7
            Number.RoundUp(Results[hits][total][value]/Number.FromText(batchSize))
        else
            counter,

        //Store the hits from the ElasticSearch query into ParsedResults and if results already exist, append more results to generate the full table of events.
        ParsedResults = Table.FromList(Results[hits][hits], Splitter.SplitByNothing(), null, null, ExtraValues.Error),

        Return = if (counter > 0) then
            ParsedResults & RecursiveElasticFetch(relativepa, relativepa2, batchSize, urlBody, scrollBody, Results[_scroll_id], counter-1, scrollEND, apiKey)
        else
            ParsedResults
    in
        Return
in
    RecursiveElasticFetch

 If so, how would one apply a work around?

Here is a snippet of the query starting where the function above is called:

Source = RecursiveElasticFetch(relativepa, relativepa2, batchSize, urlBody, scrollBody, scrollID, counter, scrollEND, apiKey),
    #"Renamed Columns" = Table.RenameColumns(Source,{{"Column1", "es"}}),
    #"Expanded es" = Table.ExpandRecordColumn(#"Renamed Columns", "es", {"_index", "_source"}, {"_index", "_source"}),
    #"Expanded _source" = Table.ExpandRecordColumn(#"Expanded es", "_source", {"_label", "_createdAt", "data", "dataType", "tlp", "sighted", "_createdBy", "message", "ioc", "caseId", "relatedId", "tags"}, {"ObjectType", "DateCreated", "data", "dataType", "tlp", "sighted", "CreatedBy", "message", "ioc", "caseId", "relatedId", "tags"}),
    #"Expanded tags" = Table.ExpandListColumn(#"Expanded _source", "tags"),
    #"Duplicated Column" = Table.DuplicateColumn(#"Expanded tags", "DateCreated", "DateCreated - Copy"),
    #"Renamed Columns1" = Table.RenameColumns(#"Duplicated Column",{{"DateCreated - Copy", "TimeCreated"}}),
    #"Parsed Time" = Table.TransformColumns(#"Renamed Columns1",{{"TimeCreated", each Time.From(DateTimeZone.From(_)), type time}}),
    #"Parsed Date" = Table.TransformColumns(#"Parsed Time",{{"DateCreated", each Date.From(DateTimeZone.From(_)), type date}})
in
    #"Parsed Date"

 

Hi @nicpenning , we're using the same Elasticsearch function and are experiencing the same issue with a index of about 1.5GB. Did you find any solutions for the stack overflow issue calling ES from PowerBI? Thanks

So I don't believe I could get the work around to work. I am running the October 2022 version of PBI and it will eventually refresh, but it takes a very long time. Are you using my function above?

 

The index we query is over 1 millions rows - The index is about 3.8 GB, but totals 7.6 GB with 1 primary and 1 replica. Currently timing the refresh, stay tuned.

It was the index for me. 

 

Thank you

This totally worked for me for SP Lists I addedd Index from 0, thanks a lot

Hi @v-jingzhang could you re-explain the work around again , I didn't understand it ? 

@MAAbdullah_47 You can add a new step to each query using "Index Column" button under Add Column tab before combining tables.

BZ
Regular Visitor

Please confirm if a fix is in the works for Sharepoint Online and Excel as well.  Thank you!

v-jingzhang
Community Support
Community Support

@BZ Please try the workarounds in my another reply.

BZ
Regular Visitor

I am having the same error appending data from Sharepoint Online List and data imported from an excel file.

v-jingzhang
Community Support
Community Support

Hi @Anonymous , Can you test if you append a query and its copy (duplicate the query), will they be combined successfully?

Anonymous
Not applicable

Hello @v-jingzhang, I duplicated Query 1 and tried to append it with the original Query 1, and still got the error: Expression.Error: Evaluation resulted in a stack overflow and cannot continue.

I tried this by duplicating Query 2, and got the same result. 

Anonymous
Not applicable

I just exported the two queries into a CSV file and then loaded those CSV files into my PowerBI and appended them. They appended correctly in seconds. This points me to think it is an issue with importing them from Kusto. Do you know any limitations of appending data from Kusto?

There is a known issue about appending data from Kusto in internal DB (ICM: 219724212) but hasn't been fixed. I will update here if there is any progress about it.

 

Currently if you can use other data sources as alternatives? Or append them at the DB side first and then import into Power BI?

 

Regards,

Jing

Hi, do you know where are we with the fix?

Hi @gor 

 

There is a workaround in this post: Expression.Error: Evaluation resulted in a stack overflow and cannot continue. You could have a try.

 

From the information collected by PG, the best workaround for this is probably to call Table.AddIndexColumn on the Kusto data before using Table.Combine.

The main advantage of Table.AddIndexColumn is that it can be used by someone through the UI without any typing.

 

Best Regards,

Jing

Thanks the workaround seems to work.

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.