Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Anonymous
Not applicable

Not enough memory for a calculated column

Hi,

 

I'm facing the "There's not enough memory to complete this operation" error on a calculated column. I've already tried some workarounds like reducing the number of columns to the strict mandatory, but I'm still getting this error.

 

My company is working with funds from donations. This means we send "customers" requests of donations and the goal here is to calculate their RFM. This should be calculated directly in our CRM but unfortunately it isn't working properly.

 

Here are the details of my model (a lot simplified) :

2 dimension tables and 1 fact table.

- 1 dimension = the solicitations we send to our customers (i.e. 1 solicitation everytime we send them a donation request). (11 million rows)

- 1 dimension = accounts (500K rows)

- 1 fact table = fact_sales (4+ million rows)

 

 

2018-09-24 17_05_02-exemple pour création Segment R - Power BI Desktop.png

 

The goal is to calculate for each solicitation the Recency of the latest "sales" of a customer. This will make a historical analysis for each customer based on the solicitations they received.

 

On this simplified model, I have added this calculated column that works perfectly:

Segment R = 
VAR
Datedernierdon = 
CALCULATE(
    MAX(Fact_sales[Sales_date]);
    RELATEDTABLE(Accounts);
    FILTER(
        Fact_sales;
        Fact_sales[Sales_date] <= Sollicitations[Sollicitation_date]
    )
)
RETURN 
CALCULATE(
    DATEDIFF(
        Datedernierdon;
        MAX(Sollicitations[Sollicitation_date]);
        MONTH
    );
    RELATEDTABLE(Accounts)
)

And the results it gives (red rectangle):

2018-09-24 17_13_20-exemple pour création Segment R - Power BI Desktop.png

 

The issue is that on my real business model the fact table has more than 4 million rows and the solicitation table has more than 11 million rows. Whenever I try the exact same calculated column, I get the "Not enough memory" error.

 

I already tried to move the variable calculation as an other calculated column instead of using it directly in the Segment R calculated column, but I also get the "Not enough memory" error for the simple Datedernierdon calculation.

Datedernierdon = 
CALCULATE(
    MAX(Fact_sales[Sales_date]);
    RELATEDTABLE(Accounts);
    FILTER(
        Fact_sales;
        Fact_sales[Sales_date] <= Sollicitations[Sollicitation_date]
    )
)

Thanks in advance for your help.

2 ACCEPTED SOLUTIONS
v-lili6-msft
Community Support
Community Support

hi, @Anonymous

      This error shows that you don't have enough memory. Please extend your memory as much as possible. You can refer to this article(though the article is about PowerPivot, it should be similar with Power BI) to optimize your data model in a way that makes a better use of the available memory. 

 

For another thing, if you upload the data in import model, please disable "Allow data preview to download in the background" from options / options and settings/data load. And check if it is useful.

 

Best Regards,

Lin

Community Support Team _ Lin
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

Split data source and report into 2 separate entity?

 

There are number of ways to split.

 

1. Equal size (time range) split. So each contains half data.

 

2. One smaller range (1 month?) with finer grain data set. Then another for longer period with aggregated data set.

 

This is to reduce amount of records that you need to work with in one instance, reducing memory requirement.

 

If above isn't possible, may be increase your available RAM?

View solution in original post

6 REPLIES 6
v-lili6-msft
Community Support
Community Support

hi, @Anonymous

      This error shows that you don't have enough memory. Please extend your memory as much as possible. You can refer to this article(though the article is about PowerPivot, it should be similar with Power BI) to optimize your data model in a way that makes a better use of the available memory. 

 

For another thing, if you upload the data in import model, please disable "Allow data preview to download in the background" from options / options and settings/data load. And check if it is useful.

 

Best Regards,

Lin

Community Support Team _ Lin
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Anonymous
Not applicable

Thanks for your answer @v-lili6-msft.

 

Unfortunately, I've tried some of the solutions mentioned in the article (i.e. denormalizing and reduced number of columns to the strict minimum) and the second solution "Allow data preview..." but none of them made any change. I also tried to shrink the number of rows by using a FILTER but this had shown no specific effect.

 

If anybody had an idea, it would be much appreciated.

 

Best regards.

Split data source and report into 2 separate entity?

 

There are number of ways to split.

 

1. Equal size (time range) split. So each contains half data.

 

2. One smaller range (1 month?) with finer grain data set. Then another for longer period with aggregated data set.

 

This is to reduce amount of records that you need to work with in one instance, reducing memory requirement.

 

If above isn't possible, may be increase your available RAM?

Anonymous
Not applicable

Thanks for your answer @Chihiro.

 

With a very small amount of data, it seems to work. I'll have to find workarounds to reduce as much as possible the data being computed.

 

Once again thanks for your answers @v-lili6-msft and @Chihiro.

 

 

Chihiro
Solution Sage
Solution Sage

Calculated column (especially in Fact Table) is going to impact performance of model.

And though you state Solicitation is Dimension table... it's being used like fact table.

 

 

For optimized data analysis, you should denormalize your data. Meaning transformation from schema that's optimized for data entry to schema that's optimized for reporting. Typicaly speaking, you should transform data into star schema model prior to loading to data model.

1 fact table (all foreign) and multiple dimension tables (1 primary each).

 

Perhaps transforming data in query stage may help in performance.

Anonymous
Not applicable

Thanks for your answer.

 

You're right, those solicitations are definitely a fact table. I will try your solution, but it might be quite difficult as in my real business case there are several dimension tables related to the solicitations table.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.