Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
AGo
Post Patron
Post Patron

Performance issue: Microsoft SQL Server Analysis Services

I've got a pbix with some heavy calculated elements (VARs, calculated tables, columns, measures) on 200k rows on a PostgreSQL database.
When refreshing or changing something in the dax formulas the process Microsoft SQL Server Analysis Services starts using 18% circa of my i7 cpu and filling the 80% of my 16gb RAM.

Is there a way to speed up the process or allowing it to use the cpu heavier?

4 REPLIES 4
AGo
Post Patron
Post Patron

@hugoberry @malagari I checked my pbix and the problem is on this 2 calc columns:

 

 

Stock = 
VAR MaxDate = [Date]
VAR CurrentProduct = [articoli_art_id]
VAR InitialDate=IF(MAXX(FILTER(ALL(Magazzino);Magazzino[articoli_art_id]=CurrentProduct && Magazzino[Date]<=MaxDate && Magazzino[movmag_mmg_caus]=100);Magazzino[Date])=BLANK();MIN(Magazzino[Data movimento]);MAXX(FILTER(ALL(Magazzino);Magazzino[articoli_art_id]=CurrentProduct && Magazzino[Date]<=MaxDate && Magazzino[movmag_mmg_caus]=100);Magazzino[Date]))
RETURN
   SUMX(FILTER(ALL(Magazzino);Magazzino[Date]<= MaxDate && Magazzino[Date]>=InitialDate && Magazzino[articoli_art_id] = CurrentProduct); Magazzino[Quantità])

And the 2nd one:

Next_variation_date = 
VAR DataCorrente = [Date]
VAR CurrentProduct = [movmag_art_id]
Return
IF(CALCULATE(MIN(Magazzino[Date]);FILTER(Magazzino;Magazzino[Date]>DataCorrente && Magazzino[movmag_art_id]=CurrentProduct))=BLANK();
TODAY();
CALCULATE(MIN(Magazzino[Date]);FILTER(Magazzino;Magazzino[Date]>DataCorrente && Magazzino[movmag_art_id]=CurrentProduct)))

I tried to substitute when possible VARs using EARLIER function with no success. Is there a way to increase performance of this two DAX functions? Limiting these on only 20k rows still takes 4 minutes too.

When the pbix is online it performs well, don't know why I'm having this problem on the PBIdesktop with SSAS process.

 

I'm trying to avoid doing this calc during import via SQL (yet proved it's slow and unpractical)

malagari
Responsive Resident
Responsive Resident

Power BI's DAX engine uses SSAS under the hood to perform all of the in-memory calculations.  If you have fairly complex measurements (especially making use of dynamic date comparisons), you'll see significant CPU/RAM usage.

 

There is no way to offload things from RAM to CPU - I won't go into too much detail, but that's not how it works.

 

You'll want to start looking into improving your calculations, flattening your tables to a denormalized view, or think about offloading to a separate Analysis Services server.

 

Dan Malagari
Consultant at Headspring

So you're telling me that if I substitute dates with an incremental index, the calculations are faster?

hugoberry
Responsive Resident
Responsive Resident

Try checking your data types in the Data Model (Tabular). Unless all of the column types that you handle are varchar I don't see the reason to take up so much memory.

 

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.