Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Anonymous
Not applicable

Ultimate PowerBi/DAX computer build

Hello there,

 

I am wondering, I recently created a number of quite heavy load measures over a DirectQuery to our transaction table and my BI ground to a halt,.. stutterd and crashed.

 

I am wondering, what would be the ultimate build of a PC to run DAX. That is to manipulate data, apply changes, have a look at how it works out in the visual and open de query editor again. I have read that DAC mostly runs on a single core for its calculations. That would mean that CPU clockspeed and maybe RAM size would be most important, however I cannot find any builds of PC´s online that focus on data processing. 

That could be because the key is just to write your DAX formulas right and the processing speed of your PC does not matter.

 

Please let me know your thoughts!

 

Thomas

1 ACCEPTED SOLUTION
Greg_Deckler
Super User
Super User

@Anonymous - I would have to say #1 is memory. The amount of memory in your machine is going to dictate how big your data models are and what operations DAX will and will not be able to perform on that data. Since everything is done in memory for the most part, CPU speed, disk IOPS, etc. are very much secondary. Jam as much memory as you can into a machine. I once built 4 servers that each had a Terabyte of memory each for a very similar reason. 600+ GB SQL database, could all be put into memory. Geo-clustered between 2 locations with local redundancy. Nifty. And blazingly fast.


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
The Definitive Guide to Power Query (M)

DAX is easy, CALCULATE makes DAX hard...

View solution in original post

2 REPLIES 2
Greg_Deckler
Super User
Super User

@Anonymous - I would have to say #1 is memory. The amount of memory in your machine is going to dictate how big your data models are and what operations DAX will and will not be able to perform on that data. Since everything is done in memory for the most part, CPU speed, disk IOPS, etc. are very much secondary. Jam as much memory as you can into a machine. I once built 4 servers that each had a Terabyte of memory each for a very similar reason. 600+ GB SQL database, could all be put into memory. Geo-clustered between 2 locations with local redundancy. Nifty. And blazingly fast.


@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
The Definitive Guide to Power Query (M)

DAX is easy, CALCULATE makes DAX hard...
Anonymous
Not applicable

Sick! But so if it is memmory you need you could reason that hiring a server would solve any problems. I run a:

- Processor Intel(R) Core(TM) i7-8650U CPU @ 1.90GHz, 2112 Mhz, 4 Core(s)

- (RAM) 16,0 GB

 

and so if i need more processing power I should just build my model in a AWS server no and run it from there?

Because I guess that once I calculate all the DAX I can just access the report via the webclient || download the whole file back to my local machine and run the filters on that correct?

 

Thomas

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors