Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
KevinM
Frequent Visitor

What are the max number of datasets, how much mem and cores can a gateway handle?

I am the admin for an Enterprise that is begining to explode on PBI usage and am upgrading my system to be able to handle the multitude of uses and teams we have using Power BI.  I am looking to max out my servers to hadle the extreamly large datasets and the every hour or every 30 min refreshes they want to run.  

 

I am currently setting up a cluster with 4 servers set up with distributed services to loadbalance the work.  The servers are a 4 Core 8 gig memory.  How many cores can the gateway handle and what is the max memory? 

 

How large can a cluster be?  

 

Also is there a max number of datasets that can be set?   

 

I want to run these gateways at the max to handle everything our users will throw at them? 

 

Is there anywhere I can find this kind of information out online?  other than here?  

 

Thanks 

 

2 REPLIES 2
v-piga-msft
Resident Rockstar
Resident Rockstar

Hi @KevinM ,

 

From my knowledge, I'm afraid there is no document to describe the cores can the gateway handle and what is the max memory.

 

I'm afraid that Power BI Premium is available in node configurations with different v-core capacities.

 

Do you confuse the concept of gateway and premium capacity?

 

Do you use Premium capacity?

 

Best Regards,

Cherry

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

I work in the Support/Operations area in our company and am the Admin for Power BI and am researching setting up new gateways.  

 

We have many Preminum sites and are adding more and I understand the difference between the Gateways and Premium sites.  This is all about the Gateway application running on the VM servers in a cluster.  The groups that are using Power BI are building many very large datasets ranging from 6M to 600M in size.  Some of them are refreshed once a day while others are refreshed every half hour.  In the morning especially and occassionally during the day some datasets are getting timeout errors and failing and during those times there are large amounts of reports updating.  When the same reports run later with fewer datasets refreshing they never fail.  

 

So again, I am building new VMs and am clustering and I want to give the VMs as much power as I can but not go beyond the gateway applications ability to handle.  I am begining the understand that there doesn't seem to be any documentation or testing of the gateways maximum abilities which surprises me having been a client/server application developer in my past.  

 

This all goes alone with the lack of monitoring tools for Admins to watch the gateways closely and see what datasets are refreshing and causing issues using too many resources or running for long periods and reporting when there are too many datasets refreshing.  

 

So my question to the community is what is the Max CPUs and Memory you have given your servers that was effeciently being used?  

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors