Showing results for 
Search instead for 
Did you mean: 
Frequent Visitor

Run Data Gateway from container

Currently my company has a data gateway installed on a physical server on-prem. This server is being decommissioned and I need to run the datagateway from Azure where we have the VPNs configured and access to the on-prem servers. I know I could spin a VM and install the gateway there but this isn't cost-effective as that VM would need to be up and running 24x7 only to serve the gateway. I think it's best to prepare a container image with the gateway (or an empty windows image that installs and configures the gateway on load) but I don't have a clue where to start.


How can I spin a container to serve the data gateway? Or is there another way to access on-prem sql servers from the power bi platform?

Super User II
Super User II

Are you saying that running an on-prem VM has a physical cost for you?


No it doesn't. I was saying I need to run the gateway from Azure as the company is trying to move away from on-prem servers except in instances where absolutely required. So I'd need to create a VM in Azure but it's really expensive to run a VM 24x7 thus I was thinking of creating a docker image to do it.
Super User IV
Super User IV

@PedroC88 - Well, in theory you could use automation to only spin up the VM say, twice a day and schedule your refreshes to run in that window?


@ me in replies or I'll lose your thread!!!

I have a NEW book! 
DAX Cookbook from Packt
Over 120 DAX Recipes!

Did I answer your question? Mark my post as a solution!

Proud to be a Super User!


The problem is that gateways by definition have to be on prem (well, they have to be able to access on prem data sources, and they have to be able to see the on prem AD). And gateways are rather pointless if they are not 24x7.


Of course you can go crazy and run them on Azure with a VPN connection back into your network.  But that wholly defeats the purpose of putting the gateway as close as possible to your on prem data sources for performance reasons. 


Think of all the data travel to and fro that would result from placing the gateway into Azure.



The container or the VM would be joined to a VPN in Azure with access to the on-prem servers. About the performance, it wouldn't be worse than it is today because we already run the gateway from contry x and connect it to countries a, b andto serve those countries' data



Well... this would technically work but a question I had for another day was how can I allow the users to refresh the datasources on demand? Note that the company is only now on-boarding PBI so this is all kinda new to us. If we have that scenario of allowing users to run the reports on demand how could we handle that?


As long as "on demand"  means "from 8am UTC to 10am UTC because that's when the gateway is spun up" - that would work. Doesn't really matter if the refresh is scheduled, on demand, or OneDrive. If the gateway is down the refresh request will fail.


@lbendlin that's why I'm not all bought on that "time window" approach. Normally users don't need to see live data, yesterday's data is usually fine, however, on month-end closings for example when accounting is working on reconciling their systems, if they make any entries or adjustments they need to be able to refresh the report on demand, highlighting the need for the gateway to be on 24x7.


Also - this whole discussion assumes that your data sources are accessed in import mode.  As soon as you use Direct Query you need your gateway to be on all the time anyway.

Helpful resources

Community Conference

Power Platform Community Conference

Check out the on demand sessions that are available now!

Community Conference

Microsoft Power Platform Communities

Check out the Winners!


Create an end-to-end data and analytics solution

Learn how Power BI works with the latest Azure data and analytics innovations at the digital event with Microsoft CEO Satya Nadella.