Showing results for 
Search instead for 
Did you mean: 

Power BI – Live connection vs. Import, comparison and limitations

Some of the data sources have option both to import data as well as use live connection. But there is no easy method to change it once you already built your report. So you should rather carefully decide which method you’re going to use upfront. To make this decision right, it is good to know limitations of each method. Purpose of this blog post is to cover these limitations.

First what kind of license are you going to need? If you’ll use live connection, you’ll need Pro license for all end users consuming report/dashboard. If you’ll import data from on premises data source through gateway, you’ll need Pro License again. If your data are accessible without gateway, but you want to refresh them more often, then once a day, Pro License is required too. If you want to stay with free Power BI license, you’ll need to use data Import method only and avoid on premises data sources, refresh data maximally once a day.


Data import

Data import can be used against any data source type. Current Power BI service limitation published file size is 1 GB. Data are loaded to file using column store compression. If you’ll read for report purposes just data required for usage in visuals this is not limiting too much. Even with Power BI Pro license you are limited to 8 refreshes per day. Times can be scheduled either at full hour or half past full. Scheduled time is approximate. When will refresh actually happen depends on service utilization. If you need to refresh data, more often consider using live connection. When using import, data are stored in Power BI file/service. Therefore, there is no need to setup permissions on data source side (service account for load is enough) and you can share data publically or with people outside organization. On the other hand, all data are stored on Power BI. So if you have requirement for Row Level Security, you need to implement it on Power BI side. If you need same logic applied for multiple reports, you have to duplicate it in all the reports. Same applies to calculations, if you need same measure in multiple reports, you need to recreate it every time.


Live connection

There are more limitations for live connection in place. It doesn’t work against all data sources. Current list can be seen here

You are also limited to just one data source/database you selected. You can’t combine data from multiple data sources anymore. If you are connected to SQL Database, you can still create logical relationships between objects from that database as well as measures and calculated columns. When you are connected to SQL Server Analysis Services, you are limited just to report layout and even can’t make calculated columns or measures. When connected to SSAS, you could also reach some licensing limitations. Power BI issues DAX queries against SSAS and this isn’t supported in Standard edition of SQL Server before SQL Server 2016. When using live connection, users have to have access to underlying data source. This means you can’t share outside of your organization or publically. Other way around, when you set up security once, all reports using live connection to the same source have the same security model applied automatically. Similarly, you can prepare your data model carefully with all measures and calculated columns on data source side. All reports from the same data source can benefit of it. Refresh frequency is unlimited. Power BI simply shows data as they are in database. However, be careful using live connection against highly transactional OLTP systems. If you suffer from performance problems now, analytical workloads from Power BI won’t help it. Consider enabling features on data source side like Operational Analytics in SQL Server 2016.



Both methods have limitations, developer should be aware upfront. For import it is mainly refresh frequency, data size and necessity to duplicate same logic (calculations, security) in multiple reports. That could be helped by templates, but if you need to change logic of calculation, you need to change it in multiple reports again. For live connection, not all data sources are supported. If you pick one, then you’re limited just to that one data source for report. If logic is setup on source side, all reports can benefit. My personal preference is use live connection against SSAS models, have the model centralized and maintain logic in one place. For SQL Databases where I don’t have model, I prefer import and use Power BI as a model and vizualization.


In case you do run into a scenario where you have to replace your data source, here's a quick walkthrough showing you how to go about doing it:


Hi Victor, 

yes I know it is possible, but a lot of pain 🙂 Not just that you have to remap all fields, but also recreate all measuers and calculated columns. And in some scenarios you can't switch at all. For example if you combined multiple data sources by import and then you decide to switch to live connection. So it should be planned carefully upfront


Good point!

what are your expirience on scalabillity and prformance of the reports between import vs. direct query approach?

@Ermin generally speaking you you can reach a lot of limits using import method. Datafile size, number of refreshes, and recently we found out even query limits. So for scalable solution with larger datasets live connection is defenitively better approach. Just make sure, you're enterprise gateway doesn't have any connectivity issues neither internally nor externaly. 


Hello Experts,

I am new to Power BI, already do a graph in Power Bi Desktop, and download the APP on my cell, my question is how can I see in the Power BI APP the graph I made on my desktop?

Hi @juanglezmex,

publish your report from Power BI Desktop to Power BI Service, then login to your mobile app and in "my workspace" switch to reports. It should be there



Can we have a Live connection with SQL Server on Azure VM from Power BI ?

Can we have a Live connection with SQL Server on Premise from Power BI ?

@sajid_siddiqui: Yes you can have live connection against SQL Server in Azure VM. You just needs to make sure, that connectivity and authentication works for you during development phase. When developing, you'll need to make sure, that you set up firewall rule for machine where you will develop using Power BI desktop. No need for set up after publishing to Power BI Service as far as I know. Also Azure VM will need to be in same domain, or you'll need to deal with UPN mapping after publishing.



thanks Jiri,

Is any idea how to achive column level security in Power BI.

For example some user can see some measures from Fact table and admin user able to see all measure from that Fact Table. 

Hi Sajid @sajid_siddiqui

This is not supported as far as I know. Even if it would, I'm not sure how would work if you referenced a column in chart and user didn't have permission on that column



I have an SSAS cube that I connect to in Power BI desktop. I'm able to connect to it without any problem, using both the Import and the Live Connection methods. Howeverm for the Live Connection, when I try to add some of the measures to a visual, such as a table or a slicer, the visual fills with a grey background and displays the error Can't display the visual, and a link that says See details.


When I click on the link, the only information it shows is this:


Error Code:


OData Error Message:
Failed to execute the DAX query.


PBI_DirectQuery_Unrestricted is enabled, so that's not the problem. And this only happens with the Live Connection method. If I use Import instead, I'm able to add these measures to visual elements without any problem. But since I need to be able to access the entire cube, using Import doesn't work, since there's just too much data.


Any idea why I might be running into this problem?



Hi @jetboy2k

I guess you refer to live connection against Multidimensional cube. What edition of SQL Server and version are you connected to? If by any chance standard edition of SQL Server older then 2016, this is not supported scenario due to licensing restrictions (Tabular and DAX aren't supported in Standard SQL Server prior 2016)



Jiri, thanks.


I am in fact on 2012, so that would appear to be the issue.


Thank you again for the information.

Isn't importing from SSAS Tabular in Power BI similar to how SSRS report connected to SSAS models would work? SSRS honors the row-level security setup in the SSAS model so I assume when Power BI imports the data, it would also honor the row level security if authetication is set to impersonate the logged in user. Is that correct?

@vickyd - if you mean live connection to SSAS Tabular, yes it would read it from SSAS data model. It works against Multidimensional as well. But it doesn't work for import method. Import happens under one account you set up in data source, data are read to Power BI service during refresh and stay there. End user isn't authenticated by back end database when using import method Jiri

Hello Jiri, I have one problem with refreshing live Azure AS conection, could you help me please?
It means that in Visual Studio (SSDT) I see in model after processing table 42 rows trasnferred (I deleted one row in DTB). After going Deploy my model to Azure Analysis services and I see 43 rows transferred. PowerBI after refresh see 43 rows too (correct should by 42). When I add column or measure Doplay is success. It is problem only with "refresh" data. Any idea, please?

Hi @JakubBalaj

check processing options of your deployment. If you want to refresh data as well as they changed, you can't use default processing option "default". This doesn't check for changes in data. Use process full instead, or don't process when deploy, rather repocess SSAS db manually afterwards







Thank you Jiri, It was helpful 🙂


Best regards,



Suppose i have connect to data source via direct query mode does it effect the underlying data source if some changes are made in Power BI Desktop data in query editor. Please guide....

All new to this.  Played with it before, but now would like to use it at my work.  We use Office 365 so our soruce file (a very basic Excel sheet) is saved in the Cloud. 

What I have done to begin with is:

Use the version of the file on my desktop

Created a report on Power BI Desktop

Saved file

Exported to web based Power BI

I can now see it on the app on my phone (great)


It is a database of sales per customer.

This grows on a daily basis.

The report a a basic filter to show sales for a specific customer on my phone.

How do I make this "live", so when a marketer visits a customer and he goes to his app, he has the latest data?


As far as I understand from playing around, I will have to go to the Desktop version, updata data, then publish again. (on a daily basis?)


Any suggestions would be appreciated.



Hi @Chris_Botha

Excel doesn't support live connection. So you can use just import. If the document is stored in onedrive, you won't need gateway for automated refresh. So you can just set up refreshes (up to 8 per day during working hours). This won't make it real time but at least near real time



Thank you so much for answering my question.


Two more: 

If my file is in Sharepoint on Office 365, can I link it to Power BI - what I mean is, if I can link it to Power BI on the web (not doing it on desktop first and then publish), can I not then have some sort of updating automation?


Secondly, you mention a refresh setting, where do I find this?




 Any reply to above questions?



refresh options are in dataset settings in your workspace. If you click on workspace, you should get to page where you can see multiple tabs. Dashboards, Reports, Files, Datasets.

Check datasets option.