Showing results for 
Search instead for 
Did you mean: 

Share your thoughts on DirectQuery for Power BI datasets and Azure Analysis Services (preview)

Hit Reply and let us know what you think of the DirectQuery for Power BI datasets and Azure Analysis Services.  To learn more about this feature, please visit this blog post or our documentation.


Here are some areas that we'd like to hear about in particular:

  • Performance
  • Query editor experience--the remote model query doesn't show up in the query editor and only in the data source settings dialog. What are your thoughts?
  • Navigator experience
  • Thoughts around governance and permissions for models that leverage this feature
  • Nesting models, i.e. building a composite model on top of a composite model
  • Automatic page refresh for live connect in composite models

Thanks and we look forward to hearing your feedback!


- The Power BI Modeling Team

524 REPLIES 524

I understand that this is annoying - we don't actively stop you from doing this in Desktop as we don't know if you are going to publish to the Service. So instead of "breaking" Desktop for those that never publish to the Service, we allow you to do it and then it breaks when you publish to the Service. If we warned for everything that is not going to work in the Service that would be a very jarring experience and get in the way a lot while it might not even be relevant for you. Please see for a list of limitations in this preview, your exact scenario is listed as well.

Frequent Visitor

It would be great if there were an officially sanctioned and clear approach (or more than one) to managing large datasets (+30GB) using directquery (or not).  This is especially the case because it seems like directquery is currently the best way to deal with huge datasets in PBI. Haven't been able to load huge tables to dataflows, datamart, or blob without timeout errors, etc.  I am fully in the MS ecosystem, so I connect to Azure SQL Database using directquery.  The tables in the DB are indexed (columnstore) and I'm using auto aggregation.  I'm pretty sure that's all I can do (or mostly all), but I'm not entirely sure.  Because it's a little bit of witchcraft.  Would love a video or post like this: dealing with huge datasets from SQL Server et al to PBI. 

This thread isn't related to large models (import or direct query). This is related to connecting to Analysis Services in a Direct Query mode rather than a Live Connection mode.


It looks like there are a number of posts under the desktop section of the sight that maybe helpful:


As a Consultant myself working with large models (primarily against Snowflake), the advice is to import the data into the model in the first place. with large data this requires implementing incremental refresh. (Note that a P1 capacity has a model size limit of 25GB Capacity and SKUs in Power BI embedded analytics - Power BI | Microsoft Docs.)


If you can't scale up the capacity, or reduce data size through optimisation and rationalisation, then utilise direct query with aggregations; auto-aggregations (which may be helpful dependant on the repetitive nature of the queries) or manual aggregations.


There is plenty of material on these topics if you Bing search. GuyInACube has some good material as a starting point:


Regular Visitor

Thanks for adding this feature. Really useful; however we are having some issues at my company. 

As a tenant admin, connecting to a PBI dataset (switched to DirectQuery) + an Excel file is working fine.
But for my colleague (not an admin, but workspace admin), he is getting an error 

"Your credentials could not be authenticated: "Credentials are missing." altough everything is set up correctly in the source (we tried with several kinds). He also enabled the feature in Power BI desktop. 

I checked all the tenant admin settings (mentionned here) too. 
Do you have any idea on what could cause the problem ? 

Thanks for your help

sounds like a permission problem or credentials not being configured for the source. Have you checked the source settings in the Service?

Frequent Visitor

I'm having trouble developing the composite models.
I connect to a powerbi service dataset and create a model in direct query.

I then enabled DirectQuery for PowerBi datasets and the AS option.

Despite this, it doesn't give me the option to connect to any other file.

Also, I went to External Tools and tried to open the Tabular Editor and it gives me an error: The remote server returned an error: (401) Unauthorized

Am I doing something wrong?
Do I need to enable anything else for composite models to work?




not sure. Did you restart Desktop after enabling the preview?

Yes, I did. But nothing happens even though I am connected live to the published dataset.


is this dataset published on Pro/Premium/PPU? Is it not located in "my workspace"?

The report is published with a free account, I do not have a pro or premium.
Also, it is published on my workspace.

from the documentation: "

With another workspace, it just worked, thank you very much

ok great!

Frequent Visitor


This feature could support SQL Server Analysis Services 2022 now. It's great !

Composite models on SQL Server Analysis Services | Microsoft Power BI Blog | Microsoft Power BI

Is it possible to support SQL Server Analysis Services 2019 in the future ?

Thank you.

no, we are not planning to add support for older versions of SSAS than 2022, since it requires changes in the engine itself.

Thank you for the reply.
But it's a pity because SSAS 2019 is the latest official version for now. 

The on-premises users have to wait next year at least to use this fonctionality. There is also a migration to do.

Anyway it's a very useful fonctionality. Thank you for the work.

Helper II
Helper II

So, this one is an oddball one.


When I query a particular DQ dataset form Excel I get the error "Data seen at different points in time during execution of this query are inconsistent with each other. This could be due to external transactions updating same data while this query was running. Please rerun this query." However, the exact same setup querying the underlying dataset (the source for the DQ dataset, which is a typical import model) doesn't produce this error. 🤔


Tracing using Dax Studio, I find the underlying MDX query (which works when directly queried against the source dataset) is:


MDX Query:

                {DrillDownLevel({[Value Selector].[Value Selector].[All]})}
FROM [Model]
    [Measures].[Total Actuals NZD]


Resultant DAX generated to the source dataset is (and when run in DAX Studio is working):


DAX Query 1:

    "__Agg0", [Total Actuals NZD]


DAX Query 2:

VAR _Var0 = VALUES('Value Selector'[Value Selector])
    'Value Selector'[Value Selector],
    "__Agg0", [Total Actuals NZD]


DAX Query 3:

    VALUES('Value Selector'),
    'Value Selector'[Value Selector],
    'Value Selector'[Order]


The model structure is simple enough - although not the best modelling setup. The measure used is a simple column sum...




Not sure where to go from here? 🤷‍

hmm, can you please open an issue on or open a support ticket?

In case you want to track your end: Support request ID 2205310030002149

New Member

When you link with second dataset, the RLS would disappear. It would have data loss risk. Would there be any plan to fix that?

I am not sure I follow this - can you please provide more details?

Helpful resources

Winner of T-Shirt Design

Power BI T-Shirt Design Challenge 2023

Find out who won the T-Shirt Design Challenge and the top 3 finalists.

March 2023 Update3

Power BI March 2023 Update

Find out more about the March 2023 update.