Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
tessahurr
Employee
Employee

Share your thoughts on DirectQuery for Power BI datasets and Azure Analysis Services (preview)

Hit Reply and let us know what you think of the DirectQuery for Power BI datasets and Azure Analysis Services.  To learn more about this feature, please visit this blog post or our documentation.

 

Here are some areas that we'd like to hear about in particular:

  • Performance
  • Query editor experience--the remote model query doesn't show up in the query editor and only in the data source settings dialog. What are your thoughts?
  • Navigator experience
  • Thoughts around governance and permissions for models that leverage this feature
  • Nesting models, i.e. building a composite model on top of a composite model
  • Automatic page refresh for live connect in composite models

Thanks and we look forward to hearing your feedback!

 

- The Power BI Modeling Team

531 REPLIES 531
KristianA
Advocate II
Advocate II

Automatic page refresh on a live connection composite model was probably the first thing I wanted to test after having combined a streaming dataset with an existing model and an Excel file. Was disappointed that it wasn't possible to use it unless I'm doing something wrong, and it would be high up on my wishlist. It would have been a nested composite model, if that could be posing a limitation.


We also needs some sort of query editor functionality. So far I haven't been able to figure out how to disconnect from one of the sources that I've connected to.

Folders disappearing in the measure tables makes it very very messy to work with a large amount of measures.

Ideally I would like to be able to pick which tables and measures should be brought into the model. Unless we get something like that, we'd pretty much be forced to switch over to a large amount of smaller models, and connect to those that are needed for a use case. Alternatively, the option to connect to a "layout" or something from an existing dataset could help.

Configuring refreshes for the individual connections would also be very useful since some sources would change monthly or daily, while others may be changing every hour, minute, or even streaming. Some sort of dual mode setting for the source or tables you want to connect to, with or without a filter, might useful in increase performance.

Hi, APR is not working at the moment, but we are working on it. Moreover, you cannot delete sources right now (this is in the list of limitations in the blog post). Same for folders.

 

nicg
Advocate II
Advocate II

This looks great. I think our business users will be very happy not to have to rebuild what we already have in enteprise BI just to add a couple of their own tables to a report.

 

I do have a few usability questions regarding composite usage with an enteprise model.

 

1. Surrogate ID's - If the enterprise model is based on surrogate ID's, and a user wants to add their own data (from Excel, for instance) and relate it to existing dimensions, will they have access to the model in Power Query in order to merge the surrogate key into their data? How else will they be able to make the join in their new composite model?

 

2. Conformed Dimensions - Coming from SSAS MD, we used to have conformed dimensions, a single instance of which could be shared across multiple cubes. Power BI doesn't have anything like that, so dimensions get repeated in many different models. If two models are brought together in a composite, and both have a "Company" dimension, for instance, what's the right way of making sure each model's measures get filtered by its respective "Company" dimension?

 

3. Refresh - We have a large enterprise model where individual tables are refreshed at different times using XMLA executed by SSIS. This works but isn't great, the Power BI Service timestamp doesn't show that the model has refreshed to the user, we can't see in the Service when the tables are refreshing, and the different XMLA queries end up waiting on each other to finish instead of executing in parallel. Would composite models be a better solution? In other words, could we split our current model into "marts" that refresh independently and are then joined together in a composite?

 

Great work Power BI team!

Hi, you should be able to use PQ to create your surrogate IDs on Excel, just like normal. Re: #2: you can import the Company dimension once and relate it to the other tables. Re #3: not sure, sounds like an interesting scenario.

Thanks for responding.

 

1. What I mean is, assume we have a published Power BI shared dataset where the model makes relationships using surrogate keys, and a user pulls that into Power BI Desktop as a composite model. Then they add their own data from some other data source. Presumably, that other source will have no knowledge of the surrogate keys. How can relationships be created in this scenario. I ask because surrogate keys are a common dimensional modeling technique.

 

2. Does that mean we can pull in selected tables from composite models? I was thinking we had to import the entire model. In which case, you'd have multiple company dimensions making it difficult to "slice" the data.

Sorry for my late reply, I missed your response. 1) You would need to add those surrogate keys to the other data source as well. 2) No, you have to import the entire model, but you don't have to show all of the tables to the user nor relate all of them.

st_oleg
Frequent Visitor

am I first one to see the "The given key was not present in the dictionary." error?

Some search didn't give any results

 

thanks for your work

I'm in the same boat st_oleg. Second!

 

I have a report with a single dataset on PowerBI.com. When I attempt to convert it to a local model, I see the same error: "The given key was not present in the dictionary"

 

2020-12-17 13_56_14-To make changes, you need a DirectQuery connection.png2020-12-17 13_56_50-UPGRADED_PREVIEW_Multi DS - Power BI Desktop.png

pr2000
Frequent Visitor

Not sure if this of any help but this dataset is underpinned by Databricks. I attempted the same activity using a different dataset underpinned by an On Prem SQL db, ,and was successfully able to do so.

 

Hi, does the model have a measure-only table by any chance?

actually, it does.

i'll try adding a column there

 

thanks

I've experienced the same issue. See my post above. 

yeah, now I see it, 
exactly, I've got same error window and additional info. 

 

I'm expieriencing the same issue.

Tried it on multiple PBI datasets

 

'The given key was not present in the dictionary'

does the model have a measure-only table by any chance?

Moreover, tried it on multiple PBI accounts

Anonymous
Not applicable

@tessahurr I tired it this morning and I got the following error meassage. 

 

"cannot load model we couldn't connect to your directquery data source or internal model. Double Check That your server and databasenames are correct, and that you have permissions to access them OLE DB or ODBC error."

 

I continued and added my 2nd source (SQL) to the model. The Liveconnection model converted to a local model, but all my visuals shows now a error with:

 

"Couldn't load the data for this visual" - OLE DB or ODBC error. 

 

The Preview Option is activated. Is there anyone who solved this?

Best Regards
Manuel

 

Can you provide more details on the type of model you chose in the beginning?

Hi @Anonymous , 

 

I am having the same issue. Any help appreciated, @tessahurr . 

 

Regards

Daniel

ChoiJunghoon
Helper III
Helper III

"Replace Values" function cannot be used in power query after Dec'20 update.

"The step result does not supported in direct query"... 

 

There was no problem until the November version, but they say they don't support the function if I update it to December.

Do you have any plans to have the power query re-use " Replace Values" or "Renameed Columns" function ? 

 

I am not sure what you mean, this topic is about DQ for PBI datasets and AAS. Is this related?

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.