cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Highlighted
New Member

Native PowerBI data model vs SSAS Tabular: Change since new size limit announcement

Hi,

 

What are your views on the need for e.g tabular SSAS cubes after the upgrade that will make it able to upload 400GB of data to PowerBI? A lot of our data models are spanning from 20-400GB and with the current limitation of 10GB we are instead loading tabular SSAS cubes and connect PowerBI to those. But now I am thinking that we could just use the PowerBI native datamodel instead of going through a tabular cube server? I have not seen any disussion of the impact of this upgrade.

 

This was announced at the key note earlier this sping, about 27 minutes in.

https://youtu.be/8WhXCwHynEE

 

/Olof

4 REPLIES 4
Highlighted
Super User II
Super User II

Re: Native PowerBI data model vs SSAS Tabular: Change since new size limit announcement

@olofda  my views are that i would first do a proof of concept and test to see if it works for your particular environment.

 

are you currently using direct query?  the difference would be loading the data into power bi directly, who else uses the ssas models, does it have mulitple users.  Do you expect your data grow, if you pushing it right to the max of its capacity, what buffer do you have for growth.

 

 





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Highlighted
New Member

Re: Native PowerBI data model vs SSAS Tabular: Change since new size limit announcement

@vanessafvg thanks for your reply!

 

We have a lot of different setups depending on the usecase and audience, spanning from a small group of central users with access to all data to thousands of users world wide with local acccesses. I am currently looking into how we can optimize our tabular ssas datamodels so that we can leverage live connections in a better way (today there is a lot of importing) but seeing this key note made me wonder if there could be an upside in using the native datamodels directly in PowerBI instead. For example, we would not need dedicated IT teams developing and maintaining SSAS cubes and models if we used extracts from our datalake directly to PowerBI which is something that could be maintained by the data champions and analysts at the business side. It seems to me that using such approach would be a lot neater in terms of "self-service". Is there something important that I am missing here?

 

Capacity and data growth is of course a very important part of the question, but let's assume that this is not an issue for the sake of the discussion.  


Thanks!

Highlighted
Super User II
Super User II

Re: Native PowerBI data model vs SSAS Tabular: Change since new size limit announcement

@olofda  i think what you asking is a big question and difficult to answer as that solution requires pushing those skills to the self service users, so training and skill levels are vita, so its much bigger than just shifting technology, one needs to understand all the variables of your enviroment.

 

the best thing to do in my mind is to have a pilot group, do an end to end solution using this method and see what it brings up.   

 

the benefit of a cube vs pulling those extracts from the data lake everytime is you have a built in layer that has already interpreted all the business rules, so to recreate that everytime you create a power bi model (unless you use that model as a source to other models, but scalability might be an issue).

 

what are the issues you are experincing with the current SSAS?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Highlighted
New Member

Re: Native PowerBI data model vs SSAS Tabular: Change since new size limit announcement

Hi,

 

Thank you for your input and sorry for my late reply.

 

Yes I know it is a big question 🙂 I wouldn't say we have any issues with the current setup, I am more curious about if it is time to re-think the SSAS tabular models with the introduction of the 400GB native datamodels.

Right now we have dedicated IT resources who help us create tabular cubes according ot the business needs and requirements, but this is a long process. If we would use the native datamodels to become more self-serviced I completely agree that it would require pushing those skills onto the users, but here we would have a dedicated team on the business side who already are data champions and well connected with our data architecture. The benefit would be that we would much quicker be able to create new models and sources of analyses.

 

I guess what I am wondering is if there are some clear pros and cons with the two different setups from a technical perspective rather than the impact on knowledge and maintainability from a team/people/resource perspective. Makes sense?

 

Thanks again!

 

/Olof

 

 

 

 

Helpful resources

Announcements
Community Conference

Power Platform Community Conference

Find your favorite faces from the community presenting at the Power Platform Community Conference!

Upcoming Events

Experience what’s next for Power BI

See the latest Power BI innovations, updates, and demos from the Microsoft Business Applications Launch Event.

Upcoming Events

Community Summit North America – Join Online!

Join this community-driven Power Platform digital event for unbiased support and problem-solving.

Top Solution Authors