Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
varunaggarwal30
Frequent Visitor

Using Azure Analysis Service for Data Model

Hi All,

 

I have created a data model inside power bi for a large organization. Currently , it supports all data limits and works fine. For scalability purpose, I am thinking to move the data model out of power bi. Now I can create a SSAS package and move to AAS package and connect power bi from there but there is lot of rework required for data mashup using SSIS. 

 

I am using SSDT with 1400 compatiblity and DataLake as my source. SSDT crashes at times and requires login information everytime when data refreshes. Anyone tried this process or is there a better way to work with ?

 

 

Thanks,

Varun Aggarwal

3 REPLIES 3
v-qiuyu-msft
Community Support
Community Support

Hi @varunaggarwal30,

 

You can follow this article to import Power BI data model to SSAS 2016 tabular. 

 

By the way, you can vote this idea: Import PowerBI Desktop model into SSAS Analysis Services Tabular

 

Best Regards,
Qiuyun Yu 

Community Support Team _ Qiuyun Yu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

@v-qiuyu-msft. I am able to load the model in SSAS and using SSDT can deploy it to Azure Analysis Services. My question is if we can automate the data refresh strategies once model deploys to AAS and considering the data source as ADLS. I tried to refresh the data but everytime it asks for credential to connect to data lake

varunaggarwal30
Frequent Visitor

Hi All,

 

I have created a data model inside power bi for a large organization. Currently , it supports all data limits and works fine. For scalability purpose, I am thinking to move the data model out of power bi. Now I can create a SSAS package and move to AAS package and connect power bi from there but there is lot of rework required for data mashup using SSIS. 

 

I am using SSDT with 1400 compatiblity and DataLake as my source. SSDT crashes at times and requires login information everytime when data refreshes. Anyone tried this process or is there a better way to work with ?

 

 

Thanks,

Varun Aggarwal

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors