We've bulit a main Tabular model in SSAS but our end user would like to have the ability to add more adhoc dimensions per their own need, so we really need to import the Tabular model into PowerBI.
However, when we import the Model, the tables are merged into one, all the fields and measures are mixed up as well.
Also, we encounter lots of "out of memory" error too.
I'm wondering if there is any work around it. If there is no alternative solution, we'll have to give up PBI.
As your requirement, you would not consider using a live connection, right?
When you import too large data to Power BI Desktop, it may limit the speed of running but not the dataset capacity.
"out of memory" error occurs since consuming much memory when running the program.
What’s your current environment and RAM, would you like to improve it to achieve a better performance?
Additionally, in order to improve usability and performance, you should also consider how to properly use naming convention and what columns to include before loading.
This article provides a quick list of best practices valid for Power BI.
I don't wanna jump to RAM part actually at this moment. That part I know how to work around it already.
My top priority is to understand why "Import" has a different working structure as "Live Connect"
For example, in my Tabular I have Table A, Table B and Table C. when I connect lively, what I see in the field list is a structured view, just like what I have in the model. Table A, Table B, Table C, with all fields starting with a sigma sign be my measures.
However, when I do import, there will be only one table loaded, all the measures and columns are mixed in it, basically my structure is completely messed up. Also most importantly, I wish to see my data in relashionship tab showing as different tables with relation line in between. I'm not sure if that's not possible.
It seems to be default behavior of import mode of SSAS tabular in Power BI currently. An alternative solution could be importing the tables one by one. If you have three tables, import them by three times. Finally, you will get three tables.
Loading table seperately doesn't solve the problem because after the table loading, I can't recreate the relations. It keeps telling me need unique value to establish the relationship where I know the value in the table is unique.
I have noticed that it adds a Null row on import which violates the uniqueness check when creating relationships. If you filter the NULL/Blank row for the primary key table in the editor it will allow the join.
Despite that workaround, I am very confused as to why the Import mode would not utilize the relationships defined in the model to create result sets consistent with a live connection.