Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
danextian
Super User
Super User

Cannot add object to new semantic model

Hello,

 

I created a  delta table but I cannot add it to a new semantic model. There are no objects to select from in the New semantic model dialogue box. I also used the SQL connection string but this delta table is not there as well.

danextian_0-1713708517232.png

 










Did I answer your question? Mark my post as a solution!


Proud to be a Super User!









"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.
1 ACCEPTED SOLUTION
danextian
Super User
Super User

Ok. After several testing, it appears that a delta table created from a sempy dataframe cannot be added to a sql endpoint either because it is not supported or it is a bug.

To give you a brief background, I have a very large pbix file imported  from a data source that's been deprecated.  The data is still in the pbix but it needs to be further transformed and connecting XMLA endpoint has proven to be very slow to the point of a capacity error.  The data can be exported to CSV but will require a lot of computing power to materialize within Power BI so exporting it at once using DAX Studio is out of question  and doing it in chunks can be very tedious.

So I loaded the delta table to a notebook using spark sql and then saved it back as a delta table and voila, it now appears as a sql endpoint object and can be queried.

To those who has the same use case, here's the cleaned notebook code:

 

#convert from a semantic model to a sempy data frame
from sempy import fabric as FabricDataFrame
sempy_dataframe_name = FabricDataFrame.read_table(workspace="workspace name, not id", dataset="semantic model or dataset name, not id", table="table name in the model")
# rename columns so there are no spaces
column_mappings = {'colum name': 'column_name'}

# Rename columns using the mapping dictionary
sempy_dataframe_name.rename(columns=column_mappings, inplace=True)
from pyspark.sql import SparkSession

# Create a SparkSession
spark = SparkSession.builder \
    .appName("Convert DataFrame to PySpark DataFrame") \
    .getOrCreate()

#convert sempy dataframe to spark df
df_spark = spark.createDataFrame(sempy_dataframe_name)
# Specify the delta table name and path
delta_table_name = "delta table name"
delta_table_path = "Tables/" + delta_table_name

# Write Spark DataFrame to Delta Table
df_spark.write.format("delta").save(delta_table_path)

 

 










Did I answer your question? Mark my post as a solution!


Proud to be a Super User!









"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.

View solution in original post

6 REPLIES 6
danextian
Super User
Super User

Ok. After several testing, it appears that a delta table created from a sempy dataframe cannot be added to a sql endpoint either because it is not supported or it is a bug.

To give you a brief background, I have a very large pbix file imported  from a data source that's been deprecated.  The data is still in the pbix but it needs to be further transformed and connecting XMLA endpoint has proven to be very slow to the point of a capacity error.  The data can be exported to CSV but will require a lot of computing power to materialize within Power BI so exporting it at once using DAX Studio is out of question  and doing it in chunks can be very tedious.

So I loaded the delta table to a notebook using spark sql and then saved it back as a delta table and voila, it now appears as a sql endpoint object and can be queried.

To those who has the same use case, here's the cleaned notebook code:

 

#convert from a semantic model to a sempy data frame
from sempy import fabric as FabricDataFrame
sempy_dataframe_name = FabricDataFrame.read_table(workspace="workspace name, not id", dataset="semantic model or dataset name, not id", table="table name in the model")
# rename columns so there are no spaces
column_mappings = {'colum name': 'column_name'}

# Rename columns using the mapping dictionary
sempy_dataframe_name.rename(columns=column_mappings, inplace=True)
from pyspark.sql import SparkSession

# Create a SparkSession
spark = SparkSession.builder \
    .appName("Convert DataFrame to PySpark DataFrame") \
    .getOrCreate()

#convert sempy dataframe to spark df
df_spark = spark.createDataFrame(sempy_dataframe_name)
# Specify the delta table name and path
delta_table_name = "delta table name"
delta_table_path = "Tables/" + delta_table_name

# Write Spark DataFrame to Delta Table
df_spark.write.format("delta").save(delta_table_path)

 

 










Did I answer your question? Mark my post as a solution!


Proud to be a Super User!









"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.

Hi @danextian ,

It was great to know that you were able to get to a resolution . We expect you to keep using this forum and also motivate others to do that same . You can always help other community members by answering to their queries

frithjof_v
Continued Contributor
Continued Contributor

I never experienced that myself. I guess there must be some technical issue with this Lakehouse or something. Because you cannot find the table in the SQL Analytics Endpoint, it sounds to me like there is some problem with the metadata sync between the Lakehouse and SQL Analytics Endpoint.

 

How did you create the Lakehouse and the table? What method did you use to populate the table with data? Are you able to query the table from a Notebook?

 

But to be honest, I would just try to create a new table, or even create a new lakehouse, or even create a new workspace, and see if that works. Hopefully that will solve your issue.

 

This may not be a satisfactory answer.

If you want to understand the root cause, I hope someone else can help you with that.

frithjof_v
Continued Contributor
Continued Contributor

Did you click the refresh icon next to the search bar?

 

How about if you open the SQL analytics endpoint, is the table there?

(Or in the default direct lake semantic model)

Hi @frithjof_v ,

 

Thanks for the response. I clicked the refreshed icon several times as refreshed the web page. It is not the default semantic model as it wasn't automatic by default but there was an option to add an object so I tried it but the choices were empty. It is not in SQL analytics endpoint either or from the Lakehouse connector.










Did I answer your question? Mark my post as a solution!


Proud to be a Super User!









"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.

Hi @danextian ,

Thanks for using Fabric Community.
Yeah some times due to some metadata sync issue or some technical issue we might face it.
Can you please try creating a new delta table or new lakehouse as suggested by @frithjof_v ?

Please let us know if it works, or else we will try to debug the scenario.

Helpful resources

Announcements
April Fabric Update Carousel

Fabric Monthly Update - April 2024

Check out the April 2024 Fabric update to learn about new features.

Top Kudoed Authors