Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
Hello,
We are exploring some of the features of Fabric offering. We are trying to evaluate the capabilities of Spark engine between Fabric data engineering and databricks. The approach was to configure ETL scripts(pyspark) in Jmeter to be run against both the platforms. We are struggling to get around establishing programmatic connection with the Lakehouse artefact. While there is enough literature around gettting the SQL endpoint of a Lakehouse to connect to be able to run sql queries, we couldn't find much info around establishing connection to Lakehouse the way we could connect to a databricks instance through JDBC. Kindly provide any insight around this. Thank you!
Solved! Go to Solution.
Hi @Shivakumar1,
When you mean programatically, you want to mount the Fabric Lakehouse to a Spark/Databricks environment that is external to Fabric right?
In this link, you have the option to configure lakehouse to read and write data from Databricks. It needs credential pass-through to be enabled.
https://learn.microsoft.com/en-us/fabric/onelake/onelake-azure-databricks
Hello,
Yes, we found a way around this by writing a python script that, using multi threading, triggers all the scripts simultaneously. It establishes ODBS connection and also logs the run time for each query into a text file. We weren't able to configure Jmeter through JDBC, though.
Thank you so much @govindarajan_d for the response. The use case was more like authoring ETL steps run against a Lakehouse and executing these ETL steps through an external application like Jmeter to evaluate the spark compute performance. Please let me know if this makes sense or If I am missing something.
Hi @Shivakumar1
Following up on the status of your query and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.
Thanks.
Hi @Shivakumar1
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others.
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread.
Thanks.
Hi @Shivakumar1,
When you mean programatically, you want to mount the Fabric Lakehouse to a Spark/Databricks environment that is external to Fabric right?
In this link, you have the option to configure lakehouse to read and write data from Databricks. It needs credential pass-through to be enabled.
https://learn.microsoft.com/en-us/fabric/onelake/onelake-azure-databricks
Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.
Ask questions in Data Engineering, Data Science, Data Warehouse and General Discussion.
Ask questions in Eventhouse and KQL, Eventstream, and Reflex.
User | Count |
---|---|
5 | |
2 | |
2 | |
1 | |
1 |
User | Count |
---|---|
10 | |
5 | |
4 | |
3 | |
3 |