Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
I need to create a DataFlow within the PBI reporting service that connects to a view built in Azure Databricks.
There isn't an Azure Databricks connector, so there doesn't appear to be a straight forward method to connect to these two Microsoft products.
Are there any workarounds, or other options available?
Solved! Go to Solution.
Found a workaround:
Use the 'Spark other' connector.
Set 'Server' as: https://SERVER_HOSTNAME:443/HTTP_PATH where SERVER_HOSTNAME and HTTP_PATH have been replaced appropriately.
Set 'Protocol' as: HTTP.
Set 'Username' as 'token'.
Set 'Password' as your token.
A guide to getting your server hostname and http path can be found here:
https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html
A guide to creating a token can be found here:
https://docs.databricks.com/dev-tools/api/latest/authentication.html
Found a workaround:
Use the 'Spark other' connector.
Set 'Server' as: https://SERVER_HOSTNAME:443/HTTP_PATH where SERVER_HOSTNAME and HTTP_PATH have been replaced appropriately.
Set 'Protocol' as: HTTP.
Set 'Username' as 'token'.
Set 'Password' as your token.
A guide to getting your server hostname and http path can be found here:
https://docs.databricks.com/integrations/bi/jdbc-odbc-bi.html
A guide to creating a token can be found here:
https://docs.databricks.com/dev-tools/api/latest/authentication.html
Hey @StupidUsername,
connect to the view from Azure Databricks inside Power BI Desktop, then copy the M script from the Advanced Editor and paste the script into the Blank query template inside the dataflow site.
Maybe this will work, if there is no dialog, doesn't mean Azure Databricks isn't supported from dataflows.
Hopefully, this will provide what you are looking for.
Regards,
Tom