Showing results for 
Search instead for 
Did you mean: 
Regular Visitor

PBI Desktop connection to Apache Spark

Hi. Can someone guide me how to connect PBI Desktop to APACHE SPARK installed on a local windows machine? What should be the server details I should pass?  

I have seen guidance for databricks & hdinsight but I need guidance for connection to local machine installed spark here. Any help is appreciable.

Community Support
Community Support

Hi @jainayush007 ,

You could try the following steps.

1. Add below properties to /home/user/spark/conf/hive-site.xml and /home/user/apachehive/conf/hive-site.xml

<name> hive.server2.transport.mode </name>
<value> http </value>
<name> hive.server2.thrift.http.port </name>
<value> 10001 </value>
<name>  hive.server2.http.endpoint </name>
<value> cliservice </value>

2. Start hive metastore => start hive => start spark thriftserver
The commands are available below

cd /home/user/apachehive/bin/  //get into the directory of hive
./hive --service metastore& //start metastore
./hive //to start hive
cd /home/user/spark/sbin/ //get into the directory of spark
./start-thriftserver //to start spark thrift server

3. Open the powerbi and click on get data after that go to the spark and click on connect button.


4. Now in the server put http://hostipaddres:10001/cliservice

Select HTTP in Protocol

In Data Connectivity mode check on DirectQuery (means dont want to import data just directly work on it ) or Import (means import the data and then work on it )


5. Sign in your account and then choose the tables which you want  and load it.


Reference this: Connect PowerBI with Spark


Best Regards,

Xue Ding

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Best Regards,
Xue Ding
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Mine is a windows installation. Given paths seems to be for linux? I cant see a hive-site.xml file under /spark/conf folder. I also dont see a /apachehive/conf folder in my spark installtion. My spark installation is the latest version of spark release available. Please guide.

Any suggestions here pls?

Helpful resources

PBI User Groups

Welcome to the User Group Public Preview

Check out new user group experience and if you are a leader please create your group!

MBAS on Demand

Microsoft Business Applications Summit sessions

On-demand access to all the great content presented by the product teams and community members! #MSBizAppsSummit #CommunityRocks

MBAS Attendee Badge

Claim Your Badge & Digital Swag!

Check out how to claim yours today!

Top Solution Authors
Top Kudoed Authors