cassandra - NoHostAvailableException while running spark with dse -


i using datastax 5.1 version cassandra in local machine. started cassandra using

dse cassandra -k 

cassandra booted fine. next wanted go spark shell using

dse spark 

however, giving me following errors.

2017-08-21 12:11:25 [main] error o.a.s.d.dsesparksubmitbootstrapper - failed start or submit spark application because of com.datastax.driver.core.exceptions.nohostavailableexception: host(s) tried query failed (no host tried) - see details in log file(s): /home/rsahukar/.spark-shell.log com.datastax.driver.core.exceptions.nohostavailableexception: host(s) tried query failed (no host tried)     @ com.datastax.driver.core.exceptions.nohostavailableexception.copy(nohostavailableexception.java:75) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.exceptions.nohostavailableexception.copy(nohostavailableexception.java:28) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.driverthrowables.propagatecause(driverthrowables.java:28) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.defaultresultsetfuture.getuninterruptibly(defaultresultsetfuture.java:236) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.abstractsession.execute(abstractsession.java:59) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.abstractsession.execute(abstractsession.java:42) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.dse.defaultdsesession.execute(defaultdsesession.java:232) ~[dse-java-driver-core-1.2.2.jar:na]     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) ~[na:1.8.0_131]     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) ~[na:1.8.0_131]     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) ~[na:1.8.0_131]     @ java.lang.reflect.method.invoke(method.java:498) ~[na:1.8.0_131]     @ com.datastax.spark.connector.cql.sessionproxy.invoke(sessionproxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]     @ com.sun.proxy.$proxy6.execute(unknown source) ~[na:na]     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) ~[na:1.8.0_131]     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) ~[na:1.8.0_131]     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) ~[na:1.8.0_131]     @ java.lang.reflect.method.invoke(method.java:498) ~[na:1.8.0_131]     @ com.datastax.spark.connector.cql.sessionproxy.invoke(sessionproxy.scala:40) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]     @ com.sun.proxy.$proxy7.execute(unknown source) ~[na:na]     @ com.datastax.bdp.util.rpc.rpcutil.call(rpcutil.java:42) ~[dse-core-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$$anonfun$fetch$1.apply(sparknodeconfiguration.scala:54) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$$anonfun$fetch$1.apply(sparknodeconfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2]     @ com.datastax.spark.connector.cql.cassandraconnector$$anonfun$withsessiondo$1.apply(cassandraconnector.scala:112) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]     @ com.datastax.spark.connector.cql.cassandraconnector$$anonfun$withsessiondo$1.apply(cassandraconnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]     @ com.datastax.spark.connector.cql.cassandraconnector.closeresourceafteruse(cassandraconnector.scala:145) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]     @ com.datastax.spark.connector.cql.cassandraconnector.withsessiondo(cassandraconnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.3.jar:2.0.3]     @ org.apache.spark.deploy.sparknodeconfiguration$.fetch(sparknodeconfiguration.scala:52) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$.fetch(sparknodeconfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$.fetch(sparknodeconfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$.fetch(sparknodeconfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$.fetch(sparknodeconfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$.fetch(sparknodeconfiguration.scala:81) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparknodeconfiguration$.apply(sparknodeconfiguration.scala:44) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparkconfigurator$$anonfun$8.apply(sparkconfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparkconfigurator$$anonfun$8.apply(sparkconfigurator.scala:85) ~[dse-spark-5.1.2.jar:5.1.2]     @ scala.util.try$.apply(try.scala:192) ~[scala-library-2.11.11.jar:na]     @ com.datastax.bdp.util.lazy.internal$lzycompute(lazy.scala:26) ~[dse-spark-5.1.2.jar:5.1.2]     @ com.datastax.bdp.util.lazy.internal(lazy.scala:25) ~[dse-spark-5.1.2.jar:5.1.2]     @ com.datastax.bdp.util.lazy.get(lazy.scala:31) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparkconfigurator.dsedriverprops$lzycompute(sparkconfigurator.scala:152) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparkconfigurator.dsedriverprops(sparkconfigurator.scala:151) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparkconfigurator.dsesparkconfentries$lzycompute(sparkconfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.sparkconfigurator.dsesparkconfentries(sparkconfigurator.scala:124) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.dsesparkargspreprocessor.updatedargs$lzycompute(dsesparkargspreprocessor.scala:79) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.dsesparkargspreprocessor.updatedargs(dsesparkargspreprocessor.scala:68) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.dsesparksubmitbootstrapper$.main(dsesparksubmitbootstrapper.scala:106) ~[dse-spark-5.1.2.jar:5.1.2]     @ org.apache.spark.deploy.dsesparksubmitbootstrapper.main(dsesparksubmitbootstrapper.scala) [dse-spark-5.1.2.jar:5.1.2] caused by: com.datastax.driver.core.exceptions.nohostavailableexception: host(s) tried query failed (no host tried)     @ com.datastax.driver.core.requesthandler.reportnomorehosts(requesthandler.java:204) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.requesthandler.access$1000(requesthandler.java:40) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.requesthandler$speculativeexecution.findnexthostandquery(requesthandler.java:268) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.requesthandler.startnewexecution(requesthandler.java:108) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.requesthandler.sendrequest(requesthandler.java:88) ~[dse-java-driver-core-1.2.2.jar:na]     @ com.datastax.driver.core.sessionmanager.executeasync(sessionmanager.java:124) ~[dse-java-driver-core-1.2.2.jar:na]     ... 43 common frames omitted 2017-08-21 12:11:25 [thread-1] error o.a.s.d.dsesparksubmitbootstrapper - failed cancel delegation token 

below dsetool ring output

$ dsetool ring address          dc                   rack         workload             graph  status  state    load             owns                 token                                        health [0,1]  127.0.0.1        analytics            rack1        analytics(sm)        no          normal   189.19 kib       ?                    5643405743002698980                          0.50          

can me?

finally found mistake. running cassandra in local mode. , spark conf file (spark-defaults.conf) before change

.... spark.cassandra.connection.local_dc     localhost spark.cassandra.connection.host         localhost .... 

please note spark.cassandra.connection.local_dc value. since running in local mode, thought, value should localhost too. but, should dc name dsetool ring returns.

below dsetool ring output

$ dsetool ring address          dc                   rack         workload             graph  status  state    load             owns                 token                                        health [0,1]  127.0.0.1        analytics            rack1        analytics(sm)        no          normal   189.19 kib       ?                    5643405743002698980                          0.50          

as can see above, dc value analytics. so, had put same value in spark conf file. below code after change

spark.cassandra.connection.local_dc     analytics spark.cassandra.connection.host         localhost 

Comments

Popular posts from this blog

ubuntu - PHP script to find files of certain extensions in a directory, returns populated array when run in browser, but empty array when run from terminal -

php - How can i create a user dashboard -

javascript - How to detect toggling of the fullscreen-toolbar in jQuery Mobile? -