Exception while starting java spark-streaming application -
the application throws java.lang.nosuchmethodexception
stacktrace
dagscheduler: failed run runjob @ receivertracker.scala:275 exception in thread "thread-33" org.apache.spark.sparkexception: job aborted due stage failure: task 0 in stage 6.0 failed 4 times, recent failure: lost task 0.3 in stage 6.0 (tid 77, 172.20.7.60): java.lang.nosuchmethodexception: org.apache.spark.examples.streaming.kafkakeydecoder.<init>(kafka.utils.verifiableproperties) java.lang.class.getconstructor0(class.java:2810) java.lang.class.getconstructor(class.java:1718) org.apache.spark.streaming.kafka.kafkareceiver.onstart(kafkainputdstream.scala:106) org.apache.spark.streaming.receiver.receiversupervisor.startreceiver(receiversupervisor.scala:121) org.apache.spark.streaming.receiver.receiversupervisor.start(receiversupervisor.scala:106) org.apache.spark.streaming.scheduler.receivertracker$receiverlauncher$$anonfun$9.apply(receivertracker.scala:264) org.apache.spark.streaming.scheduler.receivertracker$receiverlauncher$$anonfun$9.apply(receivertracker.scala:257) org.apache.spark.sparkcontext$$anonfun$runjob$4.apply(sparkcontext.scala:1121) org.apache.spark.sparkcontext$$anonfun$runjob$4.apply(sparkcontext.scala:1121) org.apache.spark.scheduler.resulttask.runtask(resulttask.scala:62) org.apache.spark.scheduler.task.run(task.scala:54) org.apache.spark.executor.executor$taskrunner.run(executor.scala:177) java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor.java:1145) java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor.java:615) java.lang.thread.run(thread.java:745)
seems issue has been fixed in spark 1.1.0 per link
spark : 1.1.0 kafka : 0.8.1.1
in case, explained comment, removing library conflicts able consume info kafka , store cassandra, deploying job datastax analytics solution. have found different open source 1 streaming_kafka jar , scala libraries included executor class path.
so suggest following:
make sure using same version of scala compiler spark. make sure kafka , streaming_kafka jars compiled same version. check if scala libraries included in executor classpath , not include them in package.i have assumed building uber jar seek deploy.
java apache-spark apache-kafka spark-streaming
No comments:
Post a Comment