I have tested my spark streaming application in spark cluster with and without yarn
my kafka has 3 nodes and replication factor is 3.
when running cluster without yarn:
and if two Kafka node is down application is working fine
with yarn cluster if two Kafka node is down I am getting following error
INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (67.205.130.255:57588) with ID 19
16/10/12 15:25:59 INFO VerifiableProperties: Verifying properties
16/10/12 15:25:59 INFO VerifiableProperties: Property group.id is overridden to
16/10/12 15:25:59 INFO VerifiableProperties: Property zookeeper.connect is overridden to 159.203.189.82:9092,159.203.181.119:9092,162.243.164.18:9092
16/10/12 15:25:59 INFO BlockManagerMasterEndpoint: Registering block manager Yarn-04:33433 with 127.2 MB RAM, BlockManagerId(19, Yarn-04, 33433)
16/10/12 15:25:59 INFO SimpleConsumer: Reconnect due to socket error: java.nio.channels.ClosedChannelException
16/10/12 15:25:59 INFO SimpleConsumer: Reconnect due to socket error: java.nio.channels.ClosedChannelException
16/10/12 15:25:59 ERROR StreamingContext: Error starting the context, marking it as stopped
java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.streaming.DStreamGraph.validate(DStreamGraph.scala:163)
at org.apache.spark.streaming.StreamingContext.validate(StreamingContext.scala:513)
at org.apache.spark.streaming.StreamingContext.liftedTree1$1(StreamingContext.scala:573)
at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:572)
at com.mm.sa.StreamAnalyzerApp$.delayedEndpoint$com$mm$sa$StreamAnalyzerApp$1(StreamAnalyzerApp.scala:26)
at com.mm.sa.StreamAnalyzerApp$delayedInit$body.apply(StreamAnalyzerApp.scala:15)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at com.mm.sa.StreamAnalyzerApp$.main(StreamAnalyzerApp.scala:15)
at com.mm.sa.StreamAnalyzerApp.main(StreamAnalyzerApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/10/12 15:25:59 ERROR ApplicationMaster: User class threw exception: java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
at scala.Predef$.require(Predef.scala:224)
Thanks
Vipin