Livy with Spark 2.1.0 throwing exceptions

679 views
Skip to first unread message

Pranay Hasan Yerra

unread,
Apr 27, 2017, 8:38:50 AM4/27/17
to livy...@cloudera.org, Janki Akhani
Hi livy-devs,

I have setup livy to work with sparkmagic and have built livy with latest master. It's working with spark 1.6.3 but not with 2.1.0

I'm getting the following error. It was mentioned in the documentation that Livy uses reflection to avoid a dependency with Spark versions. What are the changes necessary to make it independent of Spark versions? If not, how do we fix this?

Thanks,
Pranay

Here are the logs:

17/04/27 05:18:27 INFO InteractiveSession$: Creating LivyClient for sessionId: 0
17/04/27 05:18:27 WARN RSCConf: The configuration key livy.rsc.driver_class has been deprecated as of Livy 0.4 and may be removed in the future. Please use the new key livy.rsc.driver-class instead.
17/04/27 05:18:27 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
17/04/27 05:18:27 DEBUG MultithreadEventLoopGroup: -Dio.netty.eventLoopThreads: 24
17/04/27 05:18:27 DEBUG PlatformDependent0: java.nio.Buffer.address: available
17/04/27 05:18:27 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
17/04/27 05:18:27 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
17/04/27 05:18:27 DEBUG PlatformDependent0: java.nio.Bits.unaligned: true
17/04/27 05:18:27 DEBUG PlatformDependent: Java version: 8
17/04/27 05:18:27 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
17/04/27 05:18:27 DEBUG PlatformDependent: sun.misc.Unsafe: available
17/04/27 05:18:27 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
17/04/27 05:18:27 DEBUG PlatformDependent: Javassist: unavailable
17/04/27 05:18:27 DEBUG PlatformDependent: You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes.  Please check the configuration for better performance.
17/04/27 05:18:27 DEBUG PlatformDependent: -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
17/04/27 05:18:27 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
17/04/27 05:18:27 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
17/04/27 05:18:27 DEBUG NioEventLoop: -Dio.netty.noKeySetOptimization: false
17/04/27 05:18:27 DEBUG NioEventLoop: -Dio.netty.selectorAutoRebuildThreshold: 512
17/04/27 05:18:27 DEBUG ThreadLocalRandom: -Dio.netty.initialSeedUniquifier: 0xcf9e526277c63b1f (took 0 ms)
17/04/27 05:18:27 DEBUG ByteBufUtil: -Dio.netty.allocator.type: unpooled
17/04/27 05:18:27 DEBUG ByteBufUtil: -Dio.netty.threadLocalDirectBufferSize: 65536
17/04/27 05:18:27 DEBUG NetUtil: Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
17/04/27 05:18:27 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128
17/04/27 05:18:27 WARN RSCConf: Your hostname, phasanye-ld1, resolves to a loopback address; using <x.x.x.x>  instead (on interface docker0)
17/04/27 05:18:27 WARN RSCConf: Set 'livy.rsc.rpc.server.address' if you need to bind to another address.
17/04/27 05:18:27 DEBUG RSCClient: Sending JobRequest[c094a599-b112-4510-96a7-21782182b1da].
17/04/27 05:18:27 INFO InteractiveSessionManager: Registering new session 0
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:28 WARN util.Utils: Your hostname, phasanye-ld1 resolves to a loopback address: 127.0.0.1; using <x.x.x.x> instead (on interface eth0))
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:28 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:28 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container))
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 INFO yarn.Client: Setting up container launch context for our AM)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 INFO yarn.Client: Setting up the launch environment for our AM container)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 INFO yarn.Client: Preparing resources for our AM container)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:29 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/tmp/spark-c624a032-944b-4ea5-a6ac-4730f1c0b6d6/__spark_libs__943027900489483945.zip -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/__spark_libs__943027900489483945.zip)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/home/phasanye/spnote/livy/rsc/target/jars/netty-all-4.0.29.Final.jar -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/netty-all-4.0.29.Final.jar)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/home/phasanye/spnote/livy/rsc/target/jars/livy-api-0.4.0-SNAPSHOT.jar -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/livy-api-0.4.0-SNAPSHOT.jar)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/home/phasanye/spnote/livy/rsc/target/jars/livy-rsc-0.4.0-SNAPSHOT.jar -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/livy-rsc-0.4.0-SNAPSHOT.jar)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/home/phasanye/spnote/livy/repl/scala-2.10/target/jars/livy-core_2.10-0.4.0-SNAPSHOT.jar -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/livy-core_2.10-0.4.0-SNAPSHOT.jar)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/home/phasanye/spnote/livy/repl/scala-2.10/target/jars/livy-repl_2.10-0.4.0-SNAPSHOT.jar -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/livy-repl_2.10-0.4.0-SNAPSHOT.jar)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/home/phasanye/spnote/livy/repl/scala-2.10/target/jars/commons-codec-1.9.jar -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/commons-codec-1.9.jar)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Uploading resource file:/tmp/spark-c624a032-944b-4ea5-a6ac-4730f1c0b6d6/__spark_conf__937611431824949776.zip -> hdfs://localhost:9000/user/phasanye/.sparkStaging/application_1489138851905_0050/__spark_conf__.zip)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO spark.SecurityManager: Changing view acls to: phasanye)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO spark.SecurityManager: Changing modify acls to: phasanye)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO spark.SecurityManager: Changing view acls groups to: )
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO spark.SecurityManager: Changing modify acls groups to: )
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(phasanye); groups with view permissions: Set(); users  with modify permissions: Set(phasanye); groups with modify permissions: Set())
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Submitting application application_1489138851905_0050 to ResourceManager)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO impl.YarnClientImpl: Submitted application application_1489138851905_0050)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: Application report for application_1489138851905_0050 (state: ACCEPTED))
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO yarn.Client: )
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 client token: N/A)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 diagnostics: N/A)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 ApplicationMaster host: N/A)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 ApplicationMaster RPC port: -1)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 queue: default)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 start time: 1493295511917)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 final status: UNDEFINED)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 tracking URL: http://phasanye-ld1:8088/proxy/application_1489138851905_0050/)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,	 user: phasanye)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO util.ShutdownHookManager: Shutdown hook called)
17/04/27 05:18:32 INFO LineBufferedStream: (stdout: ,17/04/27 05:18:31 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-c624a032-944b-4ea5-a6ac-4730f1c0b6d6)
17/04/27 05:18:32 DEBUG Client: The ping interval is 60000 ms.
17/04/27 05:18:32 DEBUG Client: Connecting to /0.0.0.0:8032
17/04/27 05:18:32 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye: starting, having connections 1
17/04/27 05:18:32 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #0
17/04/27 05:18:32 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #0
17/04/27 05:18:32 DEBUG ProtobufRpcEngine: Call: getApplications took 47ms
17/04/27 05:18:33 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #1
17/04/27 05:18:33 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #1
17/04/27 05:18:33 DEBUG ProtobufRpcEngine: Call: getApplicationReport took 3ms
17/04/27 05:18:33 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #2
17/04/27 05:18:33 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #2
17/04/27 05:18:33 DEBUG ProtobufRpcEngine: Call: getApplicationAttemptReport took 1ms
17/04/27 05:18:33 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #3
17/04/27 05:18:33 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #3
17/04/27 05:18:33 DEBUG ProtobufRpcEngine: Call: getContainerReport took 1ms
17/04/27 05:18:34 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #4
17/04/27 05:18:34 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #4
17/04/27 05:18:34 DEBUG ProtobufRpcEngine: Call: getApplicationReport took 1ms
17/04/27 05:18:34 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #5
17/04/27 05:18:34 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #5
17/04/27 05:18:34 DEBUG ProtobufRpcEngine: Call: getApplicationAttemptReport took 1ms
17/04/27 05:18:34 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #6
17/04/27 05:18:34 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #6
17/04/27 05:18:34 DEBUG ProtobufRpcEngine: Call: getContainerReport took 0ms
17/04/27 05:18:35 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #7
17/04/27 05:18:35 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #7
17/04/27 05:18:35 DEBUG ProtobufRpcEngine: Call: getApplicationReport took 1ms
17/04/27 05:18:35 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #8
17/04/27 05:18:35 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #8
17/04/27 05:18:35 DEBUG ProtobufRpcEngine: Call: getApplicationAttemptReport took 1ms
17/04/27 05:18:35 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #9
17/04/27 05:18:35 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #9
17/04/27 05:18:35 DEBUG ProtobufRpcEngine: Call: getContainerReport took 1ms
17/04/27 05:18:35 DEBUG ResourceLeakDetector: -Dio.netty.leakDetectionLevel: simple
17/04/27 05:18:35 DEBUG Recycler: -Dio.netty.recycler.maxCapacity.default: 262144
17/04/27 05:18:35 DEBUG KryoMessageCodec: Decoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
17/04/27 05:18:35 DEBUG Cleaner0: java.nio.ByteBuffer.cleaner(): available
17/04/27 05:18:35 DEBUG RpcServer$SaslServerHandler: Handling SASL challenge message...
17/04/27 05:18:35 DEBUG RpcServer$SaslServerHandler: Sending SASL challenge response...
17/04/27 05:18:35 DEBUG KryoMessageCodec: Encoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
17/04/27 05:18:35 DEBUG KryoMessageCodec: Decoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
17/04/27 05:18:35 DEBUG RpcServer$SaslServerHandler: Handling SASL challenge message...
17/04/27 05:18:35 DEBUG RpcServer$SaslServerHandler: Sending SASL challenge response...
17/04/27 05:18:35 DEBUG KryoMessageCodec: Encoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (45 bytes)
17/04/27 05:18:35 DEBUG RpcServer$SaslServerHandler: SASL negotiation finished with QOP auth.
17/04/27 05:18:35 DEBUG ContextLauncher: New RPC client connected from [id: 0x2b80e9f1, /172.23.209.80:34300 => /172.17.42.1:56185].
17/04/27 05:18:35 DEBUG KryoMessageCodec: Decoded message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/04/27 05:18:35 DEBUG KryoMessageCodec: Decoded message of type com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress (72 bytes)
17/04/27 05:18:35 DEBUG RpcDispatcher: [RegistrationHandler] Received RPC message: type=CALL id=0 payload=com.cloudera.livy.rsc.BaseProtocol$RemoteDriverAddress
17/04/27 05:18:35 DEBUG ContextLauncher: Received driver info for client [id: 0x2b80e9f1, /172.23.209.80:34300 => /172.17.42.1:56185]: 172.17.42.1/57097.
17/04/27 05:18:35 DEBUG KryoMessageCodec: Encoded message of type com.cloudera.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/04/27 05:18:35 DEBUG KryoMessageCodec: Encoded message of type com.cloudera.livy.rsc.rpc.Rpc$NullMessage (2 bytes)
17/04/27 05:18:35 DEBUG RpcDispatcher: Channel [id: 0x2b80e9f1, /172.23.209.80:34300 :> /172.17.42.1:56185] became inactive.
17/04/27 05:18:35 DEBUG KryoMessageCodec: Encoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (41 bytes)
17/04/27 05:18:35 DEBUG KryoMessageCodec: Decoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (98 bytes)
17/04/27 05:18:35 DEBUG Rpc$SaslClientHandler: Handling SASL challenge message...
17/04/27 05:18:35 DEBUG Rpc$SaslClientHandler: Sending SASL challenge response...
17/04/27 05:18:35 DEBUG KryoMessageCodec: Encoded message of type com.cloudera.livy.rsc.rpc.Rpc$SaslMessage (275 bytes)
javax.security.sasl.SaslException: Client closed before SASL negotiation finished.
	at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
	at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHandler.java:92)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(KryoMessageCodec.java:104)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:828)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:621)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)
17/04/27 05:18:35 ERROR RSCClient: Failed to connect to context.
javax.security.sasl.SaslException: Client closed before SASL negotiation finished.
	at com.cloudera.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:416)
	at com.cloudera.livy.rsc.rpc.SaslHandler.channelInactive(SaslHandler.java:92)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at com.cloudera.livy.rsc.rpc.KryoMessageCodec.channelInactive(KryoMessageCodec.java:104)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
	at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:828)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:621)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)
17/04/27 05:18:35 DEBUG RSCClient: Disconnected from context 58693a18-78ec-4734-921e-740ab943f845, shutdown = false.
17/04/27 05:18:36 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #10
17/04/27 05:18:36 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #10
17/04/27 05:18:36 DEBUG ProtobufRpcEngine: Call: getApplicationReport took 1ms
17/04/27 05:18:36 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #11
17/04/27 05:18:36 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #11
17/04/27 05:18:36 DEBUG ProtobufRpcEngine: Call: getApplicationAttemptReport took 1ms
17/04/27 05:18:36 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #12
17/04/27 05:18:36 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #12
17/04/27 05:18:36 DEBUG ProtobufRpcEngine: Call: getContainerReport took 1ms
17/04/27 05:18:37 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #13
17/04/27 05:18:37 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #13
17/04/27 05:18:37 DEBUG ProtobufRpcEngine: Call: getApplicationReport took 2ms
17/04/27 05:18:37 DEBUG InteractiveSession: InteractiveSession 0 app state changed from STARTING to FAILED
17/04/27 05:18:37 DEBUG InteractiveSession: InteractiveSession 0 session state change from starting to dead
17/04/27 05:18:37 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #14
17/04/27 05:18:37 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #14
17/04/27 05:18:37 DEBUG ProtobufRpcEngine: Call: getApplicationAttemptReport took 0ms
17/04/27 05:18:37 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye sending #15
17/04/27 05:18:37 DEBUG Client: IPC Client (1164322053) connection to /0.0.0.0:8032 from phasanye got value #15
17/04/27 05:18:37 DEBUG SparkYarnApp: application_1489138851905_0050 FAILED YARN Diagnostics: Application application_1489138851905_0050 failed 1 times due to AM Container for appattempt_1489138851905_0050_000001 exited with  exitCode: 15 For more detailed output, check application tracking page:http://phasanye-ld1:8088/proxy/application_1489138851905_0050/Then, click on links to logs of each attempt. Diagnostics: Exception from container-launch. Container id: container_1489138851905_0050_01_000001 Exit code: 15 Stack trace: ExitCodeException exitCode=15:  	at org.apache.hadoop.util.Shell.runCommand(Shell.java:575) 	at org.apache.hadoop.util.Shell.run(Shell.java:478) 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:766) 	at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212) 	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302) 	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82) 	at java.util.concurrent.FutureTask.run(FutureTask.java:266) 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 	at java.lang.Thread.run(Thread.java:745)   Container exited with a non-zero exit code 15 Failing this attempt. Failing the application.
^C17/04/27 05:18:42 INFO LivyServer: Shutting down Livy server.
17/04/27 05:18:42 INFO InteractiveSession: Stopping InteractiveSession 0...
17/04/27 05:18:42 DEBUG InteractiveSession: InteractiveSession 0 session state change from dead to shutting_down
17/04/27 05:18:42 DEBUG InteractiveSession: InteractiveSession 0 session state change from shutting_down to dead
17/04/27 05:18:42 INFO InteractiveSession: Stopped InteractiveSession 0.

Saisai Shao

unread,
Apr 28, 2017, 5:22:28 AM4/28/17
to Pranay Hasan Yerra, Livy Development, Janki Akhani
Our newly added features in Livy leverages SparkListener, whereas SparkListener's signature is changed between Spark 1.6 and Spark 2.0+, so this breaks the compatibility across the major version.

The only way you could do currently is to build with specific spark versions like: mvn package -Dspark-2.1 or mvn package -Dspark-1.6.

We will figure out a way to handle it.

Thanks
Jerry
 

--
You received this message because you are subscribed to the Google Groups "Livy Development" group.
To unsubscribe from this group and stop receiving emails from it, send an email to livy-dev+unsubscribe@cloudera.org.

Pranay Hasan Yerra

unread,
Apr 28, 2017, 5:55:44 AM4/28/17
to Saisai Shao, Livy Development, Janki Akhani
Thanks Jerry
Reply all
Reply to author
Forward
0 new messages