java.lang.NoSuchMethodError: io.netty.util.AttributeKey.value for google-cloud-pubsub in a Spark app

969 views
Skip to first unread message

ryan.t...@dowjones.com

unread,
Dec 13, 2016, 12:19:42 PM12/13/16
to Google Cloud Dataproc Discussions
I'm getting 

java.lang.NoSuchMethodError: io.netty.util.AttributeKey.valueOf(Ljava/lang/Class;Ljava/lang/String;)Lio/netty/util/AttributeKey;
  at io.grpc.netty.Utils.<clinit>(Utils.java:85)
  at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.<init>(NettyChannelBuilder.java:311)
  at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.<init>(NettyChannelBuilder.java:280)
  at io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:230)
  at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:239)
  at com.google.api.gax.grpc.InstantiatingChannelProvider.createChannel(InstantiatingChannelProvider.java:119)
  at com.google.api.gax.grpc.InstantiatingChannelProvider.getChannel(InstantiatingChannelProvider.java:106)
  at com.google.api.gax.grpc.ProviderManager.getChannel(ProviderManager.java:106)
  at com.google.api.gax.grpc.ChannelAndExecutor.create(ChannelAndExecutor.java:67)
  at com.google.api.gax.grpc.ClientSettings.getChannelAndExecutor(ClientSettings.java:78)
  at com.google.cloud.pubsub.spi.v1.PublisherClient.<init>(PublisherClient.java:182)
  at com.google.cloud.pubsub.spi.v1.PublisherClient.create(PublisherClient.java:173)
  at com.google.cloud.pubsub.spi.DefaultPubSubRpc.<init>(DefaultPubSubRpc.java:168)
  at com.google.cloud.pubsub.PubSubOptions$DefaultPubSubRpcFactory.create(PubSubOptions.java:69)
  at com.google.cloud.pubsub.PubSubOptions$DefaultPubSubRpcFactory.create(PubSubOptions.java:63)
  at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:478)
  at com.google.cloud.pubsub.PubSubImpl.<init>(PubSubImpl.java:115)
  at com.google.cloud.pubsub.PubSubOptions$DefaultPubSubFactory.create(PubSubOptions.java:44)
  at com.google.cloud.pubsub.PubSubOptions$DefaultPubSubFactory.create(PubSubOptions.java:39)
  at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:465)
  ... 27 elided

When I invoke

pubSubOptions.getService()

I assume this is because Spark and Google PubSub use two different versions of io.netty.util.AttributeKey.

Thanks for your help!




ryan.t...@dowjones.com

unread,
Dec 13, 2016, 12:56:55 PM12/13/16
to Google Cloud Dataproc Discussions
I fixed this problem by upgrading the spark jar from netty-all-4.0.29.Final.jar to netty-all-4.1.3.Final.jar

Nicolas Phung

unread,
Jan 10, 2017, 8:21:33 AM1/10/17
to Google Cloud Dataproc Discussions
Hello Ryan,

How did you manage to fix this ? I've tried replacing the /usr/lib/spark/jars/netty-all-4.0.29.Final.jar by /usr/lib/spark/jars/netty-all-4.1.3.Final.jar. But I got another stacktrace :

17/01/10 14:11:10 ERROR TransportRequestHandler: Error sending result StreamResponse{streamId=/jars/pubsub-assembly-0.1-SNAPSHOT.jar, byteCount=121539799, body=FileSegmentManagedBuffer{file=/tmp/assembly-0.1-SNAPSHOT.jar, offset=0, length=121539799}} to /10.53.42.116:54396; closing connection
java.lang.AbstractMethodError
at io.netty.util.ReferenceCountUtil.touch(ReferenceCountUtil.java:73)
at io.netty.channel.DefaultChannelPipeline.touch(DefaultChannelPipeline.java:107)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:820)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:733)
at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:111)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:748)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:740)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:826)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:733)
at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:284)
at io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:748)
at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:811)
at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:824)
at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:804)
at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:841)
at io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:1032)
at io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:296)
at org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:194)
at org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:150)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:111)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:350)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:372)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:358)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:571)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:512)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:426)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:398)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:877)
at java.lang.Thread.run(Thread.java:745)
17/01/10 14:11:10 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /10.53.42.116:46674 is closed

Reply all
Reply to author
Forward
0 new messages