Getting RemoteException

70 views
Skip to first unread message

deepak singh

unread,
Jan 14, 2016, 7:29:40 PM1/14/16
to gobblin-users


Trying to run gobblin-mapreduce with wikipedia.pull but getting RemoteException
Hadoop Version : Hadoop 2.6.0-cdh5.5.0
Built gobblin : ./gradlew clean build -PuseHadoop2 -PhadoopVersion=2.6.0 -x test

------Output ----------

$ ./bin/gobblin-mapreduce.sh --conf ~/wikipedia.pull 

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/home/deepak_singh/gobblin/gobblin-dist/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p1108.867/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4

at org.apache.hadoop.ipc.Client.call(Client.java:1113)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)

at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)

at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)

at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)

at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)

at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)

at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)

at gobblin.runtime.JobContext.<init>(JobContext.java:104)

at gobblin.runtime.AbstractJobLauncher.<init>(AbstractJobLauncher.java:117)

at gobblin.runtime.mapreduce.MRJobLauncher.<init>(MRJobLauncher.java:130)

at gobblin.runtime.mapreduce.CliMRJobLauncher.run(CliMRJobLauncher.java:60)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

at gobblin.runtime.mapreduce.CliMRJobLauncher.main(CliMRJobLauncher.java:133)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.hadoop.util.RunJar.main(RunJar.java:160)


Issac Buenrostro

unread,
Jan 14, 2016, 7:36:10 PM1/14/16
to deepak singh, gobblin-users
Try building Gobblin with Hadoop version 2.6.0-cdh5.5.0


See
https://github.com/linkedin/gobblin/wiki/FAQs#how-do-i-compile-gobblin-against-cdh


--
You received this message because you are subscribed to the Google Groups "gobblin-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gobblin-user...@googlegroups.com.
To post to this group, send email to gobbli...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gobblin-users/150617bf-c2ee-406c-a743-7968ab7b2beb%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

deepak singh

unread,
Jan 14, 2016, 8:54:13 PM1/14/16
to gobblin-users, deepa...@gmail.com
Built gobblin with 2.6.0-cdh5.5.0.

Now I am getting following exception --

$ ./bin/gobblin-mapreduce.sh --conf ~/wikipedia.pull 

Exception in thread "main" java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation

at org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:225)

at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2603)

at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2613)

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2637)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:93)

at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2680)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2662)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:379)

at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:350)

at org.apache.hadoop.util.GenericOptionsParser.validateFiles(GenericOptionsParser.java:398)

at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:288)

at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:485)

at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)

at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)

at gobblin.runtime.mapreduce.CliMRJobLauncher.main(CliMRJobLauncher.java:88)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Prashant Bhardwaj

unread,
Jan 15, 2016, 12:13:06 PM1/15/16
to gobblin-users, deepa...@gmail.com
Make sure you're giving full namenode address to fs.uri like hdfs:\\<namenode>:<port>. Also can you post output of hadoop version from your cluster.

deepak singh

unread,
Jan 15, 2016, 1:06:54 PM1/15/16
to Prashant Bhardwaj, gobblin-users
Command to build

$ ./gradlew build -PuseHadoop2 -PhadoopVersion=2.6.0-cdh5.5.0 -x test

Namenode address in gobblin-mapreduce.properties : 

# File system URIs

fs.uri=hdfs://hostname:8020

Hadoop Version -

$ hadoop version

Hadoop 2.6.0-cdh5.5.0

Subversion http://github.com/cloudera/hadoop -r fd21232cef7b8c1f536965897ce20f50b83ee7b2

Compiled by jenkins on 2015-11-09T20:37Z

Compiled with protoc 2.5.0

From source with checksum 98e07176d1787150a6a9c087627562c

This command was run using /opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p1108.867/jars/hadoop-common-2.6.0-cdh5.5.0.jar


deepak singh

unread,
Jan 15, 2016, 1:27:22 PM1/15/16
to gobblin-users, prashantb...@gmail.com
Prasanth, I am using correct namenode address.

Anyone used gobblin with 2.6.0-cdh5-5.0 ?
Any pother pointer ?  appreciate your response guys.

deepak singh

unread,
Jan 15, 2016, 2:04:40 PM1/15/16
to gobblin-users, prashantb...@gmail.com
Other observation, I see hadoop-core-1.2.1.jar in the gobblin lib directory, is it expected ?
Other hadoop jars are from 2.6.0-cdh5.5.0.

-bash-4.1$ pwd

/home/deepak_singh/gobblin/gobblin-dist/lib

-bash-4.1$ ls -la hadoop-*

-rw-r--r-- 1 deepak_singh     21534 Jan 14 17:02 hadoop-annotations-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh     73929 Jan 14 17:02 hadoop-auth-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh   3388062 Jan 14 17:02 hadoop-common-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh   4203713 Jan 12 15:47 hadoop-core-1.2.1.jar

-rw-r--r-- 1 deepak_singh  10059555 Jan 14 17:02 hadoop-hdfs-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh   752967 Jan 14 17:27 hadoop-mapreduce-client-common-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh  1532365 Jan 14 17:02 hadoop-mapreduce-client-core-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh   1901625 Jan 14 17:02 hadoop-yarn-api-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh   153964 Jan 14 17:28 hadoop-yarn-client-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh  1545686 Jan 14 17:02 hadoop-yarn-common-2.6.0-cdh5.5.0.jar

-rw-r--r-- 1 deepak_singh apple_ga   318182 Jan 14 17:30 hadoop-yarn-server-common-2.6.0-cdh5.5.0.jar

Issac Buenrostro

unread,
Jan 15, 2016, 2:18:48 PM1/15/16
to deepak singh, gobblin-users, Prashant Bhardwaj
Can you run a clean before building?
./gradlew clean build

Hopefully that will remove the additional jar.

deepak singh

unread,
Jan 15, 2016, 2:46:01 PM1/15/16
to gobblin-users, deepa...@gmail.com, prashantb...@gmail.com
Tried clean before building with 2.6.0-cdh5.5.0.
lib still contains same hadoop jar.

Any other pointer guys ?

deepak singh

unread,
Jan 15, 2016, 3:29:17 PM1/15/16
to gobblin-users, deepa...@gmail.com, prashantb...@gmail.com
Can you guys try building it with 2.6.0-cdh5.5.0 

deepak singh

unread,
Jan 15, 2016, 4:13:14 PM1/15/16
to gobblin-users, deepa...@gmail.com, prashantb...@gmail.com
Guys, I resolved that issue by manually deleting "hadoop-core-1.2.1.jar" jar from gobblin lib.

Now I am getting AccessControl exception.
Gobblin M/R job tries to write at root level of hdfs, which I dont have write acees.
How do we set M/R job path to specific location on hdfs e.g. /user/deepak ?

deepak singh

unread,
Jan 15, 2016, 6:02:06 PM1/15/16
to gobblin-users, deepa...@gmail.com, prashantb...@gmail.com
Fixed the issue by providing correct hdfs path through command line argument.

$./bin/gobblin-mapreduce.sh --conf ~/wikipedia.pull --jars ./lib/gobblin-example.jar --fs hdfs://hostname:8020/user/deepak_singh/ --workdir /user/deepak_singh/gobblin/work/


Thank  Issac, Prashant for helping me out.
Reply all
Reply to author
Forward
0 new messages