Spark simple job Can not find "org/apache/commons/configuration/Configuration", but I'm sure it is in the classpath

1,171 views
Skip to first unread message

Xianying He

unread,
Mar 19, 2012, 10:26:01 AM3/19/12
to Spark Users
spark.SimpleJob: Los was due to java.lang.NoClassDefFoundError: org/
apache/commons/configuration/Configuration
at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:
37)
at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:
34)
at
org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:
51)
at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:
196)
at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:
159)
at
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:
216)
at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:
83)


commmons-configuration-1.6.jar with hadoop-core-0.20.203.0.jar in the
same dir and include in the classpath.

Any help sir?

Matei Zaharia

unread,
Mar 19, 2012, 1:19:01 PM3/19/12
to spark...@googlegroups.com
How did you add Hadoop 0.20.203.0 to the classpath? It might be confused because Spark itself has a dependency on Hadoop, defined in project/SparkBuild.scala. It would be better to change the version number there to 0.20.203.0 and recompile. Look for the line that says "org.apache.hadoop" % "hadoop-core" % "0.20.0", and change it to "org.apache.hadoop" % "hadoop-core" % "0.20.203.0".

Matei

Xianying He

unread,
Mar 19, 2012, 9:14:28 PM3/19/12
to Spark Users
Dear sir, thanks for your reply, finally I find where goes wrong.The
running environment was in eclipse,all best fit, but I found the
process run on the mesos spark will use itself dependency where it
find libs with "lib_managed" dir, I made the change with
"lib_managed" dir just the same as your instruction, it run well.

Best regards!
Reply all
Reply to author
Forward
0 new messages