13/06/21 15:39:51 INFO yarn.ApplicationMaster: All workers have launched.
13/06/21 15:39:51 INFO yarn.ApplicationMaster: Started progress reporter thread - sleep time : 60000
13/06/21 15:39:51 INFO yarn.YarnAllocationHandler: rsrcRequest ... host : *, numContainers : 0, p = 1, capability: memory: 1408
13/06/21 15:39:51 INFO yarn.WorkerRunnable: Prepared Local resources Map(app.jar -> resource {, scheme: "hdfs", port: -1, file: "/user/somsatapathy/spark/7app.jar", }, size: 359513, timestamp: 1371854387434, type: FILE, visibility: APPLICATION, , spark.jar -> resource {, scheme: "hdfs", port: -1, file: "/user/somsatapathy/spark/7spark.jar", }, size: 63823475, timestamp: 1371854387388, type: FILE, visibility: APPLICATION, )
13/06/21 15:39:51 INFO yarn.WorkerRunnable: Prepared Local resources Map(app.jar -> resource {, scheme: "hdfs", port: -1, file: "/user/somsatapathy/spark/7app.jar", }, size: 359513, timestamp: 1371854387434, type: FILE, visibility: APPLICATION, , spark.jar -> resource {, scheme: "hdfs", port: -1, file: "/user/somsatapathy/spark/7spark.jar", }, size: 63823475, timestamp: 1371854387388, type: FILE, visibility: APPLICATION, )
13/06/21 15:39:51 INFO yarn.WorkerRunnable: Setting up worker with commands: List(java -server -XX:OnOutOfMemoryError='kill %p' -Xms1024m -Xmx1024m spark.executor.StandaloneExecutorBackend akka://sp...@10.58.17.191:56529/user/StandaloneScheduler 1 somsatapathy-MacBookPro1 1 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr)
13/06/21 15:39:51 INFO yarn.WorkerRunnable: Setting up worker with commands: List(java -server -XX:OnOutOfMemoryError='kill %p' -Xms1024m -Xmx1024m spark.executor.StandaloneExecutorBackend akka://sp...@10.58.17.191:56529/user/StandaloneScheduler 2 somsatapathy-MacBookPro1 1 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr)
13/06/21 15:39:54 INFO cluster.StandaloneSchedulerBackend: Registered executor: Actor[akka://sparkExecutor@somsatapathy-MacBookPro1:56543/user/Executor] with ID 1
13/06/21 15:39:54 INFO cluster.StandaloneSchedulerBackend: Registered executor: Actor[akka://sparkExecutor@somsatapathy-MacBookPro1:56542/user/Executor] with ID 2
13/06/21 15:39:54 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager somsatapathy-MacBookPro1:56548 with 647.7 MB RAM
13/06/21 15:39:54 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager somsatapathy-MacBookPro1:56549 with 647.7 MB RAM
13/06/21 15:39:54 INFO cluster.YarnClusterScheduler: YarnClusterScheduler.postStartHook done
Exception in thread "Thread-2" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:154)
Caused by: java.lang.IncompatibleClassChangeError: class spark.InnerClosureFinder has interface org.objectweb.asm.ClassVisitor as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at spark.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:69)
at spark.ClosureCleaner$.clean(ClosureCleaner.scala:89)
at spark.SparkContext.clean(SparkContext.scala:729)
at spark.RDD.map(RDD.scala:233)
at spark.examples.SparkPi$.main(SparkPi.scala:18)
at spark.examples.SparkPi.main(SparkPi.scala)
... 5 more
This clearly seems to be some sort of incompatibility issue. But it's not clear what's causing this.
Any help here?
Thanks,
Som