--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/tmp/nm-local-dir/usercache/hadoop/appcache/application_1368458375091_0001/filecache/6622766977966537636/1spark.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 13/05/13 11:20:25 INFO yarn.ApplicationMaster: running as user hadoop 13/05/13 11:20:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 13/05/13 11:20:25 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1368458375091_0001_000001 13/05/13 11:20:25 INFO yarn.ApplicationMaster: Connecting to ResourceManager at master/192.168.56.101:8030 13/05/13 11:20:25 INFO yarn.ApplicationMaster: Registering the ApplicationMaster 13/05/13 11:20:25 INFO yarn.ApplicationMaster: Starting the user JAR in a separate Thread 13/05/13 11:20:25 INFO yarn.ApplicationMaster: Waiting for spark driver to be reachable. 13/05/13 11:20:25 ERROR yarn.ApplicationMaster: Failed to connect to driver at null:null Usage: SparkPi <master> [<slices>]
What was the command used to execute SparkPi example ?
Regards
Mridul
Error: Could not find or load main class spark.deploy.yarn.ApplicationMaster
hmm, ./repl-bin/ contains an invalid file
ok, trying something else
8.1)
[hadoop@master master]$ SPARK_JAR=./core/target/spark-core-assembly-0.8.0-SNAPSHOT.jar ./run spark.deploy.yarn.Client --jar ./examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.0-SNAPSHOT.jar --class spark.examples.SparkPi
13/05/16 13:36:39 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited.
13/05/16 13:36:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/05/16 13:36:39 INFO service.AbstractService: Service:org.apache.hadoop.yarn.client.YarnClientImpl is started.
13/05/16 13:36:39 INFO yarn.Client: Got Cluster metric info from ASM, numNodeManagers=1
13/05/16 13:36:39 INFO yarn.Client: Queue info .. queueName=default, queueCurrentCapacity=0.0, queueMaxCapacity=1.0, queueApplicationCount=54, queueChildQueueCount=0
13/05/16 13:36:39 INFO yarn.Client: Max mem capabililty of resources in this cluster 8192
13/05/16 13:36:39 INFO yarn.Client: Setting up application submission context for ASM
13/05/16 13:36:39 INFO yarn.Client: Preparing Local resources
13/05/16 13:36:40 INFO yarn.Client: Uploading core/target/spark-core-assembly-0.8.0-SNAPSHOT.jar to hdfs://master:9000/user/hadoop/spark/58spark.jar
13/05/16 13:36:41 INFO yarn.Client: Uploading examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.0-SNAPSHOT.jar to hdfs://master:9000/user/hadoop/spark/58app.jar
13/05/16 13:36:41 INFO yarn.Client: Setting up the launch environment
13/05/16 13:36:41 INFO yarn.Client: Setting up container launch context
13/05/16 13:36:41 INFO yarn.Client: Command for the ApplicationMaster: java -server -Xmx640m spark.deploy.yarn.ApplicationMaster --class spark.examples.SparkPi --jar ./examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.0-SNAPSHOT.jar --worker-memory 1024 --worker-cores 1 --num-workers 2 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr
13/05/16 13:36:41 INFO yarn.Client: Submitting application to ASM
13/05/16 13:36:41 INFO client.YarnClientImpl: Submitted application application_1368540452662_0058 to ResourceManager at master/192.168.56.101:8032
13/05/16 13:36:42 INFO yarn.Client: Application report from ASM:
application identifier: application_1368540452662_0058
appId: 58
clientToken: null
appDiagnostics:
appMasterHost: N/A
appQueue: default
appMasterRpcPort: 0
appStartTime: 1368725801935
yarnAppState: ACCEPTED
distributedFinalState: UNDEFINED
appTrackingUrl: master:8088/proxy/application_1368540452662_0058/
appUser: hadoop
13/05/16 13:36:43 INFO yarn.Client: Application report from ASM:
application identifier: application_1368540452662_0058
appId: 58
clientToken: null
appDiagnostics:
appMasterHost: N/A
appQueue: default
appMasterRpcPort: 0
appStartTime: 1368725801935
yarnAppState: ACCEPTED
distributedFinalState: UNDEFINED
appTrackingUrl: master:8088/proxy/application_1368540452662_0058/
appUser: hadoop
13/05/16 13:36:44 INFO yarn.Client: Application report from ASM:
application identifier: application_1368540452662_0058
appId: 58
clientToken: null
appDiagnostics:
appMasterHost: master
appQueue: default
appMasterRpcPort: 0
appStartTime: 1368725801935
yarnAppState: RUNNING
distributedFinalState: UNDEFINED
appTrackingUrl: master:8088/proxy/application_1368540452662_0058/
appUser: hadoop
13/05/16 13:36:45 INFO yarn.Client: Application report from ASM:
application identifier: application_1368540452662_0058
appId: 58
clientToken: null
appDiagnostics: Application application_1368540452662_0058 failed 1 times due to AM Container for appattempt_1368540452662_0058_000001 exited with exitCode: 1 due to:
.Failing this attempt.. Failing the application.
appMasterHost: master
appQueue: default
appMasterRpcPort: 0
appStartTime: 1368725801935
yarnAppState: FAILED
distributedFinalState: FAILED
appTrackingUrl: master:8088/cluster/app/application_1368540452662_0058
appUser: hadoop
hadoop says:
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/tmp/nm-local-dir/usercache/hadoop/appcache/application_1368540452662_0058/filecache/-8637636619787811316/58spark.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 13/05/16 13:36:43 INFO yarn.ApplicationMaster: running as user hadoop 13/05/16 13:36:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 13/05/16 13:36:44 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1368540452662_0058_000001 13/05/16 13:36:44 INFO yarn.ApplicationMaster: Connecting to ResourceManager at master/192.168.56.101:8030 13/05/16 13:36:44 INFO yarn.ApplicationMaster: Registering the ApplicationMaster 13/05/16 13:36:44 INFO yarn.ApplicationMaster: Starting the user JAR in a separate Thread 13/05/16 13:36:44 INFO yarn.ApplicationMaster: Waiting for spark driver to be reachable. 13/05/16 13:36:44 ERROR yarn.ApplicationMaster: Failed to connect to driver at null:null Usage: SparkPi <master> [<slices>]
8.2) ok, adding --args yarn-standalone to command line
failure, hadoop log file says:
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/tmp/nm-local-dir/usercache/hadoop/appcache/application_1368540452662_0059/filecache/9179936078225164096/59spark.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 13/05/16 13:42:31 INFO yarn.ApplicationMaster: running as user hadoop 13/05/16 13:42:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 13/05/16 13:42:31 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1368540452662_0059_000001 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Connecting to ResourceManager at master/192.168.56.101:8030 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Registering the ApplicationMaster 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Starting the user JAR in a separate Thread 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Waiting for spark driver to be reachable. 13/05/16 13:42:31 ERROR yarn.ApplicationMaster: Failed to connect to driver at null:null 13/05/16 13:42:31 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 Exception in thread "Thread-2" java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:154) Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: master/192.168.56.101:0 at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:297) at akka.remote.netty.NettyRemoteServer.start(Server.scala:53) at akka.remote.netty.NettyRemoteTransport.start(NettyRemoteSupport.scala:89) at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:94) at akka.actor.ActorSystemImpl._start(ActorSystem.scala:588) at akka.actor.ActorSystemImpl.start(ActorSystem.scala:595) at akka.actor.ActorSystem$.apply(ActorSystem.scala:111) at spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55) at spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:83) at spark.SparkContext.<init>(SparkContext.scala:85) at spark.examples.SparkPi$.main(SparkPi.scala:14) at spark.examples.SparkPi.main(SparkPi.scala) ... 5 more Caused by: java.net.BindException: Cannot assign requested address at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:344) at sun.nio.ch.Net.bind(Net.java:336) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.bind(NioServerSocketPipelineSink.java:140) at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.handleServerSocket(NioServerSocketPipelineSink.java:90) at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.eventSunk(NioServerSocketPipelineSink.java:64) at org.jboss.netty.channel.Channels.bind(Channels.java:569) at org.jboss.netty.channel.AbstractChannel.bind(AbstractChannel.java:189) at org.jboss.netty.bootstrap.ServerBootstrap$Binder.channelOpen(ServerBootstrap.java:342) at org.jboss.netty.channel.Channels.fireChannelOpen(Channels.java:170) at org.jboss.netty.channel.socket.nio.NioServerSocketChannel.<init>(NioServerSocketChannel.java:80) at org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory.newChannel(NioServerSocketChannelFactory.java:158) at org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory.newChannel(NioServerSocketChannelFactory.java:86) at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:276) ... 16 more
ok then,
9) install maven
10) mvn -Phadoop2-yarn clean install -DskipTests=true
error:
/home/hadoop/spark/master/core/src/main/scala/spark/network/netty/FileHeader.scala
lines 3, 5, 6, 7: object netty is not a member of package io
the same for FileClientHandler.scala and BlockFetcherIterator.scala
so, it does not compile
Looks like you did not edit SparkBuild.scala
So it did not include yar code ...
Use of Maven is preferable since it forces us to choose profile while not requiring code changes.
Regards
Mridul
--
You received this message because you are subscribed to a topic in the Google Groups "Spark Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-users/itp089Xpglo/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to spark-users...@googlegroups.com.
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/tmp/nm-local-dir/usercache/hadoop/appcache/application_1368540452662_0059/filecache/9179936078225164096/59spark.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/hadoop-2.0.4-alpha/share/hadoop/common/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 13/05/16 13:42:31 INFO yarn.ApplicationMaster: running as user hadoop 13/05/16 13:42:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 13/05/16 13:42:31 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1368540452662_0059_000001 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Connecting to ResourceManager at master/192.168.56.101:8030 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Registering the ApplicationMaster 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Starting the user JAR in a separate Thread 13/05/16 13:42:31 INFO yarn.ApplicationMaster: Waiting for spark driver to be reachable. 13/05/16 13:42:31 ERROR yarn.ApplicationMaster: Failed to connect to driver at null:null 13/05/16 13:42:31 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0 13/05/16 13:42:32 ERROR yarn.ApplicationMaster: Failed to connect to driver at master:0