Unit Tests fail sporadically with Address already in use exception , Spark 0.7

737 views
Skip to first unread message

Rajiv Abraham

unread,
Mar 18, 2013, 5:03:03 PM3/18/13
to spark...@googlegroups.com
Hi Guys,

When I rerun my single unit test file, I sometimes get the Address already in use exception. There is no pattern in occurrence. I tried rerunning the same test again and again and it fails sporadically.

Is there something I have to do other than clearing the spark.master.port( see in bold below)

class MyScalaSuite extends FunSpec with ShouldMatchers with BeforeAndAfter {
  
  var sc: SparkContext = _

  before {
    sc = new SparkContext("local", "test")
  }
  
  after {
    if (sc != null) {
      sc.stop()
      sc = null
    }
//    // To avoid Akka rebinding to the same port, since it doesn't unbind immediately on shutdown
    System.clearProperty("spark.master.port")
  }
  
... my test
}

EXCEPTION MESSAGE:

[INFO] [03/18/2013 16:55:25.989] [spray-io-worker-0] [IoWorker] IoWorker thread 'spray-io-worker-0' stopped
Exception encountered when invoking run on a nested suite - Failed to bind to: /192.168.0.13:41062 *** ABORTED ***
  org.jboss.netty.channel.ChannelException: Failed to bind to: /192.168.0.13:41062
  at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:298)
  at akka.remote.netty.NettyRemoteServer.start(Server.scala:53)
  at akka.remote.netty.NettyRemoteTransport.start(NettyRemoteSupport.scala:89)
  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:94)
  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:588)
  at akka.actor.ActorSystemImpl.start(ActorSystem.scala:595)
  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
  at spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:51)
  at spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:68)
  at spark.SparkContext.<init>(SparkContext.scala:84)
  ...
  Cause: java.net.BindException: Address already in use
  at sun.nio.ch.Net.bind0(Native Method)
  at sun.nio.ch.Net.bind(Net.java:344)
  at sun.nio.ch.Net.bind(Net.java:336)
  at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199)
  at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.bind(NioServerSocketPipelineSink.java:138)
  at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.handleServerSocket(NioServerSocketPipelineSink.java:90)
  at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.eventSunk(NioServerSocketPipelineSink.java:64)
  at org.jboss.netty.channel.Channels.bind(Channels.java:569)
  at org.jboss.netty.channel.AbstractChannel.bind(AbstractChannel.java:187)
  ...
Run completed in 3 seconds, 647 milliseconds.
Total number of tests run: 0


Best Regards,
Rajiv

Matei Zaharia

unread,
Mar 20, 2013, 5:37:00 PM3/20/13
to spark...@googlegroups.com
Hi Rajiv,

That property was recently renamed to spark.driver.port. That's probably why the clear isn't working.

Matei

--
You received this message because you are subscribed to the Google Groups "Spark Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-users...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Rajiv Abraham

unread,
Mar 24, 2013, 2:56:49 PM3/24/13
to spark...@googlegroups.com
Ah, Thanks Matei

2013/3/20 Matei Zaharia <ma...@eecs.berkeley.edu>

--
You received this message because you are subscribed to a topic in the Google Groups "Spark Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/spark-users/MeVzgoJXm8I/unsubscribe?hl=en.
To unsubscribe from this group and all its topics, send an email to spark-users...@googlegroups.com.

For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
Take care,
Rajiv
Reply all
Reply to author
Forward
0 new messages