Too many files open error by shark server

66 views
Skip to first unread message

l...@groupon.com

unread,
Sep 6, 2013, 6:42:59 PM9/6/13
to spark...@googlegroups.com
I'm receiving this error from my shark server log, after which the server crashes. Any suggestion?

java.lang.RuntimeException: /tmp/spark-b69b6a14-9353-40cd-a1e7-19effc7cf8fe/broadcast-439 (Too many open files)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:141)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
        at shark.SharkServerHandler.execute(SharkServer.scala:204)
        at shark.GatedSharkServerHandler.execute(SharkServer.scala:167)
        at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:629)

Patrick Wendell

unread,
Sep 6, 2013, 8:39:08 PM9/6/13
to spark...@googlegroups.com
Hm, can you try adding a line saying "ulimit -n XXX" in
conf/spark-env.sh where XXX is some large number?

- Patrick
> --
> You received this message because you are subscribed to the Google Groups
> "Spark Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to spark-users...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
Reply all
Reply to author
Forward
0 new messages