I'm receiving this error from my shark server log, after which the server crashes. Any suggestion?
java.lang.RuntimeException: /tmp/spark-b69b6a14-9353-40cd-a1e7-19effc7cf8fe/broadcast-439 (Too many open files)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:141)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
at shark.SharkServerHandler.execute(SharkServer.scala:204)
at shark.GatedSharkServerHandler.execute(SharkServer.scala:167)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:629)