Error: "Answer from Java side is empty"

2,429 views
Skip to first unread message

Aleksandr Modestov

unread,
May 11, 2016, 5:37:18 AM5/11/16
to H2O Open Source Scalable Machine Learning - h2ostream
I use Sparkling Water 1.6.3, Spark 1.6.
I use Java Oracle 8 or OpenJDK-7:
(every time I get this error when I transform Spark DataFrame into H2O DataFrame.

ERROR:py4j.java_gateway:Error while sending or receiving.
Traceback (most recent call last):
  File ".../Spark1.6/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 746, in send_command
    raise Py4JError("Answer from Java side is empty")
Py4JError: Answer from Java side is empty
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server
Traceback (most recent call last):
  File ".../Spark1.6/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 690, in start
    self.socket.connect((self.address, self.port))
  File "/usr/local/anaconda/lib/python2.7/socket.py", line 228, in meth
    return getattr(self._sock,name)(*args)
error: [Errno 111] Connection refused
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server
Traceback (most recent call last):

My conf-file:
spark.serializer org.apache.spark.serializer.KryoSerializer 
spark.kryoserializer.buffer.max 1500mb
spark.driver.memory 65g
spark.driver.extraJavaOptions -XX:-PrintGCDetails -XX:PermSize=35480m -XX:-PrintGCTimeStamps -XX:-PrintTenuringDistribution  
spark.python.worker.memory 65g
spark.local.dir /data/spark-tmp
spark.ext.h2o.client.log.dir /data/h2o
spark.logConf false
spark.master local[*]
spark.driver.maxResultSize 0
spark.eventLog.enabled True
spark.eventLog.dir /data/spark_log

In the code I use "persist" data (amount of data is 5.7 GB).
There is nothing in the h2olog-files.
I guess that there is enough memory.
Could anyone help me?
Thanks!








h2o_127.0.0.1_54321-3-info.log

Michal Malohlava

unread,
May 12, 2016, 1:39:59 PM5/12/16
to h2os...@googlegroups.com
Hi Aleksandr,

do you have Spark logs? From Spark driver/executors?
It seems Spark driver JVM died for some reason.

You can try to increase spark.driver.memory or PermGen size for driver.

Michal
--
You received this message because you are subscribed to the Google Groups "H2O Open Source Scalable Machine Learning - h2ostream" group.
To unsubscribe from this group and stop receiving emails from it, send an email to h2ostream+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Aleksandr Modestov

unread,
May 13, 2016, 9:04:47 AM5/13/16
to H2O Open Source Scalable Machine Learning - h2ostream, mic...@h2oai.com
"It seems Spark driver JVM died for some reason." I understand but I don't see reasons...
"do you have Spark logs? From Spark driver/executors? " I didn't find anything about error...
"You can try to increase spark.driver.memory or PermGen size for driver." I gave 65 or 70Gb of RAM for Apache Spark. I can not give more.... I would be out of resources...
I use Java 8 and there is no parameter called "PermGen"

четверг, 12 мая 2016 г., 20:39:59 UTC+3 пользователь Michal Malohlava написал:
Reply all
Reply to author
Forward
0 new messages