KryoException in deserailization of class with HashMap field

1,178 views
Skip to first unread message

Ovind

unread,
Jun 22, 2016, 9:17:14 AM6/22/16
to kryo-users

I have a Java class which has several String fields and one HashMap field. I am serializing objects of this class with default Kryo serialization and storing them on disk.

After reading them in memory, deserialization in a flatMap function of RDD in Spark produces the following error.

    16/06/22 11:13:05 WARN TaskSetManager: Lost task 20.0 in stage 3.0 (TID 85, localhost): com.esotericsoftware.kryo.KryoException: Unable to find class: Dadaisme
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:138)
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
    at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:610)
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:721)
    at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:126)
    at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
    at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:729)
    at prSpark.EmPageRank$1.call(EmPageRank.java:227)
    at prSpark.EmPageRank$1.call(EmPageRank.java:1)
    at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:149)
    at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$3$1.apply(JavaRDDLike.scala:149)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1595)
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)
    at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1157)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: Dadaisme
    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:340)
    at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
    ... 22 more

Here from stack trace com.esotericsoftware.kryo.KryoException: Unable to find class: Dadaisme it says "Dadisme" class is not found but "Dadisme" is not any class in my program, it is data in HashMap field. I am using latest version of Kryo. Any suggestions?

Martin Grotzke

unread,
Jun 22, 2016, 6:16:58 PM6/22/16
to kryo-users

Probably it's hard for you to provide a self contained test case (that only involves kryo), right? Because this is what we'd need to debug this. One possibility would be to write and register a custom serializer for the class, this *could* resolve the issue.

Cheers,
Martin


--
You received this message because you are subscribed to the "kryo-users" group.
http://groups.google.com/group/kryo-users
---
You received this message because you are subscribed to the Google Groups "kryo-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to kryo-users+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Ovind

unread,
Jun 30, 2016, 7:59:43 AM6/30/16
to kryo-users
This exception occurred because of version difference in Kyro libraries used for serialization and deserialization. Spark uses version 2 of Kryo by default and I used latest version (i.e. 3.x) of Kryo for serialization of objects. So serialization and deserialization versions should match.

Martin Grotzke

unread,
Jun 30, 2016, 7:52:34 PM6/30/16
to kryo-users

Right, the major version must match. It's also important to study release notes carefully on upgrade.

Cheers,
Martin

Reply all
Reply to author
Forward
0 new messages