Facing exception :: java.lang.NoClassDefFoundError (Could not initialize class org.xerial.snappy.Snappy

1,028 views
Skip to first unread message

ashu.shri

unread,
Jan 4, 2016, 9:03:31 AM1/4/16
to DataStax Spark Connector for Apache Cassandra
Hello,

Earlier we were running Spark happily on our environment. We are now suddenly started to get this exception.
/tmp has all the permission and is mounted as tmpfs
What is going wrong/ what is making Spark un-happy now.
Also, can we configure /tmp to some other location, instead of using default /tmp directory.
Any advice will be of great help.


java.lang.UnsatisfiedLinkError: /tmp/snappy-unknown-2243995e-9baf-4ec9-953e-c5b52f04d812-libsnappyjava.so: /tmp/snappy-unknown-2243995e-9baf-4ec9-953e-c5b52f04d812-libsnappyjava.so: failed to map segment from shared object: Operation not permitted
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1851)
at java.lang.Runtime.load0(Runtime.java:795)
at java.lang.System.load(System.java:1062)
at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:166)
at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:145)
at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
at org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:358)
at org.xerial.snappy.SnappyInputStream.rawRead(SnappyInputStream.java:167)
at org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:150)
at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2310)

Thanks!

Russell Spitzer

unread,
Jan 4, 2016, 12:53:56 PM1/4/16
to DataStax Spark Connector for Apache Cassandra
This can also happen if your tmp mount point is out of space, but i haven't really seen it in any other instances

--
You received this message because you are subscribed to the Google Groups "DataStax Spark Connector for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to spark-connector-...@lists.datastax.com.
--
Reply all
Reply to author
Forward
0 new messages