Hello,
I'm trying to run a spark job on snappy files and I'm getting the error below. Spark 2.2.0 includes snappy-java-1.1.2.6.jar which contains the native libraries and is loaded into the executor classpath. From what I understand, this should be sufficient to support Snappy. Are there any other things I should be looking at to get this to work?
Thanks,
Frank
org.apache.hadoop.hbase.DoNotRetryIOException (java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z)