Got "native snappy library not available: this version of libhadoop was built without snappy support." when use hive plugin

805 views
Skip to first unread message

changz...@gmail.com

unread,
Nov 25, 2016, 5:50:13 AM11/25/16
to azkaban

Hi, I just tried to run hive script with azkaban 3.0 with the hive plugin, and this error came up:


"Unable to get CompressorType for codec (org.apache.hadoop.io.compress.SnappyCodec). This is most likely due to missing native libraries for the codec." and

"native snappy library not available: this version of libhadoop was built without snappy support."


My Hadoop is a HDP version and supports SnappyCodec and every configuration in hadoop and azkaban plugin seem correct. Running Hive directly from command line is totally fine. I think the problem may come from missing native library configuration, so I added native library path in plugins/jobtypes/commonprivate.properties but it is still not working. I can't find any place to set. Any suggestion?


My plugins/jobtypes/commonprivate.properties looks like:


hadoop.security.manager.class=azkaban.security.HadoopSecurityManager_H_2_0

hadoop.home=/usr/hdp/2.4.0.0-169/hadoop

#pig.home=/httx/run/pig-0.15.0

hive.home=/usr/hdp/2.4.0.0-169/hive

hadoop.lib=/usr/hdp/2.4.0.0-169/hadoop/lib

azkaban.should.proxy=false

jobtype.global.classpath=${hadoop.home}/conf_bak,${hadoop.home}/lib/*,${hadoop.home}/client/*,${hadoop.home}/*,/usr/hdp/2.4.0.0-169/hadoop-mapreduce/*,/usr/hdp/2.4.0.0-169/hadoop-mapreduce/

lib/*,/usr/hdp/2.4.0.0-169/tez/conf,/usr/hdp/2.4.0.0-169/tez/*,/usr/hdp/2.4.0.0-169/tez/lib/*,/usr/hdp/2.4.0.0-169/hadoop-hdfs/*,/usr/hdp/2.4.0.0-169/hadoop-hdfs/lib/*,/usr/hdp/2.4.0.0-169/

hadoop-yarn/*,/usr/hdp/2.4.0.0-169/hadoop-yarn/lib/*

jobtype.global.jvm.args=-Djava.library.path=${hadoop.home}/lib/native

obtain.binary.token=true

obtain.namenode.token=true

obtain.jobtracker.token=true

memCheck.enabled=false


Thanks


Chang

DMP

unread,
Mar 2, 2017, 2:02:43 PM3/2/17
to azkaban, changz...@gmail.com
Did you solve this issue?
Reply all
Reply to author
Forward
0 new messages