A problem caused by jar conflicts when submitting job by hue

57 views
Skip to first unread message

gab...@narratiive.com

unread,
Apr 19, 2018, 3:30:41 AM4/19/18
to Hue-Users
Hi Gurus,

Currently I am running a job in emr 5.13.0 which deploys spark 2.3.0. When I submit spark job by a shell, it works well. However when I run a hue workflow job. It fails and throws exception:

caused by: java.lang.NoSuchMethodError: net.jpountsz.lz4.LZBlockInputStream.

After investigating, I found the root cause is spark 2.3.0 depending on lz4-java 1.4.0. However when I submit job via hue, it includes lz4-java 1.3.0 which override spark's lz4-java 1.4.0.

Then I configured:
oozie.launcher.mapreduce.user.classpath.first=true in workflow setting and 
 --conf "spark.yarn.user.classpath.first=true" in options list

And package lz4-java 1.4.0 in my fat jar file.

But it does not work. Does anyone can help me and give me any hints? Thanks in advance. Your help will be greatly appreciated.

Best wishes
Gabriel



Reply all
Reply to author
Forward
0 new messages