java.lang.NoSuchFieldError: HIVE_TRANSACTIONAL_TABLE_SCAN

36 views
Skip to first unread message

Wei Mou

unread,
Dec 22, 2016, 8:56:34 AM12/22/16
to Hue-Users
The architecture: Hue(3.11.0)  + Ambari+HDP(2.4.2)


When I submit a shell script from hue to cluster, everything is ok. but when add another java program into this workflow, below error will happen.
Is anyone can help to solve this problem?

---------log info----------
16/12/22 21:52:41 INFO tool.ImportTool: Destination directory aaaa is not present, hence not deleting.
16/12/22 21:52:41 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/12/22 21:52:41 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/12/22 21:52:41 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/12/22 21:52:41 INFO mapreduce.ImportJobBase: Beginning import of aaaa
16/12/22 21:52:41 INFO hcat.SqoopHCatUtilities: Configuring HCatalog for import job
16/12/22 21:52:41 INFO hcat.SqoopHCatUtilities: Configuring HCatalog specific details for job
16/12/22 21:52:41 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `aaaa` AS t LIMIT 1
16/12/22 21:52:41 INFO hcat.SqoopHCatUtilities: Database column names projected : [name, color, age, sex]
16/12/22 21:52:41 INFO hcat.SqoopHCatUtilities: Database column name - info map :
	sex : [Type : 12,Precision : 4,Scale : 0]
	color : [Type : 12,Precision : 4,Scale : 0]
	name : [Type : 12,Precision : 20,Scale : 0]
	age : [Type : 4,Precision : 10,Scale : 0]

16/12/22 21:52:41 INFO hive.metastore: Trying to connect to metastore with URI thrift://node1:9083
16/12/22 21:52:41 INFO hive.metastore: Connected to metastore.
Exception in thread "main" java.lang.NoSuchFieldError: HIVE_TRANSACTIONAL_TABLE_SCAN
	at org.apache.hadoop.hive.ql.io.AcidUtils.setTransactionalTableScan(AcidUtils.java:519)
	at org.apache.hive.hcatalog.mapreduce.FosterStorageHandler.configureInputJobProperties(FosterStorageHandler.java:135)
	at org.apache.hive.hcatalog.common.HCatUtil.getInputJobProperties(HCatUtil.java:458)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.extractPartInfo(InitializeInput.java:161)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:137)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:343)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:802)
	at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
	at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.ShellMain], exit code [1]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.impl.MetricsSystemImpl).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
---------------------------------

Wei Mou

unread,
Dec 25, 2016, 9:34:45 PM12/25/16
to Hue-Users
Finally this problem is solved by finding the compatible jar of hive-jdbc.

Others should pay more attention of the version if HDP 2.4.2. the version of hive compatibility is not good.  
When you want to define some jar program to help you, the version of hive should be same as the HDP Small version.
Reply all
Reply to author
Forward
0 new messages