need help in feed data

205 views
Skip to first unread message

ha bach Duong

unread,
Mar 10, 2021, 9:03:03 PM3/10/21
to Kylo Community
I using kylo 0.10 and HDP 3.1.1.3.1, Hive 3.0.0.3.1
Spark local,
When i Ingest data have a error log  ERROR Hive: Exception when loading partition with parameters 

I think it is Hive bug not Kylo, Anyone have is bug yet?




Detail log
2021-03-10 23:28:08,875 INFO  21/03/10 23:28:08 ERROR Hive: Exception when loading partition with parameters  partPath=hdfs://localhost.localdomain:8020/model.db/adt_database/usersample/valid/.hive-staging_hive_2021-03-10_23-28-07_447_8751547744747852404-1/-ext-10000/processing_dttm=1615393393455,  table=usersample_valid,  partSpec={processing_dttm=1615393393455},  loadFileType=KEEP_EXISTING,  listBucketingLevel=0,  isAcid=false,  hasFollowingStatsTask=false
2021-03-10 23:28:08,875 INFO  java.lang.NullPointerException
2021-03-10 23:28:08,875 INFO        at org.apache.hadoop.hive.ql.metadata.Hive.addWriteNotificationLog(Hive.java:2999)
2021-03-10 23:28:08,875 INFO        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:2074)
2021-03-10 23:28:08,875 INFO        at org.apache.hadoop.hive.ql.metadata.Hive$4.call(Hive.java:2501)
2021-03-10 23:28:08,875 INFO        at org.apache.hadoop.hive.ql.metadata.Hive$4.call(Hive.java:2492)
2021-03-10 23:28:08,875 INFO        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2021-03-10 23:28:08,875 INFO        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2021-03-10 23:28:08,875 INFO        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2021-03-10 23:28:08,875 INFO        at java.lang.Thread.run(Thread.java:745)
2021-03-10 23:28:08,894 INFO  21/03/10 23:28:08 INFO AnnotationConfigApplicationContext: Closing org.springframework.context.annotation.AnnotationConfigApplicationContext@7d0d91a1: startup date [Wed Mar 10 23:27:57 ICT 2021]; root of context hierarchy
2021-03-10 23:28:08,897 INFO  21/03/10 23:28:08 ERROR Validator: Failed to perform validation: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table usersample_valid with loadPath=hdfs://localhost.localdomain:8020/model.db/adt_database/usersample/valid/.hive-staging_hive_2021-03-10_23-28-07_447_8751547744747852404-1/-ext-10000;
2021-03-10 23:28:08,897 INFO  org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table usersample_valid with loadPath=hdfs://localhost.localdomain:8020/model.db/adt_database/usersample/valid/.hive-staging_hive_2021-03-10_23-28-07_447_8751547744747852404-1/-ext-10000;
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.HiveExternalCatalog.loadDynamicPartitions(HiveExternalCatalog.scala:871)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.processInsert(InsertIntoHiveTable.scala:205)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.run(InsertIntoHiveTable.scala:99)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:656)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.DataFrameWriter.insertInto(DataFrameWriter.scala:322)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.DataFrameWriter.insertInto(DataFrameWriter.scala:308)
2021-03-10 23:28:08,897 INFO        at com.thinkbiganalytics.spark.DataSet20.writeToTable(DataSet20.java:98)
2021-03-10 23:28:08,897 INFO        at com.thinkbiganalytics.spark.datavalidator.StandardDataValidator.writeToTargetTable(StandardDataValidator.java:313)
2021-03-10 23:28:08,897 INFO        at com.thinkbiganalytics.spark.datavalidator.StandardDataValidator.saveValidToTable(StandardDataValidator.java:248)
2021-03-10 23:28:08,897 INFO        at com.thinkbiganalytics.spark.datavalidator.Validator.run(Validator.java:100)
2021-03-10 23:28:08,897 INFO        at com.thinkbiganalytics.spark.datavalidator.Validator.main(Validator.java:54)
2021-03-10 23:28:08,897 INFO        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2021-03-10 23:28:08,897 INFO        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2021-03-10 23:28:08,897 INFO        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2021-03-10 23:28:08,897 INFO        at java.lang.reflect.Method.invoke(Method.java:498)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2021-03-10 23:28:08,897 INFO  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 1 in table usersample_valid with loadPath=hdfs://localhost.localdomain:8020/model.db/adt_database/usersample/valid/.hive-staging_hive_2021-03-10_23-28-07_447_8751547744747852404-1/-ext-10000
2021-03-10 23:28:08,897 INFO        at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:2556)
2021-03-10 23:28:08,897 INFO        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2021-03-10 23:28:08,897 INFO        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2021-03-10 23:28:08,897 INFO        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2021-03-10 23:28:08,897 INFO        at java.lang.reflect.Method.invoke(Method.java:498)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.Shim_v3_0.loadDynamicPartitions(HiveShim.scala:1313)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveClientImpl.scala:779)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:777)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$loadDynamicPartitions$1.apply(HiveClientImpl.scala:777)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:278)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:216)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:215)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:261)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.client.HiveClientImpl.loadDynamicPartitions(HiveClientImpl.scala:777)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply$mcV$sp(HiveExternalCatalog.scala:883)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:871)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$loadDynamicPartitions$1.apply(HiveExternalCatalog.scala:871)
2021-03-10 23:28:08,897 INFO        at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2021-03-10 23:28:08,897 INFO        ... 35 more
2021-03-10 23:28:08,897 INFO  Caused by: java.util.concurrent.ExecutionException: java.lang.NullPointerException
2021-03-10 23:28:08,897 INFO        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
2021-03-10 23:28:08,897 INFO        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
2021-03-10 23:28:08,897 INFO        at org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:2547)
2021-03-10 23:28:08,897 INFO        ... 52 more
2021-03-10 23:28:08,897 INFO  Caused by: java.lang.NullPointerException
2021-03-10 23:28:08,897 INFO        at org.apache.hadoop.hive.ql.metadata.Hive.addWriteNotificationLog(Hive.java:2999)
2021-03-10 23:28:08,897 INFO        at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:2074)
2021-03-10 23:28:08,897 INFO        at org.apache.hadoop.hive.ql.metadata.Hive$4.call(Hive.java:2501)
2021-03-10 23:28:08,897 INFO        at org.apache.hadoop.hive.ql.metadata.Hive$4.call(Hive.java:2492)
2021-03-10 23:28:08,897 INFO        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
2021-03-10 23:28:08,897 INFO        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
2021-03-10 23:28:08,897 INFO        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2021-03-10 23:28:08,897 INFO        at java.lang.Thread.run(Thread.java:745)
Reply all
Reply to author
Forward
0 new messages