HIVE MR3 can not use JdbcStorageHandler

161 views
Skip to first unread message

Carol Chapman

unread,
Nov 24, 2021, 11:02:55 AM11/24/21
to MR3
When I use JdbcStorageHandler for complex queries, I often get the following error :

ERROR : Map 2            1 task           1637769433500 milliseconds: Failed, Some(Failed to create RootInputInitializerManager or VertexManager for Map 2, com.datamonad.mr3.api.common.AMInputInitializerException: WorkerVertex.createRootInputInitializerManager() Map 2
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:630)
at com.datamonad.mr3.dag.WorkerVertexImpl.transitionToInitializing(WorkerVertex.scala:474)
at com.datamonad.mr3.dag.WorkerVertexImpl.checkFromCanStartInitializing(WorkerVertex.scala:861)
at com.datamonad.mr3.dag.WorkerVertexImpl.eventInitialize(WorkerVertex.scala:1051)
at com.datamonad.mr3.dag.WorkerVertexImpl.handle(WorkerVertex.scala:961)
at com.datamonad.mr3.dag.WorkerVertex$$anon$1.com$datamonad$mr3$common$AsyncHandling$$super$handle(WorkerVertex.scala:75)
at com.datamonad.mr3.common.AsyncHandling$$anon$1.run(EventHandler.scala:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datamonad.mr3.api.common.MR3UncheckedException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:57)
at com.datamonad.mr3.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:26)
at com.datamonad.mr3.tez.TezInputInitializer$.getInputInitializer(TezInputInitializer.scala:34)
at com.datamonad.mr3.tez.TezRuntimeEnv.getInputInitializer(TezRuntimeEnv.scala:365)
at com.datamonad.mr3.dag.InputInitializerRunner.<init>(InputInitializerRunner.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:38)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:296)
at com.datamonad.mr3.dag.RootInputInitializerManager.<init>(RootInputInitializerManager.scala:38)
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:627)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:45)
... 24 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://xxxxxxxx/tmp/hue/hue/_mr3_session_dir/93d8e942-a66b-4a18-81ec-4c88db9b62d2/hive/_mr3_scratch_dir-c176-742/52f2e7e3-7d99-42a4-9189-701109c466cb/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:491)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:337)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 28 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:183)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:201)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:178)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:216)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:208)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:703)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:609)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:586)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:452)
... 30 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
... 53 more
)
ERROR : Terminating unsuccessfully: Vertex failed, vertex_1637251720035_4378_728_04. 
ERROR : Map 7            1 task           1637769433498 milliseconds: Failed, Some(Failed to create RootInputInitializerManager or VertexManager for Map 7, com.datamonad.mr3.api.common.AMInputInitializerException: WorkerVertex.createRootInputInitializerManager() Map 7
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:630)
at com.datamonad.mr3.dag.WorkerVertexImpl.transitionToInitializing(WorkerVertex.scala:474)
at com.datamonad.mr3.dag.WorkerVertexImpl.checkFromCanStartInitializing(WorkerVertex.scala:861)
at com.datamonad.mr3.dag.WorkerVertexImpl.eventInitialize(WorkerVertex.scala:1051)
at com.datamonad.mr3.dag.WorkerVertexImpl.handle(WorkerVertex.scala:961)
at com.datamonad.mr3.dag.WorkerVertex$$anon$1.com$datamonad$mr3$common$AsyncHandling$$super$handle(WorkerVertex.scala:75)
at com.datamonad.mr3.common.AsyncHandling$$anon$1.run(EventHandler.scala:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datamonad.mr3.api.common.MR3UncheckedException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:57)
at com.datamonad.mr3.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:26)
at com.datamonad.mr3.tez.TezInputInitializer$.getInputInitializer(TezInputInitializer.scala:34)
at com.datamonad.mr3.tez.TezRuntimeEnv.getInputInitializer(TezRuntimeEnv.scala:365)
at com.datamonad.mr3.dag.InputInitializerRunner.<init>(InputInitializerRunner.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:38)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:296)
at com.datamonad.mr3.dag.RootInputInitializerManager.<init>(RootInputInitializerManager.scala:38)
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:627)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:45)
... 24 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://xxxxxxxx/tmp/hue/hue/_mr3_session_dir/93d8e942-a66b-4a18-81ec-4c88db9b62d2/hive/_mr3_scratch_dir-c176-742/01ee48fd-c93d-4a92-a558-7d946c72957c/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:491)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:337)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 28 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:183)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:201)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:178)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:216)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:208)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:703)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:609)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:586)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:452)
... 30 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
... 53 more
)
ERROR : FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Map 2            1 task           1637769433500 milliseconds: Failed, Some(Failed to create RootInputInitializerManager or VertexManager for Map 2, com.datamonad.mr3.api.common.AMInputInitializerException: WorkerVertex.createRootInputInitializerManager() Map 2
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:630)
at com.datamonad.mr3.dag.WorkerVertexImpl.transitionToInitializing(WorkerVertex.scala:474)
at com.datamonad.mr3.dag.WorkerVertexImpl.checkFromCanStartInitializing(WorkerVertex.scala:861)
at com.datamonad.mr3.dag.WorkerVertexImpl.eventInitialize(WorkerVertex.scala:1051)
at com.datamonad.mr3.dag.WorkerVertexImpl.handle(WorkerVertex.scala:961)
at com.datamonad.mr3.dag.WorkerVertex$$anon$1.com$datamonad$mr3$common$AsyncHandling$$super$handle(WorkerVertex.scala:75)
at com.datamonad.mr3.common.AsyncHandling$$anon$1.run(EventHandler.scala:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datamonad.mr3.api.common.MR3UncheckedException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:57)
at com.datamonad.mr3.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:26)
at com.datamonad.mr3.tez.TezInputInitializer$.getInputInitializer(TezInputInitializer.scala:34)
at com.datamonad.mr3.tez.TezRuntimeEnv.getInputInitializer(TezRuntimeEnv.scala:365)
at com.datamonad.mr3.dag.InputInitializerRunner.<init>(InputInitializerRunner.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:38)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:296)
at com.datamonad.mr3.dag.RootInputInitializerManager.<init>(RootInputInitializerManager.scala:38)
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:627)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:45)
... 24 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://xxxxxxxx/tmp/hue/hue/_mr3_session_dir/93d8e942-a66b-4a18-81ec-4c88db9b62d2/hive/_mr3_scratch_dir-c176-742/52f2e7e3-7d99-42a4-9189-701109c466cb/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:491)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:337)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 28 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:183)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:201)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:178)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:216)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:208)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:703)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:609)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:586)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:452)
... 30 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
... 53 more
)Terminating unsuccessfully: Vertex failed, vertex_1637251720035_4378_728_04. Map 7            1 task           1637769433498 milliseconds: Failed, Some(Failed to create RootInputInitializerManager or VertexManager for Map 7, com.datamonad.mr3.api.common.AMInputInitializerException: WorkerVertex.createRootInputInitializerManager() Map 7
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:630)
at com.datamonad.mr3.dag.WorkerVertexImpl.transitionToInitializing(WorkerVertex.scala:474)
at com.datamonad.mr3.dag.WorkerVertexImpl.checkFromCanStartInitializing(WorkerVertex.scala:861)
at com.datamonad.mr3.dag.WorkerVertexImpl.eventInitialize(WorkerVertex.scala:1051)
at com.datamonad.mr3.dag.WorkerVertexImpl.handle(WorkerVertex.scala:961)
at com.datamonad.mr3.dag.WorkerVertex$$anon$1.com$datamonad$mr3$common$AsyncHandling$$super$handle(WorkerVertex.scala:75)
at com.datamonad.mr3.common.AsyncHandling$$anon$1.run(EventHandler.scala:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datamonad.mr3.api.common.MR3UncheckedException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:57)
at com.datamonad.mr3.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:26)
at com.datamonad.mr3.tez.TezInputInitializer$.getInputInitializer(TezInputInitializer.scala:34)
at com.datamonad.mr3.tez.TezRuntimeEnv.getInputInitializer(TezRuntimeEnv.scala:365)
at com.datamonad.mr3.dag.InputInitializerRunner.<init>(InputInitializerRunner.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:38)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:296)
at com.datamonad.mr3.dag.RootInputInitializerManager.<init>(RootInputInitializerManager.scala:38)
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:627)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:45)
... 24 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://xxxxxxxx/tmp/hue/hue/_mr3_session_dir/93d8e942-a66b-4a18-81ec-4c88db9b62d2/hive/_mr3_scratch_dir-c176-742/01ee48fd-c93d-4a92-a558-7d946c72957c/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:491)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:337)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 28 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:183)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:201)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:178)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:216)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:208)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:703)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:609)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:586)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:452)
... 30 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
... 53 more
)
INFO  : Completed executing command(queryId=hue_20211124235709_d14016db-86db-4427-b963-e26fe94daf33); Time taken: 3.34 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Map 2            1 task           1637769433500 milliseconds: Failed, Some(Failed to create RootInputInitializerManager or VertexManager for Map 2, com.datamonad.mr3.api.common.AMInputInitializerException: WorkerVertex.createRootInputInitializerManager() Map 2
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:630)
at com.datamonad.mr3.dag.WorkerVertexImpl.transitionToInitializing(WorkerVertex.scala:474)
at com.datamonad.mr3.dag.WorkerVertexImpl.checkFromCanStartInitializing(WorkerVertex.scala:861)
at com.datamonad.mr3.dag.WorkerVertexImpl.eventInitialize(WorkerVertex.scala:1051)
at com.datamonad.mr3.dag.WorkerVertexImpl.handle(WorkerVertex.scala:961)
at com.datamonad.mr3.dag.WorkerVertex$$anon$1.com$datamonad$mr3$common$AsyncHandling$$super$handle(WorkerVertex.scala:75)
at com.datamonad.mr3.common.AsyncHandling$$anon$1.run(EventHandler.scala:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datamonad.mr3.api.common.MR3UncheckedException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:57)
at com.datamonad.mr3.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:26)
at com.datamonad.mr3.tez.TezInputInitializer$.getInputInitializer(TezInputInitializer.scala:34)
at com.datamonad.mr3.tez.TezRuntimeEnv.getInputInitializer(TezRuntimeEnv.scala:365)
at com.datamonad.mr3.dag.InputInitializerRunner.<init>(InputInitializerRunner.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:38)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:296)
at com.datamonad.mr3.dag.RootInputInitializerManager.<init>(RootInputInitializerManager.scala:38)
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:627)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:45)
... 24 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://xxxxxxxx/tmp/hue/hue/_mr3_session_dir/93d8e942-a66b-4a18-81ec-4c88db9b62d2/hive/_mr3_scratch_dir-c176-742/52f2e7e3-7d99-42a4-9189-701109c466cb/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:491)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:337)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 28 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:183)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:201)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:178)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:216)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:208)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:703)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:609)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:586)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:452)
... 30 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
... 53 more
)Terminating unsuccessfully: Vertex failed, vertex_1637251720035_4378_728_04. Map 7            1 task           1637769433498 milliseconds: Failed, Some(Failed to create RootInputInitializerManager or VertexManager for Map 7, com.datamonad.mr3.api.common.AMInputInitializerException: WorkerVertex.createRootInputInitializerManager() Map 7
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:630)
at com.datamonad.mr3.dag.WorkerVertexImpl.transitionToInitializing(WorkerVertex.scala:474)
at com.datamonad.mr3.dag.WorkerVertexImpl.checkFromCanStartInitializing(WorkerVertex.scala:861)
at com.datamonad.mr3.dag.WorkerVertexImpl.eventInitialize(WorkerVertex.scala:1051)
at com.datamonad.mr3.dag.WorkerVertexImpl.handle(WorkerVertex.scala:961)
at com.datamonad.mr3.dag.WorkerVertex$$anon$1.com$datamonad$mr3$common$AsyncHandling$$super$handle(WorkerVertex.scala:75)
at com.datamonad.mr3.common.AsyncHandling$$anon$1.run(EventHandler.scala:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.datamonad.mr3.api.common.MR3UncheckedException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:57)
at com.datamonad.mr3.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:26)
at com.datamonad.mr3.tez.TezInputInitializer$.getInputInitializer(TezInputInitializer.scala:34)
at com.datamonad.mr3.tez.TezRuntimeEnv.getInputInitializer(TezRuntimeEnv.scala:365)
at com.datamonad.mr3.dag.InputInitializerRunner.<init>(InputInitializerRunner.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:39)
at com.datamonad.mr3.dag.RootInputInitializerManager$$anonfun$3.apply(RootInputInitializerManager.scala:38)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.immutable.List.map(List.scala:296)
at com.datamonad.mr3.dag.RootInputInitializerManager.<init>(RootInputInitializerManager.scala:38)
at com.datamonad.mr3.dag.WorkerVertexImpl.createRootInputInitializerManager(WorkerVertex.scala:627)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.datamonad.mr3.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:45)
... 24 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://xxxxxxxx/tmp/hue/hue/_mr3_session_dir/93d8e942-a66b-4a18-81ec-4c88db9b62d2/hive/_mr3_scratch_dir-c176-742/01ee48fd-c93d-4a92-a558-7d946c72957c/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:491)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:337)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 28 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:183)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:201)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:178)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:216)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:208)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:703)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:609)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:586)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:452)
... 30 more
Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
... 53 more
) (state=08S01,code=2)




This problem occurs when a table that uses the JdbcStorageHandler is associated with a table that does not use the JdbcStorageHandler  

Carol Chapman

unread,
Nov 24, 2021, 11:26:12 AM11/24/21
to MR3
After further testing, I found that all tables using other storagehandlers can only perform simple queries, and the results can only be output to the console. I cannot write the data of this kind of table to another table, nor can I use this kind of table for association query

Carol Chapman

unread,
Nov 24, 2021, 11:34:09 AM11/24/21
to MR3
I guess the cause of the problem is tez. The error I encountered is very similar to this issue :     [TEZ-4223] Adding new jars or resources after the first DAG runs does not work. - ASF JIRA (apache.org)

Sungwoo Park

unread,
Nov 24, 2021, 12:19:07 PM11/24/21
to Carol Chapman, MR3
Hello,

You can add hive-jdbc-handler-3.1.3.jar to the classpath by adding a path to hive.aux.jars.path in hive-site.xml, e.g.:

<property>
  <name>hive.aux.jars.path</name>
  <value>/home/hive/mr3-run/hive/hivejar/apache-hive-3.1.3-bin/lib/hive-llap-common-3.1.3.jar,/home/gitlab-runner/mr3-run/hive/hivejar/apache-hive-3.1.3-bin/lib/hive-llap-server-3.1.3.jar,/home/gitlab-runner/mr3-run/hive/hivejar/apache-hive-3.1.3-bin/lib/hive-llap-tez-3.1.3.jar,/home/gitlab-runner/mr3-run/hive/hivejar/apache-hive-3.1.3-bin/lib/hive-jdbc-handler-3.1.3.jar</value>
</property>

Then, all queries can use hive-jdbc-handler-3.1.3.jar. (You will need to restart HiveServer2.) If you would like to add hive-jdbc-handler-3.1.3.jar only for specific queries, you can add it as an additional jar, e.g.:

0: jdbc:hive2://indigo1:9852/> add jar hdfs:///tmp/hive-jdbc-handler-3.1.3.jar;

MR3 does not have the problem reported in TEZ-4223 because it creates ClassLoader for each DAG. The patch in TEZ-4223 affects Tez DAGAppMaster, which is not used in MR3.

Cheers,

--- Sungwoo

--
You received this message because you are subscribed to the Google Groups "MR3" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hive-mr3+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hive-mr3/6bdc7155-4431-40ed-bb82-2d6e774b87bcn%40googlegroups.com.

Carol Chapman

unread,
Nov 24, 2021, 10:10:23 PM11/24/21
to MR3
After I execute the add statement, the query still reports an error

0: jdbc:hive2://xxxxxxxx> add jar hdfs:///user/hue/mr3-lib/hive-jdbc-handler-3.1.3.jar;
INFO  : Added [/tmp/51d28c3a-2595-4668-9bf2-f944be872ae4_resources/hive-jdbc-handler-3.1.3.jar] to class path
INFO  : Added resources: [hdfs:///user/hue/mr3-lib/hive-jdbc-handler-3.1.3.jar]
No rows affected (0.136 seconds)
0: jdbc:hive2://xxxxxxxx> CREATE TEMPORARY EXTERNAL TABLE IF NOT EXISTS  dw_jdbc_bridge.t1
. . . . . . . . . . . . . . . . . . . . . . .> (
. . . . . . . . . . . . . . . . . . . . . . .>    f1           string,
. . . . . . . . . . . . . . . . . . . . . . .>    f2                string,
. . . . . . . . . . . . . . . . . . . . . . .>    f3      string,
. . . . . . . . . . . . . . . . . . . . . . .>    f4          bigint,
. . . . . . . . . . . . . . . . . . . . . . .>    f5                     bigint
. . . . . . . . . . . . . . . . . . . . . . .> )
. . . . . . . . . . . . . . . . . . . . . . .> STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
. . . . . . . . . . . . . . . . . . . . . . .> TBLPROPERTIES (
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.database.type" = "MYSQL",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.jdbc.url" = "jdbc:mysql://xxxxxx:3306/xxxx",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.dbcp.username" = "xxx",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.dbcp.password" = "xxxx",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.table" = "asdasdasdasd",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.dbcp.maxActive" = "1",
. . . . . . . . . . . . . . . . . . . . . . .>     "hive.sql.jdbc.fetch.size" = "1000"
. . . . . . . . . . . . . . . . . . . . . . .> );
INFO  : Compiling command(queryId=hue_20211125110209_7b2a05c9-5c73-4074-b16d-ab5e5d834289): CREATE TEMPORARY EXTERNAL TABLE IF NOT EXISTS  dw_jdbc_bridge.t1
(
f1           string,
f2                string,
f3      string,
f4          bigint,
f5                     bigint
)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
"hive.sql.database.type" = "MYSQL",
"hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
"hive.sql.jdbc.url" = "jdbc:mysql://xxxx:3306/xxx",
"hive.sql.dbcp.username" = "xxxx",
"hive.sql.dbcp.password" = "xxxx",
"hive.sql.table" = "xxxxxxxx",
"hive.sql.dbcp.maxActive" = "1",
"hive.sql.jdbc.fetch.size" = "1000"
)
INFO  : Semantic Analysis Completed (retrial = false)
INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO  : Completed compiling command(queryId=hue_20211125110209_7b2a05c9-5c73-4074-b16d-ab5e5d834289); Time taken: 0.034 seconds
INFO  : Executing command(queryId=hue_20211125110209_7b2a05c9-5c73-4074-b16d-ab5e5d834289): CREATE TEMPORARY EXTERNAL TABLE IF NOT EXISTS  dw_jdbc_bridge.t1
(
f1           string,
f2                string,
f3      string,
f4          bigint,
f5                     bigint
)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
"hive.sql.database.type" = "MYSQL",
"hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
"hive.sql.jdbc.url" = "jdbc:mysql://xxxx:3306/xxx",
"hive.sql.dbcp.username" = "xxxx",
"hive.sql.dbcp.password" = "xxxx",
"hive.sql.table" = "xxxxxxxx",
"hive.sql.dbcp.maxActive" = "1",
"hive.sql.jdbc.fetch.size" = "1000"
)
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : Completed executing command(queryId=hue_20211125110209_7b2a05c9-5c73-4074-b16d-ab5e5d834289); Time taken: 0.026 seconds
INFO  : OK
No rows affected (0.102 seconds)



Then i execute a simple query:
CREATE TEMPORARY  TABLE IF NOT EXISTS temp.t1 as
select * from
dw_jdbc_bridge.t1;


Then I received the same error message:  java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat

Sungwoo Park

unread,
Nov 25, 2021, 1:56:19 AM11/25/21
to MR3
It turns out that this is not due to a bug in the Hive-MR3 code, but due to a misconfiguration in mr3-site.xml.

You can set mr3.am.permit.custom.user.class to true, and then everything should work okay. You don't have to change hive-site.xml or execute 'add jar' manually because Hive automatically sends JDBC jars (e.g., hive-jdbc-handler-3.1.3.jar) to MR3. If mr3.am.permit.custom.user.class is set to false, however, the new jars are not included in the classpath and you get ClassNotFoundException.

The default value of mr3.am.permit.custom.user.class is false, and it is also set to false in mr3-site.xml in the MR3 distribution. I am not sure yet if it would be a good decision to set its default value to true. This is because the default value of true means that any user can add custom jar files, and this can be a security problem. Let me think more about this, and if necessary, I will set the default value to true in MR3 1.4.

In your case, setting it to true is probably okay because you use Ranger for authorization.

Thanks a lot for brining up this issue.

Carol Chapman

unread,
Nov 25, 2021, 4:49:31 AM11/25/21
to MR3
Can we provide a list of registration forms to load additional jar packages and load as few unnecessary jar packages as possible?
eg: 
<property>
  <name> mr3.am.load.aux.jar.white.list</name>
  <value>xxxxx,xxxxxx,xxxx,xxx</value>
</property>
 
if set  mr3.am.permit.custom.user.class=false, This configuration will take effect.
if set  mr3.am.permit.custom.user.class=true,  This configuration will be meaningless.

Then, do we need to write the solution to this problem in   Troubleshooting ?  

Sungwoo Park

unread,
Nov 25, 2021, 12:02:08 PM11/25/21
to MR3
I think using a white list does not fix the problem of rogue users trying to execute malicious code in DAGAppMaster because they could just rename jar files. In my opinion, the permission should be controlled by Ranger. Implementing this configuration key is not hard, but let's think more about why it would be necessary.

Cheers,

--- Sungwoo

Carol Chapman

unread,
Nov 29, 2021, 1:02:05 AM11/29/21
to MR3
Okay,  I also think it needs careful consideration here.
By the way, does "online pull" have any plans to release in version 1.4?

Sungwoo Park

unread,
Nov 29, 2021, 5:35:52 AM11/29/21
to MR3
No, its development is on hold at the moment :-(

Cheers,
--- Sungwoo

Carol Chapman

unread,
May 17, 2022, 4:45:39 AM5/17/22
to MR3
Hi, i set  mr3.am.permit.custom.user.class=true in my cluster. now jdbc storage handler  looks like it's starting to work.
BUT, I still found some problems  :
CREATE TEMPORARY EXTERNAL TABLE IF NOT EXISTS demo.t1
(
   c1 string,
   c2 string

)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
    "hive.sql.database.type" = "MYSQL",
    "hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
    "hive.sql.jdbc.url" = "jdbc:mysql://xxxx:3306/xxxx",

    "hive.sql.dbcp.username" = "xxxx",
    "hive.sql.dbcp.password" = "xxxx",
    --"hive.sql.table" = "t1",
    "hive.sql.query" = "select c1,c2 from t1",

    "hive.sql.dbcp.maxActive" = "1",
    "hive.sql.jdbc.fetch.size" = "1000"
);

if i use  hive.sql.table,  I can perform normal aggregate query on this table. but, In some scenarios, I cannot use this option  (The join syntax scenarios of "Computation Pushdown" and hive do not work well together),So I need to use  “hive.sql.query”.

But if i use  “hive.sql.query”,  All my operation results are 0. I can‘t insert overwrite, aggregation.

Is this a hive bug? I used HDP 3.1.5 hive and didn't find this problem.

Sungwoo Park

unread,
May 17, 2022, 7:22:42 AM5/17/22
to MR3
Do you have an example where "hive.sql.query" correctly returns results? I suspect this is a bug reported in HIVE-22357, which is not backported to Hive 3 on MR3.


Cheers,

--- Sungwoo

Carol Chapman

unread,
May 17, 2022, 9:16:34 AM5/17/22
to MR3
CREATE TEMPORARY EXTERNAL TABLE IF NOT EXISTS demo.t1
(
   c1 string,
   c2 string

)
STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler'
TBLPROPERTIES (
    "hive.sql.database.type" = "MYSQL",
    "hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver",
    "hive.sql.jdbc.url" = "jdbc:mysql://xxxx:3306/xxxx",

    "hive.sql.dbcp.username" = "xxxx",
    "hive.sql.dbcp.password" = "xxxx",
    --"hive.sql.table" = "t1",
    "hive.sql.query" = "select c1,c2 from t1",

    "hive.sql.dbcp.maxActive" = "1",
    "hive.sql.jdbc.fetch.size" = "1000"
);


select count(1) from demo.t1; ------>result :0

create table  t2  as select * from t1;  -----> t2 rows:0

select * from t1;-------> row num equal mysql


I found that this bug was blocked by hdp3 1.5 fixed it.  The open source version of hive 3 Version x does not fix this problem.

I want to submit a PR for Hive On MR3. This PR is used to update the code version of the Hive JDBC Storage Handler to HDP-3.1.5.190-1.
What should I pay attention to when submitting PR?

Carol Chapman

unread,
May 17, 2022, 9:20:51 AM5/17/22
to MR3
Or should we take a more radical approach and upgrade the code for JDBC storage handler to the latest version of Master?

Sungwoo Park

unread,
May 17, 2022, 10:55:36 AM5/17/22
to MR3
Unfortunately pull requests to Hive-MR3 are not accepted because the repository is not updated incrementally (by applying new patches to the branch tip). The repository is maintained in such a way that Apache Hive can also be built after ignoring the last two commits. For example, backporting a target patch often makes it necessary to backport its dependencies first.

At any rate, if you would like to merge patches to Hive-MR3, you have two options:

1) Let us know the patches you would like to merge, and then we will try to merge them in Hive-MR3. We might fail, however, if there are too many conflicts to resolve. If we make it, the Apache Hive community can also benefit from the update because Hive-MR3 is used by Apache Hive users. (Apache Hive 3.1.2 is not maintained seriously, and is unlikely to be maintained in the future. See the recent thread in the Hive mailing list, where the Hive-MR3 repository was discussed in an effort to better maintain Hive 3.1.2.)

2) (Not recommended) You could fork Hive-MR3 and backport patches on your own. You can rebuild Hive-MR3 by following the instruction, as explained in: https://mr3docs.datamonad.com/docs/install/compile-hivemr3/.

We will try to backport patches fixing bugs in the JDBC handler.

Cheers,

--- Sungwoo

Carol Chapman

unread,
May 17, 2022, 11:50:56 AM5/17/22
to MR3

Okay, I think fix JDBC handler will not affect the content of mr3 itself. After all, it is only an extension library。

If possible, please upgrade the code of hive JDBC storage handler to the latest branch of master.

Also, for your open source repository on GITHUB, would you consider opening up some channels for submitting PR in the future? In this way we can help you share some of the work, reduce your burden. The project code can also be iterated faster.

Sungwoo Park

unread,
May 17, 2022, 12:17:23 PM5/17/22
to MR3
Alright, we will try to update JDBC storage handler in Hive-MR3.

You are welcome to submit PRs to the MR3 backend in Hive-MR3. However, the Hive side in Hive-MR3 should be maintained by us because it is not updated incrementally. Here is how it is maintained.

1. The Hive-MR3 repo is structured like below. Let's assume that we have backported 100 patches from the master branch:

Hive 3.1.2 - patch 1 - patch 2 - .... - patch 100 - patch #1 for MR3 - patch #2 for MR3

Note that Apache Hive can be built by ignoring the last two commits. (This is why the repository is used by many Apache Hive users.)

2. Now, suppose that we want to backport another patch X. We do NOT add patch X after patch 100 or patch #2 for MR3, as shown below:

(X) Hive 3.1.2 - patch 1 - patch 2 - .... - patch 100 - patch X - patch #1 for MR3 - patch #2 for MR3

(X) Hive 3.1.2 - patch 1 - patch 2 - .... - patch 100 - patch #1 for MR3 - patch #2 for MR3 - patch X

Appending patch X to the end causes a lot of ugly problems and is not really a viable option because of the complexity of Hive. Rather, we try to insert patch X in the right place according to the patch order in the master branch:

(O) Hive 3.1.2 - patch 1 - patch 2 - .... - patch X - ... - patch 100 - patch #1 for MR3 - patch #2 for MR3

In the course of backporting patch X, we may have to insert its dependencies as well.

We have a script that replays all the patches starting from Hive 3.1.2, and whenever we need to backport a new patch, we update the script itself, rebuild the whole repository, run all the tests in Hive on four dedicated servers, analyze the result, and so on. It's a really complicated, painstaking process!

For us, any feedback on which patches to backport will be greatly appreciated, like your suggestion on JDBC driver. We will be happy to take care of the rest :-)

Cheers,

--- Sungwoo
Message has been deleted

Carol Chapman

unread,
Nov 2, 2022, 5:42:31 AM11/2/22
to MR3
Hi. I compared the codes of  JdbcStorageHandler   between  HIVE ON MR3 and  APACHE HIVE   There is no obvious gap between the codes on both sides.
Therefore, it is very likely that a patch to fix HIVE-CORE is not incorporated into MR3, which eventually leads to query exceptions.
Reply all
Reply to author
Forward
Message has been deleted
0 new messages