I'm trying to test compaction of Iceberg tables and getting an error
concerning the MySQL jdbc driver. For historical reasons, we are
still running the Metastore on one of our old, Hadoop cluster nodes
and configuring the Hive server on Kubernetes to use it. We can and
eventually will move the Metastore to Kubernetes but it hasn't been a
high priority yet.
Here is the error I'm seeing when running an "alter table ... compact
'major'" query:
2025-05-27T17:13:08,623 ERROR [HiveServer2-Background-Pool: Thread-2820] txn.TxnUtils: Unable to instantiate raw store directly in fastpath mode
java.lang.RuntimeException: Failed to get driver instance for jdbcUrl=jdbc:mysql://kbhadoop01/hive?createDatabaseIfNotExist=true&useSSL=false
at org.apache.hive.com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:114) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:331) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:114) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:108) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.datasource.HikariCPDataSourceProvider.create(HikariCPDataSourceProvider.java:102) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setupJdbcConnectionPool(TxnHandler.java:984) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setConf(TxnHandler.java:282) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnUtils.getTxnStore(TxnUtils.java:151) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.ddl.table.storage.compact.AlterTableCompactOperation.execute(AlterTableCompactOperation.java:90) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.ddl.DDLTask.execute(DDLTask.java:84) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:354) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:327) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:244) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:105) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:348) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:192) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:145) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:140) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:190) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:234) ~[hive-service-4.0.0.jar:4.0.0]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:334) ~[hive-service-4.0.0.jar:4.0.0]
at java.security.AccessController.doPrivileged(AccessController.java:712) ~[?:?]
at javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) ~[hadoop-common-3.3.6.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:354) ~[hive-service-4.0.0.jar:4.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
at java.lang.Thread.run(Thread.java:833) ~[?:?]
Caused by: java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:299) ~[java.sql:?]
at org.apache.hive.com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:106) ~[hive-exec-4.0.0.jar:4.0.0]
... 32 more
2025-05-27T17:13:08,623 ERROR [HiveServer2-Background-Pool: Thread-2820] exec.Task: Failed
java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:299) ~[java.sql:?]
at org.apache.hive.com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:106) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:331) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:114) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:108) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.datasource.HikariCPDataSourceProvider.create(HikariCPDataSourceProvider.java:102) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setupJdbcConnectionPool(TxnHandler.java:984) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setConf(TxnHandler.java:282) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnUtils.getTxnStore(TxnUtils.java:151) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.ddl.table.storage.compact.AlterTableCompactOperation.execute(AlterTableCompactOperation.java:90) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.ddl.DDLTask.execute(DDLTask.java:84) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:354) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:327) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:244) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:105) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:348) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:192) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:145) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:140) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:190) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:234) ~[hive-service-4.0.0.jar:4.0.0]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:334) ~[hive-service-4.0.0.jar:4.0.0]
at java.security.AccessController.doPrivileged(AccessController.java:712) ~[?:?]
at javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) ~[hadoop-common-3.3.6.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:354) ~[hive-service-4.0.0.jar:4.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
at java.lang.Thread.run(Thread.java:833) ~[?:?]
2025-05-27T17:13:08,623 ERROR [HiveServer2-Background-Pool: Thread-2820] exec.Task: DDLTask failed, DDL Operation: class org.apache.hadoop.hive.ql.ddl.table.storage.compact.AlterTableCompactOperation
java.lang.RuntimeException: java.lang.RuntimeException: Failed to get driver instance for jdbcUrl=jdbc:mysql://kbhadoop01/hive?createDatabaseIfNotExist=true&useSSL=false
at org.apache.hadoop.hive.metastore.txn.TxnUtils.getTxnStore(TxnUtils.java:156) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.ddl.table.storage.compact.AlterTableCompactOperation.execute(AlterTableCompactOperation.java:90) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.ddl.DDLTask.execute(DDLTask.java:84) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:214) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:354) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:327) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:244) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:105) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:348) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:192) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:145) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:140) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:190) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:234) ~[hive-service-4.0.0.jar:4.0.0]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:334) ~[hive-service-4.0.0.jar:4.0.0]
at java.security.AccessController.doPrivileged(AccessController.java:712) ~[?:?]
at javax.security.auth.Subject.doAs(Subject.java:439) ~[?:?]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) ~[hadoop-common-3.3.6.jar:?]
at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:354) ~[hive-service-4.0.0.jar:4.0.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
at java.lang.Thread.run(Thread.java:833) ~[?:?]
Caused by: java.lang.RuntimeException: Failed to get driver instance for jdbcUrl=jdbc:mysql://kbhadoop01/hive?createDatabaseIfNotExist=true&useSSL=false
at org.apache.hive.com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:114) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:331) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:114) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:108) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.datasource.HikariCPDataSourceProvider.create(HikariCPDataSourceProvider.java:102) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setupJdbcConnectionPool(TxnHandler.java:984) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setConf(TxnHandler.java:282) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnUtils.getTxnStore(TxnUtils.java:151) ~[hive-exec-4.0.0.jar:4.0.0]
... 24 more
Caused by: java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:299) ~[java.sql:?]
at org.apache.hive.com.zaxxer.hikari.util.DriverDataSource.<init>(DriverDataSource.java:106) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.initializeDataSource(PoolBase.java:331) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.PoolBase.<init>(PoolBase.java:114) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:108) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hive.com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.datasource.HikariCPDataSourceProvider.create(HikariCPDataSourceProvider.java:102) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setupJdbcConnectionPool(TxnHandler.java:984) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnHandler.setConf(TxnHandler.java:282) ~[hive-exec-4.0.0.jar:4.0.0]
at org.apache.hadoop.hive.metastore.txn.TxnUtils.getTxnStore(TxnUtils.java:151) ~[hive-exec-4.0.0.jar:4.0.0]
... 24 more
FAILED: Execution Error, return code 40000 from org.apache.hadoop.hive.ql.ddl.DDLTask. No suitable driver
2025-05-27T17:13:08,634 INFO [HiveServer2-Background-Pool: Thread-2820] reexec.ReOptimizePlugin: ReOptimization: retryPossible: false
2025-05-27T17:13:08,634 INFO [HiveServer2-Background-Pool: Thread-2820] reexec.ReExecuteLostAMQueryPlugin: Exception is not a TezRuntimeException, no need to check further with ReExecuteLostAMQueryPlugin
2025-05-27T17:13:08,634 INFO [HiveServer2-Background-Pool: Thread-2820] reexec.ReExecutionDagSubmitPlugin: Got exception message: No suitable driver retryPossible: false
2025-05-27T17:13:08,634 INFO [HiveServer2-Background-Pool: Thread-2820] reexec.ReExecuteOnWriteConflictPlugin: Got exception message: No suitable driver retryPossible: false
2025-05-27T17:13:08,634 ERROR [HiveServer2-Background-Pool: Thread-2820] ql.Driver: FAILED: Execution Error, return code 40000 from org.apache.hadoop.hive.ql.ddl.DDLTask. No suitable driver
I believe I saw possible linkages between workdir-pv/pvc and the
location for the downloaded, MySQL driver. Based on earlier
discussions here, we don't use workdir-pv/pvc and instead point Hive
to a location in HDFS. Should I revert that change or is there
another place I should configure/put the MySQL driver? Or should I
go ahead and move the Metastore to Kubernetes now?
David
--
David Engel
da...@istwok.net