I'm attempting to use Spark JDBC to query to Presto and running into issues.
Initially, I tried to use the facebook presto jdbc driver and ran into the issue documented here (
http://theckang.com/2016/spark-with-presto/): "com.facebook.presto.jdbc.NotImplementedException: Method Connection.prepareStatement is not yet implemented"
So, I tried the teradata presto jdbc driver which has prepared statement support, but run into a new issue:
[Teradata][Presto](100200) Connection string is invalid: dbtable is not recognized.
java.sql.SQLException: [Teradata][Presto](100200) Connection string is invalid: dbtable is not recognized.
at com.teradata.presto.core.PRConnection.connect(Unknown Source)
at com.teradata.jdbc.common.BaseConnectionFactory.doConnect(Unknown Source)
at com.teradata.jdbc.common.AbstractDriver.connect(Unknown Source)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply(JdbcUtils.scala:61)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply(JdbcUtils.scala:52)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:120)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91)
at org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:57)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
It appears the 'dbtable' property which is required for Spark gets passed down into the teradata driver and throws an exception during a property validation check. I'd remove this validation logic, but can't seem to track down the source code for this driver. Is it open source?