Hello!
I'm working on adding a Spark-Connect support (#447) and I need to add
an implementation of org.apache.sql.connect.plugin.RelationPlugin to
achieve that. To avoid dependency mess I'm trying to implement it as a
subproject but I'm constantly facing problems with a very old version of
sbt and with a spark-packages resolver. Is the sbt-spark-package project
alive? I see that the last commit was 8 years ago:
https://github.com/databricks/sbt-spark-package And I cannot update sbt
version because this plugin failed with a newer versions.
What do you think about:
- replacing sbt-spark-package with an explicit definition of dependencies
- updating sbt version (from 0.13.x to 1.9.x)
- rewriting build sbt using explicit definition of projects (lazy val
graphframes = (project in file(".))....)
I'm willing to open an issue, work on the implementation and open a PR
with changes.
Best regards,
Sem