Delta lake 3.0.0rc1 accessing issue

150 views
Skip to first unread message

Venkat Anumala

unread,
Jul 6, 2023, 7:05:25 PM7/6/23
to Delta Lake Users and Developers
Hi Team,

I am trying to runt the below step and its throwing errors. could you  pleas ehelp us.

pyspark --packages io.delta:delta-core_2.12:3.0.0rc1 --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"

(venv) C:\Users\kiran\PycharmProjects\Pyspark_practice\venv\Lib\site-packages\pyspark>pyspark --packages io.delta:delta-core_2.12:3.0.0rc1 --conf "spark.sql.extensions=io.delta.sql.DeltaS
parkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
Python 3.8.10 (tags/v3.8.10:3d8993a, May  3 2021, 11:48:03) [MSC v.1928 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
:: loading settings :: url = jar:file:/C:/Users/kiran/AppData/Local/Packages/PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0/LocalCache/local-packages/Python38/site-packages/pyspark/jar
s/ivy-2.5.1.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: C:\Users\kiran\.ivy2\cache
The jars for the packages stored in: C:\Users\kiran\.ivy2\jars
io.delta#delta-core_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-b97995cf-5450-4851-9787-0cd7fd117a20;1.0
        confs: [default]
:: resolution report :: resolve 1461ms :: artifacts dl 0ms
        :: modules in use:
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
                module not found: io.delta#delta-core_2.12;3.0.0rc1

        ==== local-m2-cache: tried

          file:/C:/Users/kiran/.m2/repository/io/delta/delta-core_2.12/3.0.0rc1/delta-core_2.12-3.0.0rc1.pom

          -- artifact io.delta#delta-core_2.12;3.0.0rc1!delta-core_2.12.jar:

          file:/C:/Users/kiran/.m2/repository/io/delta/delta-core_2.12/3.0.0rc1/delta-core_2.12-3.0.0rc1.jar

        ==== local-ivy-cache: tried

          C:\Users\kiran\.ivy2\local\io.delta\delta-core_2.12\3.0.0rc1\ivys\ivy.xml

          -- artifact io.delta#delta-core_2.12;3.0.0rc1!delta-core_2.12.jar:

          C:\Users\kiran\.ivy2\local\io.delta\delta-core_2.12\3.0.0rc1\jars\delta-core_2.12.jar

        ==== central: tried

          https://repo1.maven.org/maven2/io/delta/delta-core_2.12/3.0.0rc1/delta-core_2.12-3.0.0rc1.pom

          -- artifact io.delta#delta-core_2.12;3.0.0rc1!delta-core_2.12.jar:

          https://repo1.maven.org/maven2/io/delta/delta-core_2.12/3.0.0rc1/delta-core_2.12-3.0.0rc1.jar

        ==== spark-packages: tried

          https://repos.spark-packages.org/io/delta/delta-core_2.12/3.0.0rc1/delta-core_2.12-3.0.0rc1.pom

          -- artifact io.delta#delta-core_2.12;3.0.0rc1!delta-core_2.12.jar:

          https://repos.spark-packages.org/io/delta/delta-core_2.12/3.0.0rc1/delta-core_2.12-3.0.0rc1.jar

                ::::::::::::::::::::::::::::::::::::::::::::::

                ::          UNRESOLVED DEPENDENCIES         ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: io.delta#delta-core_2.12;3.0.0rc1: not found

                ::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: io.delta#delta-core_2.12;3.0.0rc1: not found]
        at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1528)
        at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185)
        at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:332)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "C:\Users\kiran\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\site-packages\pyspark\python\pyspark\shell.py", line 66,
 in <module>
    SparkContext._ensure_initialized()
  File "C:\Users\kiran\PycharmProjects\Pyspark_practice\venv\lib\site-packages\pyspark\context.py", line 432, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "C:\Users\kiran\PycharmProjects\Pyspark_practice\venv\lib\site-packages\pyspark\java_gateway.py", line 106, in launch_gateway
    raise RuntimeError("Java gateway process exited before sending its port number")
RuntimeError: Java gateway process exited before sending its port number

Thanks,
Venkat
Reply all
Reply to author
Forward
0 new messages