Hi together,
I have following problem: On my HDP 2.5 cluster running with Spark (v 1.6.2) and Livy I want to create a Spark Notebook in Hue. All application we will ever write have dependencies to the package ai.h2o:sparkling-water-core. Therefore in my spark-defaults.conf I added the following configuration: spark.jars.packages = ai.h2o:sparkling-water-core_2.10:1.6.8
Also I set the debugging level for Spark, as well as for Livy, to DEBUG. As I understand it, Hue creates a Livy interactive session whenever I open a Spark Notebook. Judging from the Livy logs the package gets loaded (see the 'livy-livy-server.out' attached). Also the information from the Spark-History Server indicated that the packages has been loaded since 'file:/home/livy/.ivy2/jars/ai.h2o_sparkling-water-core_2.10-1.6.8.jar' ist listed as one entry for the field 'spark.jars' and 'http://<LIVY_SERVER>:51720/jars/ai.h2o_h2o-core-3.10.0.7.jar'. is part of the classpath entries.
However, when I then try to import a class from the loaded package in the Spark Notebook in Hue I get the following error:
> import org.apache.spark.h2o._
error: object h2o is not a member of package org.apache.spark import org.apache.spark.h2o._ ^
When I start a spark-shell (without additional configuration) the same command works!
What am I doing wrong?