warning netlib integration

23 views
Skip to first unread message

Marco Biglieri

unread,
Feb 1, 2017, 9:12:57 AM2/1/17
to SparkR Developers
Hello,

I have used the glm to create a linear regressio model but there is these warning messages: 

WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
WARN LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK
WARN LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK

to solve this warnings i have added the dependencies in the spark-defaults.conf as described in mllib guide (https://spark.apache.org/docs/1.2.1/mllib-guide.html):

spark.jars.packages=com.github.fommil.netlib:core:1.1.2

but after i obtain a null pointer exception:

com.github.fommil.netlib#core added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.github.fommil.netlib#core;1.1.2 in central
found net.sourceforge.f2j#arpack_combined_all;0.1 in central
:: resolution report :: resolve 398ms :: artifacts dl 9ms
:: modules in use:
com.github.fommil.netlib#core;1.1.2 from central in [default]
net.sourceforge.f2j#arpack_combined_all;0.1 from central in [default]
---------------------------------------------------------------------
|                  |            modules            ||   artifacts   |
|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
|      default     |   2   |   0   |   0   |   0   ||   2   |   0   |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 2 already retrieved (0kB/15ms)
Exception in thread "main" java.lang.NullPointerException
at org.apache.spark.deploy.RPackageUtils$.checkManifestForR(RPackageUtils.scala:95)
at org.apache.spark.deploy.RPackageUtils$$anonfun$checkAndBuildRPackage$1.apply(RPackageUtils.scala:179)
at org.apache.spark.deploy.RPackageUtils$$anonfun$checkAndBuildRPackage$1.apply(RPackageUtils.scala:175)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at org.apache.spark.deploy.RPackageUtils$.checkAndBuildRPackage(RPackageUtils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:306)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Anyoune knows how is possible solve this problem?

Thanks
Marco

Reply all
Reply to author
Forward
0 new messages