Need support in setting SparkR

62 views
Skip to first unread message

VIJAY KUMAR

unread,
Jun 21, 2016, 1:30:33 PM6/21/16
to Hadoop Users Group (HUG) Chennai
Dear All,

Request you all to kindly provide your inputs in resolving this error while I tried to include sparkR package in RStudio.

hadoop installed version is 2.7.1
R installed version is 3.3.0
scala version is 2.11.8
JAVA version 1.8.0_73

> Sys.setenv(SPARK_HOME="/opt/spark-1.6.1/bin/spark")
> .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
> library("SparkR", lib.loc = "/opt/spark-1.6.1/lib/")
> library(SparkR)
> sc <- sparkR.init(master = "local")
Launching java with spark-submit command /opt/spark-1.6.1/bin/spark/bin/spark-submit   sparkr-shell /tmp/RtmpHOnIoD/backend_portad47cdd84b
sh: 1: /opt/spark-1.6.1/bin/spark/bin/spark-submit: not found
Error in sparkR.init(master = "local") :
  JVM is not ready after 10 seconds

Awaiting all your high end support.

With Warm Regards,

Vijay Kumar
Reply all
Reply to author
Forward
0 new messages