Here are my paths:
MESOS_HOME=/usr/local/mesos
SPARK_HOME=/usr/local/mesos/spark
SCALA_HOME=/usr/local/mesos/scala
My spark_env.sh file contains these:
#!/usr/bin/env bash
# Set Spark environment variables for your site in this file. Some useful
# variables to set are:
export MESOS_HOME=/usr/local/mesos
# - MESOS_NATIVE_LIBRARY, to point to your Mesos native library (libmesos.so)
export MESOS_NATIVE_LIBRARY=/usr/local/mesos/src/.libs/libmesos.so
# - SCALA_HOME, to point to your Scala installation
export SCALA_HOME=/usr/local/mesos/scala-2.9.2
# - SPARK_CLASSPATH, to add elements to Spark's classpath
# - SPARK_JAVA_OPTS, to add JVM options
# - SPARK_MEM, to change the amount of memory used per node (this should
# be in the same format as the JVM's -Xmx option, e.g. 300m or 1g).
export SPARK_MEM=200m
# - SPARK_LIBRARY_PATH, to add extra search paths for native libraries.
Can you explain me the steps that I should be following to run my spark code? I have also referred to
https://github.com/mesos/spark/wiki/Running-Spark-Demo-GuideBut that is also not working for me.
My code is:
import spark.SparkContext
import SparkContext._
object SparkTest {
def main(args: Array[String]) {
if (args.length == 0) {
System.err.println("Usage: SparkTest <host> [<slices>]")
System.exit(1)
}
val spark = new SparkContext(args(0), "SparkTest")
val slices = if (args.length > 1) args(1).toInt else 2
val myFile = spark.textFile("/opt/test.txt")
val counts = myFile.flatMap(line => line.split(" "))
.map(word => (word, 1))
.reduceByKey(_ + _)
counts.saveAsTextFile("/opt/out2.txt")
System.exit(0)
}
}
By whatever means I have tried till now, I am unable to run this code as it gives error: "value spark not found".
If I have copied it to the examples directory in SPARK_HOME and tried to execute using the "./run" command (for which I had included the "package spark.examples" in the code), it did'nt run either.
Please help me here.