Hi Guys,
I set up Tachyon on my EMR Spark cluster.
I manged to start Tachyon cluster and its up and running but when starting Spark shell I get the below error and the shell starts only as scala shell (i.e. without SparkContext).
It looks that it misses host for the spark logs but I couldn't understand how to set it up...
Any help will be welcomed!
Thanks,
Ophir
2015-01-20 13:44:59,800 INFO [main] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Registered BlockManager
java.io.IOException: Incomplete HDFS URI, no host: hdfs:///spark-logs/application_1421752461071_0007
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:143)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:365)
at org.apache.spark.util.FileLogger.<init>(FileLogger.scala:90)
at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:63)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:352)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:986)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)