root@ts80:~/work/spark-0.7.0# ./spark-shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/root/work/tachyon/target/tachyon-0.2.1-jar-with-dependencies.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/root/work/spark-0.7.0/lib_managed/jars/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 0.7.0
/_/
Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_21)
Initializing interpreter...
13/05/31 17:20:56 INFO server.Server: jetty-7.x.y-SNAPSHOT
Creating SparkContext...
13/05/31 17:21:03 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started
13/05/31 17:21:03 INFO storage.BlockManagerMaster: Registered BlockManagerMaster Actor
13/05/31 17:21:03 INFO storage.MemoryStore: MemoryStore started with capacity 323.9 MB.
13/05/31 17:21:03 INFO storage.DiskStore: Created local directory at /tmp/spark-local-20130531172103-6abf
13/05/31 17:21:03 INFO network.ConnectionManager: Bound socket to port 48098 with id = ConnectionManagerId(ts80,48098)
13/05/31 17:21:03 INFO storage.BlockManagerMaster: Trying to register BlockManager
13/05/31 17:21:03 INFO storage.BlockManagerMasterActor$BlockManagerInfo: Registering block manager ts80:48098 with 323.9 MB RAM
13/05/31 17:21:03 INFO storage.BlockManagerMaster: Registered BlockManager
13/05/31 17:21:03 INFO server.Server: jetty-7.x.y-SNAPSHOT
13/05/31 17:21:03 INFO spark.MapOutputTracker: Registered MapOutputTrackerActor actor
13/05/31 17:21:03 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-078c4401-23fb-4372-873e-92090b027690
13/05/31 17:21:03 INFO server.Server: jetty-7.x.y-SNAPSHOT
13/05/31 17:21:03 INFO io.IoWorker: IoWorker thread 'spray-io-worker-0' started
13/05/31 17:21:04 INFO server.HttpServer: akka://spark/user/BlockManagerHTTPServer started on /
0.0.0.0:4121413/05/31 17:21:04 INFO storage.BlockManagerUI: Started BlockManager web UI at
http://ts80:41214Spark context available as sc.
Type in expressions to have them evaluated.
Type :help for more information.
scala> val s = sc.textFile("tachyon://localhost:19998/user/root/input/capacity-scheduler.xml")
13/05/31 17:30:44 INFO storage.MemoryStore: ensureFreeSpace(56931) called with curMem=0, maxMem=339585269
13/05/31 17:30:44 INFO storage.MemoryStore: Block broadcast_0 stored as values to memory (estimated size 55.6 KB, free 323.8 MB)
s: spark.RDD[String] = MappedRDD[1] at textFile at <console>:12
scala> s.count()
13/05/31 17:31:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/05/31 17:31:12 WARN snappy.LoadSnappy: Snappy native library not loaded
java.io.IOException: No FileSystem for scheme: tachyon
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1408)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1429)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:176)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
at spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:52)
at spark.RDD.partitions(RDD.scala:168)
at spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:9)
at spark.RDD.partitions(RDD.scala:168)
at spark.SparkContext.runJob(SparkContext.scala:624)
at spark.RDD.count(RDD.scala:490)
at <init>(<console>:15)
at <init>(<console>:20)
at <init>(<console>:22)
at <init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:11)
at .<clinit>(<console>)
at $export(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:629)
at spark.repl.SparkIMain$Request$$anonfun$10.apply(SparkIMain.scala:890)
at scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
at scala.tools.nsc.io.package$$anon$2.run(package.scala:25)
at java.lang.Thread.run(Thread.java:722)