How do you run the wordcount example with tachyon-0.2.1? The input file input/capacity-scheduler.xml is on HDFS. I thought Tachyon caches files from HDFS automatically on first read? Or should I put files to Tachyon manually with a tool like bin/tachyon tfs?
hobin-retina:~/work/hadoop-1.1.2 hyoon$ ./bin/hadoop jar hadoop-examples-1.1.2.jar wordcount tachyon://localhost:19998/input/capacity-scheduler.xml tachyon://localhost:19998/capacity-scheduler.xml.wc
2013-05-29 16:19:03.280 java[78058:1703] Unable to load realm info from SCDynamicStore
13/05/29 16:19:04 INFO : Trying to connect master @ localhost/
127.0.0.1:1999813/05/29 16:19:04 INFO : User registered at the master localhost/
127.0.0.1:19998 got UserId 8
13/05/29 16:19:04 INFO : Trying to get local worker host : localhost
13/05/29 16:19:04 INFO : Connecting local worker @ hobin-retina/
127.0.0.1:2999813/05/29 16:19:04 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:9000/tmp/hadoop-hyoon/mapred/staging/hyoon/.staging/job_201305291607_0001
13/05/29 16:19:04 ERROR security.UserGroupInformation: PriviledgedActionException as:hyoon cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: tachyon://localhost:19998/input/capacity-scheduler.xml
org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: tachyon://localhost:19998/input/capacity-scheduler.xml
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)
at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
I tried bin/tachyon tfs ls too, but had no luck.
hobin-retina:~/work/tachyon-0.2.1 hyoon$ bin/tachyon tfs ls /user
Exception in thread "main" java.lang.NullPointerException
at java.util.Collections.sort(Collections.java:154)
at tachyon.command.TFsShell.run(TFsShell.java:299)
at tachyon.command.TFsShell.main(TFsShell.java:279)
Thanks,
Hobin