Trying to compile Spark 0.6.0 against hadoop 2.0.0-cdh4.0.0.
Running MRv1 (but that shouldn't make a difference since it should just be HDFS-related classes required by Spark yes?).
hadoop-core-2.0.0-cdh4.0.0 doesn't exist, so I tried against hadoop-core-2.0.0-mr1-cdh4.0.0, but get compilation errors, e.g.:
[error] /home/pentreathn/workspace/scala/spark-0.6.0/core/src/main/scala/spark/HadoopWriter.scala:3: object fs is not a member of package org.apache.hadoop
[error] import org.apache.hadoop.fs.FileSystem
[error] ^
[error] /home/pentreathn/workspace/scala/spark-0.6.0/core/src/main/scala/spark/HadoopWriter.scala:4: object fs is not a member of package org.apache.hadoop
[error] import org.apache.hadoop.fs.Path
[error] ^
[error] /home/pentreathn/workspace/scala/spark-0.6.0/core/src/main/scala/spark/HadoopWriter.scala:5: ReflectionUtils is not a member of org.apache.hadoop.util
[error] import org.apache.hadoop.util.ReflectionUtils
[error] ^
[error] /home/pentreathn/workspace/scala/spark-0.6.0/core/src/main/scala/spark/HadoopWriter.scala:6: object io is not a member of package org.apache.hadoop
[error] import org.apache.hadoop.io.NullWritable
[error] ^
[error] /home/pentreathn/workspace/scala/spark-0.6.0/core/src/main/scala/spark/HadoopWriter.scala:7: object io is not a member of package org.apache.hadoop
[error] import org.apache.hadoop.io.Text
[error] ^
[error] error while loading JobConf, Missing dependency 'class org.apache.hadoop.conf.Configuration', required by /home/pentreathn/workspace/scala/spark-0.6.0/lib_managed/jars/org.apache.hadoop/hadoop-core/hadoop-core-2.0.0-mr1-cdh4.0.0.jar(org/apache/hadoop/mapred/JobConf.class)
[error] /home/pentreathn/workspace/scala/spark-0.6.0/core/src/main/scala/spark/SerializableWritable.scala:6: object io is not a member of package org.apache.hadoop
[error] import org.apache.hadoop.io.Writable
[error]