I am compiling scoobi against an in-house fork of Hadoop. I have made minor changes due to some API difference, now I have only one issue as shown below, any ideas what the cause is and how to fix it? I really appreciate it.
Shuang
> compile
[info] Updating {file:/Users/shuangwu/github/scoobi/}scoobi...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 125 Scala sources and 2 Java sources to /Users/shuangwu/github/scoobi/target/scala-2.10/classes...
[warn] Class org.apache.hadoop.mapreduce.TaskAttemptContext not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.RecordWriter not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.TaskAttemptContext not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.RecordWriter not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.lib.input.FileInputFormat not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.lib.input.FileInputFormat not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.InputSplit not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.TaskAttemptContext not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.RecordReader not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.InputSplit not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.TaskAttemptContext not found - continuing with a stub.
[warn] Class org.apache.hadoop.mapreduce.RecordReader not found - continuing with a stub.
[error]
[error] while compiling: /Users/shuangwu/github/scoobi/src/main/scala/com/nicta/scoobi/io/avro/AvroInput.scala
[error] during phase: typer
[error] library version: version 2.10.4
[error] compiler version: version 2.10.4
[error] reconstructed args: -classpath /Users/shuangwu/github/scoobi/target/scala-2.10/classes:/Users/shuangwu/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar:/Users/shuangwu/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.10.4.jar:.........
[error]
[error] last tree to typer: Literal(Constant(()))
[error] symbol: null
[error] symbol definition: null
[error] tpe: Unit
[error] symbol owners:
[error] context owners: class GenericAvroKeyInputFormat -> package avro
[error]
[error] == Enclosing template or block ==
[error]
[error] Template(
[error] AvroKeyInputFormat[T] // parents
[error] ValDef(
[error] private
[error] "_"
[error] <tpt>
[error] <empty>
[error] )
[error] // 2 statements
[error] DefDef( // def <init>: <?> in class GenericAvroKeyInputFormat
[error] <method>
[error] "<init>"
[error] []
[error] List(Nil)
[error] <tpt>
[error] Block(
[error] Apply(
[error] super."<init>"
[error] Nil
[error] )
[error] ()
[error] )
[error] )
[error] DefDef( // override def createRecordReader: <?> in class GenericAvroKeyInputFormat
[error] <method> override
[error] "createRecordReader"
[error] []
[error] // 1 parameter list
[error] ValDef(
[error] <param>
[error] "split"
[error] "InputSplit"
[error] <empty>
[error] )
[error] ValDef(
[error] <param>
[error] "context"
[error] "TaskAttemptContext"
[error] <empty>
[error] )
[error] <tpt>
[error] Block(
[error] ClassDef(
[error] final
[error] "$anon"
[error] []
[error] Template(
[error] RecordReader[AvroKey[T], NullWritable] // parents
[error] ValDef(
[error] private
[error] "_"
[error] <tpt>
[error] <empty>
[error] )
[error] // 8 statements
[error] DefDef(
[error] 0
[error] "<init>"
[error] []
[error] List(Nil)
[error] <tpt>
[error] Block(
[error] Apply(
[error] super."<init>"
[error] Nil
[error] )
[error] ()
[error] )
[error] )
[error] ValDef(
[error] private <mutable> <defaultinit>
[error] "delegate"
[error] AppliedTypeTree(
[error] "AvroKeyRecordReader"
[error] "T"
[error] )
[error] <empty>
[error] )
[error] DefDef(
[error] 0
[error] "initialize"
[error] []
[error] // 1 parameter list
[error] ValDef(
[error] <param>
[error] "split"
[error] "InputSplit"
[error] <empty>
[error] )
[error] ValDef(
[error] <param>
[error] "context"
[error] "TaskAttemptContext"
[error] <empty>
[error] )
[error] "scala"."Unit"
[error] Block(
[error] // 2 statements
[error] ValDef(
[error] 0
[error] "schema"
[error] <tpt>
[error] DataFileReader.openReader(new FsInput(split.asInstanceOf[FileSplit].getPath, context.getConfiguration), new GenericDatumReader[T]())."getSchema"
[error] )
[error] Assign(
[error] "delegate"
[error] Apply(
[error] new AvroKeyRecordReader[T]."<init>"
[error] "schema"
[error] )
[error] )
[error] Apply(
[error] "delegate"."initialize"
[error] // 2 arguments
[error] "split"
[error] "context"
[error] )
[error] )
[error] )
[error] DefDef(
[error] 0
[error] "nextKeyValue"
[error] []
[error] Nil
[error] <tpt>
[error] "delegate"."nextKeyValue"
[error] )
[error] DefDef(
[error] 0
[error] "getCurrentKey"
[error] []
[error] Nil
[error] <tpt>
[error] "delegate"."getCurrentKey"
[error] )
[error] DefDef(
[error] 0
[error] "getCurrentValue"
[error] []
[error] Nil
[error] <tpt>
[error] "delegate"."getCurrentValue"
[error] )
[error] DefDef(
[error] 0
[error] "getProgress"
[error] []
[error] Nil
[error] <tpt>
[error] "delegate"."getProgress"
[error] )
[error] DefDef(
[error] 0
[error] "close"
[error] []
[error] Nil
[error] <tpt>
[error] "delegate"."close"
[error] )
[error] )
[error] )
[error] Apply(
[error] new $anon."<init>"
[error] Nil
[error] )
[error] )
[error] )
[error] )
[error]
[error] == Expanded type of tree ==
[error]
[error] TypeRef(TypeSymbol(final abstract class Unit extends AnyVal))
[error]
[error] uncaught exception during compilation: java.lang.AssertionError
[trace] Stack trace suppressed: run last compile:compile for the full output.
[error] (compile:compile) java.lang.AssertionError: assertion failed: org.apache.hadoop.mapreduce.lib.input.FileInputFormat
[error] Total time: 12 s, completed Dec 2, 2014 12:06:43 AM