If there is a "not found: type T" error in Scala, it means that that type is not visible to the compiler. This is a compiler error. This is perhaps why using the REPL could be disadvantageous, rather than writing a set of executable code and submitting it through spark submit; it's hard to discern between compile errors and runtime errors.
Either way, if the compiler can't find the type, that means it's either misspelled or not in scope. Make sure you import the package for that type. In this case, import geotrellis.spark._
The "SpatialKey" type is what tells the catalog the type of key you need the raster to be returned with. We currently support SpatialKey and SpaceTimeKey, which holds both spatial and temporal information. If you ingest a raster with a SpatialKey (which the HadoopIngestCommand does), you need to get that layer out and specify the SpatialKey type. This allows the returned RDD[T] to be typed against the key, so what you get back is RDD[SpatialKey].
It might be tough to find what types live in what packages, to know what to import. Lack of documentation of the development code means you'll have to dive a bit into the source code for GeoTrellis. You can go to the github repo, press 'T', and then type in the Type name...if there is a file name with the same name (which there often is, and is for SpatialKey.scala), you can see where that code lives.
Alternatively, until you get comfortable navigating around, you could do a sort of blanket import of the types:
import geotrellis.spark._
import geotrellis.op.local._
import geotrellis.io._
import geotrellis.io.hadoop._
import goetrellis.raster._
import geotrellis.vector._
I think this should cover a lot of the types. If you find that a type cannot be found, and can't find it, let me know and I'll tell you what the appropriate import is.