Hi.
I am trying through Java to reload a
persisted indexed spatial RDD from an object file.
Through the Spark-shell client and Scala the example works according to the documentation:
var savedRDD = new SpatialRDD[Geometry]
savedRDD.indexedRawRDD = sc.objectFile[SpatialIndex]("hdfs://PATH")
Although the Scala example requires the path as the only argument, when I try from Java the Spark context corresponding method requires 3 arguments:
I am not familiar with Scala, but through some examples I came up with the following Java code which compiles with an error:
SpatialRDD<Geometry> spatialRDD = new SpatialRDD<>();
ClassTag<SpatialIndex> spIdxTag = scala.reflect.ClassTag$.MODULE$.apply(SpatialIndex.class);
spatialRDD.indexedRawRDD = spark.sparkContext().objectFile("hdfs://PATH", 1, spIdxTag);
incompatible types: no instance(s) of type variable(s) T exist so that org.apache.spark.rdd.RDD<T> conforms to org.apache.spark.api.java.JavaRDD<org.locationtech.jts.index.SpatialIndex>
Anyone could offer some hint about how I should call objectFile from Java?
Thanks