sc.hadoopGeoTiffRDD(path)
.flatMap {
case (extent, tile) => {
val rasterExtent = RasterExtent(extent.extent, tile.cols, tile.rows)
val rows = new ArrayBuffer[(Double, Double, Double)](tile.cols * tile.rows)
tile.foreachDouble { (col, row, z) => {
val (lng, lat) = rasterExtent.gridToMap(col, row)
rows.append((lng, lat, z))
}
}
rows
}
}
.map(....)
--
You received this message because you are subscribed to the Google Groups "geotrellis-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to geotrellis-us...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
What you can do in the meantime is use the "split" method to split large geotiffs up into smaller tiles, and then repartition the RDD so that you will not experience out of memory errors (which I am assuming is the motivation for the question).
An example is here https://github.com/lossyrob/geotrellis-ned-example/blob/master/src/main/scala/elevation/Main.scala#L89
To unsubscribe from this group and stop receiving emails from it, send an email to geotrellis-user+unsubscribe@googlegroups.com.
--
You received this message because you are subscribed to the Google Groups "geotrellis-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to geotrellis-user+unsubscribe@googlegroups.com.