- Use laszip to compress individual .LAS files (Agisoft output) to LAZ with additional "-rescale 0.01 0.01 0.01 -lax -append -merged" commands
- Use lastile to tile entire .LAZ file into 25 x 25 m tiles (with 5 m buffer) to accommodate roughly 2 million points per tile. (used -extra_pass to derive # of points per tile)
- Use lassort to sort the individual tiles
- Use lascanopy to derive 1st percentile elevation heights per 5x5m grid to obtain a low noise threshold.
- Use lasheight to classify isolated points based on elevation surface generated in (4).
- Use lasground to classify ground points using -wilderness -ultra_fine and -bulge 1.0
I get stuck on generating the individual tiled DTM:
las2dem -lof file_list.2920.txt -elevation -keep_class 2 -use_tile_bb -odir "E:\...\06_Tiles_DTM_las2dem" -obil
--> crashes after it encounters 1.5 million points in particular tiles and applies a diagonal black bar.
blast2dem -lof file_list.7028.txt -elevation -keep_class 2 -use_tile_bb -odir "E:\...\06_Tiles_DTM_blast2dem" -obil
--> runs through all the tiles but for a large number of tiles it generates "Fatal error: vertex not in hash. need pre-order mesh" and subsequent empty .BIL files. (Seen screenshot)
From previous posts I realize that this error often shows up when tiles with >2 billion points are accessed, yet my tiles only contain 1-2 million points. I've tried different "number of cores" settings, I've tried running individual tiles (such as the example in the second screenshot), tried different step sizes, kill triangles, and geotiffs instead of .BILs, but all to no avail. Also tried larger tile sizes (50 m, 100 m etc) which did not work either.
Any help would be appreciated. Once it works and I have tested this performance I will seriously consider a commercial license.
Thanks,