Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

blast2dem fails to produce output for laz files >10GB

65 views
Skip to first unread message

sant...@tula.org

unread,
Mar 7, 2025, 1:34:11 PMMar 7
to LAStools - efficient tools for LiDAR processing
Hi Jochen!

Running licensed version 250304.  

Using blast2dem we are trying to batch produce hillshade rasters from ground-only laz files (merged with lasmerge from classified tiles dropping buffers and keeping class 2) for many projects in a historical archive.  

the general command used: blast2dem -i $project_ground_merge.laz -step 3 -hillshade -otif

This workflow has worked for most projects in the archive, but produces empty (1kb) tif output and many .tmp files for large extent surveys where the ground_merge.laz's are >10GB in size (see attached screenshots).  

For those large files I'm getting:  FATAL ERROR: vertex not in hash. Need pre-order mesh.  

Older posts (2015) in the group point to this behaviour observed for files with >2 billion points. Is this still a hard limit for blas2dem?  Is there a workaround that doesn't rely on [re-]tiling the data into perhaps a few huge tiles?  Our aim is to "quickly" generate those "check hillshades" without artifacts at tile boundaries (hence the blas2dem on ground_merge.laz's).   
 
Thank you very much in advance and congratulations on carrying Martin's work forward and building on it as you guys have. 

Empty_blast2dem_output.jpgblast2dem_command.jpg

The ground_merge.laz illustrated in the screenshots has 2,053,371,495 points in it. 

Dave Munge

unread,
Mar 7, 2025, 2:32:36 PMMar 7
to LAStools - efficient tools for LiDAR processing
Hello,

For problems of heaviness or other with blast2dem I often use two tips that Martin himself had communicated to me in the past:

-thin_with_grid 0.25: Martin even told me to increase the number to 1. Basically it is that there are too many unnecessary points in the las and that according to the literature, there is no problem and it is even desirable to reduce this number. Here 1 point every 25 cm is kept only.

-translate_raw_xy_at_random 2 2: Here a small shuffle of the data is generated by this command to avoid errors due to triangulation that would be non-compliant. I do not know if with version 2025 of Lastools this tip is still valid, but I still use it to not take any chances.

Jochen Rapidlasso

unread,
Mar 7, 2025, 2:55:46 PMMar 7
to LAStools - efficient tools for LiDAR processing
Hi,
the 2 billion points are not a fix limit. It depends on your data.
Best is tiling in general.
To optimize the work of the algorithm you can also index/sort your data first - this should allow bigger files.
If you do not need the resolution at your output it is a good thing to choose a as-large-as-possible step size or thin/grid your data to this step size: This should reduce the point count and will improve speed and possible maximum size.

Cheers,

Jochen @rapidlasso
Reply all
Reply to author
Forward
0 new messages