lassort -gps_time crash

Skip to first unread message

Barbara Hofmann

Oct 23, 2017, 10:09:16 AM10/23/17
to LAStools - efficient tools for LiDAR processing
Hi all,

I have a problem with lassort.

My data set consists of 2000+ tiles which have been processed using a combination of lastools and terra scan, both commercially licensed. Somewhere along the line some of our files corrupted and we are attempting to fixing them through
lassort -gps_time ....
and a couple of other steps

This approach works great for 1998 of the files but lastools crashes on the lassort step with the standard pop up message on 2 files. I have looked at the files and cannot find any reason why there should be a problem. It works fine without the -gps_time flag. 

Does anyone know what the cause for this problem could be?

Many thanks,


Kirk Waters - NOAA Federal

Oct 23, 2017, 11:01:55 AM10/23/17
You might check if those two files are larger than the others. I think lassort is doing an in core heap sort and can run out of memory. Just a guess. I suspect Martin would need access to one of the files to diagnose.

Sent from a mobile device subject to autocorrect errors.

Barbara Hofmann

Oct 24, 2017, 3:51:10 AM10/24/17
to LAStools - efficient tools for LiDAR processing
Hi Kirk,

thanks for the reply. I thought that too but the files are not particularly big (72408 records, 208KB). Larger files have worked just fine. 
I've attached a sample file. 



Barbara Hofmann

Oct 25, 2017, 4:01:11 AM10/25/17
to LAStools - efficient tools for LiDAR processing
We solved the mystery just in case somebody is wondering.
I had duplicated points in those files which apparently causes problem for lassort.
Once I removed them everything worked as expected.



Eduardo Marques

Mar 23, 2020, 2:08:37 PM3/23/20
to LAStools - efficient tools for LiDAR processing
Hey Barbara,

I get the same problem, and so far, removing the duplicates are working for me also. 

Thanks for the topic.


Apr 21, 2020, 12:36:40 AM4/21/20
to LAStools - efficient tools for LiDAR processing
Hi Barbara,
Thanks for sharing your solution. I have noticed that duplicate point could be problematic sometimes - especially with Terrestrial Laser Scanning data, which makes me conclude that it's a good practice to include cleaning-via-lasduplicate a part of the workflow. Your thoughts?


Martin Isenburg

Jul 11, 2021, 8:42:43 PMJul 11
to LAStools - efficient command line tools for LIDAR processing

Thanks for this feedback. What happens in the presence of excessive duplicate points is that the quick sort routine fails with a stack overflow. In the future versions of lassort this error will be caught and aborted in a controlled manner with the suggestion to use lasduplicate to check for excessive duplicate points.



You received this message because you are subscribed to the Google Groups "LAStools - efficient tools for LiDAR processing" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
To view this discussion on the web visit
Reply all
Reply to author
0 new messages