Issues using txt2las to convert a .pts file to .las

339 views
Skip to first unread message

Clayton Sexton

unread,
Feb 7, 2019, 1:12:25 AM2/7/19
to LAStools - efficient tools for LiDAR processing
I am attempting to convert a 315gb (Approximately 4.8 Billion points) .pts file to .las, and it is failing/crashing at exactly 80gb on the .las file. I have tried this in the command line for LAStools, as well as Pointzip, with the same results. Below is the command string I am using in LAStools. The reason I try to force -set_version 1.4, is because after it fails and I use lasinfo on it, it claims the header is only set to 225, and needs to be set to 375. However, with smaller tests, it correctly sets the header size to 375.

txt2las64 -ipts D:\Cyclone\Exports\Test.pts -o D:\Cyclone\Exports\Test.las -parse xyzi -set_version 1.4 -set_scale 0.001 0.001 0.001

Any help would be greatly appreciated!

Thanks,
Clayton Sexton

Evon Silvia

unread,
Feb 19, 2019, 7:36:46 PM2/19/19
to last...@googlegroups.com
Assuming 38 bytes per point (PDRF 8), 80GB is pretty close to the storage requirements for 2,147,483,647 points, which is the maximum value for a signed 32bit integer. My guess is some kind of container being used internally is maxing out at the 32bit integer range.

Try breaking your pts file into smaller files (maybe 1gb each).

Evon

Martin Isenburg

unread,
Feb 19, 2019, 7:52:59 PM2/19/19
to LAStools - efficient command line tools for LIDAR processing
Hello,

point type 0 is used. So it is 20 byte per point. But otherwise the same logic applies with maxing out an unsignet 32 bit integer after 4294967295 points. But his file really seems to contain all points. Below the first few lines. As each line is around 51 - 53 bytes (plus the CR) we get 6,379,507,497 * 53 bytes = 314 GB. 

I stand by my claim that the software engineer who made the Cyclone exporter for the ASCII formats PTS and PTX default to 6 decimal digits per coordinate (aka micrometer) is primarily responsible for mankind not having a colony on Mars yet (due to the insane waste in storage space, transmission bandwidth, and computation time). Simply storing this file with 3 instead of 6 decimal digits, for example, would reduce the length of each line by 9 characters and the size of the file by 6,379,507,497 * 9 bytes = 53 GB. 

(o:

6379507497
552239.748291 3560092.352783 1090.970947 -1431 0 0 0
552239.749268 3560092.353271 1090.970947 -1385 0 0 0
552239.748291 3560092.351807 1090.971436 -1644 0 0 0
552239.749756 3560092.351807 1090.971436 -1411 0 0 0
552239.747803 3560092.352783 1090.970947 -1348 0 0 0
552239.749756 3560092.354736 1090.971436 -1381 0 0 0
552239.938721 3560092.124756 1090.999756 -844 0 0 0
552239.929443 3560092.118408 1090.998779 -962 0 0 0
552239.927979 3560092.118408 1090.999268 -863 0 0 0
552239.928955 3560092.117432 1090.999756 -851 0 0 0
552239.926514 3560092.120361 1090.998779 -1087 0 0 0
552239.930420 3560092.115479 1090.999756 -799 0 0 0
552239.931396 3560092.115967 1090.999756 -856 0 0 0
552239.927490 3560092.120850 1090.998779 -925 0 0 0
552239.927002 3560092.119873 1090.998779 -932 0 0 0
552239.927002 3560092.120361 1090.999268 -1299 0 0 0
552239.925049 3560092.122314 1090.997314 -1120 0 0 0
552239.923584 3560092.123779 1090.997314 -1225 0 0 0
552239.925049 3560092.124268 1090.996826 -1110 0 0 0
[...]

Martin Isenburg

unread,
Feb 21, 2019, 11:04:09 PM2/21/19
to LAStools - efficient tools for LiDAR processing
Hello,

after exchanging a number of emails and several experimental runs and exchange of console outputs and lasinfo reports the issue was finally solved. There was a bug in LASlib that *only* occurred when attempting to write more than 4294967295 to a single uncompressed LAS file. I am surprised this bug has been able to survive seven years (or more) in the code. It seems to suggest to nobody has attempted to write LAS files containing over 4 billion points using LASlib, LASzip or txt2las before and that is a good thing. This bug does *not* occur when writing compressed LAZ. Hence had Clayton run

txt2las -i Test.pts -ipts -parse xyzi  -set_scale 0.001 0.001 0.001 -set_version 1.4 -o Test.laz

instead of 

txt2las -i Test.pts -ipts -parse xyzi  -set_scale 0.001 0.001 0.001 -set_version 1.4 -o Test.las

then this bug may have gone unnoticed another 7 years. It was fixed in the latest release and Clayton successfully turned the 315 GB PTS file containing 6,379,507,497 points in ASCII format into 118 GB binary LAS file. After running LASzip on 118 GB LAS file it went down to a 13.4 GB LAZ file and apparently the Recap software he intends to import this point cloud into does support LAZ. 

Going from a 315 GB PTS file to a 13.4 GB LAZ file is a reduction in size by a factor of more than 23. Are there any Leica Cyclone developers on this list? Maybe adding LAZ to the list of supported Cyclone output formats might be worthwhile to consider.

Thanks for working with me through that, Clayton.

Martin @rapidlasso
Reply all
Reply to author
Forward
0 new messages