hello from @lastools,
the new LASzip 2.0 version for lossless compression of LAS features
* random access decompression
* better compression for swath files (when points are in scan order)
* improved GPS time compression for points from interleaved swaths
* full backwards compatibility
now all LAStools will by default produce LASzip v2.0 compressed
content whenever the output is requested to be in LAZ format. for
example:
laszip *.las
lasthin -i original.las -o thined.laz
las2las -i long.las -o short.laz -subseq 0 1000000
las2las -i lidar.las -o lidar_utm15.laz -utm 15T
txt2las -i ascii.txt -parse xyzit -o lidar.laz
on of the biggest advantages of random access decompression is that
the spatial indexing and database functionality of LAStools is now
also available for compressed content. simply create the LAX file(s)
for one or many LAZ file(s) and query away:
lasindex huge_lidar.laz
las2las -i huge_lidar.laz -inside_tile 640000 4200000 1000 -o tile.laz
lasgrid -i huge_lidar.laz -inside_tile 640000 4200000 1000 -o tile.asc
lasinfo -i huge_lidar.laz -inside_tile 640000 4200000 1000
lasindex *.laz
las2las -i *.laz -merged -inside_tile 410000 2300000 10000 -o tile.laz
lasgrid -i *.laz -merged -inside_tile 410000 2300000 10000 -o tile.asc
lasinfo -i *.laz -merged -inside_tile 410000 2300000 10000
for LASlib - the LGPL read/write API used by LAStools - that has the
LASzip codebase built in its core, a fresh download of
http://www.cs.unc.edu/~isenburg/lastools/download/lastools.zip gets
you the LASzip 2.0 functionality for all LAStools and the updated
LASlib API with Makefile.
for libLAS - the BSD-licensed read/write API - that links to the
LASzip compression library, a fresh download from
http://laszip.org
will get you LASzip 2.0 functionality for libLAS as soon as the 1.7
release is out.
cheers,
martin @lastools
PS: Special thanks to Howard Butler @liblas for suggesting early on to
integrate "chunking" into LASzip. The default "chunk size" is 50000
points, meaning that the random access granularity is in chunks of
50000 points. Hence, if you want the LASreader to seek to point number
10,049,999 the API will start decompression at point 10,000,000 but
skip the first 49,998 points in a user-transparent manner. In
contrast, with the old LASzip v1.0 the API always needs to start
decompression at point 0. The chunk size can be controlled via the
'-chunk_size 20000' command line option on write. Note: this affects
the compression rate.