Do not understand values of height in extra_bytes after lasheight

115 views
Skip to first unread message

mauro...@gmail.com

unread,
Nov 26, 2020, 9:48:32 AM11/26/20
to LAStools - efficient tools for LiDAR processing

Hello,

I'm trying to exclude some points in a point cloud based on the following workflow:

I have a point cloud with 4 point source id's assigned from which the last two cause problems.

My idea is:
1. classify ground points from this file using only the first two point source ids
2. calculate height of points in the original file using the ground points from 1.
3. store these heights as -extra_bytes (cm resolution)

If I run lasinfo on the file with the height information, I get a record attribte0 with values in between -77.62 and 577.65 (in meters, looks quite reasonable).

Now, if I open up the point cloud in cloud compare and visualize the attribute "height above ground" I get -25000 for all points on the ground and then normally more negative values for the points below ground and less negative numbers for the points above ground. This is confusing, but I could handle it.

But it also happens, that points below the ground show more negative values (than -25000) up to some point and then even lower points start to have positive numbers.

If I run a las tool command on it, I think I should use meters to exclude some points. But with the above example, where points start to have positive numbers after a certain distance below the ground this certainly doesn't lead to a usable result.

I'm not sure if I'm making a mistake assigning the heights with lasheight to the extra bytes (the results from lasinfo acutally seem to be reasonable).

Thanks.

Mauro

Martin Isenburg

unread,
Nov 29, 2020, 7:50:02 AM11/29/20
to LAStools - efficient command line tools for LIDAR processing
Hello Mauro,

I am not quite sure what the question is and as you neither provide a lasinfo report, nor include a link to a sample LAZ file, I am afraid I cannot help you with this. Maybe a case for the Cloud Compare community where this 25000 offset comes from?

Martin

--
Download LAStools at
http://lastools.org
http://rapidlasso.com
Be social with LAStools at
http://facebook.com/LAStools
http://twitter.com/LAStools
http://linkedin.com/groups/LAStools-4408378
Manage your settings at
http://groups.google.com/group/lastools/subscribe
---
You received this message because you are subscribed to the Google Groups "LAStools - efficient tools for LiDAR processing" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lastools+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/lastools/76f56175-b946-4fb1-a1b0-6be0a40ba067n%40googlegroups.com.

mauro...@gmail.com

unread,
Nov 30, 2020, 6:38:12 AM11/30/20
to LAStools - efficient tools for LiDAR processing
Hello Martin,

ok, another try:

The point cloud I'm working with is a photogrammetrically derived product from one of our  workflows, so the point source id is an information from which model run the points originate. As mentioned we have problems with the points from two model runs, for which we assign point source ids 21 and 22. My aim is to get some kind of "ground" estimate from points with point source id 11 and 12 (those will mostly not represent real ground points, especially in forested areas) and then exclude as many of the big outliers originating from point source id 21 and 22 by some kind of height filtering.

I prepared a folder with the data of one of the tiles causing problems. It can be downloaded here:

My workflow is as follows:

# 1) make a ground estimate surface only using point source id 11 and 12 (omitting point source id 21 22)
lasground_new -i ...\LAS_164486_2018_10_LV95LN02.laz -o ...\LAS_164486_2018_10_LV95LN02_ps11_12_ground.laz -wilderness -bulge 5 -offset 10 ^
-keep_point_source_between 11 12

# 2) keep only ground points
las2las -i ...\LAS_164486_2018_10_LV95LN02_ps11_12_ground.laz -o ...\LAS_164486_2018_10_LV95LN02_ps11_12_onlyGround.laz -keep_class 2

# 3) calculate heights of points in original file with ground points from point source id 11 and 12 and store as extra_bytes
lasheight -i ...\LAS_164486_2018_10_LV95LN02.laz -o ...\LAS_164486_2018_10_LV95LN02_lh.laz -ground_points ...\LAS_164486_2018_10_LV95LN02_ps11_12_onlyGround.laz ^
-store_as_extra_bytes

I now found the problem, why everything below does not lead to the expected results. If I do a normalization of the point cloud with replace_z, the result looks very reasonable (but I actually can not use the normalized version for further processing).

# 4) calculate height of points with ground points from point source id 11 and 12 and replace z
lasheight -i ...\LAS_164486_2018_10_LV95LN02.laz -o ...\LAS_164486_2018_10_LV95LN02_lh_replacez.laz ^
-ground_points ...\LAS_164486_2018_10_LV95LN02_ps11_12_onlyGround.laz -replace_z

If I compare the lasinfo outputs on " LAS_164486_2018_10_LV95LN02_lh.laz" and " LAS_164486_2018_10_LV95LN02_lh_replacez.laz" there the problem gets visible.
In the "LAS_164486_2018_10_LV95LN02_lh.txt" heights stored as extra_bytes range from -77.63 to 577.62, whereas in "LAS_164486_2018_10_LV95LN02_lh_replacez.txt" the height of the points reange from -620.39 to 123.60.

It might be, that I did not understand the usage of storing heights as extra_bytes. Actually I must do something completely wrong.

Then I would do something like this, but for the moment this doesn't lead to reasonable results, because of above mentioned discrepancies:
If I then classify points as class 7 from "LAS_164486_2018_10_LV95LN02_lh.laz, eighter the ones with heights above 0 or below 0 I get this problem, that points which are actually far below 0 seem to have positive heights:

# 5) classify points above 0 from file with heights stores as extra bytes as 7
las2las -i ...\LAS_164486_2018_10_LV95LN02_lh.laz -o ...\LAS_164486_2018_10_LV95LN02_lh_above0.laz -classify_attribute_above_as 0 0 7

#6) classify points below 0 from file with heights stores as extra bytes as 7
las2las -i ...\LAS_164486_2018_10_LV95LN02_lh.laz -o ...\LAS_164486_2018_10_LV95LN02_lh_below0.laz -classify_attribute_below_as 0 0 7

mauro...@gmail.com

unread,
Mar 24, 2021, 6:23:19 AM3/24/21
to LAStools - efficient tools for LiDAR processing
Hello,

I'm reraising a question I asked 4 month ago, but could not find a answer yet. It is probably not a common issue, but maybe someone knows how to deal with it.

Problem:
1. I do image matching for 500m^2 tiles from two different image stripes with two different image matching algorithms
2. I assign this information to the point's source_id as follows: First stripe: 11 and 12, second stripe 21 and 22
3. In some cases the second image stripe leads to severe outliers, which can not be filtered by default (see image)

Possible solution where I'm stuck:
1. Calculate a rough ground surface only from points of first stripe (source_id 11 and 12) --> "laz_first_stripe_ground"
2. Store only ground points to seperate file --> "laz_first_stripe_groundpoints"
3. Calculate height above ground of every point in the file with all source_ids with ground points " laz_first_stripe_groundpoints" store as extra bytes
lasheight -i "laz_file_all_source_id" -o "laz_file_all_source_id_h_eb" -store_as_extra_bytes -ground_points "laz_first_stripe_groundpoints"
4. I then want to delete points by their height above ground (but this is not easily possible because of the problem explained below)

The problem I'm now facing is, that the height above ground stored in attribute 0 seems to have a overflow problem (two-bytes short).
If I run a lasinfo to the "laz_file_all_source_id_h_eb" I get heights above ground ranging fom -77 to 577 and if I visualize the dataset in cloud compare I get values of -25000 for points very close to the ground. Points below the ground first start to decrease up to -32768 (as expected for two-bytes short) and then going even lower, values jump to 32768 and then decrease again.

So I think the problem I'm facing is the fact, that the ground (=0 in lasinfo) holds the value of -25000 and therefore after -77 (=-32768) the overflow occurs and values start by +32768 and then decrease again.

Thanks for any help,

Mauro
tile1.PNG

Martin Isenburg

unread,
Mar 24, 2021, 7:37:01 AM3/24/21
to LAStools - efficient command line tools for LIDAR processing
Hello,

the easiest solution would be to simply use

-store_precise_as_extra_bytes

instead of

-store_as_extra_bytes

which will store the height as a signed 32 bit integer at millimeter resolution. As an signed 32 bit integer ranges from -2147483648 to 2147483647 and because these are scaled with 0.001 to go to millimeters that will give you a height range from -2,147,483/648 meters to 2,147,483.647 meters. As long as your noise in z is less than plus of minus 2 million meters, that should work. But you will use two more bytes per point in storage.

For '-store_as_extra_bytes' the height is stored as a signed 16 bit integer at centimeter resolution. As an signed 16 bit integer ranges from -32768 to 32767 and because these are scaled with 0.01 to go to centimeters that would give us a height range from -327.68 meters to 327.67 meters. Because in general the height above ground is more important than the height below ground the more compact 16 bit integer adds a fixed offset of 250 meters to the height. This means we can express heights from  -77.68 meters to 477.67 meters with two bytes at centimeter resolution. 

I assume things go wrong when your heights go below -77.68 meters when a wrap around happens. I guess that really should be fixed. One obvious way of fixing this would be to clamp any height value below -77.68 to  -77.68 and any height above 477.67 to  477.67. Would that be reasonable? I am surprised this has not come up before.

Regards,

Martin

Terje Mathisen

unread,
Mar 24, 2021, 8:12:00 AM3/24/21
to LAStools - efficient tools for LiDAR processing
I'll make strong suggestion for saturating logic here, it is far better to get plateaus above/below the supported range than to get any form of wraparound.

On x86 with SIMD instructions you can even do this for free, but I suspect the need for clamping will be very rare, which means that naive code with test/branch will be perfectly predicted and perform very well indeed.

(I was a volunteer between 2016 and 2019, member of the working group that made the latest update to the ieee754 FP standard.)

Terje

Martin Isenburg wrote:
To view this discussion on the web visit https://groups.google.com/d/msgid/lastools/CABSWR-FQexj-x%2BF5Xo_onQxjovejDkzujMw-UAcu%3D27ezgyd0g%40mail.gmail.com.


-- 
- <Terje.M...@tmsw.no>
"almost all programming can be viewed as an exercise in caching"

mauro...@gmail.com

unread,
Mar 24, 2021, 10:05:11 AM3/24/21
to LAStools - efficient tools for LiDAR processing
Hello Martin,

thanks a lot, yes this seems to work.
One strange thing though is the fact, that lasinfo shows now attribute 0 ranging from 0 to 114 (which should be negative), but actually if I run
las2las "myLazWith_precise_extra_bytes" -o " myLazWith_precise_extra_bytes_onlyBelowMinus60" -keep_attribute_below 0 -60, only points lower than minus 60 are kept (so this works as expected).

Without understanding to much of value ranges your suggestion of clamping below -77.68 and above 477.67 would also work for my case.

Thanks again,

Mauro

Martin Isenburg

unread,
Mar 24, 2021, 10:15:22 AM3/24/21
to LAStools - efficient command line tools for LIDAR processing
Hi,

That sounds like a bug either in the output of lasinfo or in the way that lasheight stores the "more precise" height as extra bytes. Could you please send me the lasheight command line that you are using and the full lasinfo report on the resulting output LAZ file that exhibits this faulty behaviour?


Regards,

Martin



Martin Isenburg

unread,
Mar 25, 2021, 6:14:59 AM3/25/21
to LAStools - efficient command line tools for LIDAR processing
Hello,

turns out there was a bug in lasinfo for I32 attributes. It should look okay with this version:

https://www.dropbox.com/s/6a2mi7zi2rbbs1p/lasinfo.exe

Regards,

Martin
 

mauro...@gmail.com

unread,
Mar 25, 2021, 9:18:54 AM3/25/21
to LAStools - efficient tools for LiDAR processing
Hello Martin,

yes that fixed it.

Thanks,

Mauro
Reply all
Reply to author
Forward
0 new messages