Classification of ALS and image based Digital Surface Model

217 views
Skip to first unread message

Philipp

unread,
Apr 23, 2024, 6:50:10 AMApr 23
to LAStools - efficient tools for LiDAR processing

Hello,

I am trying to classify various point clouds (ALS and image-based Digital Surface Model) as part of my thesis, extract buildings as .shp files, and then compare them. The classification of the ALS point cloud and subsequent extraction of buildings works perfectly. However, I encounter problems with lasground when dealing with the image-based Digital Surface Model. Trees are also being identified as ground points, and the actual ground after lasground has many gaps or holes. Additionally, I am not getting satisfactory results with lasclassify; the "Building" class contains vegetation, and vice versa, with buildings being classified as vegetation. Are there any tricks or tips to consider for the classification of non-ALS-based point clouds to achieve a reasonably clean classification? I could also provide the image-based Digital Surface Model that i'm trying to classify, if it helps.

Thank you in advance!

Philipp

Jochen Rapidlasso

unread,
Apr 23, 2024, 6:51:32 AMApr 23
to LAStools - efficient tools for LiDAR processing
Hi Philipp,
yes, please extract a SMALL sample and append it here of send it as private mail and we will have a look.
Also provide the detailed workflow you did so far.

Thanks,

Jochen @rapidlasso

Philipp

unread,
Apr 23, 2024, 7:24:01 AMApr 23
to LAStools - efficient tools for LiDAR processing
Hi Jochen,

unfortunately, the file is too large. However, it would be good to see the complete file with all facets, which is why I have provided the file via a link (https://www.dropbox.com/scl/fi/0v83q0vf9vy1wt9mfmved/Neubaugebiet_clip.laz?rlkey=tlztjwa3q8guhb1g80smbq6c5&st=pp7n02j4&dl=0). I hope that's okay. I don't have a proper workflow yet, as I am already quite unhappy with the ground classification, and my understanding of LAStools isn't (yet) deep enough to know exactly which command to use for each problem. So far, I've been trying out various commands in hopes of getting a reasonably satisfactory result. Additionally, it seems that every command up to lasclassify produces the same classification result each time.

Jochen Rapidlasso

unread,
Apr 24, 2024, 7:37:06 AMApr 24
to LAStools - efficient tools for LiDAR processing
Hi Philipp,
thanks for the data. This is how they look like:
lidar1.jpg
To detect the ground out of a lidar dataset we need ground points.
To check if we have ground points we did a cut along the green band using laslook.
Now we check this band data. We expect a flat area at the right side, then left of it the forest, with high vegetation but also some ground points in between.

lidar2.jpg
But this is what we get (right side, where the forest begins): The right side has ground points, but the forest does not.
So it is impossible to calculate a ground without some information about the ground.

Probably your dataset is an orthophoto and not a LiDAR dataset.
A LiDAR would/should be able to hit some points at the ground.

Cheers,

Jochen @rapidlasso

Philipp

unread,
Apr 24, 2024, 8:14:04 AMApr 24
to LAStools - efficient tools for LiDAR processing
Hi Jochen,

thanks for your efforts. I know, this dataset is a photogrammetrically generated point cloud, derived from orthophotos, so there aren't any ground points under the Vegetation in the middle of the point cloud. The other point cloud from the same area that I'm working with is LiDAR, and it works great with that. However, with the dataset I've sent, it's not working as well. I'm not really interested in the ground points; I'm interested in the building footprints that are generated with lasboundary in the end, because my goal is to compare both building footprints of the same area, one from LiDAR and one from the point cloud I've sent. In this process, parts of the vegetation are also classified as buildings. Moreover, right from the start with lasground, not all ground points are recognized as such, and sometimes vegetation in the middle of the point cloud is also classified as ground. What would be the/your approach to achieve a clean classification of such point cloud that is not based on LiDAR to get building footprints?

Thanks in advance,

Philipp

Philipp

unread,
Apr 24, 2024, 8:14:18 AMApr 24
to LAStools - efficient tools for LiDAR processing
This is the Workflow that i used: 

2) las2las:

las2las -i Neubaugebiet_clip.laz -set_classification 0 -o Neubaugebiet_Unclassified.laz

3) lasground:

lasground -i Neubaugebiet_Unclassified.laz -step 35 -coarse -bulge 2.0 -o Neubaugebiet_Ground.laz

4) lasheight:

lasheight -i Neubaugebiet_Ground.laz -o Neubaugebiet_nDOM.laz

5) lasclassify:

lasclassify -i Neubaugebiet_nDOM.laz -planar 0.15 -rugged 0.4 -o Neubaugebiet_Classified.laz

6) lasboundary:

lasboundary -i Neubaugebiet_Classified.laz -disjoint -concavity 1.5 -keep_class 6 -o Neubaugebiet_Building.shp

As i said, i'm not a pro with LAStools, so I don't know if some if these commands are unnecessary or not. 
And these are the results i get as a shape. As you can see, small parts of the vegetation in the middle of the area gets also classified as building.
Jochen Rapidlasso schrieb am Mittwoch, 24. April 2024 um 13:37:06 UTC+2:
Neubaugebiet_Building.shx
Neubaugebiet_Building.shp
Neubaugebiet_Building.dbf
Neubaugebiet_Building.prj

Jochen Rapidlasso

unread,
Apr 30, 2024, 9:17:30 AMApr 30
to LAStools - efficient tools for LiDAR processing
Hi Philipp,
what you've done so far doesn't look bad at all.
As usual, there is not one path to the target, is certainly a lot to try out to see which way is the best.
Working with your data we have some general hints which maybe help you to improve your process.
First, your data are too large to run many tests. So we cut out a small area - with some forest and some buildings - to do the tests there.
Finally we transfer this tests to the whole dataset.
We have a focus on the building area, because you are interested in the building footprints.
Also your dataset contain an error in the GeoTIFF entry, so we correct this at the beginning to avoid an error message on every run.
This command gives us a small and clean test file, 1.3 million points instead of 31 million.
 
las2las64 -i Neubaugebiet_clip.laz -o tmp_crop.laz -keep_xy 366820 5716621 366999 5716839 -set_ogc_wkt 1026 -remove_vlrs_from_to 0 2

Having a horizontal look at the data we see, that we have some noise points above and below our real data.
A denoising helps to improve the ground detection.

lasnoise64 -i tmp_crop.laz -o tmp_denoise.laz -remove_noise -step 0.3 -isolated 5

next we use the new ground algo to get ground points where we are out of the forest area.

lasground_new64 -i tmp_denoise.laz -o tmp_gnd.laz -extra_fine -offset 0.5

It is very important to check every step of the process individually.
It just makes sense to go to the next process step, if the previous step have reasonable results.
Otherwise we have to improve the current step instead of extend the problem to the next steps.

Calculating the height is a challenging part, because we do not have ground points at the forest area.
We test to do some adjustments to the default height calculation to improve your results:
- we classify all points above 18 meters as trees, because none of our buildings are larger than this value.
- we drop some green and pale colors to fragment the trees and avoid them to be detected as buildings.

lasheight64 -i tmp_gnd.laz -o tmp_h.laz -replace_z -classify_above 18 5 ^
  -drop_HSV 222d 277d 0.2 0.33 0.18 0.33 ^
  -drop_HSV 0d 359d 0.05 0.3 0.05 0.5 ^
  -drop_HSV 280d 333d 0.0 0.2 0.1 0.3

Now we do the classification, ignoring our already classified high trees.

lasclassify64 -i tmp_h.laz -o tmp_c06.laz -step 0.6 -height_in_z -ignore_class 5

Most relevant factor is the step width.
To test the result we did some iterations over this.

c05.png
Step 0.5
c06.png
Step 0.6
c07.png
Step 0.7

c08.png
Step 0.8
Optional we test further arguments of lasclassify like -wide_gutters -rugged -planar.
We check which result most fits our needs.
This result we take to go to the next step to determine building outlines.
The most relevant argument is the concavity, so we do again some iterations.

There are certainly many ways to improve the workflow, which is not yet perfect.
The tree areas contain many surface-like elements, and the buildings contain irregularities and shadow areas.

Our team is currently working on a tool that can also classify point clouds derived from orthophotos using a complete new algorithm. News on this (don’t expect anything before fall) will be published in our blog https://rapidlasso.de/news/ and here in the group.

Best regards,

Jochen @rapidlasso

Philipp

unread,
May 6, 2024, 5:27:55 AMMay 6
to LAStools - efficient tools for LiDAR processing
Hey Jochen,

thank you very much for your effort, you definitely helped me a lot.

However, I still have a few questions, especially regarding the error message from the point cloud. For example, after running lasground, I get this error message: "GeographicTypeGeoKey: look-up for 4258 not implemented" and "GeogGeodeticDatumGeoKey: look-up for 6258 not implemented". I suppose you mean this one. And you corrected this error message with the command "-set_ogc_wkt 1026"? What exactly does this command do, and what does this error mean? And what does the command "-remove_vlrs_from_to 0 2" do? Also, I couldn't find anything about the command "drop_HSV" in any README, could you also explain what that is about?

Regarding the point cloud, I also tried something else. I "colored" the point cloud again using the TIFF so that I have NIR in addition to RGB, and with -keep_NDVI, I managed to remove a large part of the vegetation.

Best regards,

Philipp

Jochen Rapidlasso

unread,
May 7, 2024, 5:10:11 AMMay 7
to LAStools - efficient tools for LiDAR processing
Hi Philipp,
that was a good idea to use the NIR values if you have them somewhere.
About the georeferencing: This is a own subject, we recommend to read some basics about this and see the other posts about CRS,...
In your case you had this CRS/projection information in your VLR part of the data file:

variable length header record 1 of 3:
  reserved             43707
  user ID              'LASF_Projection'
  record ID            34735 ----------------- GeoKeyDirectoryTag Record
  length after header  112
  description          'Georeferencing Information'
    GeoKeyDirectoryTag version 1.1.0 number of keys 13
      key 1024 tiff_tag_location 0 count 1 value_offset 1 - GTModelTypeGeoKey: ModelTypeProjected
      key 1026 tiff_tag_location 34737 count 34 value_offset 0 - GTCitationGeoKey: PCS Name = ETRS_1989_UTM_Zone_32N
      key 2048 tiff_tag_location 0 count 1 value_offset 4258 - GeographicTypeGeoKey: look-up for 4258 not implemented
      key 2054 tiff_tag_location 0 count 1 value_offset 9102 - GeogAngularUnitsGeoKey: Angular_Degree
      key 2056 tiff_tag_location 0 count 1 value_offset 7019 - GeogEllipsoidGeoKey: Ellipse_GRS_1980
      key 2057 tiff_tag_location 34736 count 1 value_offset 0 - GeogSemiMajorAxisGeoKey: 6378137
      key 2058 tiff_tag_location 34736 count 1 value_offset 1 - GeogSemiMinorAxisGeoKey: 6356752.314
      key 2059 tiff_tag_location 34736 count 1 value_offset 2 - GeogInvFlatteningGeoKey: 298.2572221
      key 2062 tiff_tag_location 34736 count 7 value_offset 3 - GeogTOWGS84GeoKey: TOWGS84[0,0,0,0,0,0,0]
      key 3072 tiff_tag_location 0 count 1 value_offset 25832 - ProjectedCSTypeGeoKey: ETRS89 / UTM 32N
      key 3073 tiff_tag_location 34737 count 23 value_offset 34 - PCSCitationGeoKey: ETRS_1989_UTM_Zone_32N
      key 3076 tiff_tag_location 0 count 1 value_offset 9001 - ProjLinearUnitsGeoKey: Linear_Meter
      key 4099 tiff_tag_location 0 count 1 value_offset 9001 - VerticalUnitsGeoKey: Linear_Meter
variable length header record 2 of 3:
  reserved             43707
  user ID              'LASF_Projection'
  record ID            34736  ----------------- GeoDoubleParamsTag Record (Optional)
  length after header  80
  description          'Double Param Array'
    GeoDoubleParamsTag (number of doubles 10)
      6.37814e+006 6.35675e+006 298.257 0 0 0 0 0 0 0
variable length header record 3 of 3:
  reserved             43707
  user ID              'LASF_Projection'
  record ID            34737 ------------------- GeoAsciiParamsTag Record (Optional)
  length after header  57
  description          'GeoAsciiParamsTag'
    GeoAsciiParamsTag (number of characters 57)
      PCS Name = ETRS_1989_UTM_Zone_32N|ETRS_1989_UTM_Zone_32N|

At least on of them
      key 2048 tiff_tag_location 0 count 1 value_offset 4258 - GeographicTypeGeoKey: look-up for 4258 not implemented
doesn't look valid, so we decided to just use key 1026
      key 1026 tiff_tag_location 34737 count 34 value_offset 0 - GTCitationGeoKey: PCS Name = ETRS_1989_UTM_Zone_32N
as WKT and drop the existing Projecting VLRs.
This is what 
... -set_ogc_wkt 1026 -remove_vlrs_from_to 0 2
suppose to do.
But the crs fix was not the subject of your question, we just want to get rid of a error message within the output.

About color filtering please see

Best,

Jochen @rapidlasso

Reply all
Reply to author
Forward
0 new messages