Donuts are not supported for clipping

566 views
Skip to first unread message

Florian Gandor

unread,
Jul 20, 2016, 2:05:18 PM7/20/16
to LAStools - efficient tools for LiDAR processing
Hi,

I am using as much as possible LAStools when it comes to basic operations like clipping, density computing, overlapping control, merging, filtering points, etc... because lastools is very efficient for this operation and (giving the appropriate license) parallelizing the task is easy.
Some weeks ago I've developed a workflow to merge dense 3d points from image matching with blending in transition areas. At this time I needed a way to clip pointclouds keeping 100 % of their points and clip the transition areas (polygon with donut hole in the middle) where some blending algorithms were used.
The problem was that lastools where not handling polygon with donut holes to clip the sort of contour. Will this functionality be added in lasclip?

My workaround was the donutBridgebuilder from FME plus a very small negative buffer in order to prevent lastools from complaining about similar coordinates. It was no great deal for me as my polygons are anyway prepared with FME, but I suppose adding support for donuts is of great help for the lasclip tool and lastools users.

Keep on the good work,

Florian

Michael Perdue

unread,
Jul 21, 2016, 2:27:35 AM7/21/16
to LAStools - efficient command line tools for LIDAR processing
Hello,

I can't answer whether or not the requested functionality will ever be included or not, but a little while ago I ran into a unrelated challenge and the workflow that I used to solve my problem may also be useful for some when dealing with problematic vector files. In short, you can create a rasterized binary mask of your vector and use lascolor/las2las to isolate points inside the region of interest. In the example below I created a series of nested rings (ComplexShape.shp) and a synthetic point cloud (SyntheticCloud.laz) with 1m X/Y spacing. I rasterized the shape and then used that info to isolate the points;

:: Rasterize your vector file in your favorite software. CLI rocks

:: so I use GDAL but GRASS, QGIS, manifold or what ever...

gdal_rasterize -tr 0.1 0.1 ^

               -burn 1  ^

               -ot Byte ^

               -l ComplexShape ComplexShape.shp BinaryMask.tif

 

:: write the value into the lasfile. In my example, if the point is

:: inside the shape and you used a command analogous to the one given

:: above, then a value of 1 will be written into the field for the

:: red band. If is outside the shape, it will be given a value of 0.

lascolor -dont_scale_rgb_up ^

         -zero_rgb ^

         -image BinaryMask.tif ^

         -i SyntheticCloud.laz ^

         -o SyntheticCloud_colorized.laz

 

:: You can use -point_type to drop the RGB fields once the file

:: has been filtered.

las2las -keep_RGB_red 1 1 ^

        -point_type 1 ^

        -i SyntheticCloud_colorized.laz ^

        -o SyntheticCloud_insideROI.laz

 

:: And the resulting point cloud...

lasview -i SyntheticCloud_insideROI.laz




The advantage of this methodology is that the complexity of your vector layer is largely irrelevant; holes, islands, holes inside of islands, ridiculous number of vertices... it doesn't matter. The method is also comparatively fast when using vectors with an unreasonable number of vertices (we occasionally receive massive, complex AOI's with millions of vertices from our forestry clients that have to be simplified).
The disadvantage is that the selection process is "fuzzy" and the maximum error for determining if a point is "inside or outside" the ROI is the diagonal distance of your raster resolution (~14.1cm in the example). You can increase the resolution to improve the accuracy, but the raster mask becomes unruly with large area's and high resolutions. When the resolution is >> lower than point spacing, you end up with ugly blockiness that is particularly evident along diagonal edges. So the user has to make a compromises on what they can and can't live with when following this workflow.

Cheers,

Mike



Floris Groesz

unread,
Jul 22, 2016, 9:00:22 AM7/22/16
to LAStools - efficient tools for LiDAR processing
That is an interesting approach Mike,
eCognition is following this approach (more or less) and we use that if rasterization is "allowed". However sometimes the clipping must be done 100% correct and that a vector method is necessary. The reason is not always accuracy, but also logical consistency.
Practically it might be easier to make some improvements on island/donut shapefiles (I guess, but I am not the one to confirm this). the problem of too many vertices and topological problems are quite easy to fix in general GIS software. And I think that those problems should be fixed as much as possible at the source.

Floris

Martin Isenburg

unread,
Jul 28, 2016, 8:56:50 AM7/28/16
to LAStools - efficient command line tools for LIDAR processing
Hello,

in the next release (this weekend) there will be support for '-donuts' in lasclip. 


D:\LAStools\bin>lasclip -v -i in.laz ^
                                      -poly donut.shp ^
                                      -donuts ^
                                      -o donut.laz
creating clip polygon took 0 sec.
removed points 3454005 (3049758). outputting 806352 points took 3.899 sec.
done with 'donut.laz'. total time 3.899 sec.

D:\LAStools\bin>lasclip -v -i in.laz ^
                                      -poly donut.shp ^
                                      -classify_as 8 -interior -donuts ^
                                      -o donut_classified.laz
creating clip polygon took 0 sec.
outputting 4260357 points (806352 classified) took 5.512 sec.
done with 'donut_classified.laz'. total time 5.512 sec.

No pressure. No donuts. (-;

Regards,

Martin @rapidlasso

--
lasclip_donut_shp.jpg
lasclip_donut.jpg
lasclip_donut_classified_interior.jpg

monst...@web.de

unread,
Aug 23, 2016, 6:50:45 AM8/23/16
to LAStools - efficient tools for LiDAR processing
Hello,
I just tested the "donuts" option on my data and it did improve the output, and that's why I'm posting in this topic.
But I do have many of questions relating to DSM computation and workflow.

I have one dataset of "first pulse (fp, --> lasgrid first pulse_komp)" measurements and another one of "last pulse ground (lpg)" measurements, and a synthetic dataset containing only the waterbodies (--> viewer waterbodies). I recieved the data like this. The aim is to compute a DTM, a DSM and a normalized DSM.
In the lpg data, all buildings and waterbodies were gaps with no data, which is no problem as these gaps were covered by triangulation and then rastered.

In the fp data, there are some holes of missing data where the water bodies are, but these data gaps are very irregular (-->again lasgrid first pulse_komp)
For the DSM I see one problem: If I compute the DSM like the DTM, I think the triangulation will use z-values from the tree tops around the waterbodies (there are many trees jutting out over the water surface) and thus lift the water surface by several meters or compute weird slopes where there is only flat water.
In order to avoid this I tired

1) lasboundary to create a shapefile outlining the areas of missing data from the fp data
actually I don't know how I managed to produce the file, yesterday I went crazy over the point limitation issue and the addition of bounding boxes. Today, I found that file which I thought I failed to produce yesterday.
what I know is that I chose a very small value for concavity ~2 and the option "holes"
and I have tiled the fp data in 100 square tiles in order to avoid the point limitations, I just don't know how I put it back together

2) lasclip to clip exactly these areas from the waterbody data which were missing in the fp data (--> lasclip interface for overview)

lasclip -i waterbodies.laz
       
-poly fp_gap_clip.shp
       
-donuts -interior -odir %OUTPUT%
       
-o clipped waterbodies_donut.laz
--> clipped waterbodies_donut.png

also I get a lot of these warnings: polygon x has duplicate point at count y (0)
can I ignore them? If not, is there a way to handle this problem?

3) at last I wanted to merge the output of the clipping with the fp data to get an improved data coverage of the water surfaces. I did not do that yet, because the results are not satisfying.

The problem is that the output of the clipping, as I said, has improved with the "donuts" option but still could be much better. I think it is a problem and will introduce errors in the DSM if the synthetic data of the waterbodies overlaps with the first pulse data.
I also thought of the possibility to just merge both and then remove the duplicates so that fp data is the first in line and the second input only is kept if fp data is missing a point. But as the waterbody data is synthetic is has a different point spacing and x/y values are not the same as in fp data, so I suspect there won't be any duplicated points to filter.

So I wondered
a) if there is either a completely different workflow to fill up the data gaps or
b) if there is the possibility to refine my workflow or
c) maybe I'm being much too niggling or
d) the overlapping of both data is not as problematic as I think (Why?)

I'd appreciate any advice.
Greetings

Petra

P.S.: just to let you know, I'm a university student, not a professional yet and this is the first time I work with lastools and do some programming. Maybe you can consider that in you answer...and I'm sorry for the messy entry, I'm pretty confused.







viewer waterbodies.PNG
lasclip interface for overview.PNG
lasgrid first pulse_komp.jpg
clipped waterbodies_donut.PNG

Karol Krajewski

unread,
Oct 18, 2016, 9:38:28 AM10/18/16
to LAStools - efficient tools for LiDAR processing
Martin,

I'm afraid there is still some problems with clipping to donut polygons. Been trying to work on shape files with donuts and each time I try them (even with -donuts switch) lasclip returns a number of errors and fails to process:

WARNING: polygon 2 has duplicate point at count 2301 (0)
WARNING
: polygon 3 has duplicate point at count 3411 (0)
WARNING
: polygon 3 has duplicate point at count 3413 (0)
ERROR
: cannot find donut edge 289 to 290 in contrained TIN


This is the case for two sources of polygon creation (FME and Global Mapper), however seems to work fine when the same was created in QGIS using OGR dissolve method. The polygon I'm using is shown in attached file.

I understand this may not be directly a LasTools problem, but I wonder why it wouldn't accept geometry from different source, even if it's perfectly valid and passes all the geometry test I could put them through.

Thanks,
K
Donuts.PNG

Martin Isenburg

unread,
Oct 23, 2016, 9:33:57 PM10/23/16
to LAStools - efficient command line tools for LIDAR processing
Hello Karol,

looks like you might have found a rare (?) bug. But to evaluate this on my end I would need the SHP and LAZ file together with the exact command line that you have used so I can run it in the debugger (assuming I can replicate the error).

Regards from Sri Lanka,

Martin @rapidlasso

--

Karol Krajewski

unread,
Nov 26, 2016, 8:07:24 AM11/26/16
to LAStools - efficient tools for LiDAR processing
Hi Martin,

Thanks a lot for fixing this issue in the latest release (161114). We've been clipping to various complex shapes now with hundreds of donuts, made in variety of software and have no problems so far.

PS. Just one thing to note as it's not in the README yet: -quiet switch, suppresses duplicate vertex warnings from stdout and makes your screen a bit "cleaner".

Thx,
Karol

Ruben Wauters

unread,
Mar 23, 2023, 5:45:06 AM3/23/23
to LAStools - efficient tools for LiDAR processing
Hi,

I have a similar error. I use the latest version, so this could not be the problem.
How can I fix this? 

data map: question donut 

Command: lasclip -i "...\*.las" -poly "...\GRB50.shp" -odir "..." -interior -quiet -donut -odix "_clip" -olas
(i tried it also with '-donuts' because both are possible apparently)

The map contains the following files
  • 174000_174000_tile_clip.las: the las file that needs to be clipped by a polygon
  • GRB50.shp: the shapefile to clip the las file (this shapefiles contains all buildings in the city with a buffer of 50m around it)
  • Problem.png: a picture of the problem. Some buildings of the shapefile (pink) are like donuts but not all. When clipping without the '- donut' command, the inner information is lost (marked in red). When clipping with the '- donut' command, there is an error.
    • on the picture the yellow is another shapefile of the region of interest, but I want to delete the buildings from the LiDAR data because they cause problems in further analysis (tree segmentation). The green polygons are segmented trees. 
  • Error.png: the error that I get
Thanks for helping me out!
Ruben

Op zaterdag 26 november 2016 om 14:07:24 UTC+1 schreef Karol Krajewski:

Jochen Rapidlasso

unread,
Mar 25, 2023, 6:55:38 AM3/25/23
to LAStools - efficient tools for LiDAR processing
Hi Ruben,
the SHP file ontains just ONE Feature/Polygon (with 1.141.972 vertices).
This is quite hard to handle.
You have to create/save your SHP in a manner that every building is one polygon.
I am still testing/working on this issue to give a more detailed answer.
This just as a first hint.

Best regards,

Jochen @rapidlasso

Jochen Rapidlasso

unread,
Mar 31, 2023, 6:20:06 AM3/31/23
to LAStools - efficient tools for LiDAR processing
Hi Ruben,
after doing some more research on your huge SHP:

For what you told you want to do the arguments "-interior -donut" are the right ones.
You will need the donut argument, cause lots of your polys have a inner area wich you do not want to cut.
So I focus on this and try to make this work.

First: I was wrong about the post before.
We can handle SHP files with many features as well as such with just one MEGA shape.

Your shape isn't nice about other aspects.
It tells a lot about repeated points and duplicate points.
The duplicate points was correct ignored on the outer boundary,
but not within the inner donut polys.

We have different solutions.

May the best is to fix the SHP.
I used QDAS to remove the doubles and very nearby vectors:
> vector geometry > simplify > tolerance 0.01 meters > save as "GRB50simple.shp"

lasclip64 -i 174000_174000_tile_clip.las -poly GRB50simple.shp -o outx.laz -interior -donut

gives a reasonable result then and works also with previous lasclip versions.

Going further we need to change lasclip.
So with version 230330 (out yesterday) you have some improvements:

First we give a more detaild information, which points are repeated or duplicate.
The stderr out now gives a detail x/y pos of the duplicates.
So if you want to fix the points you have a chance to do so.

WARNING: polygon 982 has 2 repeated point(s) at count 46642 (172321.03/171805.19). ignoring ...
WARNING: polygon 1171 has 1 repeated point(s) at count 56724 (173470.04/171885.87). ignoring ...

The coordinates behind the count are the x/y values of the SHP point.

If there is a repeated or duplicate point within the donut shape the past program gave an ERROR.
We changed the program to handle this situation and just give a warning:  

Versions prior 230330:
ERROR: cannot find donut edge 134862 to 134863 in contrained TIN
Versions equal/above 230330:
WARNING: cannot find donut edge 134862(176381.92/172850.76) to 134863(176381.90/172850.69) in contrained TIN

Also the coordinates of the edge are given, so you have a chance to fix the SHP.
If you do not fix it, it runs through and present a result.
In general: You should always take warnings very serious.
The result having a warning before may becomes not trustable.

For the data you presented the result looks good:

lasclip_result_sample.jpg

Best regards,

Jochen @rapidlasso

Ruben Wauters

unread,
Apr 11, 2023, 7:45:53 AM4/11/23
to LAStools - efficient tools for LiDAR processing
Dear Jochen,

Thanks a lot! This helps me a lot because I could not find a way around this issue...

I download the new LAStools version, and indeed the ERRORs changed to WARNINGs. 
Now I can continue my workflow. 

I was wondering, the result that you generated (picture), is this when you fix the SHP and run the command or when you just run the command and ignore the WARNINGs? 

Thank you.
Ruben

Op vrijdag 31 maart 2023 om 12:20:06 UTC+2 schreef Jochen Rapidlasso:

Jochen Rapidlasso

unread,
Apr 11, 2023, 7:50:31 AM4/11/23
to LAStools - efficient tools for LiDAR processing
Dear Ruben,
thanks for the positive feedback! I think the picture shows the result when I igored the warnings, imported in QDAS, but just on a subsample of data.
Best regards,

Jochen @rapidlasso
Reply all
Reply to author
Forward
0 new messages