Point cloud resampling - multiple thinned point clouds from one

457 views
Skip to first unread message

MelFed

unread,
Jul 11, 2016, 3:14:50 AM7/11/16
to LAStools - efficient tools for LiDAR processing
Hi everyone,

I am currently trying to write a workflow that takes me from a large area, high density point cloud (norm_warra_rotated.laz) to a series of grids (warragrids.zip) where each grid is used to make multiple instances of the point cloud thinned to various point densities.

The main issues I am having are:
1) I want to thin the point cloud to a point density relative to area (1 hit/m2), but would need to resample based on pulse (meaning that I need to thin using the GPS time stamps). However, I cant seem to find a work around that lets me select the point density based on area using the GPS time stamp information. 
2) I need to make multiple instances of the thinned point cloud. For example, I have an original lidar file that is 40 hits/m2, I want to make 40 independent point clouds each with 1hit/m2. I need to make as many independent point clouds as possible from the available data (which varies based on site).

My current attempt to work through this is:
1) lasclip - clip the lidar point cloud using the grids (keeps the grid ID as the name of the file going forward). This allows me to clip the point clouds based on each grid cell size (already generated - see warragrids.zip).
2) lassplit - split the lidar point cloud in each grid cell into individual flight lines
3) lassort - sort the individual flight line files by GPS time
4) lasthin - stepsize = gridcell size, use the -random and -seed switches to ensure I am choosing random pulses for each iteration

Key notes:
- the resampling unit is pulses, not hits
- I need as many independent point clouds as possible at multiple point densities
- I need a way to thin the point cloud based on giving an intended density but filters using GPS time

Ideally, I would like to write a batch file that runs through each of these steps for each area (in this case, Warra; I have many others).

Any insight or suggestions would be greatly appreciated. I have attached the original lidar cloud and the multiple grids I am using.

Kind regards,
Mel

norm_warra_rotated.laz
warragrids.zip

Terje Mathisen

unread,
Jul 11, 2016, 8:47:24 AM7/11/16
to last...@googlegroups.com
MelFed wrote:
> Hi everyone,
>
> I am currently trying to write a workflow that takes me from a large
> area, high density point cloud (norm_warra_rotated.laz) to a series of
> grids (warragrids.zip) where each grid is used to make multiple
> instances of the point cloud thinned to various point densities.
So effectively you are trying to determine minimum pulse densities
needed to do various level of classification/feature extraction?

>
> The main issues I am having are:
> 1) I want to thin the point cloud to a point density relative to area
> (1 hit/m2), but would need to resample based on pulse (meaning that I
> need to thin using the GPS time stamps). However, I cant seem to find
> a work around that lets me select the point density based on area
> using the GPS time stamp information.

This means that with a point cloud with lots of returns for each pulse
(I have a 40 points/sq meter collection myself which has up to 7 returns
for each pulse) you get larger flight/time distance between each pulse
you keep, right?
> 2) I need to make multiple instances of the thinned point cloud. For
> example, I have an original lidar file that is 40 hits/m2, I want to
> make 40 independent point clouds each with 1hit/m2. I need to make as
> many independent point clouds as possible from the available data
> (which varies based on site).
This is in fact fairly easy to do:

1) Read in the LAZ tile, group them by GPS time while counting both
number of pulses and number of returns.

2) Use total_returns / area_in_square_meters to calculate number of
separate instances.

3) Distribute the groups from (1) across the instances from (2), this
gives a near-optimal distribution of points to instances while making
sure that all points from a given pulse will end up in the same instance.

If the source tile is sorted in aquisition/gps time order then you
should be able to just pick up the (2) info from the LAZ header,
initialize all requires instances and then make a single pass over the
collected points, switching to the next instance each time you get a new
pulse.

Terje


>
> My current attempt to work through this is:
> 1) lasclip - clip the lidar point cloud using the grids (keeps the
> grid ID as the name of the file going forward). This allows me to clip
> the point clouds based on each grid cell size (already generated - see
> warragrids.zip).
> 2) lassplit - split the lidar point cloud in each grid cell into
> individual flight lines
> 3) lassort - sort the individual flight line files by GPS time
> 4) lasthin - stepsize = gridcell size, use the -random and -seed
> switches to ensure I am choosing random pulses for each iteration
>
> Key notes:
> - the resampling unit is pulses, not hits
> - I need as many independent point clouds as possible at multiple
> point densities
> - I need a way to thin the point cloud based on giving an intended
> density but filters using GPS time
>
> Ideally, I would like to write a batch file that runs through each of
> these steps for each area (in this case, Warra; I have many others).
>
> Any insight or suggestions would be greatly appreciated. I have
> attached the original lidar cloud and the multiple grids I am using.
>
> Kind regards,
> Mel
>
> --
> Download LAStools at
> http://lastools.org
> http://rapidlasso.com
> Be social with LAStools at
> http://facebook.com/LAStools
> http://twitter.com/LAStools
> http://linkedin.com/groups/LAStools-4408378
> Manage your settings at
> http://groups.google.com/group/lastools/subscribe


--
- <Terje.M...@tmsw.no>
"almost all programming can be viewed as an exercise in caching"

MelFed

unread,
Jul 12, 2016, 3:46:30 AM7/12/16
to LAStools - efficient tools for LiDAR processing, terje.m...@tmsw.no
Hi Terje
Thanks for your quick response.
Unfortunately, I dont really understand your suggestion. I have some comments below.

On Monday, 11 July 2016 22:47:24 UTC+10, Terje Mathisen wrote:
MelFed wrote:
> Hi everyone,
>
> I am currently trying to write a workflow that takes me from a large
> area, high density point cloud (norm_warra_rotated.laz) to a series of
> grids (warragrids.zip) where each grid is used to make multiple
> instances of the point cloud thinned to various point densities.
So effectively you are trying to determine minimum pulse densities
needed to do various level of classification/feature extraction?

In one instance I am trying to determine the minimum pulse densities to do different classifications but I dont just need one instance, I need as many as possible. This means that each version of the thinned point cloud needs to be different from the others. I dont seem to understand how your approach ensures that the thinned clouds are different. 
>
> The main issues I am having are:
> 1) I want to thin the point cloud to a point density relative to area
> (1 hit/m2), but would need to resample based on pulse (meaning that I
> need to thin using the GPS time stamps). However, I cant seem to find
> a work around that lets me select the point density based on area
> using the GPS time stamp information.

This means that with a point cloud with lots of returns for each pulse
(I have a 40 points/sq meter collection myself which has up to 7 returns
for each pulse) you get larger flight/time distance between each pulse
you keep, right?

I dont necessarily think that just because the pulse has multiple returns that the time distance between each pulse would be larger. I dont want to adjust the distance between each pulse, I just want a random selection of pulses at a set density, regardless of the distance between the points.
 
> 2) I need to make multiple instances of the thinned point cloud. For
> example, I have an original lidar file that is 40 hits/m2, I want to
> make 40 independent point clouds each with 1hit/m2. I need to make as
> many independent point clouds as possible from the available data
> (which varies based on site).
This is in fact fairly easy to do:

1) Read in the LAZ tile, group them by GPS time while counting both
number of pulses and number of returns.

I dont understand what you mean by "group" them by GPS time? I have them sorted by GPS time. Why does the number of returns matter if my base unit is pulses. I will not be separating points from the pulses during thinning. 
 
2) Use total_returns / area_in_square_meters to calculate number of
separate instances.

I dont understand what the number of separate instances means? 
 
3) Distribute the groups from (1) across the instances from (2), this
gives a near-optimal distribution of points to instances while making
sure that all points from a given pulse will end up in the same instance.

If the source tile is sorted in aquisition/gps time order then you
should be able to just pick up the (2) info from the LAZ header,
initialize all requires instances and then make a single pass over the
collected points, switching to the next instance each time you get a new
pulse.


I think I just dont understand the wording. Any extra explanation would be very helpful.
Kind regards,
Melissa

Terje Mathisen

unread,
Jul 12, 2016, 5:03:56 AM7/12/16
to MelFed, LAStools - efficient tools for LiDAR processing
MelFed wrote:
> Hi Terje
> Thanks for your quick response.
> Unfortunately, I dont really understand your suggestion. I have some
> comments below.

I obviously failed to explain it properly. :-(

Here's a pseudo-code version:

(x0,y0,x1,y1) = getextent();
int area = abs((x1-x0)*(y1-y0));
int nrofpoints = getnumberofpoints();

int instances = (int) (nrofpoints/area); // How many points per sq meter?

lazinstance instance[instances]; // Initialize this many instances

int prevtime = -1;
int inst = -1;
foreach (points_in_laz_file) {
(x,y,z,r,n,t) = nextpoint();
if (t != prevtime) { // Switch to next instance!
inst++;
if (inst >= instances) inst = 0;
}
instance[inst].savepoint(x,y,z,r,n,t);
}

The result of this will be to split all pulses evenly (round robin)
across all instances, with an average point density of the specified 1
point per sq meter.

OK?

Terje

>
> On Monday, 11 July 2016 22:47:24 UTC+10, Terje Mathisen wrote:
>
> MelFed wrote:
> > Hi everyone,
> >
> > I am currently trying to write a workflow that takes me from a
> large
> > area, high density point cloud (norm_warra_rotated.laz) to a
> series of
> > grids (warragrids.zip) where each grid is used to make multiple
> > instances of the point cloud thinned to various point densities.
> So effectively you are trying to determine minimum pulse densities
> needed to do various level of classification/feature extraction?
>
> *In one instance I am trying to determine the minimum pulse densities
> to do different classifications but I dont just need one instance, I
> need as many as possible. This means that each version of the thinned
> point cloud needs to be different from the others. I dont seem to
> understand how your approach ensures that the thinned clouds are
> different.*
>
> >
> > The main issues I am having are:
> > 1) I want to thin the point cloud to a point density relative to
> area
> > (1 hit/m2), but would need to resample based on pulse (meaning
> that I
> > need to thin using the GPS time stamps). However, I cant seem to
> find
> > a work around that lets me select the point density based on area
> > using the GPS time stamp information.
>
> This means that with a point cloud with lots of returns for each
> pulse
> (I have a 40 points/sq meter collection myself which has up to 7
> returns
> for each pulse) you get larger flight/time distance between each
> pulse
> you keep, right?
>
>
> *I dont necessarily think that just because the pulse has multiple
> returns that the time distance between each pulse would be larger. I
> dont want to adjust the distance between each pulse, I just want a
> random selection of pulses at a set density, regardless of the
> distance between the points.*
>
> > 2) I need to make multiple instances of the thinned point cloud.
> For
> > example, I have an original lidar file that is 40 hits/m2, I
> want to
> > make 40 independent point clouds each with 1hit/m2. I need to
> make as
> > many independent point clouds as possible from the available data
> > (which varies based on site).
> This is in fact fairly easy to do:
>
> 1) Read in the LAZ tile, group them by GPS time while counting both
> number of pulses and number of returns.
>
> *I dont understand what you mean by "group" them by GPS time? I have
> them sorted by GPS time. Why does the number of returns matter if my
> base unit is pulses. I will not be separating points from the pulses
> during thinning.*
>
> 2) Use total_returns / area_in_square_meters to calculate number of
> separate instances.
>
> *I dont understand what the number of separate instances means? *
>
> 3) Distribute the groups from (1) across the instances from (2), this
> gives a near-optimal distribution of points to instances while making
> sure that all points from a given pulse will end up in the same
> instance.
>
> If the source tile is sorted in aquisition/gps time order then you
> should be able to just pick up the (2) info from the LAZ header,
> initialize all requires instances and then make a single pass over
> the
> collected points, switching to the next instance each time you get
> a new
> pulse.
>
>
> *I think I just dont understand the wording. Any extra explanation
> would be very helpful.*
> *Kind regards,*
> *Melissa*
> - <Terje.M...@tmsw.no <javascript:>>

Terje Mathisen

unread,
Jul 12, 2016, 5:16:28 AM7/12/16
to MelFed, LAStools - efficient tools for LiDAR processing
Terje Mathisen wrote:
> MelFed wrote:
>> Hi Terje
>> Thanks for your quick response.
>> Unfortunately, I dont really understand your suggestion. I have some
>> comments below.
>
> I obviously failed to explain it properly. :-(
>
> Here's a pseudo-code version:
>
> (x0,y0,x1,y1) = getextent();
> int area = abs((x1-x0)*(y1-y0));
> int nrofpoints = getnumberofpoints();
>
> int instances = (int) (nrofpoints/area); // How many points per sq
> meter?
>
> lazinstance instance[instances]; // Initialize this many instances
>
> int prevtime = -1;
> int inst = -1;
> foreach (points_in_laz_file) {
> (x,y,z,r,n,t) = nextpoint();
> if (t != prevtime) { // Switch to next instance!
> inst++;
> if (inst >= instances) inst = 0;
> }
> instance[inst].savepoint(x,y,z,r,n,t);

OOPS! I'm missing a crucial line here:
prevtime = t; // Remember last GPS pulse timestamp

Martin Isenburg

unread,
Jul 22, 2016, 10:26:25 AM7/22/16
to LAStools - efficient command line tools for LIDAR processing
Hello,

the easiest way to lower the density in order to simulate lower density data "correctly" is to simply drop the overlap in flightlines either using lasoverage 

D:\LAStools\bin>lasoverage -i norm_warra_rotated.laz -odix _over -olaz
D:\LAStools\bin>lasview -i norm_warra_rotated_over.laz -drop_class 12

or by dropping a few individual flightlines with '-drop_point_source 45 44':

D:\LAStools\bin>lasinfo -i norm_warra_rotated.laz -nh -nr -nv -nmm -histo point_source 1
lasinfo (160721) report for norm_warra_rotated.laz
WARNING: there is coordinate resolution fluff (x10) in XY
point source id histogram with bin size 1
  bin 13 has 31025
  bin 14 has 10736
  bin 44 has 34211
  bin 45 has 103187
  bin 46 has 83475
  bin 47 has 539
  average point source id 40.1542 for 263173 element(s)

D:\LAStools\bin>lasview -i norm_warra_rotated.laz -drop_point_source 45 44

To go even lower and still keep each pulse "intact" by either removing it completely (i.e. removing all its returns) or by keeping it completely (i.e. keeping all its returns) you can use the '-thin_with_gpstime' filter that is available for all LAStools:

lasview -i norm_warra_rotated.laz  -thin_with_time 0.01
lasview -i norm_warra_rotated.laz  -thin_with_time 0.005
lasview -i norm_warra_rotated.laz  -thin_with_time 0.001
lasview -i norm_warra_rotated.laz  -thin_with_time 0.0005
lasview -i norm_warra_rotated.laz  -thin_with_time 0.0001
lasview -i norm_warra_rotated.laz  -thin_with_time 0.00005

Or just for a single flightline:

lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.01
lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.005
lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.001
lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.0005
lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.0001
lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.00005

Finally, by first translating the GPS time you can get different subsets that are being filtered. Note you must first create these temporary files. You cannot do it in one step because filters are applied before transforms.

las2las -i norm_warra_rotated.laz -translate_gps_time 0.00025 -odix _00025 -olaz
las2las -i norm_warra_rotated.laz -translate_gps_time 0.00050 -odix _00050 -olaz
las2las -i norm_warra_rotated.laz -translate_gps_time 0.00075 -odix _00075 -olaz

lasview -i norm_warra_rotated.laz -keep_point_source 45 -thin_with_time 0.001
lasview -i norm_warra_rotated_00025.laz -keep_point_source 45 -thin_with_time 0.001
lasview -i norm_warra_rotated_00050.laz -keep_point_source 45 -thin_with_time 0.001
lasview -i norm_warra_rotated_00075.laz -keep_point_source 45 -thin_with_time 0.001

Regards,

Martin @rapidlasso
thin_with_time_0_001_translate_000.png
thin_with_time_0_001_translate_025.png
thin_with_time_0_001_translate_050.png
thin_with_time_0_001_translate_075.png
Reply all
Reply to author
Forward
0 new messages