Duplicate point removal by constant dx, dy, dz offset

319 views
Skip to first unread message

Lui

unread,
May 6, 2016, 5:17:42 PM5/6/16
to LAStools - efficient tools for LiDAR processing
Hello,

it seem that Slovenian Environment Agency has really LIDAR dataset from hell! First it is in zlas format and second it has duplicate points! But someone (I suspect a company that was scanning) made duplicate points that have fixed dx, dy, dz offset and almost same timestamp (0.000001s difference) . It is small offset: 0.1, 0.1, 0.04m but it really hurts working with LIDAR dataset and file sizes are two times bigger as they should be. It doesn't seems to be possible to remove those pesky duplicate points with lasduplicate or I'm wrong? Does someone knows a solution to this kind of problem?

Data can be downloaded from their portal: LIDAR Slovenia.
Just be sure to download GKOT (all points) version of pointcloud.

Best regards

Lui

Nicolas Cadieux

unread,
May 6, 2016, 5:58:34 PM5/6/16
to last...@googlegroups.com

Hi,

If lastools can't help, send me a file. I am currently stuck with a similar problem and working on a Python based solution.

How many files are we looking at?

Nicolas

On May 6, 2016 17:44, Nicolas Cadieux <nicolas...@archeotec.ca> wrote:

>

> Hi,
>
> If lastools can't help, send me a file. I am currently stuck with a similar problem and working on a Python based solution.
>
> How many files are we looking at?
>
> Nicolas

>>

>> --
>> Download LAStools at
>> http://lastools.org
>> http://rapidlasso.com
>> Be social with LAStools at
>> http://facebook.com/LAStools
>> http://twitter.com/LAStools
>> http://linkedin.com/groups/LAStools-4408378
>> Manage your settings at
>> http://groups.google.com/group/lastools/subscribe

Terje Mathisen

unread,
May 7, 2016, 2:51:33 AM5/7/16
to last...@googlegroups.com
Lui wrote:
> Hello,
>
> it seem that Slovenian Environment Agency has really LIDAR dataset
> from hell! First it is in zlas format and second it has duplicate
> points! But someone (I suspect a company that was scanning) made
> duplicate points that have fixed dx, dy, dz offset and almost same
> timestamp (0.000001s difference) . It is small offset: 0.1, 0.1, 0.04m
> but it really hurts working with LIDAR dataset and file sizes are two
> times bigger as they should be. It doesn't seems to be possible to
> remove those pesky duplicate points with lasduplicate or I'm wrong?
> Does someone knows a solution to this kind of problem?

As Nicolas also wrote, this sounds like an interesting problem!

From your description it sounds like all other attributes are constant
and the (x,y,z,t) values have those fixed offsets?

Just post an exact link to one or two lidar blocks and I'll also see if
I can write a filter to detect (and either remove or mark as withheld?)
the extra set.

foreach point seen:

create one additional point with -delta then count duplicates.

write back only those points with a count of exactly 2, this will be a
match for a single original point and a single shifted point which the
process have shifted back.

Terje
>
> Data can be downloaded from their portal: LIDAR Slovenia.
> <http://gis.arso.gov.si/evode/profile.aspx?id=atlas_voda_Lidar@Arso>
> Just be sure to download GKOT (all points) version of pointcloud.
>
> Best regards
>
> Lui
--
- <Terje.M...@tmsw.no>
"almost all programming can be viewed as an exercise in caching"

Martin Isenburg

unread,
May 8, 2016, 7:55:33 AM5/8/16
to LAStools - efficient tools for LiDAR processing
Hello Lui,

I would be willing to work with you and the Slovenian Environment Agency (ARSO) and whomever else is interested to develop a LAStools workflow and maybe some custom filters to fix this data. As a "payment" for such services I would merely require ARSO to publish the fixed data in an proper open format such as LAS, zipped LAS, or LAZ. This would also get it out of the "HALL OF SHAME" (*) where anyone appears who stores open LiDAR in a closed format and thereby - voluntarily or involuntarily - allows a large GIS company to use a good idea (aka "an open data policy") as an evil opportunity (aka "vendor lock-in via a proprietary format to a commercial software").

ARSO & friends ... get back to me if interested ... (-:

Regards,

Martin @rapidlasso

(*) D:\LAStools> more HALL_OF_SHAME.txt
HALL OF SHAME
=============

1) Slovenian Environment Agency or ARSO (Agencija Republike Slovenije za okolje)

   for providing "pseudo-open" access to their national LiDAR data by only
   offering downloads in the proprietary "Optimized LAS" or "zLAS" format
   also known as the "LAZ clone"

   use the "LASliberator" [1,2,3] to set the enslaved Slovenian LiDAR free

2) ...

The "HALL OF SHAME" is our funny-provocative and dicussion-starting way to
assure that our "open LiDAR data" is stored and distributed in our "open
LiDAR formats" to the benefits of users on any platform using any software.

We list agencies or portals that are distributers of "locked-up LiDAR"
and promote proprietary LiDAR formats by providing raw "pseudo-open" point
clouds exclusively in closed formats such as *.sid, *.zlas, *.rar, ...

We are taking "HALL OF SHAME" nominations ... (-;

Terje Mathisen

unread,
May 8, 2016, 2:57:49 PM5/8/16
to last...@googlegroups.com
Martin Isenburg wrote:
> Hello Lui,
>
> I would be willing to work with you and the Slovenian Environment
> Agency (ARSO) and whomever else is interested to develop a LAStools
> workflow and maybe some custom filters to fix this data. As a
> "payment" for such services I would merely require ARSO to publish the
> fixed data in an proper open format such as LAS, zipped LAS, or LAZ.
> This would also get it out of the "HALL OF SHAME" (*) where anyone
> appears who stores open LiDAR in a closed format and thereby -
> voluntarily or involuntarily - allows a large GIS company to use a
> good idea (aka "an open data policy") as an evil opportunity (aka
> "vendor lock-in via a proprietary format to a commercial software").

I'm in!

I suspect that the given fixed offsets (0.1, 0.1, 0.04 and 1us) will
turn out to not be completely fixed: If they are the result of applying
some strange local reference geoid, then they are likely to vary
slightly over the entire country, so that the search for properly paired
duplicates will have to apply some form of fuzzy-ness, possibly adaptive.

I agree completely with your suggestion that all the data we fix should
be published in LAS/LAZ primarily. If they insist (or have managed to
sign a really braindead contract with a GIS company not to be mentioned
in a polite discussion) then they could of course also convert the
results back to that closed format afterwards, as long as the open
formats stay as the primary source.

Terje
>
> ARSO & friends ... get back to me if interested ... (-:
>
> Regards,
>
> Martin @rapidlasso
>
> (*) D:\LAStools> more HALL_OF_SHAME.txt
> HALL OF SHAME
> =============
>
> 1) Slovenian Environment Agency or ARSO (Agencija Republike Slovenije
> za okolje)
>
> Â Â for providing "pseudo-open" access to their national LiDAR data
> by only
> Â Â offering downloads in the proprietary "Optimized LAS" or "zLAS"
> format
> Â Â also known as the "LAZ clone"
>
> Â Â use the "LASliberator" [1,2,3] to set the enslaved Slovenian
> <http://gis.arso.gov.si/evode/profile.aspx?id=atlas_voda_Lidar@Arso>
> Just be sure to download GKOT (all points) version of pointcloud.
>
> Best regards
>
> Lui
>

Žiga Kokalj

unread,
May 9, 2016, 2:43:07 AM5/9/16
to last...@googlegroups.com
Hi!

A great idea and you are right. The offset is not fixed, I've found it to be 0.15, 0.15, 0.07 and 1us in places. There's more than one dataset really, but the one most worth repairing (because others are derivatives) is GKOT in D96 national coordinate system (CRS code (EPSG) 3794). The names of files start with TM e.g. http://gis.arso.gov.si/lidar/gkot/b_35/D96TM/TM_460_94.zlas. As far as I'm aware there is no other reason for the data to be stored in a proprietary format that a misjudged decision of an overburdened technician. I'll point the right people to this discussion and hope for the best.

Best wishes,

Žiga

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Žiga Kokalj, PhD
Institute of Anthropological and Spatial Studies
Research Centre of the Slovenian Academy of Sciences and Arts
Novi trg 2, SI - 1000 Ljubljana, Slovenia

Centre of Excellence for Space Sciences and Technologies
Novi trg 2, SI - 1000 Ljubljana, Slovenia
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Lui

unread,
May 12, 2016, 3:17:37 PM5/12/16
to LAStools - efficient tools for LiDAR processing
Hi,

I just make a little research what is going on with this LIDAR data and in D48 GKOT dataset (an old Slovenian coordinate system) the offset seems to be same everywhere (0.1, 0.1, 0.04). I would really like to process a whole dataset (I have a whole dataset on my computer in D48 coordinate system) but I don't have time to do it.
I also wonder why this kind of point cloud was delivered or available? There are several options:
- commercial interest / lack of QA of a company that was scanning
- commercial interest / lack of QA of a institution that was processing point cloud
- ARSO decided to deliver point cloud for free with a "price"
I personally don't believe in lack of QA but it is possible to have a mishap at exporting a whole dataset at once.

I have no commercial interest in this dataset, I just order and got it as a part of my professional curiosity. It is a pity that such a useful data as point cloud has such flaws. Oh and I really like the idea that ARSO should publish fixed data in open format.

Regards

Lui

Martin Isenburg

unread,
May 12, 2016, 6:07:34 PM5/12/16
to LAStools - efficient command line tools for LIDAR processing
Hi Lui,

if the offset really is constant and that small it would be reasonably easy for me to write a dedicated 'lasfix' tool that carefully eliminates the duplicates without removing (any significant amounts of) actual data. So I guess we are waiting to hear what Ziga's contact at ARSO are saying and need to find out what the "original copy" really is. Would be best to fix that "original copy" instead of having to repeat the exercise more than once in different reprojected versions of the data. 

I think we will first have to recreate the original flight strips (with lassplit) and then process each strip individually to make sure we don't miss removing duplicates that ended up in different files (due to tile boundaries).

Regards,

Martin @rapidlasso

--

Primoz Kogovsek

unread,
May 13, 2016, 6:18:03 AM5/13/16
to LAStools - efficient tools for LiDAR processing

Hi to all.

 

My name is Primož Kogovšek. I am in charge of GIS department at (Hall of shame nominee) Slovenian environment agency (ARSO).

Žiga informed me of duplicated points in our LiDAR data distribution.

First of all i would like to thank you for detecting this duplicated point entries in our data. It is a huge proof of concept why opening government data is helpful. I my presentations i often argue that opening our data has not only economic benefits but also helps us improve our data!

Secondly i would like to thank you for your readiness to help us fix the data. I am excepting your offer with open hands. But since i am a part of public administration, which is very rigid system, i must first organize a meeting on the Ministry of the environment and spatial planning (they ordered LiDAR data) with their contractor (Geodetic institute of Slovenia). Here we will discuss our next steps.

 

After the data will be fixed we will be more than happy to return favour to the open data community by publishing our LiDAR data in a "proper" format.

 

Hope i will be back soon with good news

 

Primož Kogovšek

Slovenia Environment Agency

Martin Isenburg

unread,
May 15, 2016, 9:03:09 AM5/15/16
to LAStools - efficient command line tools for LIDAR processing
Hello Primoz,

thank you for your message and also the incredible courage to identify yourself as a HALL OF SHAME nominee on the 3127 member strong LAStools user group ... (-; ... we could not agree more with your statements on the benefits of open LiDAR and hope that the other European countries such as Germany, Poland, the Czech Republic, Switzerland, Scotland, ... will also implement similar open data policies to the benefit of their economies. 

I was just at the Lindau 3D Forum in Germany where one well-known researcher was talking about deriving various beneficial information on a national scale from LiDAR. He was - ironically - repeatedly resorting to examples and data from the Netherlands during his presentation. Why? Because my native Germany has one of the worst European open data policies that I know of when it comes to LiDAR - in part due to its federalism, which means that the survey departments and ministers of each state get to each make their own decisions  ... )-:

I'll be waiting for your go-ahead note as well as the pointer to which data set is the "original copy" before I start downloading some representative portion of the data, liberating it from its closed format, fixing (hopefully) what ails it, and finally delivering a batch script that you can run on your end on the full data set.

Best Regards,

Martin @rapidlasso


--

Martin Isenburg

unread,
Jun 29, 2016, 7:10:56 AM6/29/16
to LAStools - efficient command line tools for LIDAR processing
Hello Primoz,

it's been over a month and I have not heard back from you about fixing the duplicate point issue in the National LiDAR data set of Slovenia (and also to "liberate" it from the proprietary format is is kept in). Any updates on when this will be moving forward?

Regards,

Martin @rapidlasson

PS: I discussed this with Khaleesi (also known as Daenerys Stormborn of House Targaryen) whom I met while paddling in the Mediterrainean Sea. She seems to share our passion for liberating what is enslaved and promised "I will not let those [LiDAR points] that are free slide back into chains". Khaleesi pledged allegiance to House LAZ and offered - if needed - to send one of her dragons to assist in the battle against the Inland Empire ... (-;

Jost Hobic

unread,
Nov 25, 2016, 1:29:42 PM11/25/16
to LAStools - efficient tools for LiDAR processing
I just checked the ARSO LiDAR portal and they uploaded all LiDAR files in LAZ format! Data is also stil avaliable in zlas. I didnt check the offset if it is fixed.


Martin Isenburg

unread,
Nov 25, 2016, 5:57:58 PM11/25/16
to LAStools - efficient command line tools for LIDAR processing
Hello,

this is great news! I have removed ARSO from the HALL OF SHAME (of the next release) ... (-:

I just downloaded one of their LiDAR tiles (see attached image) provided in the open and compressed LAZ format and have found nothing out of the ordinary. Only the CRS is missing but that's missing almost everywhere ... (-:

E:\LAStools\bin>lasoverlap -i TM_499_110.laz -min_diff 0.1 -max_diff 0.3 -opng -step 1.5

E:\LAStools\bin>lasinfo -i TM_499_110.laz
lasinfo (161114) report for TM_499_110.laz
reporting all LAS header entries:
  file signature:             'LASF'
  file source ID:             0
  global_encoding:            1
  project ID GUID data 1-4:   00000000-0000-0000-0000-000000000000
  version major.minor:        1.2
  system identifier:          ''
  generating software:        'TerraScan'
  file creation day/year:     13/2016
  header size:                227
  offset to point data:       229
  number var. length records: 0
  point data format:          1
  point data record length:   28
  number of point records:    30238611
  number of points by return: 20794686 5287896 2638289 1096471 335056
  scale factor x y z:         0.01 0.01 0.01
  offset x y z:               0 0 0
  min x y z:                  499000.00 109999.99 231.68
  max x y z:                  500000.10 111000.03 453.52
the header is followed by 2 user-defined bytes
LASzip compression (version 2.2r0 c2 50000): POINT10 2 GPSTIME11 2
reporting minimum and maximum for all LAS point record entries ...
  X            49900000   50000010
  Y            10999999   11100003
  Z               23168      45352
  intensity           0      65527
  return_number       1          7
  number_of_returns   1          7
  edge_of_flight_line 0          1
  scan_direction_flag 1          1
  classification      0          7
  scan_angle_rank   -35         37
  user_data           0          0
  point_source_ID   301       2374
  gps_time 78846017.654421 109849085.114752
number of first returns:        20794686
number of intermediate returns: 4155007
number of last returns:         20797035
number of single returns:       15508117
WARNING: there are 73527 points with return number 6
WARNING: there are 12686 points with return number 7
overview over number of returns of given pulse: 15508117 5297372 4626223 3045612 1307547 372640 81100
histogram of classification of points:
         3610529  never classified (0)
           33347  unclassified (1)
        10790346  ground (2)
         1502114  low vegetation (3)
         2303362  medium vegetation (4)
         9486553  high vegetation (5)
         2479938  building (6)
           32422  noise (7)

E:\LAStools\bin>lasvalidate -i TM_499_110.laz -o report.xml
This is version 160512 of the LAS validator. Please contact
me at 'martin....@rapidlasso.com' if you disagree with
validation reports, want additional checks, or find bugs as
the software is still under development. Your feedback will
help to finish it sooner.
needed 21.27 sec for 'TM_499_110.laz' fail

E:\LAStools\bin>more report.xml
<?xml version="1.0" encoding="UTF-8"?>
<LASvalidator>
  <report>
    <file>
      <name>TM_499_110.laz</name>
      <path>TM_499_110.laz</path>
      <version>1.2</version>
      <system_identifier></system_identifier>
      <generating_software>TerraScan</generating_software>
      <point_data_format>1</point_data_format>
      <CRS>not valid or not specified</CRS>
    </file>
    <summary>
      fail
    </summary>
    <details>
      <fail>
        <variable>CRS</variable>
        <note>file does not specify a Coordinate Reference System with GEOTIFF tags</note>
      </fail>
      <warning>
        <variable>system identifier</variable>
        <note>empty string. first character is '\0'</note>
      </warning>
      <warning>
        <variable>return number</variable>
        <note>there are 73527 points with a return number of 6</note>
      </warning>
      <warning>
        <variable>return number</variable>
        <note>there are 12686 points with a return number of 7</note>
      </warning>
      <warning>
        <variable>return number</variable>
        <note>there are 372640 points with a number of returns of given pulse of 6</note>
      </warning>
      <warning>
        <variable>return number</variable>
        <note>there are 81100 points with a number of returns of given pulse of 7</note>
      </warning>
    </details>
  </report>
  <total>
    fail
    <details>
      <pass>0</pass>
      <warning>0</warning>
      <fail>1</fail>
    </details>
  </total>
  <version>
    160512 built with LASread version 1.0 (160119)
  </version>
  <command_line>
    lasvalidate -i TM_499_110.laz -o report.xml
  </command_line>
</LASvalidator>

E:\LAStools\bin>lassort -i TM_499_110.laz -gps_time -return_number -odix _sorted -olaz

E:\LAStools\bin>lasview -i TM_499_110_sorted.laz

E:\LAStools\bin>lasreturn -i TM_499_110_sorted.laz  -check_return_numbering
checked returns of 5282391 multi and 15527127 single return pulses. took 20.806 secs
missing: 41655 duplicate: 1277 too large: 0 zero: 0
missing
=======
6670 returns with n = 2 and r = 1 are missing
5678 returns with n = 2 and r = 2 are missing
4378 returns with n = 3 and r = 1 are missing
4111 returns with n = 3 and r = 2 are missing
4455 returns with n = 3 and r = 3 are missing
2645 returns with n = 4 and r = 1 are missing
2564 returns with n = 4 and r = 2 are missing
2594 returns with n = 4 and r = 3 are missing
2529 returns with n = 4 and r = 4 are missing
887 returns with n = 5 and r = 1 are missing
884 returns with n = 5 and r = 2 are missing
932 returns with n = 5 and r = 3 are missing
889 returns with n = 5 and r = 4 are missing
861 returns with n = 5 and r = 5 are missing
214 returns with n = 6 and r = 1 are missing
222 returns with n = 6 and r = 2 are missing
225 returns with n = 6 and r = 3 are missing
219 returns with n = 6 and r = 4 are missing
205 returns with n = 6 and r = 5 are missing
201 returns with n = 6 and r = 6 are missing
38 returns with n = 7 and r = 1 are missing
46 returns with n = 7 and r = 2 are missing
46 returns with n = 7 and r = 3 are missing
44 returns with n = 7 and r = 4 are missing
44 returns with n = 7 and r = 5 are missing
38 returns with n = 7 and r = 6 are missing
36 returns with n = 7 and r = 7 are missing
duplicate
========
1277 returns with n = 7 and r = 7 are duplicate

E:\LAStools\bin>lasreturn -i TM_499_110_sorted.laz  -histo return_distance 0.05 0.0 2.99
processed returns of 'TM_499_110_sorted.laz'
return distances [meter] histogram with bin size 0.05
  bin [0,0.05) has 1431
  bin [0.05,0.1) has 1571
  bin [0.1,0.15) has 1761
  bin [0.15,0.2) has 2245
  bin [0.2,0.25) has 3087
  bin [0.25,0.3) has 4511
  bin [0.3,0.35) has 6685
  bin [0.35,0.4) has 10632
  bin [0.4,0.45) has 17119
  bin [0.45,0.5) has 25578
  bin [0.5,0.55) has 37387
  bin [0.55,0.6) has 51415
  bin [0.6,0.65) has 70081
  bin [0.65,0.7) has 96526
  bin [0.7,0.75) has 116965
  bin [0.75,0.8) has 129920
  bin [0.8,0.85) has 141119
  bin [0.85,0.9) has 150657
  bin [0.9,0.95) has 157744
  bin [0.95,1) has 163902
  bin [1,1.05) has 170043
  bin [1.05,1.1) has 171893
  bin [1.1,1.15) has 178689
  bin [1.15,1.2) has 181827
  bin [1.2,1.25) has 184652
  bin [1.25,1.3) has 186324
  bin [1.3,1.35) has 189476
  bin [1.35,1.4) has 188802
  bin [1.4,1.45) has 189206
  bin [1.45,1.5) has 188858
  bin [1.5,1.55) has 184749
  bin [1.55,1.6) has 183686
  bin [1.6,1.65) has 178675
  bin [1.65,1.7) has 167668
  bin [1.7,1.75) has 158023
  bin [1.75,1.8) has 148391
  bin [1.8,1.85) has 140165
  bin [1.85,1.9) has 132439
  bin [1.9,1.95) has 125217
  bin [1.95,2) has 119448
  bin [2,2.05) has 114490
  bin [2.05,2.1) has 109254
  bin [2.1,2.15) has 103888
  bin [2.15,2.2) has 99480
  bin [2.2,2.25) has 96674
  bin [2.25,2.3) has 93575
  bin [2.3,2.35) has 91134
  bin [2.35,2.4) has 87802
  bin [2.4,2.45) has 85833
  bin [2.45,2.5) has 83935
  bin [2.5,2.55) has 81500
  bin [2.55,2.6) has 78918
  bin [2.6,2.65) has 77708
  bin [2.65,2.7) has 76169
  bin [2.7,2.75) has 74491
  bin [2.75,2.8) has 72663
  bin [2.8,2.85) has 70885
  bin [2.85,2.9) has 69657
  bin [2.9,2.95) has 67003
  bin [2.95,3) has 3235467
  average return distances [meter] 2.07128 for 9429093 element(s)
checked returns of 5282391 multi and 15527127 single return pulses. took 19.16 secs

On Fri, Nov 25, 2016 at 6:55 PM, Jost Hobic <ech...@gmail.com> wrote:
I just checked the ARSO LiDAR portal and they uploaded all LiDAR files in LAZ format! Data is also stil avaliable in zlas. I didnt check the offset if it is fixed.


TM_499_110.jpg
TM_499_110_over.png
TM_499_110_diff.png

eleanor...@gmail.com

unread,
Feb 12, 2017, 8:07:33 PM2/12/17
to LAStools - efficient tools for LiDAR processing
Hi Martin - forgive me if I am responding in the wrong place; this is actually my first time navigating a Google group.

I have a quick question with regards to lasduplicate if the offset is definitely uniform, say in the case that some idiot *cough* merged the shifted and unshifted data into one and didn't notice until after classification was complete.

If we know which set of points are "correct", is there a way to isolate them?

My google fu appears to have shut down completely so please forgive me if this questions has been answered elsewhere.

Cheers,

Eleanor 

Terje Mathisen

unread,
Feb 13, 2017, 6:19:02 AM2/13/17
to last...@googlegroups.com
eleanor...@gmail.com wrote:
> Hi Martin - forgive me if I am responding in the wrong place; this is
> actually my first time navigating a Google group.
>
> I have a quick question with regards to lasduplicate if the offset is
> definitely uniform, say in the case that some idiot *cough* merged the
> shifted and unshifted data into one and didn't notice until after
> classification was complete.
>
> If we know which set of points are "correct", is there a way to
> isolate them?
>
>
This is in fact doable, I would handle the problem by loading all points
twice, with both the in-file value and after subtracting the known offset:

This would lead to each original point occuring 4 times in the output,
and two of these four would be identical!
I.e. by only outputting the points with a count == 2 we get just the
desired set:

foreach (@points) {
my ($x,$y,$z) = split;
$points{sprintf("%1.2f\t%1.2f\t%1.2f", $x,$y,$z)}++;
$x-=$XOFFSET; $y-=$YOFFSET; $z-=$ZOFFSET;
$points{sprintf("%1.2f\t%1.2f\t%1.2f", $x,$y,$z)}++;
}
@points = ();
foreach (keys %points) {
if ($points{$_} == 2) {
push(@points, $_);
}
}

It might be possible to do this using LAStools only, by first creating a
file with the fixed offset subtracted, merging that with the first, and
then using lasduplicate to output the points that now have duplicates:

:: Reverse the original (false) shift:
las2las -i source.laz -translate_xyz <-DX> <-DY> <-DZ> -o unshifted.laz
:: merge them
lasmerge -i source.laz -i unshifted.laz -o doubled.laz
:: Find the duplicated xyz points:
lasduplicate -i doubled.laz -record_removed -unique_xyz -o fixed.laz

Terje

Martin Isenburg

unread,
Feb 14, 2017, 7:11:40 PM2/14/17
to LAStools - efficient command line tools for LIDAR processing
Hello Eleanor,

hard to say if this can be repaired. There is no readily available function for that in LAStools ... but sometimes a combination of tools can do the trick. It would help if you could post a link to a compressed LAZ file that contains this shifted and unshifted data as one. Whether this can be fixed may depend on the amount and the direction of the shift (vertical only?) as well as how much processing was done on the data since (tiling?).

Regards,

Martin @rapidlasso

--
Reply all
Reply to author
Forward
0 new messages