Matching Panoramic image to LIDAR scan geometry

693 views
Skip to first unread message

Dobbo

unread,
Apr 5, 2011, 7:41:55 PM4/5/11
to PTGui Support
We recently went out and used a high precision LIDAR unit to scan a
large outdoor area.

We shot panoramas using a 5D Mkii with a 15mm lens (6 around + 1 up +
1 down)

We placed a marker in the ground everywhere we shot a pano. These
markers were picked up clearly by the LIDAR scanner.

When we place a sphere in 3D at the location of the marker, we are
unable to get the LIDAR data to match in perspective with the
panorama.

If you face one direction you can get things to appear to match.
Rotate the camera to look the opposite direction, and the pano does
not match up with the scan data.

This leads me to believe that PTGui is munging the source data and
warping it in a way that is yielding non-perspective correct imagery.

Question 1: Given that I'm using the Canon 15mm lens on a full frame
sensor, which setting should I be using for the "Minimize Lens
Distortion" feature in the Optimizer?

Question 2: Has anyone ever tried designing a method of calibrating a
panoramic image to see if the image is staying perspectively correct?
I was thinking about building a perfect cube space out of plywood or
something, and painting a black & white grid on the interior and
shooting test panos inside it to see how uniform the grid stays (or
does not stay) thru the panorama processing work flow.

Thanks for any insights.
-robb

gravityimage

unread,
Apr 5, 2011, 10:51:06 PM4/5/11
to PTGui Support
I like the idea of shooting inside a box with a grid pattern, and
would like to see what it would reveal about perspective. Some of the
white squares could be white plexiglass to let light in.
Bill

John Houghton

unread,
Apr 6, 2011, 3:32:38 AM4/6/11
to PTGui Support
On Apr 6, 12:41 am, Dobbo <j...@2xlgames.com> wrote:
> If you face one direction you can get things to appear to match.
> Rotate the camera to look the opposite direction, and the pano does
> not match up with the scan data.

Could you give some indication of how bad the mismatch is?

> Question 1: Given that I'm using the Canon 15mm lens on a full frame
> sensor, which setting should I be using for the "Minimize Lens
> Distortion" feature in the Optimizer?

"Heavy + lens shift" is the setting to use, and you really need to
calibrate the lens to evaluate the lens parameters accurately, once
and for all. (See below).

> Question 2: Has anyone ever tried designing a method of calibrating a
> panoramic image to see if the image is staying perspectively correct?

I have used the stars in the night sky as a target for calibrating the
lens, for which very accurate mapping is available. See
http://www.johnhpanos.com/starcal.htm . The method is slightly
flawed, as there are significant diffraction effects down towards the
horizon. It's best, therefore, to use the upper part of the visible
hemisphere only, taking multiple shots in the case of very wide angle
lenses. For a 15mm you could take two shots pitched down from the
zenith by +/-23 degrees and optimize both of these to the sky map
using only the overlap area of the two images for control points
assignment.

John

John Houghton

unread,
Apr 6, 2011, 3:49:34 AM4/6/11
to PTGui Support
On Apr 6, 8:32 am, John Houghton <j.hough...@ntlworld.com> wrote:

> there are significant diffraction effects down towards the
> horizon.

I should have wrote refraction rather than diffraction!

John

PTGui Support

unread,
Apr 6, 2011, 4:19:58 AM4/6/11
to pt...@googlegroups.com
Hi Robb,

Some more ideas:

Did you use a properly calibrated panoramic head?

What's the average control point error after optimization? You should be
able to get this down to 1 or 2 pixels.

Ensure that you have control points spread over the entire overlap area.
If the control points are clustered in a small area add a few by hand.

If you can export your LIDAR image in equirectangular 360x180 format,
you can use it as a reference image for alignment. See the second half
of 5.29:
http://www.ptgui.com/support.html#5_29

Joost


On 6-4-2011 1:41, Dobbo wrote:
> We recently went out and used a high precision LIDAR unit to scan a
> large outdoor area.
>
> We shot panoramas using a 5D Mkii with a 15mm lens (6 around + 1 up +
> 1 down)
>
> We placed a marker in the ground everywhere we shot a pano. These
> markers were picked up clearly by the LIDAR scanner.
>
> When we place a sphere in 3D at the location of the marker, we are
> unable to get the LIDAR data to match in perspective with the
> panorama.
>
> If you face one direction you can get things to appear to match.
> Rotate the camera to look the opposite direction, and the pano does
> not match up with the scan data.
>
> This leads me to believe that PTGui is munging the source data and
> warping it in a way that is yielding non-perspective correct imagery.
>
> Question 1: Given that I'm using the Canon 15mm lens on a full frame
> sensor, which setting should I be using for the "Minimize Lens
> Distortion" feature in the Optimizer?
>
> Question 2: Has anyone ever tried designing a method of calibrating a
> panoramic image to see if the image is staying perspectively correct?
> I was thinking about building a perfect cube space out of plywood or

> something, and painting a black& white grid on the interior and

Bwanaq

unread,
Apr 6, 2011, 5:34:15 AM4/6/11
to PTGui Support
To add colour to the points in the point cloud from a LIDAR scan the
360° Panorama needs to be coincident with the scanner location.
i.e. The lens Entrance Pupil(Nodal Point) needs to be at the same
location as the centre of the scan head otherwise the parallax
resulting from a different camera location will result in the wrong
colours being attached to the points in the point cloud.

Imagine a wall say 10m from the scanner with a post say 5m from the
scanner in front of it.
There will be a hole in the point cloud behind the post.
If the panorama is taken to one side of the scanner then the colours
of the post will be superimposed on the wall and the colours of the
wall on the post.

PTGui does a great job creating panoramas that can be used to
introduce "true" colour to the point cloud.
http://www.hugha.co.uk/Acrobat%20PDF%20Files/Spherical-Panoramas-for-HDS-Point-Clouds.pdf

If you would like to discuss, you will find my contact address at
http://www.hugha.co.uk/About.htm

Jeffrey Martin

unread,
Apr 7, 2011, 2:43:53 AM4/7/11
to pt...@googlegroups.com
Yeah I was going to point this out. if the LIDAR is not in the exact same location as the camera, it's not going to match!

perspixe

unread,
Apr 7, 2011, 8:09:30 AM4/7/11
to PTGui Support
A company that makes big panoramic heads has released a unit that does
just that. One single device doing the 3D scan and a camera, with
apparently same entrance pupil.
It's called RodeonMetric. A very nice device. This is on my list.

Hugh

unread,
Apr 7, 2011, 12:02:31 PM4/7/11
to PTGui Support

> Question 2: Has anyone ever tried designing a method of calibrating a
> panoramic image to see if the image is staying perceptively correct?
> I was thinking about building a perfect cube space out of plywood or
> something, and painting a black & white grid on the interior and
> shooting test panos inside it to see how uniform the grid stays (or
> does not stay) thru the panorama processing work flow.
>

For me, the easiest way of ascertaining the accuracy of a panoramic
image is to take a set of images for the panorama so that the Entrance
Pupil (Nodal Point) of the camera lens is coincident with the centre
for a high density Scan World (i.e. the centre for the scanner)
[provided you have access to an HDS (High Definition Survey) Scanner].
Provided that there are objects with clearly defined edges that are
different distance from this centre it is easy to see if the correct
colour from the panoramic image is applied to the points in the point
cloud.
(e.g. a post closer to the centre than a background wall of a
completely different colour.

In the testing used for:
http://www.hugha.co.uk/NodalPoint/Lens-Investigation.htm
both the Scanner and Panohead mounted camera were positioned in a
Leica Tribrach on a concrete pillar so that both devices were
coincident to sub-millimetre accuracy.
Reply all
Reply to author
Forward
0 new messages