Simple lens calibration (was:Filling in LensFun database)

692 views
Skip to first unread message

Bruno Postle

unread,
Jul 9, 2008, 6:07:27 PM7/9/08
to cre...@lists.freedesktop.org, Hugin ptx
On Wed 18-Jun-2008, Bruno Postle wrote on the CREATE list:
>
>The hugin lens correction tutorial hasn't been written. I'll
>attach an email I wrote recently describing what this tutorial
>would look like.

[snip]

Terry Duell has written a tutorial describing a very simple
technique for calibrating lens distortion parameters with hugin:

http://hugin.sourceforge.net/tutorials/calibration/

The advantage of this technique is that it doesn't rely on perfectly
rectangular buildings or involve multiple overlapping photos taken
with a panoramic head - Just a simple scene that anyone can
construct.

The other advantage is that it has real potential for automatic
processing.

--
Bruno

Ken Warner

unread,
Jul 9, 2008, 6:44:32 PM7/9/08
to hugi...@googlegroups.com
Very useful information -- thanks for taking the time.

Johannes Zander

unread,
Jul 9, 2008, 10:08:05 PM7/9/08
to hugin and other free panoramic software
> Terry Duell has written a tutorial describing a very simple
> technique for calibrating lens distortion parameters with hugin:

This looks very promising. Sadly it seems, that it's not possible to
reproduce this tutorial with Harrys latest SVN build (3133) on OS X. I
haven't found support for straight line control points in the GUI. Is
it possible to activate this somehow?

Cheers
Johannes

Bruno Postle

unread,
Jul 10, 2008, 5:32:09 AM7/10/08
to Hugin ptx

They are there, but you need to know where to look ;-)

http://wiki.panotools.org/Hugin_Control_Points_tab#Straight_line_control_points

--
Bruno

Ir. Hj. Othman bin Hj. Ahmad

unread,
Jul 10, 2008, 9:07:40 AM7/10/08
to hugin and other free panoramic software
I tried the tutorial on a Windows XP Hugin v 0.7.0.3061 using the
sample image.

I notice that this technique is useful only for finding lens
distortion parameters.

The view(field of view or focal length) is to be extracted from the
image file.

The images from my camera, a Sony Dsc-S600, do not produce the correct
focal length data that can be extracted by Hugin.

By using the sample image in the tutorial, I can get the view data,
but the resultant distortion data is too large. It shows that it is
difficult to get the straight line control points correctly.

I notice that, when stitching a set of pictures to form a 360x180
panorama, and by optimising up to the distortion parameters, the
resultant parameters appear to be almost similar for the different
sets.

Now I just use the lens data, that I get from the optimised set of
pictures that form the best panorama and apply them to all the other
sets of pictures. So you can say that I'm not trying to get the best
picture, but to speed up my processing time. There is no more need for
me to optimise further than the second stage, i.e. positions.

I don't use a pano head. Just a normal tripod, and now, I just use
Philipod, i.e. a string attached to a weight.

Simon Oosthoek

unread,
Jul 10, 2008, 9:44:54 AM7/10/08
to hugi...@googlegroups.com
Bruno Postle wrote:
> On Wed 18-Jun-2008, Bruno Postle wrote on the CREATE list:
>> The hugin lens correction tutorial hasn't been written. I'll
>> attach an email I wrote recently describing what this tutorial
>> would look like.
>
> [snip]
>
> Terry Duell has written a tutorial describing a very simple
> technique for calibrating lens distortion parameters with hugin:
>
> http://hugin.sourceforge.net/tutorials/calibration/

Hi

Just a few comments on the tutorial...

I really like the simplicity of it and it taught me something new about
hugin (the straight line CPs, which I've always found missing ;-) )

The sample image (230.jpg) is a bit of a tricky one, because the one
linked on the site is a bit too small to get accurate marks for the
lines, the contrast is simply too low and the resolution too coarse. I
did my ultimate best (may not be much ;-) and I got:
a:0.18826
b:-0.60268
c:0.60079

which is quite a lot and not the same as the tutorial values a=0.00104,
b=-0.00169 and c=0.00509

(I can't help thinking the tutorial might have been written with the
full image resolution and not the file linked in the tutorial, a lower
resolution with the original exif might give confusing lens parameters?)

So if this is the problem, I'd suggest replacing the image with the
original or updating the tutorial with the right values for the sample
image. Better still would be using a different picture with more easily
visible lines...

The .pto file is attached...

Cheers

Simon

230-230.pto

Tduell

unread,
Jul 10, 2008, 8:42:58 PM7/10/08
to hugin and other free panoramic software


On Jul 10, 11:44 pm, Simon Oosthoek <simon.oosth...@gmail.com> wrote:

>
> (I can't help thinking the tutorial might have been written with the
> full image resolution and not the file linked in the tutorial, a lower
> resolution with the original exif might give confusing lens parameters?)
>
> So if this is the problem, I'd suggest replacing the image with the
> original or updating the tutorial with the right values for the sample
> image. Better still would be using a different picture with more easily
> visible lines...

You are quite correct, the example image is considerably reduced in
scale to that used to actually do the tutorial.
That was done simply to reduce the size of the download data.
I would suggest setting up your own string line arrangement and
working with that. You should see quite an improvement.
The aim of the tutorial is really to show the way.

Cheers,
Terry

Tduell

unread,
Jul 10, 2008, 8:44:11 PM7/10/08
to hugin and other free panoramic software


On Jul 10, 11:07 pm, "Ir. Hj. Othman bin Hj. Ahmad"
<othm...@lycos.com> wrote:
> I tried the tutorial on a Windows XP  Hugin v 0.7.0.3061 using the
> sample image.
>
> I notice that this technique is useful only for finding lens
> distortion parameters.
>
> The view(field of view or focal length) is to be extracted from the
> image file.

I think you need at least two images to determine FOV.

Cheers,
terry

Simon Oosthoek

unread,
Jul 11, 2008, 1:22:16 AM7/11/08
to hugi...@googlegroups.com

Hi Terry

that's what I did (though I'm not sure I was precise enough) and this is
what I got for my 50mm lens on my sony a100:
a: 0.03355
b:-0.13657
c: 0.16691

Still quite large values, aren't they? However the resulting image (in
preview) doesn't look strangely distorted, so they can't be that far off
I think...

For the tutorial a quick note that the sample image is scaled and thus
will result in different values might be good....

Cheers

Simon

Tduell

unread,
Jul 11, 2008, 5:54:41 AM7/11/08
to hugin and other free panoramic software


On Jul 11, 3:22 pm, Simon Oosthoek <simon.oosth...@gmail.com> wrote:

> that's what I did (though I'm not sure I was precise enough) and this is
> what I got for my 50mm lens on my sony a100:
> a: 0.03355
> b:-0.13657
> c: 0.16691
>
> Still quite large values, aren't they? However the resulting image (in
> preview) doesn't look strangely distorted, so they can't be that far off
> I think...

If you use a thin string and have an arrangement roughly as per the
tute it should be relatively easy to be very precise, and I would have
guessed difficult to get a bad result...unless the string was not taut
and the lines didn't cross areas of the image where the distortion is
greatest. I'm guessing a bit here.
They do look like largish values, but others may have a better feel
for that.
>
> For the tutorial a quick note that the sample image is scaled and thus
> will result in different values might be good....

OK.

Cheers,
Terry

Ir. Hj. Othman bin Hj. Ahmad

unread,
Jul 12, 2008, 9:55:18 AM7/12/08
to hugin and other free panoramic software


On Jul 11, 1:22 pm, Simon Oosthoek <simon.oosth...@gmail.com> wrote:
> Tduell wrote:
>

...

> Hi Terry
>
> that's what I did (though I'm not sure I was precise enough) and this is
> what I got for my 50mm lens on my sony a100:
> a: 0.03355
> b:-0.13657
> c: 0.16691
>
> Still quite large values, aren't they? However the resulting image (in
> preview) doesn't look strangely distorted, so they can't be that far off
> I think...

I also got similarly large errors which I believe is not the true
value for the lens from the tutorial.

But Pano tools, will give the correct parameters based on the control
points that we insert.
There are too few points inserted resulting in large deviations in the
resultant parameters.

Pano tools will optimise the values of these distortion parameters to
minimise the errors in the placement of the control points.

Having values for the parameters that match the actual values, will
help pano tools converge faster but for best results, it is best to
reoptimise these distortion parameters for each set of panoramic
pictures. There will be slight variations due to the way we/autopano
place control points.

I made an experiment by using my mobile phone, V3x camera, to take a
360 degree panoramic picture. I was able to finally get a good
estimate for the fov after more than 4 trials. Since it is a 360
degree panorama, the fov is adjusted until I can get a 360 degree
panoramic picture.

V3x: Motorola Lens
a=0.0823533
b=-0.222375
c=0.176544

Sony Dsc-s600 Carl Zeiss Tessar
a=-0.0081414
b=0.0305603
c=-0.0899093

Microtek MV300 camera: chinese lens
a=0.0761687
b=-0.24371
c=0.220429

My conclusion has to be that the tutorial for lens calibration is not
useful for panoramic work. It is better to let pano tools adjust these
parameters for each set of pictures, for the best panoramic picture
stitching.

You can save the values for reference and quick lens setup. Based on
my experiences with my Sony compact, using autopano-sift-c without key
refining, the values don't vary much so it is also useful as an
indicator to the true distortion parameter for our lens.

Using just one picture for calibration may not be good enough for
consistent results.

Tom Sharpless

unread,
Jul 13, 2008, 2:33:57 PM7/13/08
to hugin and other free panoramic software
Hi Terry,

I am puzzled about one point in the tutorial. You say the pano
projection has to be set to rectilinear, even when calibrating a
fisheye image. I don't understand why the output projection should
have any effect on the optimization. Can you explain?

Regards, Tom

Erik Krause

unread,
Jul 13, 2008, 3:17:41 PM7/13/08
to hugin-ptx
On Sunday, July 13, 2008 at 11:33, Tom Sharpless wrote:

> I am puzzled about one point in the tutorial. You say the pano
> projection has to be set to rectilinear, even when calibrating a
> fisheye image. I don't understand why the output projection should
> have any effect on the optimization. Can you explain?

Panotools treats line control points in output space. This is: If you
have straight line control points the line will only be straight in
rectilinear projection, which is the only one to keep straight lines
in reality straight in the image. Read
http://wiki.panotools.org/Lens_correction_model and particularily
http://wiki.panotools.org/Panotools_internals#Line_control_points for
details.

The idea of using string lines is clever, but one must keep in mind
that a string line is never exactly straight. It's own weight always
bends it down more or less. The string tension must be very high in
order to be almost straight...

best regards
Erik Krause
http://www.erik-krause.de

Tom Sharpless

unread,
Jul 13, 2008, 6:16:09 PM7/13/08
to hugin and other free panoramic software
Thanks, Erik

I was supposing that straight line control points would be mapped to
the panosphere (where they are great circles) as I have seen several
papers on calibrating fisheye lenses that way. Maybe we need such a
thing in hugin.

And maybe also a different parameter set for fisheye lenses. The
notion of "ideal function plus correction polynomial" works fine for
normal lenses where the ideal function, R = tan(theta) is well defined
(R = radius in image / focal length; theta = view angle from optic
axis). But fisheyes really have no "ideal function". Several that
have been characterized recently were well fit by functions such as as
R = a * sin (b * theta) or even R = a * sin (b * theta) + c tan (d *
theta). The fist form is an equal solid angle projection when a = 2
and b = 0.5; however Thoby found that the best fit for the excellent
Nikon 10.5 lens had roughly a = 1.5 and b = 0.7. The second form is a
mixture of of that and a generalized rectilinear (c = d = 1) or
stereographic (c = 2, d = 0.5) curve. And it is well known that
fisheye calibrations with the Panotools parameters are rather
ambiguous -- many sets of a, b, c can match the same lens. So it
might clear up a lot of confusion if we used a better model for
fisheyes. I would propose using the mixed form with just 3 free
parameters, by insisting that a + c = 1 (and let "focal length" take
up the difference).

Regards, Tom

Rik Littlefield

unread,
Jul 14, 2008, 12:05:19 AM7/14/08
to hugi...@googlegroups.com
Erik Krause wrote:
> The idea of using string lines is clever, but one must keep in mind
> that a string line is never exactly straight. It's own weight always
> bends it down more or less. The string tension must be very high in
> order to be almost straight...
Try using a plumb line, hanging vertical so the line will be perfectly
straight even if it is not very tight.

Take several pictures, rotating the camera so that the line appears in a
numerous places within the frames, different distances from the center.
It is not necessary to rotate around the NPP.

Then optimize a single set of lens parameters across all frames to make
all the lines straight, leaving pitch/roll/yaw locked at zero for all
frames.

This method tested OK for me a couple of years ago, but I have not used
it a lot so I don't know what all can go wrong with it.

--Rik

Johannes Zander

unread,
Jul 14, 2008, 6:00:24 PM7/14/08
to hugin and other free panoramic software
>> I haven't found support for straight line control points in the GUI. [..]
>
> They are there, but you need to know where to look ;-)

Ok, that was hidden really well (and it was even mentioned in the
tutorial -- the shame ... ).

Maybe this feature is hidden too good, because I have even seen the
item in the menu, while searching for straight-line-support. But I
didn't realize what it's for. It differs fundamentally from the other
items in the menu. It is not a type that can be selected to describe a
point-pair. Instead it is a function that will create a new entry in
the menu.

The simplest thing that comes to my mind to make it a bit more clear
would be to add a divider-line in the pop-up between the types and
this special item.

Cheers
Johannes

Tom Sharpless

unread,
Jul 14, 2008, 8:11:50 PM7/14/08
to hugin and other free panoramic software
Right on, Rik

Rotating the camera to take multiple images of one straight line
target, then optimizing only the lens parameters, is a very clever
idea. You can cover the whole field of a fisheye lens without having
to build a circular target or worry about calibrated rotations.

And it is perfectly OK if all the calibration lines are parallel,
because what actually determines the calibration parameters is just
the radial distances of the control points from the projection center
= image center + (d,e). So the important thing is to cover the range
of possible radii as fully as possible. For the same reason, the lines
need not extend all the way across the image; a little past center on
one side and all the way out on the other is fine.

Cheers, Tom

On Jul 14, 12:05 am, Rik Littlefield <rj.littlefi...@computer.org>
wrote:

Oskar Sander

unread,
Feb 25, 2009, 10:07:56 AM2/25/09
to hugi...@googlegroups.com
sorry for waking up an old thread.

I'd like to use this method to calibrate a UW camera set-up. That is,
a camera in a housing with a wide angle lens attached on the housing.
Obviously this set-up is very much different from the camera on land
so I need to do it in the pool, so I need to plan the exercise in
beforehand.


The pool have a lot of natural lines to use in the tiles of the pool,
the question is if these would be good enough, or if I must make my
own lines using a little buoy with a weight and orange string to get
enough accuracy?


the purpose with the calibration foul be to be able to correct
individual images using Fulla (and maybe directly in Lightroom
eventually) but also to have a starting value when stitching linear
panoramas of wreck sites.


Cheers
Oskar


2008/7/15 Tom Sharpless <TKSha...@gmail.com>:
--
/O

michael crane

unread,
Feb 25, 2009, 10:40:56 AM2/25/09
to hugi...@googlegroups.com
2009/2/25 Oskar Sander <oskar....@gmail.com>:

> the purpose with the calibration foul be to be able to correct
> individual images using Fulla (and maybe directly in Lightroom
> eventually) but also to have a starting value when stitching linear
> panoramas of wreck sites.

Why does it need to be underwater to calibrate ?

regards

mick

Carl von Einem

unread,
Feb 25, 2009, 11:39:11 AM2/25/09
to hugi...@googlegroups.com
Put an empty drinking glass on your kitchen table.

Fix a laser pointer so it aims from a higher point on one side of the
glass to the table surface on the other side of the glass. Eventually
mark the spot where the laser pointer hits the table.

Fill your glass with tap water and watch the beam being refracted (I
hope that's the right term) in a different angle when travelling from
glass through water instead glass/air.

The same water resistant lens has different values when used in standard
or underwater conditions.

Carl

Mick Crane

unread,
Feb 25, 2009, 12:06:36 PM2/25/09
to hugi...@googlegroups.com
I assumed refraction was the issue but wondered why the calibration
would be different underwater. It is light falling on the lens in both
cases.
Regards

Mick

Carl von Einem

unread,
Feb 25, 2009, 12:15:20 PM2/25/09
to hugi...@googlegroups.com
Yes, but a different medium (water instead of air) is used at one
surface of the glass. That's where the angle of the light ray changes.

Carl

Pit Suetterlin

unread,
Feb 26, 2009, 4:03:57 AM2/26/09
to hugi...@googlegroups.com

Hi Oskar,

I tried the several-lines method for calibrating my lens, and it indeed worked
quite nicely.

Oskar Sander wrote:
>
> sorry for waking up an old thread.
>
> I'd like to use this method to calibrate a UW camera set-up. That is,
> a camera in a housing with a wide angle lens attached on the housing.
> Obviously this set-up is very much different from the camera on land
> so I need to do it in the pool, so I need to plan the exercise in
> beforehand.
>
>
> The pool have a lot of natural lines to use in the tiles of the pool,
> the question is if these would be good enough, or if I must make my
> own lines using a little buoy with a weight and orange string to get
> enough accuracy?

My experience was that the lines from the spacings between tiles look nice
until you start and use them for setting the control points. The exct
placement gets a bit uncertain and thus you get a higher noise which falls
back on the derived parameters unless you have many points.
I.e., if you use them I'd make sure to take a lot of images with lines going
through the FOV at different positions and in different angles. If I remember
correct I was using something like 10 images, defining some 30 lines in
there...

Pit

--
Dr. Peter "Pit" Suetterlin http://www.astro.su.se/~pit
Institute for Solar Physics
Tel.: +34 922 405 590 (Spain) P.Suet...@royac.iac.es
+46 8 5537 8534 (Sweden) Peter.Su...@astro.su.se

Tom Sharpless

unread,
Feb 26, 2009, 10:12:26 AM2/26/09
to hugin and other free panoramic software
Hi Pit

On Feb 26, 4:03 am, Pit Suetterlin <P.Suetter...@royac.iac.es> wrote:
>   Hi Oskar,
>
> I tried the several-lines method for calibrating my lens, and it indeed worked
> quite nicely.
>
> Oskar Sander wrote:
>
> > sorry for waking up an old thread.
>
> > I'd like to use this method to calibrate a UW camera set-up.  That is,
> > a camera in a housing with a wide angle lens attached on the housing.
> > Obviously this set-up is very much different from the camera on land
> > so I need to do it in the pool, so I need to plan the exercise in
> > beforehand.
>
> > The pool have a lot of natural lines to use in the tiles of the pool,
> > the question is if these would be good enough, or if I must make my
> > own lines using a little buoy with a weight and orange string to get
> > enough accuracy?
>
> My experience was that the lines from the spacings between tiles look nice
> until you start and use them for setting the control points.  The exct
> placement gets a bit uncertain and thus you get a higher noise which falls
> back on the derived parameters unless you have many points.
> I.e., if you use them I'd make sure to take a lot of images with lines going
> through the FOV at different positions and in different angles.  If I remember
> correct I was using something like 10 images, defining some 30 lines in
> there...

Did your calibration image set cover 360 degrees?

It seems to me that if you use multiple images for calibration, then
the correct focal length or fov becomes a critical parameter (which is
not the case for single-image straight line calibration). And it is
known that the PT optimizer will almost always choose a wrong value
for that unless you force it to be right by insisting on closure of a
full circular image set.

But the straight line control point optimization (as currently
implemented) requires that the output projection be rectilinear. So
naively I would suppose that only a part of your straight line data
would be usable, unless the rectilinear error is computed separately
for each image, ignoring the rotation that aligns it on the
panosphere?

Some time ago I looked into the possibility of making libpano optimize
straight lines on the panosphere (where they become great circles)
instead of in a rectilinear projection. My hope was to better
support calibration of fish eye lenses, which is a continuing
problem. My code gave worse results than the existing method, so I
gave it up.

However I still think Hugin needs an easy and reliable way to do
straight line lens calibration. I believe that after many years of
using various calibration systems, photogrammetrists finally decided
the straight line method was best. And they often use naturally
occurring straight lines rather than special calibration rigs. The
key is software that can follow lines and estimate their positions to
subpixel accuracy. The raw image of a calibration line will in
general be curved, so a human has to designate which lines are
straight in reality -- but not set dozens of control points on them.

Maybe this would make a good GSOC project.

Regards, Tom

michael crane

unread,
Feb 26, 2009, 10:38:45 AM2/26/09
to hugi...@googlegroups.com
2009/2/25 Carl von Einem <ca...@einem.net>:

>
> Yes, but a different medium (water instead of air) is used at one
> surface of the glass. That's where the angle of the light ray changes.

Oh you mean the lens is inside a box ?
I thought that the lens would be in the water.
regards

mick

Carl von Einem

unread,
Feb 26, 2009, 11:18:23 AM2/26/09
to hugi...@googlegroups.com
The box would be the lens housing, the lens itself consists of a certain
number of glass elements with air between those. In most cases there is
second box behind box #1: the camera body, also filled with air. If the
whole combination is surrounded by water it's either a underwater camera
or broken ;-) Note that also underwater cameras can get very wet inside
if not maintained properly.

Light rays (emitted by sun or underwater flash) are reflected by an
object and travel through water -> glass -> air (awfully simplified) to
the sensor/film surface.

A similar problem is IR photography: the focus index is calculated for a
certain bandwidth of light. Try to only use the IR spectrum and you will
notice that these rays also behave in a different fashion.

After rereading this I think it's a wonder if a photographer is still
able to focus on the main subject :-)

Cheers,
Carl

Pit Suetterlin

unread,
Feb 27, 2009, 4:10:55 AM2/27/09
to hugi...@googlegroups.com

Hi Tom,

Tom Sharpless wrote:

> Did your calibration image set cover 360 degrees?

No, this was a a-b-c calibration only. I left the FOV fixed

> It seems to me that if you use multiple images for calibration, then
> the correct focal length or fov becomes a critical parameter (which is
> not the case for single-image straight line calibration). And it is
> known that the PT optimizer will almost always choose a wrong value
> for that unless you force it to be right by insisting on closure of a
> full circular image set.

Hmm, good point. I didn't check if a (small) change in the FOV results in
different a-b-c, but it seems somewhat reasonable.

> But the straight line control point optimization (as currently
> implemented) requires that the output projection be rectilinear. So
> naively I would suppose that only a part of your straight line data
> would be usable, unless the rectilinear error is computed separately
> for each image, ignoring the rotation that aligns it on the
> panosphere?

Yes. I kept all images at yaw/pitch/roll=0 and thus had a maze of lines
within the FOV and requested each of the lines to be straight in the output...

> Some time ago I looked into the possibility of making libpano optimize
> straight lines on the panosphere (where they become great circles)
> instead of in a rectilinear projection.

Yes, I remember.

> My hope was to better support calibration of fish eye lenses, which is a
> continuing problem. My code gave worse results than the existing method, so
> I gave it up.

I guess the reason is that in your method it strongly depends on the correct
location of the image (y,p,r), isn't it? So it would be a good thing for
panoramas in general, but meybe less for calibration only?

> However I still think Hugin needs an easy and reliable way to do
> straight line lens calibration. I believe that after many years of
> using various calibration systems, photogrammetrists finally decided
> the straight line method was best. And they often use naturally
> occurring straight lines rather than special calibration rigs. The
> key is software that can follow lines and estimate their positions to
> subpixel accuracy. The raw image of a calibration line will in
> general be curved, so a human has to designate which lines are
> straight in reality -- but not set dozens of control points on them.

Hmm, indeed. an automatic line finding algorithm might be a handy thing.
Sounds like calling for Hough transforms or alike. But for fisheye images
that really must be an awkward job...

Cheers,

Reply all
Reply to author
Forward
0 new messages