The lens database problem

298 views
Skip to first unread message

Robert

unread,
Mar 17, 2010, 10:32:03 AM3/17/10
to hugin and other free panoramic software
Hi,

starting this discussion on hugin-ptx because I would like to hear
some opinions, and here I reach a greater audience. Hopefully this is
alright.

I have been thinking for quite some time now about the "lens database
problem". Well, there *is* the old PTLens database, but this is now
about 4 years old and not too useful anymore. Finding new calibration
parameters with Hugin and tca_correct is no witchery, but:

(1) there are no tools to maintain the PTLens database, so the user is
pretty much on his/her own
(2) At least the available old PTLens library does not seem to support
CA correction
(3) The license of the DB data is unclear (or could someone please
elaborate on that matter)

Then there's lensfun. Andrew apparently saw the limitations of the old
PTLens DB, and tried to fix it. But the lensfun lib design has its
problems as well:

(1) Over-engineered: do we really need e.g half a dozen Pentax mounts
which all are marked as 'compatible'? For all purposes, different
versions of the same mount do not have to be discerned by software, it
just makes things complicated
(2) Modern cameras more or less invalidate the 'mount based' approach
of lensfun (see below)
(3) no maintenance tools as well, same as the old PTLens DB

The problem with modern cameras is that they try to do lens correction
themselves. The problem is most prominent with (Micro) Four Thirds
currently, but it is a development more or less all manufacturers are
working on to different extents.

It is a huge difference if you e.g. use a Zuiko 14-42 MFT lens
(Olympus brand) on a Olympus or a Panasonic MFT camera. The Olympus E-
PL1 will correct to a subtly different extent than e.g. the Panasonic
DMC-GH1, and some models will not correct at all. If the lens firmware
does not provide correction coefficients (or in a format the camera
does not understand), it will not correct. So even different lens
generations can behave differently on the same camera, even though
they might be very similar from an optical point of view. And if you
use raw images, then it depends on which converter you use whether the
result will be pre-corrected or not.

In-camera correction is usually not perfect, so one *can* further
improve things even in those pre-corrected images (provided one has
the correct coefficients). Remains the fact that it becomes
increasingly difficult (read: nearly impossible) to treat lens and
camera as separate entities. In fact, PTLens never really has: AFAIK
it only knows "groups" of lenses, which are more or less identified as
camera models. The drawback of this approach is that it will not
usually automatically find correction data for a EOS-350 if e.g. the
correction data is only stored for a EOS-300 (at least if I understood
its behaviour correctly).

To add to the problems, then there's multi-aspect sensors (currently
mainly Panasonic) and aspect-cropping in other camera models, e.g.
resulting in different diagonal crop factors for different aspects...
so there is not just 'data for camera model XYZ', but 'data for camera
model XYZ at specific settings'.

Despite its drawbacks, PTLens's approach allows it to quite easily
accomodate different behaviour of cameras in different settings: it
just provides several entries for the same camera, and if unable to
automatically identify the correct one it relies on the user to
choose. For the user this process is usually straight-forward, and
actually I cannot think of a 'better' solution.

So my proposal is this: I will try to implement yet another re-design
of the PTLens DB. I am thinking of the following:

(1) Structure close to PTLens's DB, but including TCA parameters and
possibly a mechanism to properly identify 'variants' of cameras (i.e.,
different settings)
(2) SQLite based backend
(3) Import/export tools to stay compatible with tools working with
PTLens's text files
(4) Extend the PTLens format to accomodate TCA parameters (hopefully
older software will just ignore those lines)
(5) Provide a PTLens-like frontend to (a) correct images directly if
wanted and more importantly (b) allow to view/edit/add correction
parameters
(6) Allow users to create and save their own lens data (possibly in
the home directory); tested/working data can be exported from there,
sent to the DB maintainer(s) and (hopefully easily) imported into the
'main' DB

I cannot think of a way to implement the changes inside the lensfun
XML specification, so a re-design cannot be avoided (and in that case,
the clean-slate approach should be best, IMHO). Of course to get this
to work will take quite some time, but I have to do the structure
design of the DB before I can actually get to work on it. Which is why
I am in need of opinions on the matter. So what do you think?

Regards,
Robert

Bruno Postle

unread,
Mar 17, 2010, 4:36:49 PM3/17/10
to hugin and other free panoramic software
On Wed 17-Mar-2010 at 07:32 -0700, Robert wrote:

>So my proposal is this: I will try to implement yet another re-design
>of the PTLens DB. I am thinking of the following:
>
>(1) Structure close to PTLens's DB, but including TCA parameters and
>possibly a mechanism to properly identify 'variants' of cameras (i.e.,
>different settings)
>(2) SQLite based backend
>(3) Import/export tools to stay compatible with tools working with
>PTLens's text files
>(4) Extend the PTLens format to accomodate TCA parameters (hopefully
>older software will just ignore those lines)
>(5) Provide a PTLens-like frontend to (a) correct images directly if
>wanted and more importantly (b) allow to view/edit/add correction

>parameters.

My questions are what is it for? and who is it for?

Who uses PTLens? barrel distortion in single-shot photos never
bothered me enough to want to correct it - Rather when I do want to
give a photo this much attention I'm as likely to want to recenter
and rotate it at the same time - Something that Hugin is well suited
for.

If you are stitching photos it doesn't make much sense to first
batch correct distortion in a separate tool, as this loses edge
pixels.

A lens database would be useful within a RAW converter, since the
converter is resampling the photo it makes sense to fix tca and
distortion at the same time - i.e. a database that RAW converters
can use transparently to improve results would be a good thing.

You could describe Hugin as primarily a tool that understands the
differences between a photo and the reality it represents - Being
able to stitch panoramas is a convenient result of that
understanding. So it is funny that Hugin doesn't have any
preconceived knowledge of real lenses and cameras, it generally has
to figure all this stuff out every time it runs - But we get by ok.

Having said that, Hugin would be better if it knew all the image
parameters upon loading a photo, not least because it could use this
to improve control point detection.

>(6) Allow users to create and save their own lens data (possibly in
>the home directory); tested/working data can be exported from there,
>sent to the DB maintainer(s) and (hopefully easily) imported into the
>'main' DB

This is the real problem that lensfun faced, who is going to
maintain this database? Somebody has to accept and validate
contributions and this is a _lot_ of tedious work.

One of the suggestions we had for lensfun was that Hugin could gain
an option to submit EXIF and corresponding calibration data every
time it does a stitch. This could be done via HTTP, and if it was
done REST-fully there would be no need for a database application to
receive the data, everything would be in the server logs and
extremely scaleable.

This isn't as crazy as it sounds, the panotools a,b,c lens
parameters can't simply be averaged, but it seems that the lens
model that Tom Sharpless developed for calibrate_lens is suitable.

i.e. collect a lot of data and build the database statistically with
a minimum of human input, the shipped database would then represent
the cameras and lenses in use in the real world rather than those
with owners who can be bothered to contribute.

--
Bruno

John McAllister

unread,
Mar 17, 2010, 6:40:20 PM3/17/10
to hugi...@googlegroups.com
I think that it is worth noting that Canon bundle software that corrects distortion, vignetting and TCA - for their own recognised lenses.
 
I recently upgraded to a 500D from a 350D and, naturally, Adobe Camera Raw CS2 did not play with the new raw.CR2 files.
 
When I looked at Canon's bundled apps (Digital Photo Professional), I was impressed with the raw facilities but also pleased to find that it did a very good job on the three basic corrections.
 
I don't want to press proprietory solutions but, if you've bought the kit, take a good look at what you've got.
 
You can sort the basic corrections, properly and every time; before even thinking about lens.info files, vignetting correction (surely only a problem with low-light photography) and those rare occassions when Chromatic Aberration intrudes.
 
My experience of PTLens was that it didn't seem to bear any correspondence to my own experience.
 
It is dead easy to work out your own lens distortion corrections.
Vignetting and Chromatic Aberration only cause problems in limited cases, and are easily dealt with by H.
 
I'm not sure that a database would prove to be particularly useful.
 
John...

Lukáš Jirkovský

unread,
Mar 18, 2010, 4:44:56 AM3/18/10
to hugi...@googlegroups.com
Hi Robert,

On 17 March 2010 15:32, Robert <robert...@googlemail.com> wrote:
> Hi,
>
> starting this discussion on hugin-ptx because I would like to hear
> some opinions, and here I reach a greater audience. Hopefully this is
> alright.
>
> I have been thinking for quite some time now about the "lens database
> problem". Well, there *is* the old PTLens database, but this is now
> about 4 years old and not too useful anymore. Finding new calibration
> parameters with Hugin and tca_correct is no witchery, but:
>
> (1) there are no tools to maintain the PTLens database, so the user is
> pretty much on his/her own

I think this is really big problem with the old PTLens database (I
mean the one which is freely available on the internet). However with
lensfun it's the same (the only difference is that it's updated from
time to time).

> (2) At least the available old PTLens library does not seem to support
> CA correction
> (3) The license of the DB data is unclear (or could someone please
> elaborate on that matter)

I don't remember very well but I think it was something like "free to use"

> (1) Over-engineered: do we really need e.g half a dozen Pentax mounts
> which all are marked as 'compatible'? For all purposes, different
> versions of the same mount do not have to be discerned by software, it
> just makes things complicated
> (2) Modern cameras more or less invalidate the 'mount based' approach
> of lensfun (see below)
> (3) no maintenance tools as well, same as the old PTLens DB
>

I never used lensfun but it sound horrifying.
According to Four Thirds – I saw images after doing lens correction in
bundled software and it looked awful because it only stretched some
parts of image. IMO something more robust is welcomed.

>
> So my proposal is this: I will try to implement yet another re-design
> of the PTLens DB. I am thinking of the following:
>
> (1) Structure close to PTLens's DB, but including TCA parameters and
> possibly a mechanism to properly identify 'variants' of cameras (i.e.,
> different settings)
> (2) SQLite based backend
> (3) Import/export tools to stay compatible with tools working with
> PTLens's text files
> (4) Extend the PTLens format to accomodate TCA parameters (hopefully
> older software will just ignore those lines)
> (5) Provide a PTLens-like frontend to (a) correct images directly if
> wanted and more importantly (b) allow to view/edit/add correction
> parameters
> (6) Allow users to create and save their own lens data (possibly in
> the home directory); tested/working data can be exported from there,
> sent to the DB maintainer(s) and (hopefully easily) imported into the
> 'main' DB
>

I really like you idea. I already took a glance at your photoropter
library and it looks really simple to use. If it is accompanied with a
good lens library and nice tools to update library it will be awesome.

Robert

unread,
Mar 18, 2010, 5:27:01 AM3/18/10
to hugin and other free panoramic software
On 17 Mrz., 21:36, Bruno Postle <br...@postle.net> wrote:
> My questions are what is it for? and who is it for?

I am currently thinking of open source RAW converters mainly, but a
command line tool for batch processing should have its uses as well.
If Hugin can profit from lens data DB lookups, all the better, but
compared to the effort of setting up a proper stitching project,
manually loading lens data is not *that* big of a deal. Apart from
that, distortion data can be determined automatically by Hugin in many
cases anyway.

> Who uses PTLens? barrel distortion in single-shot photos never
> bothered me enough to want to correct it - Rather when I do want to
> give a photo this much attention I'm as likely to want to recenter
> and rotate it at the same time - Something that Hugin is well suited
> for.

Correct. However, considering modern 'hybrid' cameras (Micro Four
Thirds, Samsung NX etc.) with ultra-compact lens constructions,
distortion in raw images once again becomes a serious problem, e.g.:

http://www.dpreview.com/reviews/olympusep1/page22.asp

The commercial raw converters partially license the algorithms used by
the camera manufacturers to replicate in-camera correction, but that
is not always optimal (and the free projects --RawTherapee, Rawstudio,
UFRaw etc.-- do not even have that option).

> A lens database would be useful within a RAW converter, since the
> converter is resampling the photo it makes sense to fix tca and
> distortion at the same time - i.e. a database that RAW converters
> can use transparently to improve results would be a good thing.

Exactly (see above).

> You could describe Hugin as primarily a tool that understands the
> differences between a photo and the reality it represents - Being
> able to stitch panoramas is a convenient result of that
> understanding.  So it is funny that Hugin doesn't have any
> preconceived knowledge of real lenses and cameras, it generally has
> to figure all this stuff out every time it runs - But we get by ok.
>
> Having said that, Hugin would be better if it knew all the image
> parameters upon loading a photo, not least because it could use this
> to improve control point detection.

As I said, if my idea proves to be workable and Hugin can be adapted
to make use of it, all the better. But ironically Hugin is the only
software at the moment that can be used to _determine_ correction
parameters-- maybe exactly because it does not assume anything about
the physics of a photograph.

> This is the real problem that lensfun faced, who is going to
> maintain this database?  Somebody has to accept and validate
> contributions and this is a _lot_ of tedious work.

The problem that lensfun faces is that it is neither easy nor
straightforward to create your own datasets. If it is easy enough,
quite a lot of people will probably even be happy to calibrate their
lenses themselves (provided they do only have to do it *once*).
Concerning validation: if I get data from someone concerning a lens to
include it in the database, I cannot validate it without having the
lens and/or several test shots (and it takes a lot of time, of
course). So I can only trust user contributions, but I do not see this
as a real problem. If a correction model is wrong, sooner or later
there will be a bug report on it. The crucial point is the existence
of proper DB maintenance tools.

> This isn't as crazy as it sounds, the panotools a,b,c lens
> parameters can't simply be averaged, but it seems that the lens
> model that Tom Sharpless developed for calibrate_lens is suitable.

I was unaware that calibrate_lens implements a new correction model as
well. I will have a look, not least because Photoropter will have to
'learn' about it...

> i.e. collect a lot of data and build the database statistically with
> a minimum of human input, the shipped database would then represent
> the cameras and lenses in use in the real world rather than those
> with owners who can be bothered to contribute.

I do not think that this will work. First of all, a *lot* of people
distrust software sending 'feedback' or the like to some servers
without intervention. Second, identifying a lens is not trivial and
very often requires at least some minimal amount of user input. Not to
mention adapted lenses (no EXIF data at all) and 'new' lenses that are
not yet present in the database. You might end up with an ugly mess
instead of a usable database.

Regards,
Robert

Robert

unread,
Mar 18, 2010, 5:35:17 AM3/18/10
to hugin and other free panoramic software
On 17 Mrz., 23:40, "John McAllister" <sp...@blueyonder.co.uk> wrote:

> I recently upgraded to a 500D from a 350D and, naturally, Adobe Camera Raw CS2 did not play with the new raw.CR2 files.
>
> When I looked at Canon's bundled apps (Digital Photo Professional), I was impressed with the raw facilities but also pleased to find that it did a very good job on the three basic corrections.

I find it rather curious that you actually seem to advise against
implementing a feature in open source software on the grounds of
closed-source software already providing it... That said, there are
lots of raw converters out there, and not all of them implement the
manufacturers' default lens corrections. E.g. I use Olympus myself,
and the 'Studio' converter by them actually corrects distortion quite
nicely. But it has other limitations, because of which I hardy use it
nowadays. And e.g. Bibble uses the PTLens DB and engine, AFAIK.

> I'm not sure that a database would prove to be particularly useful.

Not all people use the same equipment, and for some combinations, lens
db lookup in open-source software will be a greater asset than for
others.

Regards,
Robert

Robert

unread,
Mar 18, 2010, 7:49:50 AM3/18/10
to hugin and other free panoramic software
> I never used lensfun but it sound horrifying.

Let's just say I dislike lensfun's API, which is the reason for the
start of Photoropter.

> According to Four Thirds – I saw images after doing lens correction in
> bundled software and it looked awful because it only stretched some
> parts of image. IMO something more robust is welcomed.

The problem with Four Thirds is mainly aspect ratios. For example,
Olympus's models use 4:3, Panasonic's are multi-aspect (4:3, 3:2,
16:9). 3:2 on a Panasonic GH1 has crop factor 2.0, but if you crop to
3:2 on a Olympus model then the crop factor changes from 2.00 to
2.08...

The PTLens normalised coordinate system does not deal with this, and
just applying correctional parameters from e.g. a Nikon and compensate
for crop factor 1.5->2.0 is _not_ sufficient. lensfun cannot deal with
this, as it has no API for it. Photoropter however _can_, provided you
tell it the 'parameter aspect' as well as the crop factor.

> I really like you idea. I already took a glance at your photoropter
> library and it looks really simple to use. If it is accompanied with a
> good lens library and nice tools to update library it will be awesome.

Let's see how far I get. I have quite a few ideas on the database
structure, I will try to write them down in Photoropter's technical/
background documentation for the next release. I will probably also
write some documentation on the PTLens file format.

Regards,
Robert

Bruno Postle

unread,
Mar 18, 2010, 7:55:42 AM3/18/10
to hugi...@googlegroups.com
On 18 March 2010 09:27, Robert <robert...@googlemail.com> wrote:

[snip collecting Hugin calibration data automatically]

> I do not think that this will work. First of all, a *lot* of people
> distrust software sending 'feedback' or the like to some servers
> without intervention.

It would be disabled by default. I'd suggest a checkbox in the pop-up
window that says "You have a good fit, apply?", The checkbox would say
"Submit good calibration data
to the photoroptor database?"

> Second, identifying a lens is not trivial and
> very often requires at least some minimal amount of user input. Not to
> mention adapted lenses (no EXIF data at all) and 'new' lenses that are
> not yet present in the database.

If the metadata isn't sufficient to identify a lens/camera combination
then by definition automatic lens correction isn't going to work.
Ignore these photos and let the users cope with the problem as they do
already - Concentrate on problems that can be solved.

--
Bruno

Robert

unread,
Mar 18, 2010, 8:02:02 AM3/18/10
to hugin and other free panoramic software
On 18 Mrz., 10:27, Robert <robert.fe...@googlemail.com> wrote:
> I was unaware that calibrate_lens implements a new correction model as
> well. I will have a look, not least because Photoropter will have to
> 'learn' about it...

On that note, I have taken a look inside 'calibrate_lens' and would
like to ask some questions... first of all: what. the. heck?

Okay, I understand that apparently a spline function is used, and the
output of calibrate_lens is currently just spline coefficients. But
where exactly is the connection to the (u,v) parameters Tom describes
in lensFunc.h for his generic lens model? That model makes sense, and
is in fact quite an impressive idea. But there is no further
explanation.

So, how do those 7 spline parameters correspond to the different lens
geometries? If I want to implement the model in Photoropter, I have to
know what is supposed to be the actual "model parameter set". If I
start calibrate_lens with e.g. "-l 2" (equidistant fish), does this
just set up different start values for the Levenberg-Marquardt
optimiser, or what does it do?

Regards,
Robert

Tom Sharpless

unread,
Mar 18, 2010, 12:55:43 PM3/18/10
to hugin and other free panoramic software
Hi All

Bruno speaks my mind.

The most important problems with a lens-and-camera database for Hugin
are
1) designing it to support the real needs of panotools users
2) getting reliable data on actual equipment
3) maintaining and distributing the data.

1) Although such a db must support the way Hugin/panotools works now,
designing it to only do that would be fatal. It is a chicken and egg
situation: panotools can't do it better without a db, and a db can't
make that happen. The PT lens/camera model is in serious need of
improvements, which must be co-developed with a database that supports
them. But there are significant technical as well as operational
problems with separating lens calibration from the present scheme in
which all sources of control point error are folded into a single
overall optimization. And some attention must be given to the problem
of modeling the projection functions of 'non-rectilinear' lenses.

2) means not only mining camera mfgrs' specs, but providing tools by
which a photographer can calibrate his own lenses Yes, I mean the
database is no good without a fisheye calibration procedure anyone can
use.

3) Much of the info -- especially on sensors, mfgr's codes, etc. --
should be kept up to date by a dedicated central agency, and updates
must be easy to get hold of. Bruno's idea of an online query facility
is not farfetched, but the now traditional automatic update of a local
db off the internet is essential. That process needs to preserve any
local values set by the photographer, and should be smart enough only
to load info on equipment the photographer claims to have.

All that said, I wish Robert good luck and success on this
undertaking, and hope it really happens.

Cheers, Tom

Tom Sharpless

unread,
Mar 18, 2010, 1:43:36 PM3/18/10
to hugin and other free panoramic software
Hi Robert

calibrate_lens is part of an R&D project that was never completed.
Although I contributed a lot of ideas and code, I'm not in a position
to say whether the program as it now stands makes any kind of sense,
and if so, what.

The piece called LensFunc is a prototype for a possible addition to
libpano that would support both calibrating lenses and using
calibrated lenses in stitching. It contains a basic model for camera
and lens parameters but no specific lens projection curve model. The
cubic spline is just a handy way to represent smooth curves, used to
make it possible to supply 'forward' and 'inverse' functions for any
monotonic radial projection curve. The actual curves would be
generated by optimizing the parameters of functions specifically
designed as models for particular classes of lenses. Those might not
be easily invertible, the spline provides a way of inverting them
numerically -- provided they are monotonic. When I last worked on
this I was having trouble using PT's 4th order polynomial in the model
functions, because the optimizer too often generated non-monotonic
curves. So any further development will need to use a better-behaved
model function for fisheyes. The standard tangent curve plus 4th
order polynomial is a perfectly usable model for normal lenses, but
something else is needed for fisheye lenses.

The generic fisheye (u,v) lens model is based on the idea of modeling
lens projections as linear combinations of some of the ideal
functions that fisheyes often approximate. I think I used the
stereographic, equal-angle, and equal_area curves there, so (u,v) span
a triangular space, something like the ones used in color mapping.
More recently I have had fairly good luck modeling fisheye lenses with
a mixture of just the stereographic and equal-area functions; however
it might be necessary to add a 'french curve' - type correction,
controlled by one or two additional parameters, to get really good
fits --and I have not yet found a suitable one (i.e. one that can't
produce non-monotonic curves).

regards, Tom

Aleksandr Milewski

unread,
Mar 18, 2010, 2:49:47 PM3/18/10
to hugi...@googlegroups.com
On 3/18/10 9:55 AM, Tom Sharpless wrote:
> Hi All

> 2) means not only mining camera mfgrs' specs, but providing tools by
> which a photographer can calibrate his own lenses Yes, I mean the
> database is no good without a fisheye calibration procedure anyone can
> use.

Related, I'm in the midst of hacking up some scripts for my own use for
populating the EXIF data for manual focus lenses on various cameras.

(This effort is mostly non-pano related, though many of our inexpensive
fisheyes fall into this category.)

I'd love to be able to able to pull general lens metadata from a db like
this both so my asset manager (Aperture, in my case) can see it, and so
that running these images through a tool that uses this db can find the
right lens corrections.

The next obvious step would be to support using EXIF (MakerNote?) or XMP
to store the correction data in/with the file itself...

-Zandr

Bruno Postle

unread,
Mar 18, 2010, 7:25:49 PM3/18/10
to hugin and other free panoramic software
On Thu 18-Mar-2010 at 09:55 -0700, Tom Sharpless wrote:
>
>3) Much of the info -- especially on sensors, mfgr's codes, etc. --
>should be kept up to date by a dedicated central agency, and updates
>must be easy to get hold of. Bruno's idea of an online query facility
>is not farfetched, but the now traditional automatic update of a local
>db off the internet is essential.

Actually I'm suggesting the opposite: we could harvest the results
of thousands of people stitching panoramas and use it to build the
database. For this we need some way to reasonably 'average' lens
parameters, I suggested an internal conversion from panotools
polynomial to a spline model would work for averaging.

--
Bruno

Lukáš Jirkovský

unread,
Mar 19, 2010, 4:31:50 AM3/19/10
to hugi...@googlegroups.com

That sounds nice. If I got it you mean that the user would send the
information to the database and the database would publish the
parameters which are the most frequent for given lens. I like that
idea because it doesn't need anyone to check the parameters.

Lukas

John McAllister

unread,
Mar 19, 2010, 5:27:36 AM3/19/10
to hugi...@googlegroups.com
Remember, part of the information that any database would require would be the resolution of the sensor, as the distortion parameters relate to the pixel dimensions of the image.
 
John

Robert

unread,
Mar 19, 2010, 5:55:51 AM3/19/10
to hugin and other free panoramic software

On 19 Mrz., 10:27, "John McAllister" <sp...@blueyonder.co.uk> wrote:
> Remember, part of the information that any database would require would be the resolution of the sensor, as the distortion parameters relate to the pixel dimensions of the image.

Not exactly: in the current normalised coordinates system, one mainly
needs the crop factor. To convert distortion data between different
cameras, one needs both crop factors (and aspect ratios if they
differ).

However, the DB would need to store at least *some* information on the
camera, that's right, and that's definitely not something that can be
just averaged.

Regards,
Robert

Bruno Postle

unread,
Mar 21, 2010, 6:58:20 AM3/21/10
to hugin and other free panoramic software
On Wed 17-Mar-2010 at 20:36 +0000, Bruno Postle wrote:
>
>One of the suggestions we had for lensfun was that Hugin could gain
>an option to submit EXIF and corresponding calibration data every
>time it does a stitch. This could be done via HTTP, and if it was
>done REST-fully there would be no need for a database application to
>receive the data, everything would be in the server logs and
>extremely scaleable.

So here is a bit of code as a test. It needs Panotools::Script 0.24
and a couple of standard perl modules.

What it does is to read a .pto project, find the first lens,
gather some lens and EXIF data, and submits it via HTTP (currently
to my home server, but this is configurable).

Run it like this:

lens-submit someproject.pto

..or

lens-submit *.pto

..or even this if you like:

find . -name "*.pto" -exec lens-submit '{}' \;

The data on the server looks like this, it is quite anonymous
(except the ip-address which can be stripped):

192.168.1.99 - - [21/Mar/2010:00:45:58 +0000] "GET /?w=3968&a=0&Rd=0&d=0&Vx=0&Rb=0&h=2232&Vy=0&g=0&f=0&t=0&e=0&Vc=0&Va=1&Ra=0&Vb=0&v=33.4&c=0&Rc=0&b=-0.000199020908138464&Vd=0&Re=0&FocalLengthIn35mmFormat=60&ImageWidth=3968&ScaleFactor35efl=4.6875&ColorSpace=1&Model=DMC-LX3&ResolutionUnit=2&YResolution=180&FileType=JPEG&FOV=33.3985166785074&FNumber=2.8&ImageHeight=2232&FocalLength=12.8&Software=Ver.1.3++&XResolution=180&Make=Panasonic&ExifVersion=0221 HTTP/1.1" 200 - "-" "lens-submit/0.25"

The difficult bit will be to write a tool that does something useful
with this information.

--
Bruno

lens-submit

Robert

unread,
Mar 21, 2010, 4:56:26 PM3/21/10
to hugin and other free panoramic software
> The difficult bit will be to write a tool that does something useful
> with this information.

Yes, that's the crucial point I'm afraid. Just having Hugin submit
data is not much use before it is not decided what shall be done with
that information. However, the more I think about it, the more a CDDB-
like online database could actually work. There will be a *quite* few
wrinkles to be ironed out, but we'll see. At the moment I am thinking
of writing a small GUI that hopefully can act as a testbed in the not-
to-far future. Since I am doing this in my sparetime, it will however
take some time until it does something "really useful".

Until then, I'll have to think about the generic lens model. I am not
convinced that generic spline parameters are the way to go, and it is
not guaranteed that one can actually just 'average' them and hope to
achieve something meaningful. The beauty of the original model by Tom
is that it just uses two parameters (u,v) to describe more or less all
fisheye cases and the rectilinear one in a generic and exact way, thus
providing something conceptually close to a lens model 'Hilbert
space' (but unfortunately non-linear).

From a physical point of view, fitting those two model parameters and
on top of it a (hopefully small) PTLens correction sounds much more
sound to me. If the model fit is good enough, a PTLens polynomial
should be quite 'well-behaved', making fitting stable. However, both
model and PTLens correction are by definiton non-linear (making simple
averaging of data impossible). Let me think about it, maybe I'll have
an inspiration.

Regards,
Robert

Robert

unread,
Mar 22, 2010, 11:58:19 AM3/22/10
to hugin and other free panoramic software
Hi,

I've been thinking a bit more about modelling fisheye lenses in Hugin
and other tools. The universal model Tom describes in the sources of
calibrate_lens is not a linear combination of stereographic, equisolid
and equidistant models, but rather based upon the following idea:

r(theta)=p*kt*tan(theta/kt)+(1-p)*ks*sin(theta/ks).

This model is intriguing in that is shows the close relationship
between the various mapping functions. It also includes the
stereographic (p=1, kt=2), equisolid (p=0, ks=2), orthographic (p=0,
ks=1), equidistant (p=1, kt=infinity) and rectilinear (p=1, kt=1)
cases. The (u,v) mapping maps the three parameters to a triangular
space, since one does not really have three degrees of freedom but
only 2. So far it is very impressive. However, of course it is
impractical, since the model is non-linear in its parameters.

The current approach of Hugin is to map an arbitrary fisheye to the
equidistant case. This works actually not too badly, although Tom is
right about the 4th order correction polynomial to map the input data
to the equidistant case being non-monotonous. However this should not
matter too much since one can IMHO get by without having to invert it.
The approach shows the greatest problems when trying to map
orthographic and stereographic fishes to the equidistant case. The
current approach also has the problem that it sets d=1-(a+b+c), which
changes the apparent focal length of the lens.

The orthographic case is special in that it is limited to theta < pi/2
(as is the gnomonic/rectilinear case). Because of that, it deserves
special treatment; however, to my knowledge only one fisheye has ever
been built that actually implements that mapping. The stereographic
mapping diverges for theta->pi, so it shows rather large differences
to the equidistant/equisolid cases. Because of that, a polynomial
mapping does not work too well here, meaning that this mapping should
be treated as special, too.

To make a long story short: I would propose the following:

(1) Hugin support not only for equidistant, but also for the other
three 'standard mappings' on the input side
(2) Either expose the 'd' parameter of the PTLens model to the UI or
lock it to 1.0 for fisheye cases

The choice which model is 'best' for a given lens can be left to the
user to decide, or it can be decided in an external lens calibration
application (I am currently thinking about writing one for
Photoropter, anyway). E.g. the Samyang 8mm is not really stereographic
but AFAIK rather implements 3f*tan(theta/3), and the Peleng implements
3f*sin(theta/3); however, the necessary corrections are rather small
and can be quite easily accomplished using the PTLens model if one
checks in a first step which model shows the smallest deviations.

Therefore, the only parameters that would need to be averaged if
different people submit calibration data would be the PTLens
parameters. If someone used a different lens geometry, the server can
either drop the submitted data or create an alternative batch of
parameter sets. However, for any given geometry, the PTLens correction
is linear in all its parameters, making it easy to determine the
average courve (by averaging the parameters).

Regards,
Robert

dmg

unread,
Mar 22, 2010, 12:04:27 PM3/22/10
to hugin-ptx
>
> To make a long story short: I would propose the following:
>
> (1) Hugin support not only for equidistant, but also for the other
> three 'standard mappings' on the input side
> (2) Either expose the 'd' parameter of the PTLens model to the UI or
> lock it to 1.0 for fisheye cases
>

it is not difficult. Start by creating a new input projection in
libpano. Once that is done
it will relatively trivial to add support in hugin. THe user can then
choose the model
that best suits the lens.

--dmg

> The choice which model is 'best' for a given lens can be left to the
> user to decide, or it can be decided in an external lens calibration
> application (I am currently thinking about writing one for
> Photoropter, anyway). E.g. the Samyang 8mm is not really stereographic
> but AFAIK rather implements 3f*tan(theta/3), and the Peleng implements
> 3f*sin(theta/3); however, the necessary corrections are rather small
> and can be quite easily accomplished using the PTLens model if one
> checks in a first step which model shows the smallest deviations.
>
> Therefore, the only parameters that would need to be averaged if
> different people submit calibration data would be the PTLens
> parameters. If someone used a different lens geometry, the server can
> either drop the submitted data or create an alternative batch of
> parameter sets. However, for any given geometry, the PTLens correction
> is linear in all its parameters, making it easy to determine the
> average courve (by averaging the parameters).
>
> Regards,
> Robert
>

> --
> You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
> A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
> To post to this group, send email to hugi...@googlegroups.com
> To unsubscribe from this group, send email to hugin-ptx+...@googlegroups.com
> For more options, visit this group at http://groups.google.com/group/hugin-ptx
>
> To unsubscribe from this group, send email to hugin-ptx+unsubscribegooglegroups.com or reply to this email with the words "REMOVE ME" as the subject.
>
>

--
--dmg

---
Daniel M. German
http://turingmachine.org

Bruno Postle

unread,
Mar 22, 2010, 6:17:00 PM3/22/10
to hugin-ptx
On Mon 22-Mar-2010 at 09:04 -0700, Daniel M. German wrote:
>>
>> To make a long story short: I would propose the following:
>>
>> (1) Hugin support not only for equidistant, but also for the other
>> three 'standard mappings' on the input side

>> (2) Either expose the 'd' parameter of the PTLens model to the UI or
>> lock it to 1.0 for fisheye cases

This would be a major change and would break existing Hugin project
files without any obvious benefit for the average user.

> it is not difficult. Start by creating a new input projection in
> libpano. Once that is done it will relatively trivial to add
> support in hugin. THe user can then choose the model that best
> suits the lens.

I'm pretty sure that more fisheye input models have already been
added to libpano13, from the docs for input formats:

# The 'o' lines describe input images. One line per image is
# required
# The width and height of the image is obtained from image
# f0 projection format,
# 7 - Mirror
# 8 - Fisheye Orthographic
# 10 - Fisheye Stereographic
# 19 - Fisheye Equisolid

--
Bruno

Robert

unread,
Mar 22, 2010, 7:40:12 PM3/22/10
to hugin and other free panoramic software
On 22 Mrz., 23:17, Bruno Postle <br...@postle.net> wrote:
> >> (1) Hugin support not only for equidistant, but also for the other
> >> three 'standard mappings' on the input side
> >> (2) Either expose the 'd' parameter of the PTLens model to the UI or
> >> lock it to 1.0 for fisheye cases
>
> This would be a major change and would break existing Hugin project
> files without any obvious benefit for the average user.

Would you care to elaborate on how adding support for additional
fisheye geometries would break existings project files? Exposing the
'd' parameter would need an additional configuration option (i.e.,
'Auto-calculate d for fixed image height (default)' or similar), but
if done sensibly this also would not break anything. In fact, my
proposal was made exactly with continuity/consistency in mind, as this
is a change that needs not break anything. On the other hand,
alternative ideas like using a generic spline curve (proposed by
yourself, by the way), *would* certainly break things, and I doubt
they would be vastly superiour in terms of results.

As to the point of 'no obvious benefit to the average user', I find it
interesting how you define 'average' and 'no obvious benefit'. To say
something is not needed/wanted by users even if one of those users
just requested it is a rather fascinating concept.

The support for fisheye lenses in Hugin quite frankly sucks at the
moment, and this is something that you can hear/read from several
people, it's not just me. Especially emulating stereographic fishes
(i.e. Samyang 8mm) using the current mechanism is a big pain, and the
results leave quite a lot to be desired (deviations using equidistant
+PTLens can still get as high as 1 percent near the image border,
that's 20-30 pixels...). The possibility to fix the 'd' param at 1.0
(as it is the only correct value if the focal length is known) and use
a more adequate model would both hugely improve things.

Considering that libpano13 already seems to support all the standard
geometries, the current situation is getting not only annoying but
rather ridiculous. Or are you saying that nobody is using these
lenses? Sorry, that would be like saying nobody is using the Peleng
8mm. Not everyone is using Sigma or Nikkor fishes for 1+ kEUR a piece.
If you were saying (more or less) nobody is using orthographic fishes,
then I would agree, considering that the only lens ever made was the
Nikkor 10mm OP, and that has been unavailable for nearly 35 years...

Just my $0.02.

Regards,
Robert

Felix Hagemann

unread,
Mar 23, 2010, 5:34:33 PM3/23/10
to hugi...@googlegroups.com
On 21 March 2010 11:58, Bruno Postle <br...@postle.net> wrote:
> So here is a bit of code as a test.  It needs Panotools::Script 0.24 and a
> couple of standard perl modules.
>
> What it does is to read a .pto project, find the first lens, gather some
> lens and EXIF data, and submits it via HTTP (currently to my home server,

Wow, that's an extremely interesting concept. Are you interested in
receiving submissions now? Should these only be high quality projects
(e.g. only with pano head not handheld)? There about 150 project files
on my harddisk waiting to be submitted.

> The difficult bit will be to write a tool that does something useful with
> this information.

As a first step it will be very interesting to look into the data and
look at the scatter of the correction curves that the submitted
parameters produce for one lens + camera combination. I'd be happy to
do that, unfortunately I just can't be sure when I will have the
necessary time available.

Felix

Bruno Postle

unread,
Mar 23, 2010, 8:08:54 PM3/23/10
to hugi...@googlegroups.com
On 23 March 2010 17:34, Felix Hagemann <felix.h...@gmail.com> wrote:
> On 21 March 2010 11:58, Bruno Postle <br...@postle.net> wrote:
>>
>> What it does is to read a .pto project, find the first lens, gather some
>> lens and EXIF data, and submits it via HTTP (currently to my home server,
>
> Wow, that's an extremely interesting concept. Are you interested in
> receiving submissions now? Should these only be high quality projects
> (e.g. only with pano head not handheld)? There about 150 project files
> on my harddisk waiting to be submitted.

I think any use of this data would have to be able to cope with
variable quality, so go ahead and submit. The only way to find out if
the approach is going to work is with some real data.

There is no way this is going to overload the server, so anyone can
feel free to run lens-submit as much as they like.

>> The difficult bit will be to write a tool that does something useful with
>> this information.
>
> As a first step it will be very interesting to look into the data and
> look at the scatter of the correction curves that the submitted
> parameters produce for one lens + camera combination. I'd be happy to
> do that, unfortunately I just can't be sure when I will have the
> necessary time available.

Me too, but hopefully somebody will get a chance to do something with it.

--
Bruno

Reply all
Reply to author
Forward
0 new messages