Hi,
I recently started to do photo stitching with Hugin and I love it. You
guys really made it easy to have a shallow learning curve, but also
provide all the command line tools to run things in bash scripts or
Makefiles for 'pro' work. Nice!
I have a question that I came across while doing experiments with HDR
panoramas (I like to do night-panoramas
https://plus.google.com/113917390722624035964/posts/FsNq9ECfpM3 ). For
these 360/180 panoramas with multiple exposures, it is common that I
have over 100 pictures to be stitched (I don't have a fisheye yet). It
takes Hugin a loong time to match up the adjacent pictures because it
compares each picture with every other picture to find if there are
matching control points. This is an O(n^2) operation which takes
forever (didn't really time it because I walked away from the
computer, but one hour is probably a good order of magnitude).
Is it possible to tell Hugin in advance which pictures are adjacent
(or partially overlapping) ? That way, it would only need to match up
control points between these adjacent pictures. So essentially for
each picture, tell it which is right, left, top, below from it and
which are mostly overlapping (e.g. for stacked HDR pictures).
I presume, this should be possible by telling it the rotating and tilt
angles, maybe ideally directly in the EXIF or XMP metadata. I plan to
build a motorized panorama head, so that would then allow to
automatically add this meta-data to the pictures.
Of course, _ideally_, this would even be with error margins (42
degrees +1.1 / -1.5), so that Hugin can use this as a hint to know in
which range it has to search for matching control points.
Also, this technique might help to make it easier to stitch parts of
featureless areas (say the sky).
Is there something like this ?
-henner