Image Stitching and Exposure Fusion with pv

551 views
Skip to first unread message

kfj

unread,
Feb 28, 2021, 5:23:32 AM2/28/21
to hugin and other free panoramic software
Dear group!
I introduced HDR blending with pv some time back, and since that time I felt tempted to tackle both exposure fusion and image stitching as well. Hugin usually delegates these tasks to enfuse and enblend, and I have used both programs to good effect for a long time, so I was aiming at using similar techniques in pv. enfuse and enblend rely on multiresolution blending, as published by Peter J. Burt and Edward H. Adelson in their article 'A Multiresolution Spline With Application to Image Mosaics',  which was used for exposure fusion by Tom Mertens, Jan Kautz and Frank Van Reeth, as described in their article 'Exposure Fusion'.
pv uses b-splines for interpolation, and it's use of image pyramids relies on a variant of image pyramids based on b-splines. This has proven effective for viewing images with pv for many years now, and it's also what I use for image pyramids in my implementation of the adapted multiresolution blending algorithm. This provides a fresh, modern implementation based on my library vspline, which is fast because it is multithreaded and uses SIMD code.
I have pushed a prototype to the master branch of my pv repo which offers both exposure fusion and image stitching using this new code, so for now the code is Linux-only - if you feel adventurous, do your own merge to the msys2 or mac branches. To try it out, simply load a PTO containing a registered exposure bracket or a panorama into pv, select the ROI and press 'U' for an exposure fusion or 'P' for a stitch. You may want to pass --snapshot_magnification=... on the command line to get larger output, other snapshot-related parameters should apply.
Consider this a 'sneak preview' - the code is, as of this writing, not yet highly optimized, but should be functional. The output is rendered in the background and may take some time to complete. Best work from the command line, where there is some feedback on the process. I'll tweak the code in the days to come, and I intend to provide a version with alpha channel processing as well. I'll post again once everything is 'production grade' and fully documented.
One large benefit of having these new capabilities in pv is that it does away with the need for intermediate images and external helper programs: the intermediates are produced and processed internally in RAM, so there is no disk traffic, and the internal intermediates have full single precision float resolution, which makes for good image quality. On the down side, due to the RAM-based intermediates, output size is limited.
Kay

Harry van der Wolf

unread,
Feb 28, 2021, 1:09:25 PM2/28/21
to hugi...@googlegroups.com
I just build it on a Ubuntu derived 64bit system. It compiles fine. Before testing further I did the same on my Debian 10 buster armhf Raspberry pi 4. Now it ends in error.
Don't put too much effort in it. The rpi4 is my server, but I just wanted to test if it would compile there (and then test it via tightvnc).

clang++ -c -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_no_rendering.cc -o pv_no_rendering.o
clang++ -c -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_initialize.cc -o pv_initialize.o
clang++ -c -D PV_ARCH=PV_FALLBACK -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_rendering.cc -o pv_fallback.o
clang++ -c -mavx -D PV_ARCH=PV_AVX -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_rendering.cc -o pv_avx.o
clang: warning: argument unused during compilation: '-mavx' [-Wunused-command-line-argument]
clang++ -c -mavx2 -D PV_ARCH=PV_AVX2 -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_rendering.cc -o pv_avx2.o
clang: warning: argument unused during compilation: '-mavx2' [-Wunused-command-line-argument]
clang++ -c -mavx512f -D PV_ARCH=PV_AVX512f -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_rendering.cc -o pv_avx512f.o
clang: warning: argument unused during compilation: '-mavx512f' [-Wunused-command-line-argument]
clang++ -c -D PV_ARCH=PV_FALLBACK -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_combine.cc -o pv_cmb_fallback.o
clang++ -c -mavx -D PV_ARCH=PV_AVX -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_combine.cc -o pv_cmb_avx.o
clang: warning: argument unused during compilation: '-mavx' [-Wunused-command-line-argument]
clang++ -c -mavx2 -D PV_ARCH=PV_AVX2 -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_combine.cc -o pv_cmb_avx2.o
clang: warning: argument unused during compilation: '-mavx2' [-Wunused-command-line-argument]
clang++ -c -mavx512f -D PV_ARCH=PV_AVX512f -Ofast -std=c++11 -D USE_VC -D VECTORIZE -c -fno-math-errno -Wno-unused-value pv_combine.cc -o pv_cmb_avx512f.o
clang: warning: argument unused during compilation: '-mavx512f' [-Wunused-command-line-argument]
clang++ -DUSE_TINYFILEDIALOGS -std=c++11 -c -o file_dialog.o file_dialog.cc
clang++ -std=c++11 -o pv pv_no_rendering.o pv_initialize.o pv_fallback.o pv_avx.o pv_avx2.o pv_avx512f.o pv_cmb_fallback.o pv_cmb_avx.o pv_cmb_avx2.o pv_cmb_avx512f.o file_dialog.o -lpthread -lVc -lvigraimpex -lsfml-window -lsfml-graphics -lsfml-system -lexiv2
/usr/bin/ld: pv_no_rendering.o: in function `__cxx_global_var_init.375':
pv_no_rendering.cc:(.text.startup[_ZN4Vc_16detail12RunCpuIdInitILi0EE3tmpE]+0x18): undefined reference to `Vc_1::CpuId::init()'
/usr/bin/ld: pv_no_rendering.o: in function `_GLOBAL__sub_I_pv_no_rendering.cc':
pv_no_rendering.cc:(.text.startup+0x6ec): undefined reference to `Vc_1::CpuId::s_processorFeatures7B'
/usr/bin/ld: pv_no_rendering.cc:(.text.startup+0x6f8): undefined reference to `Vc_1::CpuId::s_processorFeaturesC'
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [makefile:33: pv] Error 1


Kay F. Jahnke

unread,
Mar 1, 2021, 2:40:46 AM3/1/21
to hugi...@googlegroups.com
Am 28.02.21 um 19:09 schrieb Harry van der Wolf:

> I just build it on a Ubuntu derived 64bit system. It compiles fine.

Great! Thanks for reporting back. Do try it out, and share your
experience. To me, the implementation of the Burt/Adelson Algorithm is a
major breakthrough.

> Before testing further I did the same on my Debian 10 buster armhf
> Raspberry pi 4. Now it ends in error.
> Don't put too much effort in it. The rpi4 is my server, but I just
> wanted to test if it would compile there (and then test it via tightvnc).

pv is currently intel/AMD only, the Raspi uses an ARM processor. I'd
like to try a port to ARM eventually, especially to support newer macs,
but this will require a bit more fiddling than just supporting yet
another intel ISA.

What would be required is quite different compiler flags for the
rendering code (so, not -mavx etc) and setting up the dispatcher to send
rendering jobs to the code with the right ISA.

This should not be too hard if Vc supports ARM well, and if it doesn't,
pv can fall back on code which isn't using Vc. The problem is that I
don't have any ARM hardware, so it's hard for me to test. Would you like
to cooperate on an ARM port? I could produce code which should run on
ARM, but lacking testing hardware I'd probably need several attempts
before it comes out right. I'd set up an ARM branch, you can watch it
for new commits, try the build and communicate via the issue tracker.

Kay

Monkey

unread,
Mar 1, 2021, 8:10:11 AM3/1/21
to hugin and other free panoramic software
> and it's use of image pyramids relies on a variant of image pyramids based on b-splines.

> To me, the implementation of the Burt/Adelson Algorithm is a
major breakthrough.

Is there a simple way to explain how this/these differ from a standard Laplacian pyramid?

Harry van der Wolf

unread,
Mar 1, 2021, 11:07:19 AM3/1/21
to hugi...@googlegroups.com
Hi,

I understand that ARM is different. That's also why I mentioned to not put too much effort into it.

Yes, I'm willing to cooperate in porting to armhf for the raspberry pi' s. I also have another rpi4 which runs Ubuntu Mate 20.04 (currently), but my (headless) server is always on.
But I use the other one for all kind of things, so helping to port pv could be another option.

I have no idea though how much the "normal" ARM chips vary from the new Apple Intel M1. That might be a completely different beast (the 8-core is at least when it comes to performance).
I read that Apple uses the same Rosetta (V2) for running Intel based programs on the new M1. (Just like 10-12 years ago when they switched from PowerPC to Intel to run PowerPC apps on Intel. I also guess the universal binaries will come to life again. At that time I took over from Ippei Ukai for building Hugin on Mac. A platform I left in 2012 or so for being too closed).
So maybe you do not have to port immediately,but can simply run your intel based MacOS version on the new M1' s.


On the RPi4 I also did the: 
git clone https://github.com/VcDevel/Vc.git
cd Vc
git checkout 1.4
mkdir build
cd build
cmake -DCMAKE_CXX_COMPILER=clang++ -DBUILD_TESTING=0 ..
make
sudo make install

And then tried to rebuild pv, but that doesn' t make a difference. The exact same error. 

======================================
With regard to Intel MacOS: I had just built multiblend for David on my old (by now terribly slow) 2008 MacBook Pro, so I tried the same for pv also using macports (as home brew does not have vigra and I want to keep it simple).
(install macports using pkg)
(from terminal do a: sudo port -v selfupdate)

Getting dependencies:
sudo port install sfml
sudo port install exiv2
sudo port install vigra -python38 -python39
sudo port install Vc
(This installs a whole lot of other dependencies as well. Note also the uppercase in Vc )

Get and build pv:
git clone https://bitbucket.org/kfj/pv
cd pv
git checkout mac
git status (to check whether you have the mac branch)
make

It builds fine and I only did some viewing of images until now.

Harry

T. Modes

unread,
Mar 1, 2021, 11:59:03 AM3/1/21
to hugin and other free panoramic software
I found this a little bit sad. Instead of merging the developing power everyone is working on its own code.
Yes, it is easier to work on own code. But for the longer one it would be better to combine the developing skills and work on a common code base.
This would also have the advantage that not all code needs to be written and debugging several times in several iterations.
So please consider working at a common code base or at least port the improvements to enblend/enfuse.

Thomas


Kay F. Jahnke

unread,
Mar 2, 2021, 5:20:14 AM3/2/21
to hugi...@googlegroups.com
Am 01.03.21 um 14:10 schrieb Monkey:

> > and it's use of image pyramids relies on a variant of image pyramids
> based on b-splines.
>
> > To me, the implementation of the Burt/Adelson Algorithm is a
> major breakthrough.
>
> Is there a simple way to explain how this/these differ from a standard
> Laplacian pyramid?

My code uses a different pyramid generation scheme for the gaussian
pyramids. The laplacian pyramids are not made explicit but calculated
on-the-fly by functor composition - the result is the same, the
difference between a scaled-up 'top' level and the unmodified 'bottom
level' at every stage - but the process is less memory-hungry and uses
vspline's fast multithreaded SIMD code to do the operation in one go.

Standard gaussian pyramids have a set of samples at each level. To get
the next smaller level ('reduce' stage), a digital low-pass filter is
applied - usually a small-kernel FIR filter - and the result is
decimated, discarding every other sample. Everything works along the
initial sampling grid, every pyramid stage is precisely half as large as
the previous. The 'expand' stage uses the same FIR filter for upscaling
from a grid filled with zeros where only every other sample is filled in
from the 'top' level.

My pyramid technique has a b-spline at each level. This is a continuous
function, and low-pass-filtering is done by modifications of the
b-spline evaluation function; the resulting (continuous) low-pass signal
can then be sampled at arbitrary positions. This method is grid-free:
you can choose decimation steps different from two, and you're free to
place the samples which the smaller-level spline is built from in
off-grid positions. I use this in pv to get a set of splines which have
the same boundary behavior, so by slotting in a simple shift+scale, I
can evaluate them with the same coordinates at every level, which
simplifies the code a great deal. The 'expand' stage of the pyramid uses
simple b-spline evaluation, which is good for upscaling a signal.

pv offers the parameter 'pyramid_scaling_step' which allows you to build
'steeper' or 'shallower' pyramids scaling each level with an arbitrary
real factor - 2.0 by default, but sensible values go down to, say, 1.1.
you can simply try it out and see if it makes any difference - the
effect is subtle and hard to spot unless you know what to look for, if
you're interested I can say more about it.

Employing my spline-based pyramid technique to implement an equivalent
of the 'classic' Burt and Adelson (B&A) approach is experimental, but
with the latest pv commits I provide a working implementation so it can
be evaluated and experimented with, to see where it's advantages and
disadvantages are, and to figure out where my initial choices in
mimicking the B&A algorithm can be improved. It may well turn out to be
inferior to the classic approach - the intention in pv is to produce
'decent' stitches and expoure fusions for the currently displayed view -
if I can get the quality up to a level which approaches the standard
technique, this would be great, but I'm not making such claims yet.

To sum it up: The main difference to 'classic' pyramids and B&A in the
current pv code is the off-grid evaluation for up- and downsampling,
arbitrary scaling factor from one pyramid level to the next, use of a
small binomial FIR filter (.25, .5, .25) for low-pass filtering, and use
of b-spline evaluation for upsampling.

I hope this answers your question - the B&A algorithm and image pyramids
are quite involved stuff, and it's hard to explain the differences
'simply' because the algorithm in itself is not simple.

Kay

Kay F. Jahnke

unread,
Mar 2, 2021, 5:43:16 AM3/2/21
to hugi...@googlegroups.com
Am 01.03.21 um 17:07 schrieb Harry van der Wolf:
> Hi,
>
> I understand that ARM is different. That's also why I mentioned to not
> put too much effort into it.

It should not be too hard to get pv to run on ARM, the question is how
*fast* I can make it. I can set up the compilation to not use Vc for the
SIMD code, which makes everything standard C++11, with no hardware
dependencies. If Vc can be used, this should make it faster.

> Yes, I'm willing to cooperate in porting to armhf for the raspberry pi'
> s. I also have another rpi4 which runs Ubuntu Mate 20.04 (currently),
> but my (headless) server is always on.
> But I use the other one for all kind of things, so helping to port pv
> could be another option.
I'll set up a branch called 'native' which will offer three targets:

- scalar version, not making an attempt at SIMD whatsoever
- vspline SIMD implementation, using vspline's SIMD emulation
- Vc SIMD implementation, attempting to use Vc on ARM.

All of the targets will be compiled with '--march=native', telling the
compiler to compile for the ISA it's running on. The binaries will only
run on the build machine or better. Then we can take it from there -
especially the Vc target may not build without tweaking.

I'll get in touch once I have the branch set up, and you can pull the
branch and see whether it compiles on your ARM machine.

> I have no idea though how much the "normal" ARM chips vary from the new
> Apple Intel M1. That might be a completely different beast (the 8-core
> is at least when it comes to performance).
> I read that Apple uses the same Rosetta (V2) for running Intel based
> programs on the new M1./(Just like 10-12 years ago when they switched
> from PowerPC to Intel to run PowerPC apps on Intel. I also guess the
> universal binaries will come to life again. At that time I took over
> from Ippei Ukai for building Hugin on Mac. A platform I left in 2012 or
> so for being too closed)./
> So maybe you do not have to port immediately,but can simply run your
> intel based MacOS version on the new M1' s.

No, let's give it a go. It's not much work on my side, a first try at
your side takes a few minutes, and if it succeeds, we have an ARM port
of pv to play with. I anticipate that the 'native' approach should even
function on all platforms (including intel/AMD), but the resulting
binaries will be hardware-dependent, this is why I went for automatic
ISA detection in the intel/AMD version, to make it easier for users and
packagers.

>
>
> On the RPi4 I also did the:
> git clone https://github.com/VcDevel/Vc.git
> cd Vc
> git checkout 1.4
> mkdir build
> cd build
> cmake -DCMAKE_CXX_COMPILER=clang++ -DBUILD_TESTING=0 ..
> make
> sudo make install
> And then tried to rebuild pv, but that doesn' t make a difference. The
> exact same error.

Wait for the new branch, hopefully the outcome will be better.

> ======================================
> With regard to Intel MacOS: I had just built multiblend for David on my
> old (by now terribly slow) 2008 MacBook Pro, so I tried the same for pv
> also using macports (as home brew does not have vigra and I want to
> keep it simple).
> (install macports using pkg)
> (from terminal do a: sudo port -v selfupdate)
>
> Getting dependencies:
> sudo port install sfml
> sudo port install exiv2
> sudo port install vigra -python38 -python39
> sudo port install Vc
> (This installs a whole lot of other dependencies as well. Note also the
> uppercase in Vc )
>
> Get and build pv:
> git clone https://bitbucket.org/kfj/pv
> cd pv
> git checkout mac
> git status (to check whether you have the mac branch)
> make
>
> It builds fine and I only did some viewing of images until now.

Thank you for confirmig a mac build! This is great, you're now only the
third person I know of who's succeeded in building pv on a mac. I don't
have a mac myself, so it's hard for me to code for it. Please note that
the mac branch has not yet been updated to contain my new code, I'll do
that soon-ish. Old macs are like old pcs, they won't have fast SIMD
units. I routinely build on an old IBM Thinkpad R60e with a core2 Duo,
to confirm that the code works on 32bit intel hardware with SSE only. It
works, but it takes forever... My desktop is a Haswell core-i5, which
has AVX2. This is where it starts being fun :D

Since you're the one who's building hugin on the mac, maybe you can give
me a hint on how to write a 'port file' to provide a pv package for
macports? Is that hard? And is it hard to provide a package which users
can simply download from the app store? I proposed to bundle pv with
hugin, because it would make a good addition to the suite, what do you say?

Kay

Kay F. Jahnke

unread,
Mar 2, 2021, 6:00:38 AM3/2/21
to hugi...@googlegroups.com
Am 01.03.21 um 17:59 schrieb T. Modes:

> I found this a little bit sad. Instead of merging the developing power
> everyone is working on its own code.

pv started out as a demo program for my b-spline interpolation library,
vspline. Initially it was only displaying single images. Then I extended
it to display synoptic views of several images, and the natural file
format for this is PTO, which is widely used. So I extended pv to read
PTO format. But all the code in pv and vspline is completely new, it
does not rely on libpano or libhugin at all, and I can see no way of
easily integrating it into hugin or it's helper programs.

> Yes, it is easier to work on own code. But for the longer one it would
> be better to combine the developing skills and work on a common code base.
> This would also have the advantage that not all code needs to be written
> and debugging several times in several iterations.

The code base is totally different. pv and hugin code are quite
incompatible, the only common ground is the use of PTO format. pv is a
panorama viewer, and stitching is only a nice-to-have recent addition.

> So please consider working at a common code base or at least port the
> improvements to enblend/enfuse.

I took nothing from enblend/enfuse but the idea of what I wanted and the
choice of the Burt and Adelson image splining algorithm. My code is for
embedded use in pv, and it's quite alien to enblend and enfuse, as I
have explained in my reply to Monkey concerning the use of image
pyramids. There is nothing to port as of yet.

If enblend/enfuse developers are interested in my new implementation of
the B&A algorithm they are free to use it - it's all FOSS (GPL v3) and
amply documented. For now I recommend to just play with my new code and
to see if it works out fine - it's still experimental and may not turn
out to be as good as the 'classic' implementation in enblend/enfuse. And
enblend/enfuse have a much wider selection of parameters and variants
and integrate well with hugin, so they will be the standard for some
time to come, and they do well what they do.

Kay

Harry van der Wolf

unread,
Mar 2, 2021, 7:53:45 AM3/2/21
to hugi...@googlegroups.com


Op di 2 mrt. 2021 om 11:43 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

Thank you for confirmig a mac build! This is great, you're now only the
third person I know of who's succeeded in building pv on a mac. I don't
have a mac myself, so it's hard for me to code for it. Please note that
the mac branch has not yet been updated to contain my new code, I'll do
that soon-ish. Old macs are like old pcs, they won't have fast SIMD
units. I routinely build on an old IBM Thinkpad R60e with a core2 Duo,
to confirm that the code works on 32bit intel hardware with SSE only. It
works, but it takes forever... My desktop is a Haswell core-i5, which
has AVX2. This is where it starts being fun :D

Since you're the one who's building hugin on the mac, maybe you can give
me a hint on how to write a 'port file' to provide a pv package for
macports? Is that hard? And is it hard to provide a package which users
can simply download from the app store? I proposed to bundle pv with
hugin, because it would make a good addition to the suite, what do you say?

Kay


I was not correct, or at least not clear enough, in my answer. I did take over from Ippei Ukai, but also mentioned I abandoned Mac in 2012/2013 or so. I simply never threw my 2008 macbook away.
At that time I also maintained the macports build of hugin and enblend/enfuse, but that is now also long, long ago. But I will have a look at it.

The current Hugin builders on Mac are Niklas Mischkulnig and Bob Campbell. All credits to them (not to me).

I certainly do not have an Apple M1 and will not buy one either. I am a happy Linux user since 1993 (early adapter) and simply wandered off for a couple of years, both to Apple and Windows.

Making a package for the MacOS app store is a different thing.
You have to register as a developer at Apple. I did a complete rant some days ago about how I hate how Apple does things, but also realised at that time that I had registered around 2008, when I took over from Ippei. And yes, that still worked.

- So, you have to register at Apple as a developer giving quite some info.
- You have to offer your bundle to Apple where Apple checks for malware/Apple standards/"other things". That is actually a good thing, but will take time and sometimes your bundle is rejected while you know it is perfectly fine. I have a java app which I also packaged as a MacOS bundle. Apart from the fact that I won't release it via the apple store, it will not be accepted either as it is built outside Xcode thereby lacking some controls/standards (and Apple spyware?).

With regard to bundling with Hugin: 
I agree with Thomas that where possible we should integrate. You already explained the big differences in code, so that might not be possible. I can't judge that as I'm not a C/C++ programmer.
As it is all open source, it might still be possible to migrate/integrate it. And otherwise: could it be possible to call pv from hugin to run it as enblend replacement. Just like multiblend can be called from hugin (with a simpler command set) as enblend replacement?
Also: panomatic was first an external tool and most of it is now integrated in cpfind.
I hope we see the same for pv blending/fusing functionality if it really is an improvement.
My personal point of view is to help where I can if "something" can mean an improvement for all of us. And if it is mature try to integrate it.

Harry




Kay F. Jahnke

unread,
Mar 3, 2021, 4:32:50 AM3/3/21
to hugi...@googlegroups.com
Am 02.03.21 um 13:53 schrieb Harry van der Wolf:
>
>
> Op di 2 mrt. 2021 om 11:43 schreef 'Kay F. Jahnke' via hugin and other
>
> Since you're the one who's building hugin on the mac, maybe you can
> give
> me a hint on how to write a 'port file' to provide a pv package for
> macports? Is that hard? And is it hard to provide a package which users
> can simply download from the app store? I proposed to bundle pv with
> hugin, because it would make a good addition to the suite, what do
> you say?
>
> Kay
>
>
> I was not correct, or at least not clear enough, in my answer. I did
> take over from Ippei Ukai, but also mentioned I abandoned Mac in
> 2012/2013 or so. I simply never threw my 2008 macbook away.
> At that time I also maintained the macports build of hugin and
> enblend/enfuse, but that is now also long, long ago. But I will have a
> look at it.

Sorry, I did not read your mail properly. I can sympathize with leaving
the mac universe. I've been a linux early adopter as well - working on
unix networks in the nineties changed my views as to what an OS should
be like and I started out with Xenix, then slackware Linux on my home
PC. Later on I used Windows for some time, with CygWin to have a POSIX
environment. Then it was back to Linux again, happy as you are. I keep a
W10 install on one of my Thinkpads to run windows builds of pv, so that
I can distribute Windows binaries to people who don't want to build
themselves.

> The current Hugin builders on Mac are Niklas Mischkulnig and Bob
> Campbell. All credits to them (not to me).

Maybe they'll take note of pv and become interested in the mac version -
after all the branch is already there and you demonstrated that it's
dead easy to create a binary. Sadly, my work so far has not got much
attention.

> I certainly do not have an Apple M1 and will not buy one either. I am a
> happy Linux user since 1993 (early adapter) and simply wandered off for
> a couple of years, both to Apple and Windows.

I must admit that the M1 is very attractive, though. It might be
worthwhile to just buy a small machine with this chip - I'd like a
macbook air, because I don't have an up-to-date laptop - my newest
thinkpad is eight years old or so.

> Making a package for the MacOS app store is a different thing.
> You have to register as a developer at Apple. I did a complete rant some
> days ago about how I hate how Apple does things, but also realised at
> that time that I had registered around 2008, when I took over from
> Ippei. And yes, that still worked.

I wouldn't want to do that if I can help it - that's why I was hoping to
piggyback on the hugin suite. I still think it would make a nice
addition. Where is the good cross-platform FOSS panorama viewer in the
hugin package? I was frustrated because I could not find one, so I wrote
one. I hear others use 'Panini' but I never managed to get it to work
for me. So here's pv.

> - So, you have to register at Apple as a developer giving quite some info.
> - You have to offer your bundle to Apple where Apple checks for
> malware/Apple standards/"other things". That is actually a good thing,
> but will take time and sometimes your bundle is rejected while you know
> it is perfectly fine. I have a java app which I also packaged as a MacOS
> bundle. Apart from the fact that I won't release it via the apple store,
> it will not be accepted either as it is built outside Xcode thereby
> lacking some controls/standards (and Apple spyware?).

I wouldn't be surprised...

> With regard to bundling with Hugin:
> I agree with Thomas that where possible we should integrate. You already
> explained the big differences in code, so that might not be possible. I
> can't judge that as I'm not a C/C++ programmer.

It is in fact completely new technology. Potentially disruptive. It's
probably not too hard to extract the blending and fusing code into a
separate program, but the beauty of pv is in the total integration of
all components - you simply load your PTO file into pv like any other
image, adjust the view until you have what you want, and press 'P' to
have a panorama rendered in the background. No intermediate images on
disk, no helper programs. Once the stitch is rendered, you can simply
open pv's file select dialog (press 'F') and load the freshly-made
panorama in the same session. Same goes for exposure fusions or HDR merges.

> As it is all open source, it might still be possible to
> migrate/integrate it. And otherwise: could it be possible to call pv
> from hugin to run it as enblend replacement. Just like multiblend can
> be called from hugin (with a simpler command set) as enblend replacement?
> Also: panomatic was first an external tool and most of it is now
> integrated in cpfind.
> I hope we see the same for pv blending/fusing functionality if it really
> is an improvement.

Now it's my turn to be 'a bit sad'. I've spent about seven man years
creating a b-spline library and a beautiful, powerful panorama viewer
which now can even do some stitching and exposure fusion on top of
everything else. It would be sad to just rip it apart and take a few
bits as helper programs for hugin. Think of pv as a 'preview on
steroids'. There was a discussion in this group whether it wouldn't make
sense to work more from the preview window. pv is my answer to this
discussion: its what used to be called 'wysiwig'. Pretty much everything
happens immediately, you see all parameter changes 'live'. The only
things which I can't show 'live' are exposure fusions and stitches,
because the B&A algorithm takes a lot of time. I may be able to tweak it
so that I can produce a few frames per second showing stitches and
fusions live, but for now they are rendered by a background thread to a
file on disk. For playing with my stitching and fusion code, this should
be good enough for now.

I even made it easy to use it as a preview for hugin and the likes: if
you work on a panorama in hugin, you can just have a pv window open to
the same PTO. as soon as you save the PTO in hugin, you can simply press
'F1' in pv and it will update to the changes in the PTO. I see pv rather
as a third 'preview' window which hugin etc. can use alongside the other
two, which are good for some tasks where my code is not - I don't handle
all panotools projections, I don't deal with control points, I can't
switch individual imageson and off... - but I sure can give a good view
at the PTO.

> My personal point of view is to help where I can if "something" can mean
> an improvement for all of us. And if it is mature try to integrate it.

If my code turns out to be good, I'd be happy to see it used by others.
I just don't want to go fiddling with other people's software to force
my new stuff in. Integration into extant software is better done by the
original authors, who know their way around their code. As far as pv and
it's new components are concerned, I want to stay upstream and focus on
innovation.

Now for the attempt to port pv to ARM. I have prepared the branch I
announced yesterday, and you can try if it compiles on your Raspi:

git checkout native
make

This will try to build three targets: pv_scalar, pv_vspsimd and
pv_vcsimd - you can also build them separately if you like, it's a phony
'all' target compiling the lot. The first one makes no attempt at using
SIMD code, the second one used vspline's 'implicit' SIMD code, and the
third uses Vc. I think the first two will build on the Raspi, the third
may or may not. If any of the targets build, you should have a
functional pv variant to run on the same machine which it was built on
(it uses --march=native). If the target using Vc builds, it should be
the fastest. Please try it out and let me know how it went!

When I build the 'native' branch targets on my desktop machine, the
performance difference is quite noticeable. My standard test is a
1000-frame automatic pan over a full spherical panorama. The per-frame
rendering times I get are roughly like this:

- pv_vcsimd: 7.1 msec
- pv_vspsimd: 12.4 msec
- pv_scalar: 15.5 msec

So the Vc version is more than twice as fast as the scalar one, and the
'implicit SIMD' version comes out in the middle.

Kay

yuv

unread,
Mar 3, 2021, 7:27:16 AM3/3/21
to hugi...@googlegroups.com
On Wed, 2021-03-03 at 10:32 +0100, 'Kay F. Jahnke' via hugin and other
free panoramic software wrote:

> I must admit that the M1 is very attractive, though. It might be
> worthwhile to just buy a small machine with this chip - I'd like a
> macbook air, because I don't have an up-to-date laptop - my newest
> thinkpad is eight years old or so.

Guilty as charged. Typing this on a 2013 MacBook Air, when the 2013
Sony Vaio, very similar tech specs, is already on the electro-junk heap
with broken parts not worth repairing. 2018 Microsoft Surface Book Pro
(or whatever mindboggling name MSFT's marketing has thought for that
device) is already defective (the contact at the hinge between tablet
and keyboard keeps breaking up); 2019 Lenovo Thinkpad T495s has already
been into warranty once and if left connected to its power brick for
too long has seizures. As a result, I am now the only family member
without an iDevice and the M1 has joined the school-mandated iPad. I
did get a used iPhone to test the experience for myself, and so far it
feels like being in a golden cage. It is still a prison, even if
golden.

Apple does a lot of things very well. Sadly, at the critical junction,
it retains control of the T1 / encryption keys instead of giving
control to the user who should be allowed to choose between Apple's
encryption keys; consumer's own encryption keys; or no encryption at
all.


> > With regard to bundling with Hugin:
> >
> It is in fact completely new technology. Potentially disruptive.

I have not touched panorama photography for ages, so forgive me if I am
missing something. If it is so disruptive, why not adding to pv the
missing functions from Hugin rather than the other way around?

And in both cases, what is the main obstacle to move functionality from
pv to Hugin or from Hugin to pv?

If memory serves me well, the current preview mode in Hugin was born as
a project to improve the existing preview. Turned out that a complete
replacement was not possible/desirable and so the two lived side-by-
side in the same package. The difference here is that pv started life
in a separate GUI toolkit, or am I mistaken?

--
Yuval Levy, JD, MBA, CFA
Ontario-licensed lawyer


Harry van der Wolf

unread,
Mar 3, 2021, 7:33:52 AM3/3/21
to hugi...@googlegroups.com


Op wo 3 mrt. 2021 om 10:32 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

> The current Hugin builders on Mac are Niklas Mischkulnig and Bob
> Campbell. All credits to them (not to me).

Maybe they'll take note of pv and become interested in the mac version -
after all the branch is already there and you demonstrated that it's
dead easy to create a binary. Sadly, my work so far has not got much
attention.


It should not be too difficult to add it to the Hugin bundle package. The hugin package consists of 4 separate bundles (Hugin.app, HuginStitchProject.app, PTBatcherGUI.app, calibrate_lens_gui.app).
If this is not "allowed", it is not too difficult either to build a separate pv bundle. Not difficult (if you know how to do it), but quite some work.
I would use macports for this over homebrew. Homebrew uses as much as possible the libraries available on the system. Macports tries to build as many libraries itself as it can and only uses the real system libraries. That macports build process on installing takes much longer as it will build more libraries. Homebrew uses libiconv, libexpat, libz, liblzma, etcetera from MacOS. Macports builds these itself.
For a "my system only", homebrew is a more convenient, easier and faster way to install new ports.
Macports gives you more portabality and MacOS "cross-version" compatibility if you bundle an app.

Note: Mac bundles do have the Info.plist setting
<key>NSHighResolutionCapable</key>
<string>True</string>


The latest Hugin build had issues with this and those were apparently just solved by Bob and Niklas (one of the other conversations about the hugin mac build).
I only have that old Macbook Pro and can't test high resolution functionality of pv.
 
Now it's my turn to be 'a bit sad'. I've spent about seven man years
creating a b-spline library and a beautiful, powerful panorama viewer
which now can even do some stitching and exposure fusion on top of
everything else. It would be sad to just rip it apart and take a few
bits as helper programs for hugin. Think of pv as a 'preview on
steroids'. There was a discussion in this group whether it wouldn't make
sense to work more from the preview window. pv is my answer to this
discussion: its what used to be called 'wysiwig'. Pretty much everything
happens immediately, you see all parameter changes 'live'. The only
things which I can't show 'live' are exposure fusions and stitches,
because the B&A algorithm takes a lot of time. I may be able to tweak it
so that I can produce a few frames per second showing stitches and
fusions live, but for now they are rendered by a background thread to a
file on disk. For playing with my stitching and fusion code, this should
be good enough for now.

I even made it easy to use it as a preview for hugin and the likes: if
you work on a panorama in hugin, you can just have a pv window open to
the same PTO. as soon as you save the PTO in hugin, you can simply press
'F1' in pv and it will update to the changes in the PTO. I see pv rather
as a third 'preview' window which hugin etc. can use alongside the other
two, which are good for some tasks where my code is not - I don't handle
all panotools projections, I don't deal with control points, I can't
switch individual imageson and off... - but I sure can give a good view
at the PTO.

Sorry to make you sad. ;)
You have some very good arguments (at least in my eyes).
Like in all Open Source projects: "if you feel an itch, scratch it". You simply did what many other developers did (and what I did myself).
Look-alike packages can still have different origins, different goals and different appliances (and based on your arguments: different technology).
And I did not know of all the effort you already put into it.

 
Now for the attempt to port pv to ARM. I have prepared the branch I
announced yesterday, and you can try if it compiles on your Raspi:

   git checkout native
   make

This will try to build three targets: pv_scalar, pv_vspsimd and
pv_vcsimd - you can also build them separately if you like, it's a phony
'all' target compiling the lot. The first one makes no attempt at using
SIMD code, the second one used vspline's 'implicit' SIMD code, and the arry

third uses Vc. I think the first two will build on the Raspi, the third
may or may not. 

While typing this mail, I compiled your new code on my RPi4 server. All 3 versions compile fine, but as they use (Open)GL code, they won't run in a VNC window and I didn't setup X on my server.
I will do the same on my other RPi4 with a monitor connected, but I don't have time for it now.

Harry 

Kay F. Jahnke

unread,
Mar 3, 2021, 11:54:16 AM3/3/21
to hugi...@googlegroups.com
Am 03.03.21 um 13:27 schrieb yuv:

> On Wed, 2021-03-03 at 10:32 +0100, 'Kay F. Jahnke' via hugin and other
> free panoramic software wrote:
>

>>> With regard to bundling with Hugin:
>>>
>> It is in fact completely new technology. Potentially disruptive.
>
> I have not touched panorama photography for ages, so forgive me if I am
> missing something. If it is so disruptive, why not adding to pv the
> missing functions from Hugin rather than the other way around?

In a way this is what I've been doing. pv started out as a simple
panorama viewer demo to demonstrate how my library (vspline) was
well-suited for geometric transformations. And then it turned out that
the technology I had developed and which I was using to write the first
version was very well suited to do other image processing jobs. I do a
lot of stuff with images - not only panoramas - and one thing which had
always annoyed me was the break I had between panoramas and 'normal'
images: when a panorama showed in my image viewer, I had to tell the
image viewer to 'open it with the panorama viewer'. And the pano viewer
and the image viewer would have different controls - with panoramas I
like using QTVR mode, and you don't really get that in 'normal' image
viewers. So I added 'normal' image viewing capabilities to pv, in order
to be able to view all images, panoramic or not, with the same software.
And the same UI. And a slide show mode which would show panoramas as
such, with the right projection and FOV read from the metadata. And,
frankly, the image quality most 'normal' viewers provide just isn't
really good enough for my taste. b-splines are simply top notch.

I used a game engine (SFML) for the UI, in fact I decided to implement a
fair amount of 'gamification' and to *not* use a traditional GUI library
like wxWidgets or Qt - instead I programmed a simple 'immediate mode
GUI' with just a few rows of buttons and left the remainder of the UI to
be done with mouse gestures and keystrokes, like a computer game. This
enabled me to 'have nothing between me and my images' - no menus, no
dialog boxes, no popup windows. Of course one has to 'learn to fly' to
enjoy it fully.

So now I could view images/panoramas just fine. What if I wanted to
share them? Would be nice to have a snapshot facility. And when doing
snapshots, why not make them in the background with a very good
interpolator, to several times the screen's size? Turned out to be quite
easy with what I already had.

Oh, I can do snapshots! How about I add viewing and snapshots in
different target projections? I might as well display in e.g. spherical
and when I do a snapshot of that, it's another spherical, maybe with a
fixed horizon or brightness or black point or white balance... and why
only snapshots, why not throw a bit more image processing in the works?
how about HDR? I added HDR blending and live viewing of brackets, so one
can 'explore' the shadows or the clouds by simply varying brightness
(try a horizontal secondary-button click-drag gesture) - and without
having to first create HDR output, but with the *option* to export the
blended bracket to openEXR with a simple keystroke.

Hey, so I can HDR-blend! Maybe I can even do exposure fusion!? How about
just running a B&A image splining algorithm in the background to create
an exposure-fused snapshot? Bingo!

Now, having the B&A code, how about I just feed it spatial masks instead
of brightness-related ones? The code to stitch images should be
precisely the same, give or take a few tweaks... it works!

It all fell into place naturally. No need to take code from somewhere
else. Look at my implementation of the B&A algorithm, and compare it to
'traditional' approaches, and judge for yourself:

https://bitbucket.org/kfj/pv/src/master/pv_combine.cc

Compared to traditional code, this is alien technology. It's all written
using the functional paradigm, you simply compose SIMD-capable functors
and feed them to transform functions, filling in arrays as the functors
are 'rolled out' with efficient multithreading. Once you get used to
work with that technology, you don't want traditional code anymore.
That's why I think it may be disruptive, and why I prefer to code stuff
myself, using vspline.

> And in both cases, what is the main obstacle to move functionality from
> pv to Hugin or from Hugin to pv?

So I just can't see much point of trying to move code from hugin to pv.
I have programmed everything new, from scratch, using multithreaded SIMD
code on the CPU. If I want to, I can shell out to helper programs
myself, that's no big deal, but it's much faster to work within the same
process and keep the data in memory. Hugin is great for getting the
registration right, and it does the job of providing a GUI for panotools
well. But when it comes to adding capabilities to pv, I prefer to use my
own, modern code base. It's more fun like that as well, and I admit that
I like having the say of what is done and what isn't. pv and hugin
function very well side-by-side - I've even experimented with code where
other programs can signal pv to perform a refresh - I don't get the
reaction time faster than, say, 100msec, so it's not enough to control
animations frame by frame, but for an 'external preview' it's perfectly
fine. All that a software like hugin needs to know is pv's PID and then
it can signal pv to update it's image, as if the user had pressed 'F1'.
No need to integrate further, really.

Hugin and pv are *complementary*.

> If memory serves me well, the current preview mode in Hugin was born as
> a project to improve the existing preview. Turned out that a complete
> replacement was not possible/desirable and so the two lived side-by-
> side in the same package. The difference here is that pv started life
> in a separate GUI toolkit, or am I mistaken?

So, no tool kit but an 'immediate mode GUI' -
https://en.wikipedia.org/wiki/Immediate_mode_GUI - SFML did not provide
one, but the community had a few hints ready, so I took it from there
and wrote the GUI myself as well. That was fun, too, and I did hope it
would lure some users to my lair... and not having a fat dependency for
the GUI toolkit makes pv lean.

Personally, I don't use the GUI much, apart from starting slideshows or
setting numerical values precisely - or for 'override arguments' -
command line arguments 'injected' at run time to modify the current
viewing cycle. But it does no harm having it.

Kay

Kay F. Jahnke

unread,
Mar 3, 2021, 12:37:23 PM3/3/21
to hugi...@googlegroups.com
Am 03.03.21 um 13:33 schrieb Harry van der Wolf:
>
>
> Op wo 3 mrt. 2021 om 10:32 schreef 'Kay F. Jahnke' via hugin and other
> free panoramic software <hugi...@googlegroups.com
> <mailto:hugi...@googlegroups.com>>:
>
>
> > The current Hugin builders on Mac are Niklas Mischkulnig and Bob
> > Campbell. All credits to them (not to me).
>
> Maybe they'll take note of pv and become interested in the mac
> version -
> after all the branch is already there and you demonstrated that it's
> dead easy to create a binary. Sadly, my work so far has not got much
> attention.
>
>
> It should not be too difficult to add it to the Hugin bundle package.
> The hugin package consists of 4 separate bundles (Hugin.app,
> HuginStitchProject.app, PTBatcherGUI.app, calibrate_lens_gui.app).
> If this is not "allowed", it is not too difficult either to build a
> separate pv bundle. Not difficult (if you know how to do it), but quite
> some work.

I'd much prefer if I did not have to do this myself. When I offered my
library, vspline, to debian, they said to me: 'if you want your software
to show up in debian anytime soon, you'll have to packge it yourself'.
Gulp. So I sat myself down and learned debian packaging, found a mentor,
made mistakes, learned more... it took *months*. I remember a time when
you could excite people by offering them good software - people would
say 'hey that looks great, we'll package it' - nowadays even the search
engines only find what 1000000 social media users have 'liked',
relevance does not seem to be an issue anymore, and everyone already has
everything, why would they even look at what you've made? I felt my work
was relevant, so I did the debian packaging myself, but I would much
rather have spent my time coding.

Anybody out there?

> I would use macports for this over homebrew. Homebrew uses as much as
> possible the libraries available on the system. Macports tries to build
> as many libraries itself as it can and only uses the real system
> libraries. That macports build process on installing takes much longer
> as it will build more libraries. Homebrew uses libiconv, libexpat, libz,
> liblzma, etcetera from MacOS. Macports builds these itself.
> For a "my system only", homebrew is a more convenient, easier and faster
> way to install new ports.
> Macports gives you more portabality and MacOS "cross-version"
> compatibility if you bundle an app.

Macports was the choice I made when I first ported pv to the mac. And it
had all the dependencies. But it is asking a lot from users. To just be
able to use macports at all, I had to first install xcode, which takes
very long indeed, and you may have a notion of your typical mac user...
they want some nice shiny icon in the store to click on, rather than
having to go through installing first xcode, then macports, then git and
clang++ and Vc, and finally load a port file...

> Note: Mac bundles do have the Info.plist setting
> <key>NSHighResolutionCapable</key>
> <string>True</string>
>
> The latest Hugin build had issues with this and those were apparently
> just solved by Bob and Niklas (one of the other conversations about the
> hugin mac build).
> I only have that old Macbook Pro and can't test high resolution
> functionality of pv.

My friend who's done the last few test builds on a fat iMac, complained
about the GUI looking funny on his retina display. So I added code which
makes it adapt to the desktop size, and you can also simply scale and
move it with mouse gestures. After that he did not complain anymore and
only said that everything works, but he's a busy man and does not spend
too much of his time testing my software.

Of course if pv is rendering to a 5K screen in native resolution (which
is what it does per default) animations may stutter, because it's hard
to get 60fps at that size. But you can just tell pv to lower resolution
for animated sequences (once the animation, like a zoom or pan, stops,
it will switch to full resolution still images) - so pv works well with
high resolution, and because it does not depend on a GUI library, but
instead renders the GUI itself with a bit of openGL code, the screen can
be any resolution. No problem.

> Now it's my turn to be 'a bit sad'. I've spent about seven man years
> creating a b-spline library and a beautiful, powerful panorama viewer
> which now can even do some stitching and exposure fusion on top of
> everything else. It would be sad to just rip it apart and take a few
> bits as helper programs for hugin.
> ...
>
> Sorry to make you sad. ;)
> You have some very good arguments (at least in my eyes).

Thanks

> Like in all Open Source projects: "if you feel an itch, scratch it". You
> simply did what many other developers did (and what I did myself).
> Look-alike packages can still have different origins, different goals
> and different appliances (and based on your arguments: different
> technology).
> And I did not know of all the effort you already put into it.

The biggest part was my library, vspline. That's the real achievement,
and it does not 'just' calculate every thinkable type of b-spline, but
it has extremely efficient multithreaded code to 'roll out' SIMD-capable
functors over n-dimensional arrays. The trouble is that it's novel, and
not so easy to grasp, because all the novel concepts interact tightly,
and it's hard to find a way into it, because everything interacts with
everything else. Without vspline, I could never have hoped to get 60fps
of geometrically transformed frames from the CPU.

> Now for the attempt to port pv to ARM. I have prepared the branch I
> announced yesterday, and you can try if it compiles on your Raspi:
>
>    git checkout native
>    make
>
> This will try to build three targets: pv_scalar, pv_vspsimd and
> pv_vcsimd - you can also build them separately if you like, it's a
> phony
> 'all' target compiling the lot. The first one makes no attempt at using
> SIMD code, the second one used vspline's 'implicit' SIMD code, and
> the arry
> third uses Vc. I think the first two will build on the Raspi, the third
> may or may not.

> While typing this mail, I compiled your new code on my RPi4 server. All
> 3 versions compile fine, but as they use (Open)GL code, they won't run
> in a VNC window and I didn't setup X on my server.
> I will do the same on my other RPi4 with a monitor connected, but I
> don't have time for it now.

Now, that makes me happy :D

ARM isn't merely a different platform, but a different architecture
altogether, and to get a build first go is just great! I am so glad
about clang++. It sure makes my code a lot faster than g++ - I suppose
it handles the functional paradigm better, and even if I type-erase my
functors it can still compose them into the fastest configuration
possible. And it's there on many platforms and architectures and just
compiles the same code everywhere, as far as I have seen it. So, my
gratitude goes to the makers of clang++, and also to the giants on whose
shoulders I'm standing, namely Ullrich Köthe, the original author of
vigra, whose thesis finally convinced me of generic programming, and
Matthias Kretz, the author of Vc, whose thesis made me realize that
there is more to SIMD than simply using a few intrinsics and that SIMD
programming is a state of mind - and that's mentioning just the two who
I have to thank most.

Thank you for trying the ARM build, and let's hope that the binary
actually *works*... Hope to hear about that soon-ish.

I couldn't have done it without openGL, either. SFML provides access to
openGL functionality with very little fuss, and it's cross-platform as
well, so I can leave all the nitty-gritty to it. Great software. Highly
recommended.

Kay

Harry van der Wolf

unread,
Mar 3, 2021, 4:20:41 PM3/3/21
to hugi...@googlegroups.com
Hi Kay and everyone else interested.

It is not so hard to create a bundle for pv after all. It took me an hour.
Currently the app does not have an .icns iconpack, so it has a default icon
It also contains a minimal, strictly necessary Info.plist, but that should be expanded.
PV.png
If you create a nice 512x512x300 dpi or 1024x1024x300 dpi PNG image, it is easy to create an icns icon pack (which shows app icons based on your systems resolution)
I copied the app bundle, the license and the readme into a distributable dmg.
dmgpicture.png



As I can't attach a dmg containing a bundled app (this could be a security issue), I can't attach it to this mail.
Download from here: 

Harry

Kay F. Jahnke

unread,
Mar 4, 2021, 2:47:31 AM3/4/21
to hugi...@googlegroups.com
Am 03.03.21 um 22:20 schrieb Harry van der Wolf:

> Hi Kay and everyone else interested.
>
> It is not so hard to create a bundle for pv after all.
> It took me an hour.

Wow! Thanks for your work! I did think about building a flatpack for
linux, but it looked like quite a bit of stuff one had to learn and do,
so I kept postponing it - and building on Linux is really easy.

But I have a Windows version ready and I'll follow your example and
share it publicly. It needs a 64bit system. All DLLs are included, no
installation necessary. Here it is:

https://www.magentacloud.de/lnk/luiIBoVK

Password: hugin-ptx

Just unpack the ZIP file anywhere, navigate to find 'pv.exe' or plain
'pv' if you have opted to hide filename extensions on your system and
doubleclick on it. You'll get a file select box where you can pick
images and PTOs. When you see an image displayed, move the mouse up to
the top margin, and three rows of buttons will show. Many buttons have
to be click-held for the desired effect. secondary-click-hold the
buttons for a quick explanation and alternative access to the functions.
For everything else, please refer to the included documentation in
README.html

Enjoy. Comments welcome. And keep in mind that this is a development
snapshot (made yesterday), it may be buggy and the newest functionality
is not yet documented properly.

Kay

Harry van der Wolf

unread,
Mar 4, 2021, 3:05:46 AM3/4/21
to hugi...@googlegroups.com


Op do 4 mrt. 2021 om 08:47 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:
Am 03.03.21 um 22:20 schrieb Harry van der Wolf:

 > Hi Kay and everyone else interested.
 >
 > It is not so hard to create a bundle for pv after all.
 > It took me an hour.

Wow! Thanks for your work! I did think about building a flatpack for
linux, but it looked like quite a bit of stuff one had to learn and do,
so I kept postponing it - and building on Linux is really easy.


I built 2 Linux AppImages by now (3 if you count my Hugin try a few weeks ago, but that's an unstable one).
It takes some time to learn how to build an AppImage, but once you know how to do it is not so difficult (as with most things you have learned). It is mostly copy&paste and combine the "things" the linux build system already compiled/built for you. Same for a .deb. Also built two of them but never offered them to Debian.

Harry

 

Kay F. Jahnke

unread,
Mar 4, 2021, 5:43:43 AM3/4/21
to hugi...@googlegroups.com
Am 03.03.21 um 22:20 schrieb Harry van der Wolf:
>
> It is not so hard to create a bundle for pv after all. It took me an hour.
> Currently the app does not have an .icns iconpack, so it has a default icon
> It also contains a minimal, strictly necessary Info.plist, but that
> should be expanded.
> If you create a nice 512x512x300 dpi or 1024x1024x300 dpi PNG image, it
> is easy to create an icns icon pack (which shows app icons based on your
> systems resolution)

I think I would be happy with a readymade image. It's in the public
domain, from Wikimedia Commons, and depicts Odin's ravens, Hugin and
Munin, sitting on a branch. I think this is apt. The site has several
download sizes apart from the SVG, but maybe an SVG is even better to
make an icon pack? This is the one:

https://commons.wikimedia.org/wiki/File:Odin%27s_ravens_right.svg

I suppose something simple with a stark contrast will also be good for
lower resolutions, and should be easily recognizable.

Kay

Harry van der Wolf

unread,
Mar 4, 2021, 8:11:27 AM3/4/21
to hugi...@googlegroups.com
All,

As a lunch activity ;)
Nothing changed in pv. I only added the icon set and rebuilt the bundle and dmg.
image.png
(Screenshot in Dutch. Had not noticed that Yesterday :) )

It needs to be run on a Mac from subfolder mac-bundle inside the pv build folder.
This is only after you have built pv as binary, but that is part of the pv build instruction.
This all requires more time and effort to "beautify" it, but it simply works.

(previous link removed)

@Kay: You will need to add versioning to pv. That can be used to verify the binary ( pv --version ), and create bundles, exes, debs, appimages, and so on.
I now simply used 1.0 for the bundle.

Harry


Op do 4 mrt. 2021 om 11:43 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:
--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/hugin-ptx/534e68e2-9ffa-d215-c078-e5da90663f09%40yahoo.com.

Kay F. Jahnke

unread,
Mar 4, 2021, 10:48:54 AM3/4/21
to hugi...@googlegroups.com
Am 04.03.21 um 14:11 schrieb Harry van der Wolf:

> As a lunch activity ;)
> Nothing changed in pv. I only added the icon set and rebuilt
> the bundle and dmg.

Looks nice, thanks! I like the icon :)

> Script and files to build the bundle and dmg:
> https://mega.nz/file/...
> It needs to be run on a Mac from subfolder mac-bundle inside the pv
> build folder.

Is it okay if I add put a link to your script and the bundle on the
bitbucket page? I'd also like to give you credit for the effort, is it
okay to mention you by name?

> This all requires more time and effort to "beautify" it, but it simply
> works.

Anything you would like me to do to help beautifying?

>> @Kay: You will need to add versioning to pv. That can be used
> to verify the binary ( pv --version ), and create bundles, exes,
> debs, appimages, and so on.
> I now simply used 1.0 for the bundle.

Good point. I'll figure out a numbering scheme.

Kay

Harry van der Wolf

unread,
Mar 4, 2021, 12:28:27 PM3/4/21
to hugi...@googlegroups.com


Op do 4 mrt. 2021 om 16:48 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

Is it okay if I add put a link to your script and the bundle on the
bitbucket page? I'd also like to give you credit for the effort, is it
okay to mention you by name?

That's OK. I actually wanted to do a pull request so it simply would add a subfolder to your repo for mac building but then I have to register at bitbucket, fork it, etc., so I simply zipped it.
In my cross-platform repos I mostly add a folder "packaging" with subfolders like "macos", "windows", debian (for a .deb") and AppImage, with scripts in there and "base" files.
I leave it completely up to you what you want.

And with regard to the dmg: Once pv is stable, and thereby your exe package, the mac bundle, the AppImage of FlatPack, and whatever other distribution option, you might start to add releases to your repo.
 
Anything you would like me to do to help beautifying?


With beautifying I mean.
- Comment the script a little bit more.
- Make some sed and/or awk steps where you start the script with a version number and it automatically updates the Info.plist.
- Expand the Info.plist with pto mime document typing. Now only for jpg,jpeg,tif, tiff, png. (And I forgot openexr. That needs to be added as well). This allows a user to double-click or right-click an image and use "open with ...".

Those last 3 steps are 20-30 minutes of work (once the versioning is implemented) as I will simply copy the structure from my repos.

Harry

Harry van der Wolf

unread,
Mar 4, 2021, 12:36:41 PM3/4/21
to hugi...@googlegroups.com
Compiling pv on my Ubuntu Mate raspberry pi 4.
A very short action:
 make
clang++ -c -Ofast -march=native -std=c++11 -c -fno-math-errno -Wno-unused-value pv_no_rendering.cc -o pv_no_rendering.o
clang: error: the clang compiler does not support '-march=native'
make: *** [makefile:103: pv_no_rendering.o] Error 1


So it works fine on Debian Buster on my server RPi4, but not on my RPi4 with Ubuntu Mate 20.04 (Ubuntu 20.04.2 LTS; uname -a: Linux ubuntu 5.4.0-1028-raspi #31-Ubuntu SMP PREEMPT Wed Jan 20 11:30:45 UTC 2021 aarch64 aarch64 aarch64 GNU/Linux)

So I removed the -march=native from the common_compiler_flags in the makefile and started make again.

Now it builds 3 versions which I quickly checked on a partial pano on a 1920x1080 monitor. 
See attached txt files.

Harry
pv_vspsimd.txt
pv_scalar.txt
pv_vcsimd.txt

Kay F. Jahnke

unread,
Mar 4, 2021, 3:14:41 PM3/4/21
to hugi...@googlegroups.com
Am 04.03.21 um 18:36 schrieb Harry van der Wolf:

> Compiling pv on my Ubuntu Mate raspberry pi 4.
> A very short action:
>  make
> clang++ -c -Ofast -march=native -std=c++11 -c -fno-math-errno
> -Wno-unused-value pv_no_rendering.cc -o pv_no_rendering.o
> clang: error: the clang compiler does not support '-march=native'
> make: *** [makefile:103: pv_no_rendering.o] Error 1

Just a quick thank-you back before I log off for the day...

That's funny. -march=native means 'compile using all instructions your
local processor supports' and is usually the fastest binary the compiler
can provide for the local machine, and I've used the flag for ages
without problems. Without that flag, the compiler usually picks a
conservative fallback ISA which is assumed to work 'everywhere' (that
would be all ARM chips in your case) and is usually slow - e.g. on intel
machines that's SSE, which pretty much every intel processor still alive
can handle. So not getting the 'native' ISA may slow things down,
especially when the vector units are not used, which make pv fast.

Of course the CPU and ISA is not the only factor for speed - pv also
uses a lot of memory bandwidth.

There is another alternative - to use g++ instead of clang++. On intel
machines, I get slower binaries with g++, but that does not mean it's
the same on ARM. -march=native is really a g++ option, and clang++ is
usually 100% compatible...

And did you check the compiler's version as well?

> So it works fine on Debian Buster on my server RPi4, but not on my RPi4
> with Ubuntu Mate 20.04 (Ubuntu 20.04.2 LTS; uname -a: Linux ubuntu
> 5.4.0-1028-raspi #31-Ubuntu SMP PREEMPT Wed Jan 20 11:30:45 UTC 2021
> aarch64 aarch64 aarch64 GNU/Linux)
>
> So I removed the -march=native from the common_compiler_flags in the
> makefile and started make again.
>
> Now it builds 3 versions which I quickly checked on a partial pano on a
> 1920x1080 monitor.
> See attached txt files.

That's the proof I was hoping for. It's not fast (didn't expect that
from the little machine) but it *works* :D

and that suggests that all we'd need for an M1 build is someone with a
new mac - would be best to compile on that machine, but one might
cross-compile and only test the binary on the M1.

Anybody out there?

Concerning packaging I'll take a while longer to respond.

Kay

Harry van der Wolf

unread,
Mar 5, 2021, 3:56:42 AM3/5/21
to hugi...@googlegroups.com


Op do 4 mrt. 2021 om 21:14 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

And did you check the compiler's version as well?

Sorry, yes. I forgot to mention.
clang version 10.0.0-4ubuntu1
Target: aarch64-unknown-linux-gnu
Thread model: posix 


Kay F. Jahnke

unread,
Mar 5, 2021, 5:39:17 AM3/5/21
to hugi...@googlegroups.com
Am 05.03.21 um 09:56 schrieb Harry van der Wolf:
>
>
> Op do 4 mrt. 2021 om 21:14 schreef 'Kay F. Jahnke' via hugin and other
> free panoramic software <hugi...@googlegroups.com
> <mailto:hugi...@googlegroups.com>>:
>
>
> And did you check the compiler's version as well?
>
>
> Sorry, yes. I forgot to mention.
> clang version 10.0.0-4ubuntu1

Still mystified by this, but let's ignore it for now - just build for
that Raspi without -march=native.

Regarding versioning: I decided to use the git tags after all, and I've
pushed a new version to all 'interesting' branches and set a tag 1.0.2

The versioning is done with a simple shell script, and the makefile
keeps a header 'pv_version.h' up-to-date, which pv in turn uses to
'know' the current version. To get pv to echo it's version, just type

./pv --version | tail -1

if you're in the base folder of the source tree, you can also issue

./echo_version.sh

If you want to know the version from git use

git tag --sort=creatordate | tail -1

I'll be solely responsible for the numbers for the time being, using the
script

pv_version.sh

The new version might be worth testing, I'm really happy now with the
stitching, having ironed out a few glitches. My next post will have a
quick how-to to whet your appetite!

Kay

Kay F. Jahnke

unread,
Mar 5, 2021, 5:56:15 AM3/5/21
to hugi...@googlegroups.com
In a way I'm surprised that so far nobody has actually reported doing
what I propose in this thread, namely performing a stitching job or
exposure fusion, using pv's adapted Burt & Adelson image splining
algorithm. I admit my documentation isn't so easy to access on this
topic, so here's a brief how-to:

First, you need a PTO file. pv only processes a subset of possible PTO
content, and the stitching algorithm has some limitations, so:

- your images should (for now) not have an alpha channel.

- source images should be rectilinear, spherical, cylindric,
stereographic or fisheye.

- the PTO's 'p-line' is ignored, you may set the output projection with
a command-line parameter.

- masks are also ignored.

- for this trial, please use sRGB input (like, JPGs or PNGs) or EXR.

I recommend you work from the command line with pv in windowed mode, so
you can see what pv echos. Let' say your PTO is called 'pano.pto'. try
this: (I'll explain the parameters afterwards)

./pv -W --feathering=10 --target_projection=spherical \
--snapshot_magnification=4 --window_width=1000 \
--window_height=500 pano.pto

You'll see pv's 'live stitch' - a 'quick' stitch which only uses slight
feathering between the facets.

Now use the arrow keys and the zoom keys - or mouse gestures - to show
in the viewer window what will later be in your stitched panorama. Tip:
if you lose the horizon, press 'L' for 'level'. There is no view
cropping in pv. 'Empty' space shows black, which is no problem. If the
aspect ratio is not right, just reshape the window. pv is wysiwig, so
the output will have the same size as the window.

If the brightness values of the images don't seem to fit well, press
'Shift + L' to do an automatic brightness balance - don't worry about
slight differences after that.

Now press 'P'. pv will now stitch the panorama in the background - you
could carry on viewing or making more snapshots etc. - you just can't
'leave' the current image (set) until all snapshots, stitching jobs etc
which depend on it are done. For now, you may just press 'escape', pv
will terminate once the stitch is ready.

From the command line, you can see to which filename the stitch was
saved. Try and open it in pv to convince yourself that the stitch has in
fact happened - it should look much better than the 'live stitch' and be
close to what you are used to from using enblend.

Here is the explanation of the parameters:

-W run pv in a window, not in fullscreen mode

--feathering=10 use a bit of feathering at the facet borders and for the
stitching masks

--target_projection=spherical view and create output in spherical projection

--snapshot_magnification=4 make the output 4X as large as the viewer window

--window_width=1000 viewer window's height

--window_height=500 viewer window's width

pano.pto your PTO file

So you see, stitching with pv is no rocket science, and you can take it
from there! To do an exposure fusion, let's suppose you have the
registered bracket in 'bracket.pto'. Try this:

./pv -W --next_after_fusion=yes --snapshot_like_source=yes bracket.pto

You'll see a viewer window while pv does a batch job, and when the job
is done the window closes:

--next_after_fusion=yes behaves as if you had pressed first 'U' then
'Tab' - so it launches the exposure fusion, then proceeds to the next
image - or terminates, since there is no next image.

--snapshot_like_source=yes creates output which has the same size and
projection as the input

That's just to give you a taste for pv's batch mode, which you can in
fact use to batch-process entire collections of PTOs and images,
stitching, fusing, HDR-merging or snapshotting along the way to your
heart's content ;)

Enjoy

Kay


Harry van der Wolf

unread,
Mar 5, 2021, 6:43:38 AM3/5/21
to hugi...@googlegroups.com


Op vr 5 mrt. 2021 om 11:56 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:
In a way I'm surprised that so far nobody has actually reported doing
what I propose in this thread, namely performing a stitching job or
exposure fusion, using pv's adapted Burt & Adelson image splining
algorithm. I admit my documentation isn't so easy to access on this
topic, so here's a brief how-to:


I did not report anything back and did not do a panorama job, but I did some "enfusing" jobs on bracket images.brackets of 3 images and brackets of 5 images.
Using align_image_stack to align them (hand shot images) and let AIS create a pto.
Then use pv to open the pto and use "U" and "Shift-U" to fuse them.
I also used these ptos with a complete command-line parameter batch set (from your Readme), but pv does not exit after the last pto.
Both on the RPi4 and my Linux laptop.

I did not report back as I am also a complete newby to pv and first wanted to explore it a bit deeper.

Results are good and completely comparable with enfuse.
Panos this weekend.

Harry

Kay F. Jahnke

unread,
Mar 5, 2021, 1:18:20 PM3/5/21
to hugi...@googlegroups.com
Am 05.03.21 um 12:43 schrieb Harry van der Wolf:
>
>
> Op vr 5 mrt. 2021 om 11:56 schreef 'Kay F. Jahnke' via hugin and other
> free panoramic software <hugi...@googlegroups.com
> <mailto:hugi...@googlegroups.com>>:
>
> In a way I'm surprised that so far nobody has actually reported doing
> what I propose in this thread, namely performing a stitching job or
> exposure fusion, using pv's adapted Burt & Adelson image splining
> algorithm. I admit my documentation isn't so easy to access on this
> topic, so here's a brief how-to:
>
> I did not report anything back and did not do a panorama job, but I did
> some "enfusing" jobs on bracket images.brackets of 3 images and brackets
> of 5 images.
> Using align_image_stack to align them (hand shot images) and let AIS
> create a pto.

That's a good way of doing it, and sets reasonable Ev values as well as
doing the registration. Good to batch as well. I tend to collect all
brackets from a take into separate folders (I have a script for that),
then make the PTOs one per folder. After that, a pv batch job (or
several). You can also make HDR-blended images instead of exposure
fusions, by using

--snapshot_extension=exr --blending=hdr

which you can also batch with

--next_after_snapshot=yes

and you may want

--aeb_auto_brightness=yes

for a batch job, because that does a light balance before blending -
especially for JPGs, the brightness values from the PTOs aren't always
optimal.

> Then use pv to open the pto and use "U" and "Shift-U" to fuse them.

It's nice to go through the set 'manually' - also to set the shape of
the output.

Shift+U simply takes the first image as output template and renders to
that size and projection. You can actually choose to use any shot of the
bracket as the template, with --snapshot_facet=...

I sometimes use the darkest shot as frame reference - the second, or
number 1 as pv counts from zero, for my canon cameras because it's most
likely to have non-overexposed content all over the frame.
'U' - without shift - is to share the current view, with the shape and
content as the window you see on-screen. Like, you can work with a
square window and fixed 1000X1000 output size and produce frames for you
know what.

> I also used these ptos with a complete command-line parameter batch set
> (from your Readme), but pv does not exit after the last pto.
> Both on the RPi4 and my Linux laptop.

It should, though...

It's nice if you send the actual command line you used if something
unexpected happens - makes it easier for me to figure out if something
is genuinely wrong. But your hint made me look into the logic, and I
actually spotted and fixed a bug here, which seems unrelated, though.
Not the last bug...

> I did not report back as I am also a complete newby to pv and first
> wanted to explore it a bit deeper.

No problem. Newbies tend to come up with problems that 'experts' would
never manage to produce ;)

> Results are good and completely comparable with enfuse.

I take that as a compliment! With the default settings, pv only uses
'exposure_weight' with the standard values for sigma and mu, as proposed
by Mertens, Kautz and Van Reeth, and also the standard in enfuse. enfuse
adds the default of .2 saturation weight, which pv does not (yet)
support. You can vary mu and sigma through command line parameters. I
also added contrast weighting as an experiment (--contrast_weight=...)
which I calculate from the gradient squared magnitude of the b-splines
in the pyramid - the derivative of the splines is easy to get, so this
is quick and precise, but slightly different to standard B&A and enfuse.
I do the GSM separately over the colour channels, and I have the
impression that mixing in a bit of contrast weight (like .2) makes the
result more 'vivid'.

> Panos this weekend.

Have fun! And pull and build afresh before you use pv, right now I'm
pushing new stuff frequently.

Kay

Harry van der Wolf

unread,
Mar 6, 2021, 5:02:07 AM3/6/21
to hugi...@googlegroups.com
PV 1.0.2 bundle for MacOS.

Op vr 5 mrt. 2021 om 19:18 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:
--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.

Kay F. Jahnke

unread,
Mar 6, 2021, 6:02:47 AM3/6/21
to hugi...@googlegroups.com
Am 06.03.21 um 11:01 schrieb Harry van der Wolf:

> PV 1.0.2 bundle for MacOS.
> ...

Thanks!

I put a 'scripts' folder into the mac branch and copied the files from
your mac-scripts 1.0.2 into it. So now the bundling-code is under
revision control.

I also did a bug fix removing gradient-like artifacts from fused image
margins - pushed to all branches, but no new version number yet

Kay

yuv

unread,
Mar 6, 2021, 2:03:45 PM3/6/21
to hugi...@googlegroups.com
On Wed, 2021-03-03 at 17:54 +0100, 'Kay F. Jahnke' via hugin and other
free panoramic software wrote:
> Am 03.03.21 um 13:27 schrieb yuv:
>
> > why not adding to pv the
> > missing functions from Hugin rather than the other way around?
>
> In a way this is what I've been doing.

From far away, it looks like there are four alternatives:
(a) do nothing;
(b) distribute pv along Hugin;
(c) integrate pv's functionalities into Hugin; or
(d) integrate Hugin's functionalities into pv.

How far is pv from being a replacement to Hugin, and do you see it
going there?

Conversely, how different is pv from Hugin and how difficult would be
that integration work? I suspect that the use of a different GUI
Toolkit is a major obstacle, but your expertise my intuition wrong?

Which brings us to what I understand is your proposal, to distributed
pv along Hugin. The cost to Hugin are "only" more build/distribution
complexity. Or am I missing something? And it does not seem to be
such a heavy toll, based on Harry's feedback here. Do the benefits to
Hugin's users justify the extra build/distribution complexity? And are
there things that can be done to reduce that complexity, e.g. moving pv
from bitbucket to the same repo as Hugin?

Distributing pv along Hugin would give pv more visibility, and
photographers would receive an additional, valuable tool in their
toolbox. But what would it give Hugin other than more
build/distribution complexity? Note that the question is only meant to
be thought-provoking and does not represent an opinion. It is only the
opinion of those who do the heavy lifting that counts; and ultimately,
like all open source project, the driver is user's need.

Hugin is already a bundle of more or less integrated tools and
libraries. It is proof that the unix philosophy of one tool for one
task is not universal. One such limit is a GUI workflow tool like
Hugin, where the integration of editing / stitiching / blending tools
has enabled usability that was not available when all of these
functions were in isolated single-purpose tools.

Does it make sense to distribute Multiblend with Hugin as well, and
integrate it as a possible worklfow option?


> I decided [...] to *not* use a traditional GUI library
> like wxWidgets or Qt

Maybe that is the technical obstacle, and need reconsideration?
Integration is a two-way street.


> Look at my implementation of the B&A algorithm, and compare it
> to 'traditional' approaches, and judge for yourself:
>
> https://bitbucket.org/kfj/pv/src/master/pv_combine.cc
>
> Compared to traditional code, this is alien technology.

I am sure there are experts in alien technology. I am not one, don't
ask for my judgment. All I see is a beautiful and useful tool that has
fulfilled a niche left open by the existing tools. I see the same in
Multiblend. And I recall how Hugin started as an overall GUI to
visualize, control, and synchronize different such specialized tools.

The general question becomes: what test need to be met for a tool to:
(a) be distributed with Hugin (the minimum common denominator)?
(b) get a tab or other UI element within Hugin as an addition to the
existing tabs (i.e. pv as the third view mode)?
(c) replace a tool within Hugin (IIRC some control point finders have
been deprecated/removed)?

With a clear set of rules for the above, you'd have a target to work
toward.

I, as an external spectator who no longer has the time to shoot, edit,
stitch panos, can only cheer from the sidelines the efforts of
developers and builders sharing their newest development and trying to
make them work for the rest of us.

Monkey

unread,
Mar 6, 2021, 5:22:17 PM3/6/21
to hugin and other free panoramic software
Tried to build on Ubuntu Mate and got this:

/usr/include/Vc/scalar/../common/../sse/intrinsics.h:601:13: error: argument to '__builtin_ia32_vec_ext_v4sf' must be a constant integer
            _MM_EXTRACT_FLOAT(f, v, i);
            ^~~~~~~~~~~~~~~~~~~~~~~~~~
/usr/lib/llvm-10/lib/clang/10.0.0/include/smmintrin.h:876:11: note: expanded from macro '_MM_EXTRACT_FLOAT'
  { (D) = __builtin_ia32_vec_ext_v4sf((__v4sf)(__m128)(X), (int)(N)); }
          ^                                                ~~~~~~~~
1 error generated.
make: *** [makefile:61: pv_avx.o] Error 1


Kay F. Jahnke

unread,
Mar 7, 2021, 2:22:39 AM3/7/21
to hugi...@googlegroups.com
Am 06.03.21 um 23:22 schrieb Monkey:

> Tried to build on Ubuntu Mate and got this:
>
> /usr/include/Vc/scalar/../common/../sse/intrinsics.h:601:13: error:
> argument to '__builtin_ia32_vec_ext_v4sf' must be a constant integer
>             _MM_EXTRACT_FLOAT(f, v, i);
>             ^~~~~~~~~~~~~~~~~~~~~~~~~~
> /usr/lib/llvm-10/lib/clang/10.0.0/include/smmintrin.h:876:11: note:
> expanded from macro '_MM_EXTRACT_FLOAT'
>   { (D) = __builtin_ia32_vec_ext_v4sf((__v4sf)(__m128)(X), (int)(N)); }
>           ^                                                ~~~~~~~~
> 1 error generated.

Please be more specific: what Linux version is it, and which version of
Vc are you including? The error is in Vc, did you build Vc from source?
That's most likely to succeed, please pick the 1.4 branch. You can find
instructions for compiling Vc from source on my bitbucket site, please
refer to the section 'Building pv on a debian-based system'.

If you are in a hurry to get code running and want to work around the Vc
problem, you can compile without Vc: just remove the '-D USE_VC'
statement in the makefile (compiler_flags, line 43). The resulting
binary will be about 30% slower than the Vc version. For experiments
trying to get the code compiled, you may also want to check out the
'native' branch of pv, which offers two non-Vc targets and has faster
turn-around.

Good luck
Kay

Harry van der Wolf

unread,
Mar 7, 2021, 4:05:39 AM3/7/21
to hugi...@googlegroups.com
Sorry. Long mail.

Op za 6 mrt. 2021 om 20:03 schreef yuv <goo...@levy.ch>:

From far away, it looks like there are four alternatives:
(a) do nothing;
(b) distribute pv along Hugin;
(c) integrate pv's functionalities into Hugin; or
(d) integrate Hugin's functionalities into pv.


My vote would be for (b), just like Kay himself also mentions.
 
How far is pv from being a replacement to Hugin, and do you see it
going there?

Not there (yet). Hugin has way more tweaking options to create great looking panos. Translations for example are not supported by pv (yet?), and neither are masks.
Next to that: The Hugin assistant makes things way more easy than currently in pv. Of course: also getting the most out of Hugin, as in pv, requires some in-depth knowledge.
Also: pv's goal is to also improve your already created images (or panoramas).
 
Conversely, how different is pv from Hugin and how difficult would be
that integration work?  I suspect that the use of a different GUI
Toolkit is a major obstacle, but your expertise my intuition wrong?

Which brings us to what I understand is your proposal, to distributed
pv along Hugin.  The cost to Hugin are "only" more build/distribution
complexity.  Or am I missing something?  And it does not seem to be
such a heavy toll, based on Harry's feedback here.  Do the benefits to
Hugin's users justify the extra build/distribution complexity?  And are
there things that can be done to reduce that complexity, e.g. moving pv
from bitbucket to the same repo as Hugin?


I can't and won't say anything about the difference in code. I leave that to the experts.
But indeed: I see it as a supplementary tool. At least for the time being.
It can be added to Hugin as an extra app bundle (apple) or as a tool, or as an extra exe (on windows). Linux is of course the easiest one.
When looking at Hugin, like yuv described, Hugin is really a combination of all kind of tools.
Enblend and enfuse need to be dragged in (or multiblend for that matter). libpano13 is often too old and needs to be built as well. Both on windows and mac almost everything needs to be downloaded and compiled.

For me integration is not one big monolithic binary doing everything, but a suite consisting of nicely integrated tools via APIs.
If you look at web technology where all kind of technologies need to be connected, you see that APIs (or interfaces or "command line" parameters on a local platform, if you like) are the way forward to connect and integrate different technologies.
This also allows for developments (leaps in developments) in the separate tools as long as the API or interfaces remains consistent. Adding an extra option will not break anything.
I am not a programmer (weel, an amateur at best), but in my work we bring a lot to the cloud where you have saas, paas, iaas solutions on a number of platforms (AWS, Microsoft, Google to name the biggest ones) which you want to talk to each other (in a safe and secure way). Nobody nowadays does everything in one "tool".

Why not pv also? It uses the same libraries as already necessary for Hugin and all the tools, with one "extra": Vc, and Vc needs to be of high enough version. Which means it can be treated like libpano.

And with regard to integration: pv started as panorama viewer. In Hugin you could add as setting: "Open created panorama in pv", just like we do with another non-C, non WxWidgets, non-integrated tool being exiftool acting on our created pano from our original images. 
Opening pv from Hugin after creation gives you an immediate view on your results.
And pv can in that case also give you immediate options to tweak your created panorama.

And in 5 years or 10 years? Who knows.

Harry
 

Bruno Postle

unread,
Mar 7, 2021, 4:22:45 AM3/7/21
to Hugin ptx
On Sun 07-Mar-2021 at 10:05 +0100, Harry van der Wolf wrote:
>Op za 6 mrt. 2021 om 20:03 schreef yuv:
>
>> From far away, it looks like there are four alternatives:
>> (a) do nothing;
>> (b) distribute pv along Hugin;
>> (c) integrate pv's functionalities into Hugin; or
>> (d) integrate Hugin's functionalities into pv.
>>
>My vote would be for (b), just like Kay himself also mentions.

I think this would be good, it makes sense to have a panorama viewer
in the Hugin bundle. Though aside from getting it working on all
architectures, there is a question of who maintains it once it is in
the Hugin codebase? There will be bugs that need fixing, weird new
compiler/library combinations, translations etc... Note that
currently there is a single person doing code maintenance,
collecting translations, _and_ doing the releases - these could
easily be different roles.

--
Bruno

Kay F. Jahnke

unread,
Mar 7, 2021, 5:22:20 AM3/7/21
to hugi...@googlegroups.com
Am 07.03.21 um 10:05 schrieb Harry van der Wolf:
> Sorry. Long mail.
>
> Op za 6 mrt. 2021 om 20:03 schreef yuv
>
>
> From far away, it looks like there are four alternatives:
> (a) do nothing;
> (b) distribute pv along Hugin;
> (c) integrate pv's functionalities into Hugin; or
> (d) integrate Hugin's functionalities into pv.
>
>
> My vote would be for (b), just like Kay himself also mentions.

That's also what I would propose. The distribution process is already
established, and building pv as yet another binary to 'ride out' with a
hugin bundle should not be too hard - I have already put a lot of work
into making porting easy, as you can see from the fact that we have
proof of build and execution on Linux, macOS, Windows and even on the
Raspi's ARM processor.

Due to my licensing, everyone who complies with the GPLv3 is of course
free to 'rip my code apart' or fork pv, but I think there's not so much
interest for that - maybe after some time. And I like being upstream -
the debacle with hugin's python interface which has never made it far
beyond linux and broke with every new release is a warning to me: I'd
rather be responsible for my own code, remain upstream, and fork out
'release' versions which might be packaged with hugin, if the hugin crew
thinks that's a good idea. Packagers might also simply provide a
'flavour' of hugin which includes pv, and let the user choose which she
picks.

>
> How far is pv from being a replacement to Hugin, and do you see it
> going there?
>
> Not there (yet). Hugin has way more tweaking options to create great
> looking panos. Translations for example are not supported by pv (yet?),
> and neither are masks.

That is correct. Really, pv is a panorama *viewer*, and so far the
stitching and blending capabilities are 'nice to have' extra features.
They are also a playground for my new technology, my library 'vspline'
which does all the 'heavy lifting' in the background - and for it's
application to image processing. I use pv as a showcase of how image
processing applications can profit from vspline, and as a handy test bed
to work on my image processing ideas. I hope that some of my development
work turns out to be attractive to other people so that they may *want*
to use vspline or some of pv's code - for example for blending and
stitching. I'm making an offer.

Having an application out which exhibits desirable features may help
popularize the underlying technology. It's interesting stuff, and it's
sad that so far it's largely ignored - I put a lot of effort into it,
and vspline is now quite mature and stable. Releasing it through debian
makes new versions 'trickle down' quite slowly, that's why I enclude
full vspline source code in pv's repo - vspline is header-only.

> Next to that: The Hugin assistant makes things way more easy than
> currently in pv. Of course: also getting the most out of Hugin, as in
> pv, requires some in-depth knowledge.
> Also: pv's goal is to also improve your already created images (or
> panoramas).

And to *display* them adequately. With smooth zooms and pans, QTVR-like
control, easy snapshotting, slideshows, etc. pp. - this is pv's core
functionality. Display your images and panoramas. Under a common
surface, with high quality, and an interface which 'puts nothing between
you and your images'. And with many options for experts who want to go
'deeper'.

> Conversely, how different is pv from Hugin and how difficult would be
> that integration work?  I suspect that the use of a different GUI
> Toolkit is a major obstacle, but your expertise my intuition wrong?
>
> Which brings us to what I understand is your proposal, to distributed
> pv along Hugin.  The cost to Hugin are "only" more build/distribution
> complexity.  Or am I missing something?  And it does not seem to be
> such a heavy toll, based on Harry's feedback here.  Do the benefits to
> Hugin's users justify the extra build/distribution complexity?  And are
> there things that can be done to reduce that complexity, e.g. moving pv
> from bitbucket to the same repo as Hugin?
>
>
> I can't and won't say anything about the difference in code. I leave
> that to the experts.
> But indeed: I see it as a supplementary tool. At least for the time being.
> It can be added to Hugin as an extra app bundle (apple) or as a tool, or
> as an extra exe (on windows). Linux is of course the easiest one.
> When looking at Hugin, like yuv described, Hugin is really a combination
> of all kind of tools.
> Enblend and enfuse need to be dragged in (or multiblend for that
> matter). libpano13 is often too old and needs to be built as well. Both
> on windows and mac almost everything needs to be downloaded and compiled.
>
> For me integration is not one big monolithic binary doing everything,
> but a suite consisting of nicely integrated tools via APIs.

I say it again: pv and hugin are complementary. If, eventually, it turns
out that one may benefit from the other in a more direct way (like code
sharing) that can be dealt with later.

hugin and pv code are very different: hugin uses a set of external
helper programs for many things, and is based on a C++ wrap of an old C
library, libpano. It relies on disk storage for intermediate results. pv
is 'monolithic' and keeps everything in memory (limiting the maximum
size of image/pano it can handle) - and everything is programmed afresh
from scratch in C++11, relying on vspline's multithreaded SIMD code.
Please do have a look at vspline's and pv's source code and convince
yourself that it also is well-structured and well-commented - I find
some of hugin's code base hard to access.

> If you look at web technology where all kind of technologies need to be
> connected, you see that APIs (or interfaces or "command line" parameters
> on a local platform, if you like) are the way forward to connect and
> integrate different technologies.

pv and hugin do not need to share any API. pv shows images, and PTOs.
PTO format is not exclusive to hugin, but a panotools standard. that's
all the common ground that's needed. pv even has it's own format for
synoptic views (adding, for example, display of 'cubemaps' which
hugin/panotools does not use) - but I wanted to open it up to the
'panotools universe' to make it easy for users: they already have their
panoramas in PTO, so why force them to use my PTO to ini file
translator? PTO is quite easy to parse. And everyone has images!

> This also allows for developments (leaps in developments) in the
> separate tools as long as the API or interfaces remains consistent.
> Adding an extra option will not break anything.

I agree.

> I am not a programmer (weel, an amateur at best), but in my work we
> bring a lot to the cloud where you have saas, paas, iaas solutions on a
> number of platforms (AWS, Microsoft, Google to name the biggest ones)
> which you want to talk to each other (in a safe and secure way). Nobody
> nowadays does everything in one "tool".
>
> Why not pv also? It uses the same libraries as already necessary for
> Hugin and all the tools, with one "extra": Vc, and Vc needs to be of
> high enough version. Which means it can be treated like libpano.

I think pv is actually easy to build - compared to software like hugin.
Vc is a bit tricky: it's also a very good library, but like vigra it's
not so widely used, and does not have too much manpower behind it. It's
author, Matthias Kretz, has been part of formulating the upcoming C++
standard std::simd, and he is currently working on producing a good
reference implementation for this standard, and aims at implementing Vc
2.0 using std::simd. I have an experimental pv version out using
std::simd which can use his std::simd implementation, but the standard
has a few shortcomings, which hamper it's use: it omits SIMD
gather/scatter operations and also Vc's combined load/shuffle and
shuffle/store routines, which are very relevant for working with
interleaved data (like pixel arrays). I have coded around these issues,
but the resulting code is definitely slower than using Vc, which has all
these capabilites - and for smooth animations, every millisecond counts,
so I stick with Vc for now.

> And with regard to integration: pv started as panorama viewer. In Hugin
> you could add as setting: "Open created panorama in pv", just like we do
> with another non-C, non WxWidgets, non-integrated tool being exiftool
> acting on our created pano from our original images.
> Opening pv from Hugin after creation gives you an immediate view on your
> results.
> And pv can in that case also give you immediate options to tweak your
> created panorama.

Let me stress this point, too: with pv, you can directly look at the
PTO, just like at any other image. You don't have to stitch first. You
have the full set of options a panorama viewer offers to inspect the PTO
in great detail. You can set the output projection as you like, and
still zoom and pan etc. - and if you like what you see you can instantly
have 'hard copies' with the press of a single key.

My notion of 'live stitching' is an attempt to even make stitching
unnecessary - if the images fit well, the brightness is correctly
adjusted, and there is a bit of feathering, oftentimes that's perfectly
good enough - and it's instantly available. 'Live stitching' also has a
few advantages: if you have tele shots mixed in with interesting detail,
you can zoom in to the tele shot's full resolution. With stitched panos,
you only ever get the resolution the pano was stitched to. And if the
images vary in brightness, you still have the full range visible - you
can 'peek' into dark shadows or bright clouds, if your take has them -
in a stitched image, stuff gets clipped unless you stitch to HDR. Check
it out, it's great, I'm sure you'll like it. All this talk is academic,
really - Harry and I have made bundles for macOS and Windows, all you
need to to is download and try it. What are you waiting for?

Kay

Kay F. Jahnke

unread,
Mar 7, 2021, 5:47:45 AM3/7/21
to hugi...@googlegroups.com
Am 07.03.21 um 10:22 schrieb Bruno Postle:
I propose that for now I remain upstream and responsible for the code. I
can fork out releases which I consider sufficiently stable. The hugin
builders can simply clone my git repo, *there are already branches for
all major platforms*. What's missing is the *platform integration* -
like where it is installed, if it is installed at all (my Windows
bundles are stickware, so people can just put them on a DVD alongside
with their images and give their users a free viewer to look at them),
what icon is displayed for it, what mime types are associated - and I
don't know enough about Windows or macOS to provide that, and when it
comes to Linux, I already have a debian package to maintain, so I'd be
happy if someone else could do that part - maybe somone who can build
AppImages or flatpaks. The hugin builders/packagers would be in a good
position to do so, because they already know how to roll out stuff to
their platform. Harry has demonstrated that for macOS this is not
terribly hard to do - but I could not have done this, I don't even have
a mac.

We don't know if pv will be well-received by the users - it may be too
complicated, not international enough, too slow for dual core laptops...
but throwing it out from a bundle is just as easy as putting it in and
helping it to a wider audience. Then people can decide whether they like
it or not. If people do like it, there will be ways to keep it afloat,
I'm sure. I'll do my best to keep offering a stand-alone version, and
when I get around to it, I'll try and set up a CI/CD pipeline to build
automatically and have binary for every push, but it would be nice to
get some help for that as well, just as Harry has done for macOS. And I
did also propose that packagers might offer a 'flavour' of the hugin
bundle offering my code as an addition. No reason to make an either/or
decision.

For now I'm quite busy getting the code up to the features I intend it
to have, and that's what I think I'm good at.

Kay

yuv

unread,
Mar 7, 2021, 8:16:16 AM3/7/21
to hugi...@googlegroups.com
On Sun, 2021-03-07 at 09:22 +0000, Bruno Postle wrote:
> currently there is a single person doing code maintenance,
> collecting translations, _and_ doing the releases - these could
> easily be different roles.

This! What made Hugin great and successful in years past was the co-
operation and co-ordination of the different teams / authors to the
point of trusting and sharing full access to their respective code
repositories; being able to inter-changeably assume the different roles
for the different packages; reaching mutual agreement on repositories
and other tools to use to simplify interchangeability of roles; co-
ordinate releases, bug fixes, etc.

That co-ordination seems to have gone missing, and re-building it will
require sacrifices and compromises on all ends.

I have just taken a half-hour stroll down memory lane. The main
comment I have: Why have no admins been added to Hugin / Panotools /
Enblend in ages? My memory fades, but all the names I see are people
that were either there before me, or were added on my initiative. Who
has taken care of the team after I moved on because life?

* have any new contributors been invited?
* if yes, why have they not accepted the invitation?
* if no, why?
* how many other contributors has the ecosystem missed? not just admins
(the highest level of access/trust).

Below follow a longish (and incomplete) list of signs of what is in my
view organizational rot. Those signs point to extra work. I am not
asking the exisisting and dwindling team to take on that extra work. I
am saying that the organizational rot is the consequences of the
failure to welcome and embrace new contributors. Who wants to join a
dwindling team who does not welcome change?

adding team members requires two thrusts:
(a) individuals inspired to become team members, to bring new energy,
new ideas, new code, and eventually change to the projects;
(b) an existing team that is welcoming new contributors and accepts the
change that they bring.

where are those thrusts?

And now for the longish list.

* why are the projects still using Mercurial? IIRC back when the
switch to a unified VCS was made, Mercurial won because it was the one
with most even support amongst operating system (Windows!). Meanwhile,
the landscape has shifted, git is the tool. That would require some
flexibility from all individuals, including Kay (bitbucket? exotic! I
would not adopt it at this time).

* why are the projects still on Sourceforge? History is past, and they
may have corrected they blunders. But there is no going back and today
Gitlab is the place to be (which would also replace Launchpad, that I
introduced ages ago).

* it pains me to see F*book promoted on the Hugin's website. Really?
it surely explain less traffic to this mailing list, which means less
potential contributors coming its way. Don't expect an
influencing/advertising tool that has an interest to isolate users into
echo chambers to be a positive to this community. if I was around the
time that decision came up for discussion, I would have vehemently
opposed it.

* the Enblend/Enfuse website is still... mine! What is the new
generation of contributors waiting to leave their mark? Last news from
five years ago? OK, the tool has matured and no need for new releases,
but someone with a little bit of marketing flair could point out to the
continued flow of Hugin releases, and now to Multiblend, a very
interesting development?

* the Panorama Tools website list the last new to almost eight years
ago! the latest release tarball is two years ago.

I could probably find a bunch more, but the point is: encourage new
contributors to take over and replace the current generation and take
this community to its future with as little interference as possible
from us oldies. Else, the community will die when we inevitably do.
So please, please, please, make this a more welcoming place to new
contributions. Kay and Harrie are no newbies, so imagine how difficult
this is for someone who has not been around for that long. The
alternative is to fade into irrelevance, which is why I asked the
thought-provoking question of whether pv is already a replacement to
Hugin.

Gunter Königsmann

unread,
Mar 7, 2021, 9:59:11 AM3/7/21
to hugi...@googlegroups.com, yuv
As a user the only thing I wonder about is if multiblend2 will be the new default instead of enblend. And we could try harder to explain the projection types to the users. And we could add a mosaic wizard. Besides that I cannot complain.

Kind regards,

Gunter.

--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.

--
Sent from my Android device with K-9 Mail. Please excuse my brevity.

Kay F. Jahnke

unread,
Mar 7, 2021, 12:30:45 PM3/7/21
to hugi...@googlegroups.com
Am 07.03.21 um 14:16 schrieb yuv:
> On Sun, 2021-03-07 at 09:22 +0000, Bruno Postle wrote:
>> currently there is a single person doing code maintenance,
>> collecting translations, _and_ doing the releases - these could
>> easily be different roles.
>
> This! What made Hugin great and successful in years past was the co-
> operation and co-ordination of the different teams / authors to the
> point of trusting and sharing full access to their respective code
> repositories; being able to inter-changeably assume the different roles
> for the different packages; reaching mutual agreement on repositories
> and other tools to use to simplify interchangeability of roles; co-
> ordinate releases, bug fixes, etc.

I felt it was difficult to keep my foot in the door. Sort of
disheartened. Like, I got stern admonishments rather than friendly
encouragement. I prefer to run 'my own show' now, where I don't step on
anyone's toes.

> That co-ordination seems to have gone missing, and re-building it will
> require sacrifices and compromises on all ends.

I'm not sure if there is the will to rebuild stuff, much less sacrifice
anything. I have tried to help keep my python interface afloat, but
apart from that really I wouldn't want to touch any of hugin's code
anymore. I'd even let that go and not be too sad about it, if someone
came up with a better free stitcher, but I'm used to it and it does it's
job, more or less. And I can tweak the sources locally to bend them to
my will. And fuse my stacks separately, when hugin insists that my
assignment of stacks is irrelevant to it's heuristic approach. When it
comes to python, what I now use is cppyy
(https://cppyy.readthedocs.io/en/latest/index.html), that's much cooler
than swig.

> I have just taken a half-hour stroll down memory lane. The main
> comment I have: Why have no admins been added to Hugin / Panotools /
> Enblend in ages? My memory fades, but all the names I see are people
> that were either there before me, or were added on my initiative. Who
> has taken care of the team after I moved on because life?
>
> * have any new contributors been invited?
> * if yes, why have they not accepted the invitation?
> * if no, why?
> * how many other contributors has the ecosystem missed? not just admins
> (the highest level of access/trust).

Look at the hugin website: 'hugin is now stable'. I suppose that's the
idea. Yes, there have been great innovations in the past few years - the
control point tool, entering names of raw files to be converted to TIFF
by external helper programs, a dynamic range compression button, and I
discovered one can even use the mouse wheel to change the hfov in the
openGL preview! And that's only mentioning the stuff that immediately
springs to my mind!

A couple of weeks ago I even managed to unlock the positions of images
in a stack! It was easy! I only had to find the right submenu and click
on a bit of empty white space to get the dialog offering me to do so.
Only took me half an hour to figure it out, with the help of looking
through a few postings on hugin-ptx, which I would quote now if I could
find them again with the search tool, I suppose I need other keywords
than 'stack' and 'unlock'.

> Below follow a longish (and incomplete) list of signs of what is in my
> view organizational rot. Those signs point to extra work. I am not
> asking the exisisting and dwindling team to take on that extra work. I
> am saying that the organizational rot is the consequences of the
> failure to welcome and embrace new contributors. Who wants to join a
> dwindling team who does not welcome change?

Good question.

> adding team members requires two thrusts:
> (a) individuals inspired to become team members, to bring new energy,
> new ideas, new code, and eventually change to the projects;
> (b) an existing team that is welcoming new contributors and accepts the
> change that they bring.
>
> where are those thrusts?

> And now for the longish list.
>
> * why are the projects still using Mercurial? IIRC back when the
> switch to a unified VCS was made, Mercurial won because it was the one
> with most even support amongst operating system (Windows!). Meanwhile,
> the landscape has shifted, git is the tool. That would require some
> flexibility from all individuals, including Kay (bitbucket? exotic! I
> would not adopt it at this time).

I've been on bitbucket with all my projects for many years - for
upstream git repos, online presence and issue tracking bitbucket is just
fine, and I like that they are independent. They've been good to me,
nice and helpful, and I won't turn my back on them just because they are
'exotic'.
Downstream - that would be hugin, when it comes to pv - can be anything
doing git. The debian science team, where I maintain vspline, has moved
to gitlab, and I was happy about it, the old solution was a bit awkward.
gitlab is clean and capable, I think it's a good choice. But it's
definitely git now wherever it's hosted, I agree.

> * why are the projects still on Sourceforge? History is past, and they
> may have corrected they blunders. But there is no going back and today
> Gitlab is the place to be (which would also replace Launchpad, that I
> introduced ages ago).

I never liked sourceforge, and I did not create an account back when I
contributed to hugin, because I did not want to sign the terms of
service. I wanted nothing to do with it, that's why I sent in patches
instead. I'd still like to see hugin move.

> * it pains me to see F*book promoted on the Hugin's website. Really?
> it surely explain less traffic to this mailing list, which means less
> potential contributors coming its way. Don't expect an
> influencing/advertising tool that has an interest to isolate users into
> echo chambers to be a positive to this community. if I was around the
> time that decision came up for discussion, I would have vehemently
> opposed it.

Gosh, I did not even see that. Yeah, posting to hugin-ptx has not been
rewarding for me recently. I started several attempts to get *anyone*
interested in my program. I did not get very far - a couple of
enterprising linuxers who built it and never said much more about it. Do
they use it? Do they like it? I have no idea. Some thanked me and even
gave a bit of praise.

When I offered ready-made Windows binary packages 'for evaluation'
recently, I had *two* takers, who never wrote back when I sent them the
download links, either to me or to the list. I had something else in
mind with 'evaluation'. And with a couple of linux people building pv, I
had thought I'd get at least ten Windows users. Not so. Funny that now
all of the sudden when I announce I can even stitch and fuse with pv,
there is some echo. Hardly anyone seemed to be interested in 'pv, the
multiplatform FOSS panorama and image viewer'. Maybe fb has an
integrated viewer.

> * the Enblend/Enfuse website is still... mine! What is the new
> generation of contributors waiting to leave their mark? Last news from
> five years ago? OK, the tool has matured and no need for new releases,
> but someone with a little bit of marketing flair could point out to the
> continued flow of Hugin releases, and now to Multiblend, a very
> interesting development?

I don't think people at enfuse/enblend are interested anymore. I still
saw the same flaws (primary-coloured sparkle which I assume is a failure
to do saturation arithmetics) after years, so now I'm happy I have my
own variant of the algorithm. I just haven't managed seam optimization -
yet.

> * the Panorama Tools website list the last new to almost eight years
> ago! the latest release tarball is two years ago.

Panorama tools... That was not moving even years ago when I was
contributing to hugin. It's *stable*.

> I could probably find a bunch more, but the point is: encourage new
> contributors to take over and replace the current generation and take
> this community to its future with as little interference as possible
> from us oldies. Else, the community will die when we inevitably do.
> So please, please, please, make this a more welcoming place to new
> contributions. Kay and Harrie are no newbies, so imagine how difficult
> this is for someone who has not been around for that long. The
> alternative is to fade into irrelevance, which is why I asked the
> thought-provoking question of whether pv is already a replacement to
> Hugin.

Since you're playing devil's advocate (hah) - I can wrap all panotools
transformations in vspline, most of them are even easy to port to proper
vector code. I could throw in a few calls to helper programs like
cpfind, celeste, PTOptimzer - and that would yield a 'core' program
which can do the most important bits, one could take it from there. But,
as I said, hugin is good for image registration, and as Harry and Bruno
said, it can still do quite a few things which pv doesn't, and wasn't
meant to do. I still favor two separate programs, but I'd be happy to
see hugin evolve. Maybe some of the tricks pv can do can serve as an
inspiration. And, with pv, I like to 'move fast and break things'. Come
for the ride!

Kay

Kay F. Jahnke

unread,
Mar 8, 2021, 4:30:14 AM3/8/21
to hugi...@googlegroups.com
I've made a new pv release, version 1.0.3

This has now code to do

- exposure fusion
- image stitching
- faux bracketing

And documentation for the three tasks, in a new chapter in the
documentation named 'Exposure Fusion and Image Stitching', which you can
also find displayed on my bitbucket repo's front page (just scroll
down). The code is still experimental, and I still have not included
alpha channel processing, but it's stabilizing nicely, I hope there are
no major glitches.

Harry, would you mind making another mac bundle? So far we haven't heard
of any takers, but maybe they just haven't reported back... I'll build
another windows bundle soon and upload that to my cloud storage.

Kay

Kay F. Jahnke

unread,
Mar 8, 2021, 5:50:08 AM3/8/21
to hugi...@googlegroups.com
There's a new windows stickware bundle of pv, version 1.0.3:

https://www.magentacloud.de/lnk/w7Cohv9v

Password: hugin-ptx

This bundle has updated code for

- image stitching
- exposure fusion
- faux brackets

Please refer to the documentation on my bitbucket repo

https://bitbucket.org/kfj/pv

for more details; there is now a new chapter 'Exposure Fusion and Image
Stitching' - also included in README.html in the bundle.

Kay

Kornel Benko

unread,
Mar 8, 2021, 7:00:48 AM3/8/21
to hugi...@googlegroups.com
Am Wed, 3 Mar 2021 10:32:40 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:

> If my code turns out to be good, I'd be happy to see it used by others.
> I just don't want to go fiddling with other people's software to force
> my new stuff in. Integration into extant software is better done by the
> original authors, who know their way around their code. As far as pv and
> it's new components are concerned, I want to stay upstream and focus on
> innovation.

Hi Kay,
just to make some feedback about using pv.
Compiles fine (source from the repo), happy to use it.
I have to specify the font though to run pv.
Works fine on .pto files.
Thanks for the nice tool.

Kornel

Kay F. Jahnke

unread,
Mar 8, 2021, 11:10:31 AM3/8/21
to hugi...@googlegroups.com
Am 08.03.21 um 13:00 schrieb Kornel Benko:
> Am Wed, 3 Mar 2021 10:32:40 +0100
> schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
> <hugi...@googlegroups.com>:
>
>> If my code turns out to be good, I'd be happy to see it used by others.
>> ...
>
> just to make some feedback about using pv.
> Compiles fine (source from the repo), happy to use it.
> I have to specify the font though to run pv.

Yes, pv expects the font in the same folder where the binary is - so if
you move the binary somewhere else, best to move the font along with it,
and if not, well - you found out already ;)

You can use other fonts as well if the Sansation font does not please
you (any TTF should do, probably others as well), but other fonts don't
always fit well into the buttons, and they don't always have all the
characters pv needs.

> Works fine on .pto files.
> Thanks for the nice tool.

You're welcome, thanks for reporting back!

Kay

Harry van der Wolf

unread,
Mar 8, 2021, 12:02:48 PM3/8/21
to hugi...@googlegroups.com
Hi,

I compiled 1.0.3 on my 64bit GalliumOS 3.1 (Ubuntu 18.04 LTS) laptop. Started pv directly with 5 already aligned tifs. (In Align_image_Stack I created earlier both a pto and the aligned tifs: align_image_stack -v -p 8X.pto -a 8X P10048x.JPG )

command: pv 8X000?.tif
<snip>
interpreting trailing args as image files...
image file: 8X0000.tif
image file: 8X0001.tif
image file: 8X0002.tif
image file: 8X0003.tif
image file: 8X0004.tif
</snip>

Pressed 'u' and this happened
pv: pv_no_rendering.cc:9063: int inner_main(): Assertion `nfacets' failed.

Started pv again in the same way and pressed 'Shift-U' and exactly the same error happened.

Then I started pv with the pto as in: pv 8X.pto
Using 'U' or 'Shift-U' now works fine.
I did not do further tests yet.

Harry
Message has been deleted
Message has been deleted

Kay F. Jahnke

unread,
Mar 8, 2021, 2:43:35 PM3/8/21
to hugi...@googlegroups.com
Am 08.03.21 um 18:02 schrieb Harry van der Wolf:
> Hi,
>
> I compiled 1.0.3 on my 64bit GalliumOS 3.1 (Ubuntu 18.04 LTS) laptop.
> Started pv directly with 5 already aligned tifs.

This won't work.

> (In Align_image_Stack I
> created earlier both a pto and the aligned tifs: align_image_stack -v -p
> 8X.pto -a 8X P10048x.JPG )
>
> command: pv 8X000?.tif
> <snip>
> interpreting trailing args as image files...
> image file: 8X0000.tif
> image file: 8X0001.tif
> image file: 8X0002.tif
> image file: 8X0003.tif
> image file: 8X0004.tif
> </snip>
>
> Pressed 'u' and this happened
> pv: pv_no_rendering.cc:9063: int inner_main(): Assertion `nfacets' failed.
>
> Started pv again in the same way and pressed 'Shift-U' and exactly the
> same error happened.

For *synoptic* displays of several images you need image registration
first - your invocation is fine for slideshows etc, where you look at
*one image after the other* - try pessing 'Tab' to get from one to the
next. If you want several images to show at the same time, you must use
a PTO file or an ini file to tell pv how they are positioned relative to
each other. That's why it says "assertion 'nfacets' failed": I use
nfacets as the number of images in a synoptic display, like a panorama
or a bracket. If it's 0, there is nothing synoptic - there are zero
facets. Hence the failed assertion.

> Then I started pv with the pto as in: pv 8X.pto
> Using 'U' or 'Shift-U' now works fine.
> I did not do further tests yet.

So there. in a PTO it's fine. Maybe I should be less harsh about
people trying to fuse single images...

I hope to eventually offer services to do simple image registration in
pv - this would require control point detection and optimization - but
that's a major step and I'm not ready yet to do it.

If you have images which you know are aligned perfectly, an alternative
to a PTO is an 'ini' file, which is not very hard to make. Please refer
to the documentation - look e.g in README.rst line 1746ff, here's an
example:

In an ini file feeding a synoptic view, use projection=facet_map and
introduce each image with image=8X0000.tif etc - every key-value pair
has to go into a separate line. blending=hdr is just for looking at the
images correcly and the fov is mandatory:

projection=facet_map
blending=hdr
facet_hfov=50.0
facet=8X0000.tif
facet=8X0001.tif
facet=8X0002.tif
facet=8X0003.tif
facet=8X0004.tif

Save that as xx.ini, then load it with pv's file select dialog or pass
it on the command line *as if it were an image*. Works like opening PTO
files. When loaded, you'll need to light-balance, because the file is
missing brightness values, so press 'Shift+L'. Then you can
exposure-fuse the images with 'U' or 'Shift+U'

You may also want to look at the sections in the documentation titeled
'Acceptable Input' and 'Displaying Several Images in one Session'.

Kay

Harry van der Wolf

unread,
Mar 9, 2021, 1:53:40 PM3/9/21
to hugi...@googlegroups.com
Hi,

You released version 1.0.3, but forgot to update the pv_version* files.

Harry

Kay F. Jahnke

unread,
Mar 9, 2021, 3:52:52 PM3/9/21
to hugi...@googlegroups.com
Am 09.03.21 um 19:53 schrieb Harry van der Wolf:
>
> You released version 1.0.3, but forgot to update the pv_version* files.

Thanks for telling me. I'll fix it tomorrow!

Kay

Kay F. Jahnke

unread,
Mar 10, 2021, 3:16:48 PM3/10/21
to hugi...@googlegroups.com
Am 09.03.21 um 19:53 schrieb Harry van der Wolf:
>
> You released version 1.0.3, but forgot to update the pv_version* files.

Okay, forget 1.0.3, I just pushed 1.0.6, after somemore refactoring
today, small modifications, one bug fix and updates to the README. I
hope things can slow down a bit now - I did a few ten exposure fusions
and faux brackets plus half a dozen stitches, all seems well, but
processing of images with alpha channel is still missing for the fusion
and stitching code.

I had to do some wrangling with the versioning code, hence the version
jump... hope it does now what it's supposed to do.

I hope this version can now stand for some time, I think I'll do some
more tweaking for speed but the results should be good as it is. Sorry
for being a bit frantic over the last few days, thanks for your patience.

Note that stitches and fuses by pressing keys (U, P, E) from the viewer
are artificially slowed down to leave performance for continued viewing,
if you don't like that, use --snapshot_threads=... to increase the
number of threads from the default of 1 (maybe a bit too low...). Batch
jobs use as many threads as the machine has cores.

Kay

Kay F. Jahnke

unread,
Mar 12, 2021, 5:50:09 AM3/12/21
to hugi...@googlegroups.com
Dear all!

I've just released pv 1.0.7, which now has code to stitch and
exposure-fuse images with alpha channel as well. This makes the feature
pretty complete. Stitching and fusing alpha channel images should do
'the obvious' and look similar to the 'live stitching' and 'live HDR
blending' which pv displays on-screen. The alpha channel handling is
quite elaborate: instead of simply discarding all pixels which don't
have full opacity, I make an attempt of mathematically correct alpha
blending. The code is fresh and not well-tested, check it out, comments,
as ever, are welcome.

If your images don't have an alpha channel, but you want to use
alpha-only features like --facet_fade_out, just start pv with
--alpha=yes, it will add an alpha channel to the images and produce
output with alpha channel if your output format supports it (like TIFF
and EXR).

I've also made a new bundle for windows, which you can download here:

https://www.magentacloud.de/lnk/f5iohtoR

Password: hugin-ptx

Kay

Harry van der Wolf

unread,
Mar 12, 2021, 1:14:46 PM3/12/21
to hugi...@googlegroups.com
And a macos bundle with 1.0.7 version which you can download here:

Harry

Op vr 12 mrt. 2021 om 11:50 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:
--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.

Kay F. Jahnke

unread,
Mar 12, 2021, 3:28:16 PM3/12/21
to 'Kay F. Jahnke' via hugin and other free panoramic software
Am 12.03.21 um 11:49 schrieb 'Kay F. Jahnke' via hugin and other free
panoramic software:
> Dear all!
>
> I've just released pv 1.0.7, which now has code to stitch and
> exposure-fuse images with alpha channel as well
> ...

Sorry, I pushed a buggy version: the non-alpha stitching and exposure
fusion code went into an endless loop. I've fixed that and updated all
branches in the repo.

Kay F. Jahnke

unread,
Mar 13, 2021, 4:16:20 AM3/13/21
to hugi...@googlegroups.com
To whet your appetite: a little script to automatically exposure-fuse
exposure brackets with hugin tools and pv. Some of you have asked how pv
can be made to fit into the hugin toolset, and this is an example, where
it can already fit in quite well; in this example it will replace the
use of first nona, then enfuse, and produce an exposure fusion straight
from a freshly-generated PTO file. Save the script to 'pv_fuse.sh' and
make it executable.

#! /bin/bash

# pv_fuse.sh - exposure-fuse an exposure bracket

# Assuming you have a set of bracketed shots in your present
# working directory, this is a bash script to automatically create
# a PTO file 'bracket.pto', register the images, and then
# exposure-fuse the images in full size with pv to an openEXR file,
# creating output named bracket.fused.exr. Call this script with the
# set of images you want to fuse, like

# pv_fuse.sh IMG1.JPG IMG2.JPG IMG3.JPG

# first make a pto file 'bracket.pto' and register the images:

pto_gen -p0 -o bracket.pto "$@"
pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto
cpfind --sieve2size=5 -o bracket.pto bracket.pto
autooptimiser -p -o bracket.pto bracket.pto
pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto

# now process it with pv - note that you'll have to replace
# '/path/to/pv' to where the pv binary resides on your system.

/path/to/pv -W \
--blending=hdr \
--snapshot_like_source=yes \
--snapshot_facet=1 \
--snapshot_extension=exr \
--aeb_auto_brightness=yes \
--snapshot_prefix=bracket \
--next_after_fusion=yes \
bracket.pto

##### end of script file


Here's an explanation of the parameters passed to pv:

-W

show the image set which is processed in a Window, not full-screen

--blending=hdr

while the image shows during processing, display it with HDR blending.
This is only cosmetic.

--snapshot_like_source=yes

create output with the same size, orientation and projection as one of
the images in the PTO file

--snapshot_facet=1

namely, the image #1 - pv counts from zero, so this is the *second* one;
with my canon camera, it's the darkest of the lot.

-snapshot_extension=exr

The default would be to create a JPG, but I like EXR. pick your own choice!

--aeb_auto_brightness=yes

The scripts generating the PTO file do assign Ev values gleaned from the
images, but this option tells pv to look at them again and figure out
the relative brightness, just in case.

--snapshot_prefix=bracket

When saving the result to disk, use 'bracket' as it's base name. In this
example, it will come out as bracket.fused.exr

--next_after_fusion=yes

After doing an exposure fusion from the current set of images, proceed
to the next image or PTO file in line. Here, there is no other input
lined up, so pv will terminate

And that's it! Enjoy!

A hint: pv is good to *display* EXR files as well, in case you've
wondered how to look at the result ;)

Kay

Harry van der Wolf

unread,
Mar 13, 2021, 7:58:49 AM3/13/21
to hugi...@googlegroups.com
Thanks Kay,

I use a likewise script but then using align_image_stack from the hugin tools (and optimised the pv command with your command. Thanks).
I replace the lines
pto_gen -p0 -o bracket.pto "$@"
pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto
cpfind --sieve2size=5 -o bracket.pto bracket.pto
autooptimiser -p -o bracket.pto bracket.pto
pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto


(Note: I changed "pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto" to "pano_modify -p0 --fov=AUTO --canvas=AUTO -o bracket.pto bracket.pto" in the script.)

with one line:
align_image_stack --gpu -p bracket.pto -v "$@"

when using "time pv_fuse.sh" or "time ais_pv_fuse.sh" (my align_image_stack version), it is twice as fast and with my preliminary tests I do not find differences.
Next to that: I use the same script with enfuse, meaning align_image_stack followed by enfuse. Enfuse is currently faster than pv and does not entirely lock my laptop like pv does. However, the "big wait" is in the panotools. I don't care about the speed difference between pv and enfuse.
Note also the max amount of memory used by pv versus align_image_stack/enfuse where ais takes the 0.98GB and enfuse approx. 500MB. All runs using the same five 9.1 megapixel bracketed images and all creating a jpg instead of an (open)exr image.

Some figures:
pv_fuse (using the pano tools and pv)
real 2m16,020s
user 6m3,251s
sys 0m17,520s
3.7GB

ais_pv_fuse.sh (using align_image_stack and pv)
real 1m22,573s
user 4m2,008s
sys 0m16,343s
3.7GB

enfuse.sh (using align_image_stack and enfuse)
real 0m53,653s
user 1m20,007s
sys 0m4,532s
0.98GB

Both pv and enfuse use all 4 cores in my laptop where pv, for whatever reason, almost completely blocks my laptop.
I will do further tests.

best,
Harry

Kornel Benko

unread,
Mar 13, 2021, 9:30:57 AM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 15:11:23 +0100
schrieb Harry van der Wolf <hvd...@gmail.com>:

> Sorry,
>
> I made the patch on my Mac and forgot about the changes for Linux 18.04
>
> See attached patch 2 including the changes of the previous patch as well.
> Do not apply the previous "mac_exiv2_bundle.patch", but only this
> "mac_exiv2_bundle_2.patch"
>

We have to wait for Kay anyway. He is ATM the only on I know of who can commit.

Applying the patch locally I got errors.

patching file CMakeLists.txt
patching file CMakeModules/FindEXIV2.cmake
can't find file to patch at input line 163
Perhaps you used the wrong -p or --strip option?
The text leading up to this was:
--------------------------
|diff --git a/scripts/make_bundle.sh b/scripts/make_bundle.sh
|index 16f73d8..40cd80c 100755
|--- a/scripts/make_bundle.sh
|+++ b/scripts/make_bundle.sh
--------------------------
File to patch:

.................................
Here, the scripts directory does (of course) not exist.

Apart from that, pv is still compilable. Thanks for the patch.
(Ask Kay to include your scripts dir)

Kornel

Harry van der Wolf

unread,
Mar 13, 2021, 10:07:14 AM3/13/21
to hugi...@googlegroups.com
I'm sorry.
The linux version is the master branch.
Mac works on the origin/mac branch.

So I should actually create 2 patches.
I simply acted too quickly and combined the two.

So actually the first patch should go on the mac branch, and I should make a second patch for the master branch.
Or better: Change the makefile in the master branch so that it checks whether it runs on Apple, gnu linux or win32/win64, and then remove the mac branch.

And in the CMakeLists.txt file we should have at line 38 something like
# prevent in-tree building
if("${CMAKE_SOURCE_DIR}" STREQUAL "${CMAKE_BINARY_DIR}")
    message(FATAL_ERROR "In-source builds are not allowed.")
endif()



Best,
Harry



Harry


Op za 13 mrt. 2021 om 15:30 schreef Kornel Benko <Kornel...@berlin.de>:
--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.

Kornel Benko

unread,
Mar 13, 2021, 10:15:25 AM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 16:06:58 +0100
schrieb Harry van der Wolf <hvd...@gmail.com>:

> I'm sorry.
> The linux version is the master branch.
> Mac works on the origin/mac branch.
>
> So I should actually create 2 patches.
> I simply acted too quickly and combined the two.
>
> So actually the first patch should go on the mac branch, and I should make
> a second patch for the master branch.
> Or better: Change the makefile in the master branch so that it checks
> whether it runs on Apple, gnu linux or win32/win64, and then remove the mac
> branch.
>
> And in the CMakeLists.txt file we should have at line 38 something like
> # prevent in-tree building
> if("${CMAKE_SOURCE_DIR}" STREQUAL "${CMAKE_BINARY_DIR}")
> message(FATAL_ERROR "In-source builds are not allowed.")
> endif()
>

Good hint. I never liked the in-source builds anyway.

> Best,
> Harry
>
>
>
> Harry
>
Kornel

Harry van der Wolf

unread,
Mar 13, 2021, 10:29:26 AM3/13/21
to hugi...@googlegroups.com


Op vr 12 mrt. 2021 om 21:28 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

Sorry, I pushed a buggy version: the non-alpha stitching and exposure
fusion code went into an endless loop. I've fixed that and updated all
branches in the repo.

I've also made a new bundle for windows, which you can download here:

https://www.magentacloud.de/lnk/f5iohtoR

Password: hugin-ptx

Kay


And also a new mac bundle, which you can download from here:

Harry
 

Kay F. Jahnke

unread,
Mar 13, 2021, 11:20:27 AM3/13/21
to hugi...@googlegroups.com
Am 13.03.21 um 13:58 schrieb Harry van der Wolf:
>
> I use a likewise script but then using align_image_stack from the hugin
> tools (and optimised the pv command with your command. Thanks).
> I replace the lines
> pto_gen -p0 -o bracket.pto "$@"
> pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto
> cpfind --sieve2size=5 -o bracket.pto bracket.pto
> autooptimiser -p -o bracket.pto bracket.pto
> pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto bracket.pto
>
> (Note: I changed "pano_modify -p0 --fov=AUTO --canvas=AUTO -obracket.pto
> bracket.pto" to "pano_modify -p0 --fov=AUTO --canvas=AUTO -o bracket.pto
> bracket.pto" in the script.)
>
> with one line:
> align_image_stack --gpu -p bracket.pto -v "$@"

This should come out just the same. Using your way would have been more
instructive in my example - I should have thought of that. I just copied
the script I had used to batch-process my brackets from last summer.

> when using "time pv_fuse.sh" or "time ais_pv_fuse.sh" (my
> align_image_stack version), it is twice as fast and with my
> preliminary??tests I do not find differences.
> Next to that: I use the same script??with enfuse, meaning
> align_image_stack followed by enfuse. Enfuse is currently faster than pv
> and does not entirely lock my laptop like pv does. However, the "big
> wait" is in the panotools. I don't care about the speed difference
> between pv and enfuse.

You can throttle pv by passing --snapshot_threads=...
The default is to use as many threads as there are cores on the machine.
If you want a well-behaved background job and the time it takes is not
an issue, pass 1.

> Note also the max amount of memory used by pv versus
> align_image_stack/enfuse where ais takes the 0.98GB and enfuse approx.
> 500MB. All runs using the same five 9.1 megapixel bracketed images and
> all creating a jpg instead of an (open)exr image.

One reason why pv uses a lot of memory is because it builds a b-spline
interpolator for *magnifying* views. If your job requires only 1:1 or
less, try passing --build_pyramids=no, which should reduce the memory
footprint and be faster as well. If you want to reduce memory even
further and your job is 1:1 (like here), you can even add
--build_raw_pyramids=no.

And you can also try and omit the separate 'quality' interpolator by
passing ---quality_interpolator_degree=1 - the default of three is to
give you nice magnifications for still images up to single pixels blown
up to screen size :D

But I do admit pv is memory-hungry. There has to be a lot of stuff ready
to use for smooth animations - pan and zoom without stutter and the
likes. pv is - at heart - a viewer, and the stitching and fusing
capabilites are for now just nice-to-have additions.

The next point to consider is mathematics: if I remember correctly,
enfuse uses a highly optimized integer-math implementation of the B&A
image splining algorithm - and it may even offload work to the GPU,
which would explain why it blocks up your system less - pv is CPU only,
and does all calculations in single-precision float, and with pyramid
levels based on cubic b-splines. That's a lot of CPU load, with the SIMD
units running on all threads you put to use.

> Some figures:
> pv_fuse (using the pano tools and pv)
> real 2m16,020s
> user 6m3,251s
> sys 0m17,520s
> 3.7GB
>
> ais_pv_fuse.sh (using align_image_stack and pv)
> real 1m22,573s
> user 4m2,008s
> sys 0m16,343s
> 3.7GB
>
> enfuse.sh (using align_image_stack and enfuse)
> real 0m53,653s
> user 1m20,007s
> sys 0m4,532s
> 0.98GB

So, for now, pv won't get much faster, but try again with the
memory-savers I proposed above to see if you can squash memory use
significantly. And also keep in mind that I haven't spent a lot of time
tweaking the fusion code - I have it working all right, but especially
the version processing images with alpha channel still has room for
improvement, and I should be able to squash memory use further.

> Both pv and enfuse use all 4 cores in my laptop where pv, for whatever
> reason, almost completely blocks my laptop.

Try throttling, as explained above.

> I will do further tests.

Thanks for your continued interest!

What I'd also like to see is a close look at image quality. I put all
efficiency considerations aside and went for the best image quality I
could produce. And my new alpha-blending code should also produce very
nice results when it comes to manual deghosting - I don't look at the
masks in the PTO yet, but you should now be able to simply feed images
with unwanted content 'erased' to transparency (should even be best with
a feathered brush on the erase tool), and as long as the other images
provide content, the 'ghosts' should become invisible, thanks to the B&A
magic.

Kay

Kay F. Jahnke

unread,
Mar 13, 2021, 11:51:38 AM3/13/21
to hugi...@googlegroups.com
Sorry, @Harry, I accidentally answered to you only, here's to the list:

Am 13.03.21 um 16:06 schrieb Harry van der Wolf:

> The linux version is the master branch.
> Mac works on the origin/mac branch.

@ Harry, Kornel

Yeah, I figured the mac-only stuff would best be placed in the mac
branch, because it means nothing to the other branches, whereas the
CMakeLists.txt is multi-platform, so I put it in master.

Kornel proposed to keep the build files updated by doing ordinary
diff -u diffs against single files, which I would find easier than
patches against the whole repo or branches of it - so far the number of
files affected is quite low. I only touched the CMakeLists.txt yesterday
to put in the copyright header and some initial comments and I don't
expect to have too much to contribute to it, so we shouldn't run into
messy conflicts.

Is that okay with you guys?

> So I should actually create 2 patches.
> I simply acted too quickly and combined the two.
>
> So actually the first patch should go on the mac branch, and I should
> make a second patch for the master branch.
> Or better: Change the makefile in the master branch so that it checks
> whether it runs on Apple, gnu linux or win32/win64, and then remove the
> mac branch.

So... which is what and what should I put in the repo now??

> And in the CMakeLists.txt file we should have at line 38 something like
> # prevent in-tree building
> if("${CMAKE_SOURCE_DIR}" STREQUAL "${CMAKE_BINARY_DIR}")
> ?? ?? message(FATAL_ERROR "In-source builds are not allowed.")
> endif()

You might as well simply trust the user. Most people who've come so far
as doing their own builds should know that in-tree builds aren't cool. I
think it's best to keep it simple, no need to regulate everything. Just
my personal opinion...

Kay

Kornel Benko

unread,
Mar 13, 2021, 12:18:42 PM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 17:51:26 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:

> Sorry, @Harry, I accidentally answered to you only, here's to the list:
>
> Am 13.03.21 um 16:06 schrieb Harry van der Wolf:
>
> > The linux version is the master branch.
> > Mac works on the origin/mac branch.
>
> @ Harry, Kornel
>
> Yeah, I figured the mac-only stuff would best be placed in the mac
> branch, because it means nothing to the other branches, whereas the
> CMakeLists.txt is multi-platform, so I put it in master.
>
> Kornel proposed to keep the build files updated by doing ordinary
> diff -u diffs against single files, which I would find easier than
> patches against the whole repo or branches of it - so far the number of
> files affected is quite low. I only touched the CMakeLists.txt yesterday
> to put in the copyright header and some initial comments and I don't
> expect to have too much to contribute to it, so we shouldn't run into
> messy conflicts.
>
> Is that okay with you guys?

Yes from my side.

> > So I should actually create 2 patches.
> > I simply acted too quickly and combined the two.
> >
> > So actually the first patch should go on the mac branch, and I should
> > make a second patch for the master branch.
> > Or better: Change the makefile in the master branch so that it checks
> > whether it runs on Apple, gnu linux or win32/win64, and then remove the
> > mac branch.
>
> So... which is what and what should I put in the repo now??
>
> > And in the CMakeLists.txt file we should have at line 38 something like
> > # prevent in-tree building
> > if("${CMAKE_SOURCE_DIR}" STREQUAL "${CMAKE_BINARY_DIR}")
> > ?? ?? message(FATAL_ERROR "In-source builds are not allowed.")
> > endif()
>
> You might as well simply trust the user. Most people who've come so far
> as doing their own builds should know that in-tree builds aren't cool. I
> think it's best to keep it simple, no need to regulate everything. Just
> my personal opinion...

I don't trust. The messy things start if one tries to use cmake _and_ automake-build in
source. Better to omit the mess, less things that may go wrong.

> Kay

Kornel

Kay F. Jahnke

unread,
Mar 13, 2021, 1:12:43 PM3/13/21
to hugi...@googlegroups.com
> ...

>>> So I should actually create 2 patches.
>>> I simply acted too quickly and combined the two.

I sucked in the patch best I could and pushed --all, have a look if
you're both happy with the state of master and mac branches in the repo
now. and let's do single-file patches from now on, save me headaches...

Kay

Harry van der Wolf

unread,
Mar 13, 2021, 1:13:14 PM3/13/21
to hugi...@googlegroups.com


Op za 13 mrt. 2021 om 18:18 schreef Kornel Benko <Kornel...@berlin.de>:
Am Sat, 13 Mar 2021 17:51:26 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"

> Is that okay with you guys?

Yes from my side.


OK from my side as well.

 

> You might as well simply trust the user. Most people who've come so far
> as doing their own builds should know that in-tree builds aren't cool. I
> think it's best to keep it simple, no need to regulate everything. Just
> my personal opinion...

I don't trust. The messy things start if one tries to use cmake _and_ automake-build in
source. Better to omit the mess, less things that may go wrong.


I don't even trust myself for that matter. When I was working on the Apple I worked in the main folder and when ready modifying the CMakeLists.txt and adding the CMakeModules folder and content, I tried to do a "cmake .." which of course didn't work, but an error is easily made.
And it doesn't bother the user at all having it in the script. And if the user does it wrong, he/she is immediately educated why he/she should not do that. Keep things neat and tidy I would say.

Please find attached 2 patches.
The cmake.patch is universal and can be applied in all branches. It contains fixes to the CMakeLists.txt and the CMakeModules (and the out-of-tree build)
The mac_bundle.patch is only for the mac branch.
Both patches were made with "diff -u" and created from the root source directory.

make_bundle.sh.patch
cmake.patch

Harry van der Wolf

unread,
Mar 13, 2021, 1:14:25 PM3/13/21
to hugi...@googlegroups.com


Op za 13 mrt. 2021 om 19:12 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

I sucked in the patch best I could and pushed --all, have a look if
you're both happy with the state of master and mac branches in the repo
now. and let's do single-file patches from now on, save me headaches...

Kay


ohoh :(
Crossing mails!

Harry 

Kornel Benko

unread,
Mar 13, 2021, 3:33:24 PM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 19:12:32 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:

Missing changes to LICENSE file.

Substitute all double-quotes with single-quotes to fit in the cpack-variable
CPACK_RPM_PACKAGE_LICENSE.

Kornel
LICENSE.patch

Kornel Benko

unread,
Mar 13, 2021, 3:44:27 PM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 21:33:18 +0100
schrieb Kornel Benko <Kornel...@berlin.de>:
Changing the single-quote in brief_description.txt to dash.
The output of 'dpkg -s pv' is better readable.
Compare
pv can show -normal- images, panoramas in various formats
with
pv can show &apos;normal&apos; images, panoramas in various formats

Kornel
brief_description.txt.patch

Kornel Benko

unread,
Mar 13, 2021, 3:47:31 PM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 21:44:23 +0100
schrieb Kornel Benko <Kornel...@berlin.de>:

> Am Sat, 13 Mar 2021 21:33:18 +0100
> schrieb Kornel Benko <Kornel...@berlin.de>:
>
> > Am Sat, 13 Mar 2021 19:12:32 +0100
> > schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
> > <hugi...@googlegroups.com>:
> >
> > > > ...
> > >
> > > >>> So I should actually create 2 patches.
> > > >>> I simply acted too quickly and combined the two.
> > >
> > > I sucked in the patch best I could and pushed --all, have a look if
> > > you're both happy with the state of master and mac branches in the repo
> > > now. and let's do single-file patches from now on, save me headaches...
> > >
> > > Kay
> > >
> >

Change variable name CPACK_RPM_PACKAGE_LICENSE to more general CPACK_PACKAGE_LICENSE.

Kornel

CMakeLists.txt.patch

Kay F. Jahnke

unread,
Mar 13, 2021, 4:17:11 PM3/13/21
to hugi...@googlegroups.com
Kornel, I applied both your patches to master.

Kay

Kornel Benko

unread,
Mar 13, 2021, 4:22:20 PM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 21:47:26 +0100
schrieb Kornel Benko <Kornel...@berlin.de>:

> Am Sat, 13 Mar 2021 21:44:23 +0100
> schrieb Kornel Benko <Kornel...@berlin.de>:
>
> > Am Sat, 13 Mar 2021 21:33:18 +0100
> > schrieb Kornel Benko <Kornel...@berlin.de>:
> >
> > > Am Sat, 13 Mar 2021 19:12:32 +0100
> > > schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
> > > <hugi...@googlegroups.com>:
> > >
> > > > > ...
> > > >
> > > > >>> So I should actually create 2 patches.
> > > > >>> I simply acted too quickly and combined the two.
> > > >
> > > > I sucked in the patch best I could and pushed --all, have a look if
> > > > you're both happy with the state of master and mac branches in the repo
> > > > now. and let's do single-file patches from now on, save me headaches...
> > > >
> > > > Kay
> > > >
> > >
>

Try to omit in-source build.

Kornel
CMakeLists.txt.2.patch

Kornel Benko

unread,
Mar 13, 2021, 4:25:00 PM3/13/21
to hugi...@googlegroups.com
Am Sat, 13 Mar 2021 22:16:59 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:

> Kornel, I applied both your patches to master.
>
> Kay
>

Thanks Kay.

Kornel

Kay F. Jahnke

unread,
Mar 14, 2021, 3:07:14 AM3/14/21
to hugi...@googlegroups.com
commit 9797ba3a5ed67ba8a7edb8bbdd54ee07caac5de4 (HEAD -> master,
origin/master, origin/HEAD)
Date: Sun Mar 14 08:00:47 2021 +0100

two more patches from Kornel modifying CMakeLists.txt:

Change variable name CPACK_RPM_PACKAGE_LICENSE to more general
CPACK_PACKAGE_LICENSE.

Try to omit in-source build.

Kay

Harry van der Wolf

unread,
Mar 14, 2021, 7:04:39 AM3/14/21
to hugi...@googlegroups.com


Op za 13 mrt. 2021 om 17:20 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

 If your job requires only 1:1 or
less, try passing --build_pyramids=no, which should reduce the memory
footprint and be faster as well. 
 
And you can also try and omit the separate 'quality' interpolator by
passing ---quality_interpolator_degree=1 - the default of three is to
give you nice magnifications for still images up to single pixels blown
up to screen size :D

Using 
             --build_pyramids=no \
             --quality_interpolator_degree=1 \

makes it faster and decreases memory from 3.7GB to 3.2GB. It also helps to not completely block my laptop.

I also used the new --snapshot_threads and set it to 3.
However, pv still uses all 4 cores/threads and when I "time" the command it is exactly the same as without this parameter. 
 
The next point to consider is mathematics: if I remember correctly,
enfuse uses a highly optimized integer-math implementation of the B&A
image splining algorithm - and it may even offload work to the GPU,
which would explain why it blocks up your system less - pv is CPU only,
and does all calculations in single-precision float, and with pyramid
levels based on cubic b-splines. That's a lot of CPU load, with the SIMD
units running on all threads you put to use.


enfuse/enblend uses OpenMP.
 
 I don't look at the
masks in the PTO yet, but you should now be able to simply feed images
with unwanted content 'erased' to transparency (should even be best with
a feathered brush on the erase tool), and as long as the other images
provide content, the 'ghosts' should become invisible, thanks to the B&A
magic.

I do not entirely understand what you mean. Do you mean I should erase some unwanted part in one of my bracketed images? or does pv that? Or do you have to do that in the live view?
This is really interesting. I personally like sunrises and sunsets. You mostly need bracketed sets to get a good dynamic range. 
They sometimes contain unwanted parts or enfuse (up til now my used program) does not pick the part I want. I sometimes fall back on hugin for its use of masks. (enfuse does understand masks, but I do have to create them in hugin so I can just as well finish in hugin)

So: Do you have such images (with pto) as example that shows this functionality? 

Harry


Harry van der Wolf

unread,
Mar 14, 2021, 9:20:41 AM3/14/21
to hugi...@googlegroups.com
Another script for pv for Linux/BSD/MacOS. 

It uses align_image_stack from hugin tools and it also uses exiftool (if you have hugin, you almost certainly also have exiftool)
It works for jpg, tif and exr.
It uses "--build_pyramids=no --quality_interpolator_degree=1" as it is meant for batch use and to decrease memory usage.

if the script is started without parameters, it mentions:
* First give the target file name like "<path>/mypvfused.jpg" or "<path>/mypvfused.tif" or "<path>/mypvfused.exr"
* Then give the input files. That can be as "*.jpg", or as "one.jpg 2.jpg 3.jpg"
* Examples:
* /path/to/pvimagefuser.sh mypvfused.jpg br1.jpg br2.jpg br3.jpg
* Or:
* /path/to/pvimagefuser.sh mypvfused.jpg *.jpg


It uses exiftool to copy all metadata from the first image into the created image. This doesn't work for exr images as exiftool does not recognise these.

See attached. Save it to some location of your liking and do a "(sudo) chmod +x pvimagefuser" (or whatever you want to name it)
It is a rewrite of my enfuse fusing script.

Harry
pvimagefuser

yuv

unread,
Mar 14, 2021, 10:16:01 AM3/14/21
to hugi...@googlegroups.com
On Sun, 2021-03-07 at 18:30 +0100, 'Kay F. Jahnke' via hugin and other
free panoramic software wrote:
> Am 07.03.21 um 14:16 schrieb yuv:
> > On Sun, 2021-03-07 at 09:22 +0000, Bruno Postle wrote:
> > > currently there is a single person doing code maintenance,
> > > collecting translations, _and_ doing the releases - these could
> > > easily be different roles.
> >
> > This! What made Hugin great and successful in years past was
> > [SNIPPED MY OWN DESCRIPTION OF A TEAM]
>
> I felt it was difficult to keep my foot in the door. Sort of
> disheartened. Like, I got stern admonishments rather than friendly
> encouragement.

Sorry for the bad experience. My memory is fading, but I remember
trying to welcome you best as I could. If I did not, please accept my
belated apology.

> I prefer to run 'my own show' now, where I don't step on
> anyone's toes.

Could it be that the current maintainer also prefers to run 'his own
show?'


> I'm not sure if there is the will to rebuild stuff, much less
> sacrifice anything.

Then there is nothing I can help with. With all due respect, I draw
your attention to the similarities between pv and Hugin. Replace
'rebuild stuff' with 'build team.' pv and Hugin are at two different
life-stages, but prevented from reaching the next level by two dominant
personalities who, for different reasons, are not willing to make
compromises and prefer running 'their own shows.' Just acknowledging
this is a major step forward, but it is not enough. The personality
has to be willing to sacrifice something. Then we can speak about
wasteful vs purposeful sacrifice.


> I have tried to help keep my python interface afloat [...]

Comments are at a technical level. Need to fix the team level first.
Going technical here is diverting from the problem that asks for a fix.


> Look at the hugin website: 'hugin is now stable'.

anyone believing any software stable or mature are not looking enough
into the future, or they are just making the statement for marketing
purposes. Drinking that statement's Koolaid is fatal. 'stability' is
an illusion. particularly in tech.

> there have been great innovations in the past few years

indeed. the question is whether the community is nurturing these
innovations. the alternative is to let the proprietary model innovate
faster and better. the end-user does not care. the end-user will get
the most affordable functionality to them at the lowest available
price.


> > Who wants to join a dwindling team who does not welcome change?
>
> Good question.


Declaring a project stable is another way to say it does not welcome
change.


> I'd still like to see hugin move [from Sourceforge].

You'll have to discuss that with the current maintainer. Same for the
adoption of git instead of Mercurial. These are operational level
comments, and like your technical level comments I snipped earlier, are
a distraction from the real issue here: governance. 'run my own
show.' The current maintainer got permission to run his own show
because the rest of the team, including me, bailed out and implicitly
supported the show as it was. I cannot speak for the former team-
mates, but I have no wish to 'come back' and confront a 'run my own
show' maintainer only to add a different 'run my own show' maintainer.
I actually have no time to 'come back' at all. I am just saddened by
the fact that what I left behind has rotten. If I was still active in
the project, I would extend you admin access. Oh, but wait, you are
the one who does not want a SourceForge account? I did not like
SourceForge either. But I did get the account, and I did work within
the team to move the bug tracker to Launchpad and to achieve consensus
on replacing Subversion. I did not get to 'run my show' on Subversion,
else we'd be in git. I accepted that back then git was an inferior
experience for those on the team that were on Windows, and compromised.


> Gosh, I did not even see that [F*Book]. Yeah, posting to hugin-ptx
> has not been rewarding for me recently.

The world is changing under the corrupting influence of the fake news
business. However, if posting to a mailing list or publishing software
under a license that encourages distribution and multiplication is done
for the purpose of reward, I strongly encourage self-reflection if not
therapy. Give without expectations, or don't give, for you will be
eaten alive.


> Since you're playing devil's advocate (hah) - I can wrap all
> panotools transformations in vspline

please, do! be an inspiration. this is still technical, and does not
address the governance issues that ultimately lead to the demise of the
community. you may try to create a new community around pv (look, you
got already some traction!) but the limit will be the 'run my own show'
and not operational or technical.


> with pv, I like to 'move fast and break things'.
> Come for the ride!

just reading the past few exchanges, if you had granted Kornel and
Harry write privilege to pv's repository, it would have moved faster.
one of them could have even released, saving your brain cycles for the
inspirational level tech. not sure about the breaking, but that is
what makes the ride interesting. The best is when you grant privilege
and promotes those that are faster and better than you. Which is what
happened with Hugin. The worst is when it turns that thouse who are
faster and better than you neglect the other passengers in the train.
All of a sudden, they find themselves alone in the locomotive 'running
their own show' while the wagon behind have disconnected, one after
another, and gone to other tracks.

I like what I see happening with pv. Up to you to decide if you are
strong enough to water down some of the 'run my own show' in exchange
for 'running together.' A similar reflection will come sooner or later
on the Hugin side. All I can hope is that both will reach that
reflection point in useful time, and a community can be once again
energized to team up. I am happy to facilitate team building, but I
will not be prodding reluctant non-participants that 'run their own
show.'

--
Yuval Levy, JD, MBA, CFA
Ontario-licensed lawyer


Kay F. Jahnke

unread,
Mar 14, 2021, 12:12:16 PM3/14/21
to hugi...@googlegroups.com
Am 14.03.21 um 12:04 schrieb Harry van der Wolf:
>
>
> Op za 13 mrt. 2021 om 17:20 schreef 'Kay F. Jahnke' via hugin and other
>
> I also used the new --snapshot_threads and set it to 3.
> However, pv still uses all 4 cores/threads and when I "time" the command
> it is exactly the same as without this parameter.

Sorry, that was a quick shot: batch jobs always set the number of
threads to the number of cores, the option --snapshot_threads only
affects the number of threads used when the job is launched via the UI
with 'U' or 'Shift+U'. I'll change it so that the option is honoured always.

> I don't look at the
> masks in the PTO yet, but you should now be able to simply feed images
> with unwanted content 'erased' to transparency (should even be best
> with
> a feathered brush on the erase tool), and as long as the other images
> provide content, the 'ghosts' should become invisible, thanks to the
> B&A
> magic.
>
>
> I do not entirely understand what you mean. Do you mean I should erase
> some unwanted part in one of my bracketed??images? or does pv that? Or do
> you have to do that in the live view?

Let me explain in detail.

Sometimes you have things in your images you don't want to see in the
final output. In a stitching/image fusion context these unwanted bits
are often called 'ghosts', and the process of suppressing them so that
they aren't visible is called 'deghosting'. For stitching, this is often
done automatically, and a commonly used method is 'Khan deghosting',
used e.g. by hugin, see
http://hugin.sourceforge.net/docs/html/namespacedeghosting.html

pv does not have automatic deghosting (yet), so if an image of the set
you are fusing/stitching has unwanted bits, you must mask it out 'manually'.

In hugin, you can assign an 'exclusion mask', which results in the
partial image, rendered by nona, coming out transparent where the mask
is set. But pv does not process hugin's mask information (yet), so if
you want parts masked out for processing with pv, you have to 'manually'
make the unwanted bits transparent. You can do that e.g. with gimp: add
an alpha channel to your image if it doesn't have one yet, then pick the
'eraser' tool and select a *soft* brush to 'erase' the unwanted bits
with a bit of 'feathering'. Store the image. If pv reads such an image
(it must be run in alpha mode, so use pv --alpha=yes) it will exclude
the transparent parts - but not totally, instead it will honour the
feathering.

> This is really interesting. I personally like sunrises and sunsets. You
> mostly need bracketed sets to get a good dynamic range.
> They sometimes contain unwanted parts or enfuse (up til now my used
> program) does not pick the??part I want. I sometimes fall back on hugin
> for its use of masks. (enfuse does understand masks, but I do have to
> create them in hugin so I can just as well finish in hugin)

You're right, enfuse does understand masks, in fact I think enfuse
expects all it's input to come with an alpha channel. How you create the
transparency - hugin mask, then nona - or e.g. the gimp using an eraser
- is irrelevant. But enfuse has a limitation (to my knowledge, please
doublecheck): it's either-or for enfuse: if there is the slightest bit
of transparency in an pixel in an incoming image, it will completely
exclude the pixel. And hugin also has a limitation: the masks are
either/or as well: inside the mask is totally transparent, outside
totally opaque.

pv, on the other hand, will accept both images with alpha channel and
images without alpha channel. And if there is an alpha channel, it will
make an attempt at correct alpha blending, so you can fade stuff out (a
process called feathering or mask feathering), rather than having a
sharp discontinuity from totally opaque to totally ignored. This should
result in better output, because discontinuities are usually bad,
especially in large-scale gradients like the blue sky, where
discontinuities are very noticeable.

Because pv uses (hopefully) correct alpha blending, the results should
be very good, and masked-out bits should be even less apparent than with
other techniques. This is where I'd like feedback, just to see if my
hopes to have implemented a higher-quality image blending method are
justified. It should even be quite possible to do more interesting
things with the alpha channel than just erasing unwanted bits - in pv,
the quality masks for the B&A image splining algorithm are *weighted*
with the alpha channel, so you might put whatever content you want into
the alpha channel to make things appear 'less strongly' where the alpha
channel is 'less opaque'. This should open up a whole new set of
possibilities, but we can discuss this some other time.

> So: Do you have such images (with pto) as example??that??shows??this
> functionality?

Sorry, not just right at hand, but I think you get my drift. If I find
the time, maybe I can come up with an example, please bear with me.

Kay

Kay F. Jahnke

unread,
Mar 14, 2021, 12:19:52 PM3/14/21
to hugi...@googlegroups.com
Am 14.03.21 um 14:20 schrieb Harry van der Wolf:
> Another script for pv for Linux/BSD/MacOS.
>
> It uses align_image_stack from hugin tools and it also uses exiftool (if
> you have hugin, you almost certainly also have exiftool)

sure do ;)

> ...

> It uses exiftool to copy all metadata from the first image into the
> created image. This doesn't work for exr images as exiftool does not
> recognise these.

openEXR does not support EXIF metadata. That's indeed an annoying
shortcoming of openEXR format. I intend to implement a work-around for
it. You may have noticed that when storing to other formats (like TIFF,
JPG, PNG) pv assigns projection and FOV metadata, so if you read pv's
output into pv again, you get the correct display automatically. So what
I'll do is to store a pv 'ini file' with the relevant information and a
reference to the EXR file. Then you can open the ini file and you'll get
the correct display. Like an XMP sidecar for raw processing which has
the 'development' info.

> See attached. Save it to some location of your liking and do a "(sudo)
> chmod +x pvimagefuser" (or whatever you want to name it)
> It is a rewrite of my enfuse fusing script.

I'll check it out soon, thanks for sharing!

Kay

Kay F. Jahnke

unread,
Mar 14, 2021, 2:23:15 PM3/14/21
to hugi...@googlegroups.com
Am 14.03.21 um 15:15 schrieb yuv:
> On Sun, 2021-03-07 at 18:30 +0100, 'Kay F. Jahnke' via hugin and other
>>
>> I felt it was difficult to keep my foot in the door. Sort of
>> disheartened. Like, I got stern admonishments rather than friendly
>> encouragement.
>
> Sorry for the bad experience. My memory is fading, but I remember
> trying to welcome you best as I could. If I did not, please accept my
> belated apology.

I sure don't mean *you*. In fact I'm grateful for the pointers you've
given me and enjoyed our exchange (there was a lot of that wasn't
there?). I think I wouldn't be where I am now as a software developer if
it weren't for the odd nudge you gave me.

>> I prefer to run 'my own show' now, where I don't step on
>> anyone's toes.
>
> Could it be that the current maintainer also prefers to run 'his own
> show?'

I won't do any guesswork. I'll just go ahead with my thing for now, and
if it turns out people like it, we'll see what becomes of it. I'm happy
to share and collaborate; with pv it's just that I am the driving force,
and I accept that role for the time being. And if other projects are
interested enough in my techniques to consider adopting something, I'm
the last to turn them away.

>> I'm not sure if there is the will to rebuild stuff, much less
>> sacrifice anything.
>
> Then there is nothing I can help with. With all due respect, I draw
> your attention to the similarities between pv and Hugin. Replace
> 'rebuild stuff' with 'build team.' pv and Hugin are at two different
> life-stages, but prevented from reaching the next level by two dominant
> personalities who, for different reasons, are not willing to make
> compromises and prefer running 'their own shows.' Just acknowledging
> this is a major step forward, but it is not enough. The personality
> has to be willing to sacrifice something. Then we can speak about
> wasteful vs purposeful sacrifice.

pv *is* my own show - this is not really a choice I've made, but it's
because, until recently, no-one has opted to join in. I haven't put much
effort in getting people to join, admittedly, and for a good reason, I
think:

When it comes to things panoramic, pv is a complete rewrite. Of course
I've drawn on panotools stuff - some of panotools is very cleverly done
indeed, so why not - but it's my opinion that to conceive of a complex
new software, one best does it alone or only with a very small close-nit
team. I did it alone, and so I have an image of the whole thing present
in my mind and I can change and rearrange quite effortlessly. Getting
other people in requires a communication overhead, which I'm currently
unwilling to expend for the core development. But I'm happy to help
making pv - as it is - more accessible, and I think that's happening. I
first wanted something to show, rather than just talk about stuff. So
that's where I am now, and when the software becomes more easily
available, it will turn out whether what I have to show is as attractive
as it seems to me.

If you want to help me, and pv, please simply *use it* and *share your
thoughts on it*. And, if you like it, *share it with others* who you
think can benefit from it. I think it's good enough to attract people by
it's virtues, but if it's not seen, nobody will know about them.

When it comes to team-building, that should come naturally. I'm not in a
rush to get anywhere - for now, I'm simply sharing what I came up with
when I tried to make 'the panorama and image viewer I always wanted'. I
don't have some sort of agenda.
>> I have tried to help keep my python interface afloat [...]
>
> Comments are at a technical level. Need to fix the team level first.
> Going technical here is diverting from the problem that asks for a fix.

I don't mean to go technical on you, this is merely a statement of the
extent of my involvement with *hugin* right now. My presence on
hugin-ptx is more for the second part of it's longer title: 'other free
panoramic software'. I see hugin-ptx as a platform to discuss more than
just hugin, so that's where I feel I fit in, and I take the liberty to
post and exchange views on hugin-ptx which may have not much to do with
hugin - like pv.

>> I'd still like to see hugin move [from Sourceforge].
>
> You'll have to discuss that with the current maintainer.
> ...

It's really just a minor itch, and I won't scratch it. I'd actually be
more excited to see someone fork hugin, maybe with some fresh ideas. I'm
not the one.

I don't know who the current maintainer is, and I'm not really involved
with hugin anymore. I keep a clone of the repo in my source tree and
pull every now and then, and sometimes I install the package - mostly
when I do system upgrades. After pulling and building, or after
installing, I usually need a few fixes to get it to run so I can use it.
I see it evolves very slowly now, but I can live with that. Lately I
tried to help a bit, when there was a problem with my contribution,
which I saw by chance, but that was about it.

>> Gosh, I did not even see that [F*Book]. Yeah, posting to hugin-ptx
>> has not been rewarding for me recently.
>
> The world is changing under the corrupting influence of the fake news
> business. However, if posting to a mailing list or publishing software
> under a license that encourages distribution and multiplication is done
> for the purpose of reward, I strongly encourage self-reflection if not
> therapy. Give without expectations, or don't give, for you will be
> eaten alive.

I don't quite get you. Why do I post to a mailing list? Not to just
blare out whatever I have to say, but also to get an echo - preferably
from what one might think of as the 'peer group'. That's the reward I'm
talking about - exchange of thoughts, of new bits and bobs, hints,
opinions, dev talk. I'm not interested in money for my project, if
that's what you think I mean with 'reward'. I share my work out of a
sense of kinship with the free software world, which has been good to me
and enriched my life immensely. I want to give something back - after
making software for 35 years, I feel it's time for my 'masterpiece', and
I want to give it to everyone who can benefit from it. And my
masterpiece is *not* pv, it's vspline. But that is quite beyond most
people - it combines several novel techniques, and each relies on the
other, so it's hard to understand. pv is the showcase to demonstrate
vspline's viability in a field which is not too esoteric - namely image
processing. And it's been a source of inspiration to find bits that
vspline is missing.

What I dislike most about the whole social media business is that there
seems to be no more search for relevance. No-one will 'like' my software
within their limited attention span, and the search engines amplify this
effect by being blind to 'deep' insight. I have a good example for what
I mean: I publish some of my panoramas on 360cities. 360cites shares -
or shared, I didn't check recently - with google earth. Now I do full
sphericals, often from passes and peaks, with views often spanning
hundreds of square kilometres. The images are relevant to a large area.
But because few people have clicked on them so far, you have to
'approach' the point where I took them very closely and zoom in a lot,
before the little icon pops up - often in some inhospitable place in the
middle of nowhere, where no-one thinks of going because there is
'nothing' there, so the panorama is never seen - and therefore never
up-ranked. Whereas some irrelevant cute-puppy-image down in the valley
may be spotted from 'outer space' if you get my drift. So clearly there
is no notion of relevance in the ranking algorithm. Try and find pv in a
search engine!

>> Since you're playing devil's advocate (hah) - I can wrap all
>> panotools transformations in vspline
>
> please, do! be an inspiration. this is still technical, and does not
> address the governance issues that ultimately lead to the demise of the
> community. you may try to create a new community around pv (look, you
> got already some traction!) but the limit will be the 'run my own show'
> and not operational or technical.

Thank you for the advice, and I'll keep it in mind. See, I'm already
trying to show people how they can 'slot in' pv instead of other tools,
I'll just take my time 'sucking in' more functionality - I don't want to
overextend myself. And there are technical reasons: I still have the
python thing at the back of my head, and currently interfacing with pv
is a bit awkward, it's mainly CL stuff and a bit of (G)UI. I'd like to
eventually open pv up to become a platform, with plugins slotting in to
provide all sorts of services, like projections, filters, I/O - I just
got carried away in the last one or two years when I drifted into
synoptic displays of several images: I started with cubemaps, and all of
the sudden all sorts of interesting new possibilities opened up - all
the way to my recent implementation of the B&A algorithm. So the UI and
python aspect was postponed. I got very inspired myself and happily let
myself be carried away. I do in fact have a lot of fun with pv ;)
>> with pv, I like to 'move fast and break things'.
>> Come for the ride!
>
> just reading the past few exchanges, if you had granted Kornel and
> Harry write privilege to pv's repository, it would have moved faster.

As you say, bitbucket is a bit 'exotic'. If people so desire, they can
simply do forks and pull requests, as is the state of the art. I don't
have to grant anyone repo access for that. And I'd sure like to do
exchange on bitbucket's issue tracker rather than cluttering hugin-ptx
when it comes to technical stuff, but I have to pick up people where
they are - if they want to send in patches, that's fine with me even if
I have to suck them in manually. I'm in no rush. If I get bored applying
diffs, we'll think of something I'm sure.

> one of them could have even released, saving your brain cycles for the
> inspirational level tech. not sure about the breaking, but that is
> what makes the ride interesting. The best is when you grant privilege
> and promotes those that are faster and better than you. Which is what
> happened with Hugin. The worst is when it turns that thouse who are
> faster and better than you neglect the other passengers in the train.
> All of a sudden, they find themselves alone in the locomotive 'running
> their own show' while the wagon behind have disconnected, one after
> another, and gone to other tracks.

One of my aims in life is to achieve a high level of vertical
integration. It's good sometimes to be forced to see other aspects of
the process than just the 'inspirational level', even though it may not
be so 'glorious'. And it makes me humble, because I see all sort of
things which I can't yet do or do well.

> I like what I see happening with pv. Up to you to decide if you are
> strong enough to water down some of the 'run my own show' in exchange
> for 'running together.' A similar reflection will come sooner or later
> on the Hugin side. All I can hope is that both will reach that
> reflection point in useful time, and a community can be once again
> energized to team up. I am happy to facilitate team building, but I
> will not be prodding reluctant non-participants that 'run their own
> show.'

Again, I'll bear your advice in mind. If you look at the exchange with
my two newly-found collaborators, I think my style of dealing with team
work is quite apparent. I'm very happy for them to help the project
along, and try to be friendly and encouraging and to learn from them as
well. I'll just play it by ear for now, see how it goes. And who knows -
maybe the 'new kid on the block' will inspire the old neighborhood to do
some new tricks.

Kay

Kay F. Jahnke

unread,
Mar 14, 2021, 2:29:34 PM3/14/21
to hugi...@googlegroups.com
Am 14.03.21 um 14:20 schrieb Harry van der Wolf:
> Another script for pv for Linux/BSD/MacOS.
>
> ...

Harry, you could use --snapshot_prefix=...

if you pass --snaphot_prefix=my_prefix

you get my_prefix.fused.exr

Kay

Harry van der Wolf

unread,
Mar 14, 2021, 3:02:30 PM3/14/21
to hugi...@googlegroups.com
Hi,

Thanks for the tip, but that does not really help.

I had mv "${PTO}.pv.1.fused.${SUFFIX}" "${OUTPUT}" in the script.

When I use the --snaphot_prefix="${PREFIX}", I get an image like ${PREFIX}1.fused.${SUFFIX}.

The "1.fused" does not really help in this case as I will now get "${PREFIX}1.fused.${SUFFIX}" and I still need to do the renaming.
Or accept the "1.fused" and do some "magic" with name shuffling in my exiftool command, because it is not the name I specified.

I am so stubborn that if I choose an output name, I also want that exact output name. ;)

Harry

Op zo 14 mrt. 2021 om 19:29 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:
--
A list of frequently asked questions is available at: http://wiki.panotools.org/Hugin_FAQ
---
You received this message because you are subscribed to the Google Groups "hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hugin-ptx+...@googlegroups.com.

Kay F. Jahnke

unread,
Mar 15, 2021, 3:26:13 AM3/15/21
to hugi...@googlegroups.com
Am 14.03.21 um 20:02 schrieb Harry van der Wolf:
> Hi,
>
> Thanks for the tip, but that does not really help.
>
> I had??mv "${PTO}.pv.1.fused.${SUFFIX}" "${OUTPUT}" in the script.
>
> When I use the --snaphot_prefix="${PREFIX}", I get an image like
> ${PREFIX}1.fused.${SUFFIX}.
>
> The "1.fused" does not really help in this case as I will now get
> "${PREFIX}1.fused.${SUFFIX}" and I still need to do the renaming.
> Or accept the "1.fused" and do some "magic" with name shuffling in my
> exiftool command, because??it is not the name I specified.

Okay, you got me there - I forgot about the number.

> I am so stubborn that if I choose an output name, I also want that exact
> output name. ;)

Feel free! If that's the only problem, I'm happy.

I usually work with folders. My workflow is like this:

- initially, all images are in one folder

- I run a script detecting brackets, which puts each bracket in a
separate folder, named sth. like IMG_0001_IMG_0003. If you're
interested, I'll share the script.

- next I run the pto-generating script in each of these folders,
producing a PTO named bracket.pto in each folder.

- Now I run pv:

pv [all kinds of args] IMG_????_IMG_????/bracket.pto

- finally, a quick loop:

for d in IMG_????_IMG_????
do
mv $d/*.fused.exr $d.exr
done

- now I have one exr per bracket 'back' in the 'main' folder, ready for
the next step (like, stitch the fused images) and I have the PTOs in the
separate folders for future reference: merge to HDR, take out bits,
retouch... and I can also do a slideshow with the unbracketed images
mixed with the fused brackets:

ls *.JPG *.exr | sort | /path/to/pv -d5 -

Kay

Harry van der Wolf

unread,
Mar 15, 2021, 3:31:12 AM3/15/21
to hugi...@googlegroups.com


Op ma 15 mrt. 2021 om 08:26 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

I usually work with folders. My workflow is like this:

- initially, all images are in one folder

- I run a script detecting brackets, which puts each bracket in a
separate folder, named sth. like IMG_0001_IMG_0003. If you're
interested, I'll share the script.

- next I run the pto-generating script in each of these folders,
producing a PTO named bracket.pto in each folder.


Can you share that bracket detecting script?  

Harry

Kay F. Jahnke

unread,
Mar 15, 2021, 6:55:44 AM3/15/21
to hugi...@googlegroups.com
Am 15.03.21 um 08:30 schrieb Harry van der Wolf:
>
>
> Op ma 15 mrt. 2021 om 08:26 schreef 'Kay F. Jahnke' via hugin and other
> free panoramic software <hugi...@googlegroups.com
> <mailto:hugi...@googlegroups.com>>:
>
>
> I usually work with folders. My workflow is like this:
>
> - initially, all images are in one folder
>
> - I run a script detecting brackets, which puts each bracket in a
> separate folder, named sth. like IMG_0001_IMG_0003. If you're
> interested, I'll share the script.
>
> - next I run the pto-generating script in each of these folders,
> producing a PTO named bracket.pto in each folder.
>
>
> Can you share that bracket detecting script?

Bear with me... it's a python script and the exiv2 module has changed
quite a bit, I'll need some time to get it running again.

Kay

Harry van der Wolf

unread,
Mar 15, 2021, 9:37:47 AM3/15/21
to hugi...@googlegroups.com


Op ma 15 mrt. 2021 om 11:55 schreef 'Kay F. Jahnke' via hugin and other free panoramic software <hugi...@googlegroups.com>:

Bear with me... it's a python script and the exiv2 module has changed
quite a bit, I'll need some time to get it running again.

No problem. You gave me a hint that I should have thought about myself. So thanks. Really. :)
I am a heavy exiftool user (both on command line and via my jExifToolGUI) and in exiftool I can do  "exiftool -S -csv -filename -BracketSettings -ExposureCompensation -Sequencenumber fz82-br1*.jpg"
which gives me:
SourceFile,FileName,BracketSettings,ExposureCompensation,SequenceNumber
fz82-br1_20210310-01.jpg,fz82-br1_20210310-01.jpg,"5 Images, Sequence 0/-/+",0,1
fz82-br1_20210310-02.jpg,fz82-br1_20210310-02.jpg,"5 Images, Sequence 0/-/+",-1,2
fz82-br1_20210310-03.jpg,fz82-br1_20210310-03.jpg,"5 Images, Sequence 0/-/+",+1,3
fz82-br1_20210310-04.jpg,fz82-br1_20210310-04.jpg,"5 Images, Sequence 0/-/+",-2,4
fz82-br1_20210310-05.jpg,fz82-br1_20210310-05.jpg,"5 Images, Sequence 0/-/+",+2,5

So I can make my own script running over a folder, but I will investigate it better first. I could use some output options from exiftool to format it slightly better for easier parsing.
I might even add that to my jExifToolGUI as some kind of "export" option (pushing it to align_image_stack/pv or align_image_stack/enfuse) after having modified the metadata and keywords..

Harry

Kay F. Jahnke

unread,
Mar 15, 2021, 11:48:45 AM3/15/21
to hugi...@googlegroups.com
Am 15.03.21 um 14:37 schrieb Harry van der Wolf:
>
>
> Op ma 15 mrt. 2021 om 11:55 schreef 'Kay F. Jahnke' via hugin and other
> free panoramic software <hugi...@googlegroups.com
> <mailto:hugi...@googlegroups.com>>:
>
>
> Bear with me... it's a python script and the exiv2 module has changed
> quite a bit, I'll need some time to get it running again.
>
>
> No problem. You gave me a hint that I should have thought about myself.
> So thanks. Really. :)

Hey, you're impatient! Never mind, find my script attached. I think the
pyexiv2 module changed with the move to python3, so I had to expend a
bit of work to get it to run again. But I've used the method for many
years to good effect. Goes without saying you use my scripts at your own
risk ;)

You'll need python3 and the pyexiv2 module, which you can get via pip3:

pip3 install pyexiv2

the just launch my script with a bunch of images, like:

bracket.py 3 *.JPG *.jpg

You should see output like this (I made a trial set)

IMG_20190624_112027_0.jpg has no CanonFi.BracketMode EXIF tag
using time proximity as sole criterion
IMG_20190624_112027_1.jpg has no CanonFi.BracketMode EXIF tag
using time proximity as sole criterion
IMG_20190624_112027_2.jpg has no CanonFi.BracketMode EXIF tag
using time proximity as sole criterion
distance 0:00:00
append ((datetime.datetime(2019, 6, 24, 11, 20, 27),
'IMG_20190624_112027_0.jpg'), (datetime.datetime(2019, 6, 24, 11, 20,
27), 'IMG_20190624_112027_1.jpg'))
distance 0:00:00
append ((datetime.datetime(2019, 6, 24, 11, 20, 27),
'IMG_20190624_112027_1.jpg'), (datetime.datetime(2019, 6, 24, 11, 20,
27), 'IMG_20190624_112027_2.jpg'))
distance 386 days, 22:08:21
distance 0:00:00
append ((datetime.datetime(2020, 7, 15, 9, 28, 48), 'IMG_3078.JPG'),
(datetime.datetime(2020, 7, 15, 9, 28, 48), 'IMG_3079.JPG'))
distance 0:00:00
append ((datetime.datetime(2020, 7, 15, 9, 28, 48), 'IMG_3079.JPG'),
(datetime.datetime(2020, 7, 15, 9, 28, 48), 'IMG_3080.JPG'))
distance 0:00:20
distance 0:00:00
append ((datetime.datetime(2020, 7, 15, 9, 29, 8), 'IMG_3081.JPG'),
(datetime.datetime(2020, 7, 15, 9, 29, 8), 'IMG_3082.JPG'))
distance 0:00:01
append ((datetime.datetime(2020, 7, 15, 9, 29, 8), 'IMG_3082.JPG'),
(datetime.datetime(2020, 7, 15, 9, 29, 9), 'IMG_3083.JPG'))
---------------------------------
processing group ['IMG_20190624_112027_0.jpg',
'IMG_20190624_112027_1.jpg', 'IMG_20190624_112027_2.jpg']
processing group ['IMG_3078.JPG', 'IMG_3079.JPG', 'IMG_3080.JPG']
processing group ['IMG_3081.JPG', 'IMG_3082.JPG', 'IMG_3083.JPG']

so in this case, three folders were created and the images belonging
together were moved there.

This works well even for large folders (hundreads of files) and is quite
fast as well.

Kay

bracket.py

Kay F. Jahnke

unread,
Mar 16, 2021, 11:34:13 AM3/16/21
to hugi...@googlegroups.com
[master 0b7d5d5] commits two build-related patches from Kornel, and the
change of the project name to 'lux' in CMakeLists.txt

I committed to master for now, trickle-down to the other branches will
follow soon-ish, I want to do more refactoring on the stitching code,
and there may be an issue with stitching alpha images as well, which I
want to investigate first.

Kay

Kay F. Jahnke

unread,
Mar 16, 2021, 1:29:29 PM3/16/21
to 'Kay F. Jahnke' via hugin and other free panoramic software
Am 16.03.21 um 16:34 schrieb 'Kay F. Jahnke' via hugin and other free
panoramic software:
> [master 0b7d5d5] commits two build-related patches from Kornel, and the
> change of the project name to 'lux' in CMakeLists.txt

Pushed the changes to build code to the repo, will take my time for the
c++ work.

Thanks for your patches!

Kay

Harry van der Wolf

unread,
Mar 16, 2021, 2:19:18 PM3/16/21
to hugi...@googlegroups.com
Question of compilation of pv/lux.

Whether you use cmake or simply the makefile based make, compilation takes quite some time.
When using "make -j 4" to use 4 cores for compilation instead of 1, the compilation is much faster.
Of course not every software is suitable to do this. At least not (quite) some years ago (I was Apple maintainer for avidemux which did not support it) and I really don't know if that has changed in general.
Can I use multiple cores to compile pv/lux?
I did a compilation this evening and so far I did not find issues.

Harry

Kornel Benko

unread,
Mar 16, 2021, 2:25:28 PM3/16/21
to hugi...@googlegroups.com
Am Tue, 16 Mar 2021 19:19:02 +0100
schrieb Harry van der Wolf <hvd...@gmail.com>:
No problems here Harry. I always use 'make -j16'.

Kornel

Kay F. Jahnke

unread,
Mar 17, 2021, 3:08:18 AM3/17/21
to hugi...@googlegroups.com
Am 16.03.21 um 19:25 schrieb Kornel Benko:
> Am Tue, 16 Mar 2021 19:19:02 +0100
> schrieb Harry van der Wolf <hvd...@gmail.com>:
>
>> Question of compilation of pv/lux.
>>
>> ...
>
> No problems here Harry. I always use 'make -j16'.

Maybe -j16 is taking it a bit far - you have to have a big machine for
that. Some of the sources need a lot of memory to compile, and raising
the number of threads too much may be counterproductive, sending the
system swapping. I use -j6 on my four-core.

But yes, multithreaded compiles are no problem.

Kay

Kay F. Jahnke

unread,
Mar 17, 2021, 6:13:06 AM3/17/21
to hugi...@googlegroups.com
I'd like to tackle the find-the-font issue. CMake installs the font to a
platform-specific location. The easiest way to deal with that is to
simply hard-code this location in the C++ code and have it differ
between the branches, so in the master branch I've modified the code
pv_no_rendering.cc accordingly. It now does this:

- if the user has passed a font on the CL, that takes preference.

- next, lux will check the pwd. if the font isn't there

- lux will try and access the target-specific location, assuming it has
been installed on the local machine

- if that fails, lux will exit

I've groomed the CMakeLists.txt a bit and also added code to copy the
Font's README to the font path, as required by the font's license.

For now I've only changed the master branch to this behaviour, so you
can try it out. Let me know what you think. If this works satisfactorily
on linux, I'll change the other branches accordingly.

@Harry: can you please tell me what path the font is installed to on the
mac, so I can adapt the mac branch?

I did a trial 'sudo make install' on my machine and it works fine. I
don't like having platform-specific code, but for this single issue I
can just about live with it ;)

The last push to master also should have resolved an issue with
stitching images with alpha channel, so 'manual deghosting' by erasing
unwanted bits from partials where other partials have content to 'fill
the gap' should now work as expected for stitches - exposure fusions and
faux brackets weren't affected.

I'll carry on grooming and refactoring for a while, and I may be able to
save some more memory. With the last commit, I already introduced new
code which only uses single-float masks, where pv had used RGB data for
convenience's sake. This should already have produced the bulk of memory
savings I think possible.

@Harry: please check memory use again, let me know how far I got memory
consumption down, and if the lowering of the number of threads with
--snapshot_threads now works for you.

Kay

Kornel Benko

unread,
Mar 17, 2021, 7:01:53 AM3/17/21
to hugi...@googlegroups.com
Am Wed, 17 Mar 2021 11:12:56 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:

> I'd like to tackle the find-the-font issue. CMake installs the font to a
> platform-specific location. The easiest way to deal with that is to
> simply hard-code this location in the C++ code and have it differ
> between the branches, so in the master branch I've modified the code
> pv_no_rendering.cc accordingly. It now does this:
>
> - if the user has passed a font on the CL, that takes preference.
>
> - next, lux will check the pwd. if the font isn't there
>
> - lux will try and access the target-specific location, assuming it has
> been installed on the local machine
>
> - if that fails, lux will exit
>
> I've groomed the CMakeLists.txt a bit and also added code to copy the
> Font's README to the font path, as required by the font's license.

You may have used
if(UNIX)
set(DataDir "${CMAKE_INSTALL_PREFIX}/share/${_pv}/")
elseif(WIN32)
set(DataDir "${CMAKE_INSTALL_PREFIX}/Resources/")
elseif(APPLE)
set(DataDir "${CMAKE_INSTALL_PREFIX}/${_pv}.app/Contents/Resources")
endif()
add_definitions(" -DPV_FONTDATADIR=${DataDir}/fonts")


and later
install(FILES Sansation_Regular.ttf Sansation_1.31_ReadMe.txt DESTINATION
"${DataDir}/fonts")

> For now I've only changed the master branch to this behaviour, so you
> can try it out. Let me know what you think. If this works satisfactorily
> on linux, I'll change the other branches accordingly.
>
> @Harry: can you please tell me what path the font is installed to on the
> mac, so I can adapt the mac branch?
>
> I did a trial 'sudo make install' on my machine and it works fine. I
> don't like having platform-specific code, but for this single issue I
> can just about live with it ;)
>
> The last push to master also should have resolved an issue with
> stitching images with alpha channel, so 'manual deghosting' by erasing
> unwanted bits from partials where other partials have content to 'fill
> the gap' should now work as expected for stitches - exposure fusions and
> faux brackets weren't affected.
>
> I'll carry on grooming and refactoring for a while, and I may be able to
> save some more memory. With the last commit, I already introduced new
> code which only uses single-float masks, where pv had used RGB data for
> convenience's sake. This should already have produced the bulk of memory
> savings I think possible.
>
> @Harry: please check memory use again, let me know how far I got memory
> consumption down, and if the lowering of the number of threads with
> --snapshot_threads now works for you.
>
> Kay
>

Kornel

Bruno Postle

unread,
Mar 17, 2021, 7:29:13 AM3/17/21
to Hugin ptx
On Wed 17-Mar-2021 at 11:12 +0100, Hugin ptx wrote:
>I'd like to tackle the find-the-font issue. CMake installs the font to
>a platform-specific location. The easiest way to deal with that is to
>simply hard-code this location in the C++ code and have it differ
>between the branches, so in the master branch I've modified the code
>pv_no_rendering.cc accordingly. It now does this:
>
>- if the user has passed a font on the CL, that takes preference.
>
>- next, lux will check the pwd. if the font isn't there
>
>- lux will try and access the target-specific location, assuming it
>has been installed on the local machine

You have hard-coded /usr/local in this check:

"/usr/local/share/lux/Sansation_Regular.ttf"

Whereas (for example in Linux distribution package), this would be:

"/usr/share/lux/Sansation_Regular.ttf"

CMAKE_INSTALL_PREFIX defaults to "/usr/local", but can be set to
whatever location the builder likes (for rpm I'm setting it to
"/usr"). Can you use this definition in a macro to compile-in the
installed location of the font?

--
Bruno

Kay F. Jahnke

unread,
Mar 17, 2021, 12:52:20 PM3/17/21
to hugi...@googlegroups.com
Am 17.03.21 um 12:29 schrieb Bruno Postle:
> On Wed 17-Mar-2021 at 11:12 +0100, Hugin ptx wrote:
>> I'd like to tackle the find-the-font issue. CMake installs the font to
>> a platform-specific location. The easiest way to deal with that is to
>> simply hard-code this location in the C++ code and have it differ
>> between the branches, so in the master branch I've modified the code
>> pv_no_rendering.cc accordingly. It now does this:
>>
>> - if the user has passed a font on the CL, that takes preference.
>>
>> - next, lux will check the pwd. if the font isn't there
>>
>> - lux will try and access the target-specific location, assuming it
>> has been installed on the local machine
>
> You have hard-coded /usr/local in this check:
>
> ?????? "/usr/local/share/lux/Sansation_Regular.ttf"
>
> Whereas (for example in Linux distribution package), this would be:
>
> ?????? "/usr/share/lux/Sansation_Regular.ttf"

Thanks for pointing this out - I did fear it wasn't as easy as my quick
shot :(

> CMAKE_INSTALL_PREFIX defaults to "/usr/local", but can be set to
> whatever location the builder likes (for rpm I'm setting it to "/usr").
> Can you use this definition in a macro to compile-in the installed
> location of the font?

Good question.

@Kornel: Maybe cmake can produce something like a pv_config.h header
which has such platform-specific information? That should not be too
hard - and it might be nice to maybe have a bit more platform-specific
info at hand, like the user's home directory, to put stuff in which
persists from session to session, and the location of the binary...

I wouldn't be surprised if cmake does that all the time - I'm just not
very good at it.

Kay

Kornel Benko

unread,
Mar 17, 2021, 2:11:54 PM3/17/21
to hugi...@googlegroups.com
Am Wed, 17 Mar 2021 17:52:10 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:
I can do that. I have to know the variable-names and their value.
And of course the desired name of the header file.

Kornel

Kay F. Jahnke

unread,
Mar 18, 2021, 4:56:09 AM3/18/21
to hugi...@googlegroups.com
Am 17.03.21 um 19:11 schrieb Kornel Benko:
> Am Wed, 17 Mar 2021 17:52:10 +0100
> schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"

>> @Kornel: Maybe cmake can produce something like a pv_config.h header
>> which has such platform-specific information? That should not be too
>> hard - and it might be nice to maybe have a bit more platform-specific
>> info at hand, like the user's home directory, to put stuff in which
>> persists from session to session, and the location of the binary...
>>
>> I wouldn't be surprised if cmake does that all the time - I'm just not
>> very good at it.
>>
>
> I can do that. I have to know the variable-names and their value.
> And of course the desired name of the header file.

Sorry, @Kornel, I think this was yet another one of my quick shots. I
have doubts now after sleeping over it. If you were to generate a C++
header file which is compiled into the program, the resulting binary
contains a hard-coded path - even if this is only the third option if
the other two options fail. Can we be sure that such a binary will not
be produced/used on another platform/distro where it's bound to fail?

I've thought about this some more, and I think the ideal solution would
be to have an agent which resides on a machine and can produce
system-specific information. What I'd like is something which is not
part of the program, but local to the machine, yet can be accessed in a
platform-independent way. How about environment variables? As far as I
know they are available on every platform and they are accessible from
C++ via a standard header. To get the information into the program, they
are definitely a viable path. What do you think?

If we were to go with this idea, the question is, how can we *set* the
environment variables in a platform-specific way, i.e. set an
environment variable LUX_GUI_FONT to
/usr/local/share/lux/Sansation_Regular.ttf on ubuntu? We can't use cmake
directly, because it won't be available on the user's machine, so there
must be an indirect way. If the installation process could set the
required variables persistently, builders could modify that step to fit
their needs, and the binary would remain platform-independent. I
wouldn't be surprised if cmake's package-building code would have
built-in support to set environment variables, but I couldn't find it
when I looked just now. One problem with session-wide environment
variables is that they won't be available until the user has logged in
again (speaking about linux here, if we were to modify e.g. ~/.profile)

Another way to have the right environment variable would be to launch
lux via a helper script, which I like anyway because it's a good way for
experienced users to configure the launch. The script can export the
environment variable, then launch lux. I think I'd favor this option.

I have added code to pv_no_rendering.cc to try and glean the value of an
environment variable LUX_GUI_PATH, if neither the GUI font from the
command line (if any) nor in the pwd (if any) can be loaded. It's the
'last resort'. With this option in the code, we can experiment if this
idea is a viable path.

@Bruno: what do you think of this idea?

Kay

Kornel Benko

unread,
Mar 18, 2021, 5:45:16 AM3/18/21
to hugi...@googlegroups.com
Am Thu, 18 Mar 2021 09:56:00 +0100
schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
<hugi...@googlegroups.com>:

> Am 17.03.21 um 19:11 schrieb Kornel Benko:
> > Am Wed, 17 Mar 2021 17:52:10 +0100
> > schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
>
> >> @Kornel: Maybe cmake can produce something like a pv_config.h header
> >> which has such platform-specific information? That should not be too
> >> hard - and it might be nice to maybe have a bit more platform-specific
> >> info at hand, like the user's home directory, to put stuff in which
> >> persists from session to session, and the location of the binary...
> >>
> >> I wouldn't be surprised if cmake does that all the time - I'm just not
> >> very good at it.
> >>
> >
> > I can do that. I have to know the variable-names and their value.
> > And of course the desired name of the header file.
>
> Sorry, @Kornel, I think this was yet another one of my quick shots. I
> have doubts now after sleeping over it. If you were to generate a C++
> header file which is compiled into the program, the resulting binary
> contains a hard-coded path - even if this is only the third option if
> the other two options fail. Can we be sure that such a binary will not
> be produced/used on another platform/distro where it's bound to fail?

Sure we can. Because the installed ttf is exactly there, where it is expected.
In CMakeLists we specify the destination. You have to use 'PV_FONTDATADIR' in
pv_no_rendering.cc to find the font.

> I've thought about this some more, and I think the ideal solution would
> be to have an agent which resides on a machine and can produce
> system-specific information. What I'd like is something which is not
> part of the program, but local to the machine, yet can be accessed in a
> platform-independent way. How about environment variables? As far as I
> know they are available on every platform and they are accessible from
> C++ via a standard header. To get the information into the program, they
> are definitely a viable path. What do you think?

This agent would also have the same problem, so I am not optimistic.

> If we were to go with this idea, the question is, how can we *set* the
> environment variables in a platform-specific way, i.e. set an
> environment variable LUX_GUI_FONT to
> /usr/local/share/lux/Sansation_Regular.ttf on ubuntu? We can't use cmake
> directly, because it won't be available on the user's machine, so there
> must be an indirect way. If the installation process could set the
> required variables persistently, builders could modify that step to fit
> their needs, and the binary would remain platform-independent. I
> wouldn't be surprised if cmake's package-building code would have
> built-in support to set environment variables, but I couldn't find it
> when I looked just now. One problem with session-wide environment
> variables is that they won't be available until the user has logged in
> again (speaking about linux here, if we were to modify e.g. ~/.profile)

No need, since the executable knows the position.

> Another way to have the right environment variable would be to launch
> lux via a helper script, which I like anyway because it's a good way for
> experienced users to configure the launch. The script can export the
> environment variable, then launch lux. I think I'd favor this option.

Sure, that is how I am using it, but is not ideal IMO.

> I have added code to pv_no_rendering.cc to try and glean the value of an
> environment variable LUX_GUI_PATH, if neither the GUI font from the
> command line (if any) nor in the pwd (if any) can be loaded. It's the
> 'last resort'. With this option in the code, we can experiment if this
> idea is a viable path.

I have seen it already. Again, why not use PV_FONTDATADIR?

> @Bruno: what do you think of this idea?
>
> Kay
>

The patch in CMakeLists.txt should take care of blanks in path to the ttf file.

Kornel
CMakeLists.txt.4.patch

Kay F. Jahnke

unread,
Mar 18, 2021, 5:53:03 AM3/18/21
to hugi...@googlegroups.com
Am 18.03.21 um 10:45 schrieb Kornel Benko:
> Am Thu, 18 Mar 2021 09:56:00 +0100
> schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
> <hugi...@googlegroups.com>:
>
>> Am 17.03.21 um 19:11 schrieb Kornel Benko:
>>> Am Wed, 17 Mar 2021 17:52:10 +0100
>>> schrieb "'Kay F. Jahnke' via hugin and other free panoramic software"
>>
>>>> @Kornel: Maybe cmake can produce something like a pv_config.h header
>>>> which has such platform-specific information? That should not be too
>>>> hard - and it might be nice to maybe have a bit more platform-specific
>>>> info at hand, like the user's home directory, to put stuff in which
>>>> persists from session to session, and the location of the binary...
>>>>
>>>> I wouldn't be surprised if cmake does that all the time - I'm just not
>>>> very good at it.
>>>>
>>>
>>> I can do that. I have to know the variable-names and their value.
>>> And of course the desired name of the header file.
>>
>> Sorry, @Kornel, I think this was yet another one of my quick shots. I
>> have doubts now after sleeping over it. If you were to generate a C++
>> header file which is compiled into the program, the resulting binary
>> contains a hard-coded path - even if this is only the third option if
>> the other two options fail. Can we be sure that such a binary will not
>> be produced/used on another platform/distro where it's bound to fail?
>
> Sure we can. Because the installed ttf is exactly there, where it is expected.
> In CMakeLists we specify the destination. You have to use 'PV_FONTDATADIR' in
> pv_no_rendering.cc to find the font.

Please explain. What do you mean by 'use'? Is PV_FONTDATADIR an
environment variable? Or something CMake will patch in the source?

Kay
It is loading more messages.
0 new messages