On 25 Jan., 12:58, George R <george...
> I find that sometimes when I launch Optimise for the first time on a
> new project it will run for a very long time.
I had a case recently when I optimized a bunch of panoramas with
autooptimiser. I went for a walk and anticipated coming home to an
optimzed batch, but the process hung on one pano and had made it to
700 (!) iterations. I killed the process so the rest could have a go.
Mind you this was the only time something like that happened, and I
promptly proposed a sanity check for the optimizer:
... as you see noone bothered replying to my proposal, so your chances
probably aren't very good :(
> The image set is 8 stacks of 3x bracked-exposures each image is a 70Mb
> 16-bit TIF.
> a) do other people notice this? Is it only on 16-bit TIFF files? Is
> it only on MacOS-X?
Apart from the 700-iterations mishap, I do projects like this (8-9
stacks of 16bit TIFFS 70MB each) quite regularly on my Thinkpad (32bit
intel core 2 duo with 3GB RAM, so it's an old box - running Kubuntu
11.4) and never noticed anything undue.
> b) would it be possible for Optimise to make some sort of check either
> initially or between iterations, perhaps something like:
On my machine, with all versions I can remember, when running
optimization jobs from hugin, if I stop the optimizer by clicking
'cancel', the result of the last iteration of the optimizer is
nevertheless kept and applied, rather than cancelling the run and
leaving everything as it was before. Is that different on the Mac
> - At the end of an iteration if a control point distance is more than
> X (more than x% away from the average?) and hasn't changed more than y
> % in that iteration then start to ignore it
the optimizer algorithm does something similar to that. If it didn't
form a notion of which CPs are outliers, it could never arrive at a