Hmm, from my reading (and limited experience) about shooting frames to
be turned into a HDR image, you shoot one frame at whatever exposure (at
fixed aperture) gives proper exposure for the darkest parts, one frame
at the corresponding exposure that gives proper exposure for the
lightest parts, then space the intermediate exposures 2-3 stops between
them.
But I do my HDR image processing in QtPFSGui, not Hugin.
--
Gnome Nomad
gnome...@gmail.com
wandering the landscape of god
http://www.cafepress.com/otherend/
OK. I just was remembering doing some HDR sequences awhile back, and
seeing ranges of 7-9 stops, and thinking that a range of 4 stops isn't
very much ...
> As I'm on this topic again, I'd like to add another hopefully helpful
> hint. Even though I already put my middle exposure at -1, the +1
> exposure still is way to bright, but I need it for the deep shadows.
> Nevertheless the result of the fusion often comes out too bright. It
> may just be my specific process, but I often found that on top of
> tweaking --exposure-sigma, lowering --exposure-mu helped - it makes
> the overall result slightly darker and prefers the darker exposures,
> but there is still an appreciable fill light from lightest exposure.
> So I set my default parameters for enfuse to
>
> --exposure-sigma=.1 --saturation-weight=0 --exposure-mu=.35
>
> and often enough that hits the spot :)
Cool. I still prefer to use QTPSFGui for my HDR images. I think I'd do
that even if I was going to combine the resulting image into a panorama.
On 27 Jul., 09:28, Jeffrey Martin <360c...@gmail.com> wrote:
> Personally i've had the best results using +1.5 and -0.5 exposure derived
> from a single raw file.
Differently processed single raw files are perfect source material -
if the scene can be captured with the dynamic range available. I don't
know what you have, but my sensor's dynamic range is somewhere in the
12 to 14 bit range, and my landscapes sometimes just don't fit into
that. I wish they would.
In fact I think the technology to expose the
sensor for a fixed period of time and then count the photons is silly.
What would be much more sensible is measuring the time it takes each
cell to reach saturation. When the exposure is finally stopped, those
cells which aren't full can still be photon-counted to define the
shadows. Store the result in a floating point format and you end up
with a truly HDR raw image without any fuss and then you can proceed
by exposure-blending different versions of it instead of the
cumbersome tone-mapping. I wonder if that's technically feasible, but
why not?
Get to know the CCD chip in your camera. I don't know which CCD Canon
uses. In my Maxxum 7D, the CCD easily blows highlights.
> Wonder if anyone thought of that?
http://www.google.com/patents/about?id=MkDJAAAAEBAJ&dq=measure+time+to+saturation+%2Bphotography
coming to a camera in your neighborhood photo shop in 17 years?
Yuv
On 28 Jul., 09:39, Jeffrey Martin <360c...@gmail.com> wrote:
I suppose my 450D isn't too far from a 550D, but I'll nevertheless do
a couple of test shots to see if I can also recover two stops from
what it reckons is blown. What raw converter do you use?
You may want to take a look at rawtherapee 3.x. It can do a very good
job at restoring the highligths. I was able to get this [1] while the
in camera jpeg looks like this [2] using the method described at [3].
It still has some ugly magenta tint in some places but still rt was
able to restore incredible number of detail.
[1] http://blender6xx.ic.cz/pub/_MG_2054.jpg
[2] http://blender6xx.ic.cz/pub/_MG_2054.thumb.jpg
[3] http://rawtherapee.com/forum/viewtopic.php?t=2907
Lukas
Odd, works flawlessly here.
> Furthermore I can't do
> [3] since my current version of Raw Therapee [3.01 alpha 1] does not
> have a 'RAW' tab. I use up all my masochism using and programming for
> hugin ;-) I'll give it a while and once it says it's something like
> 3.2 final I'll give it a good long try again. Am I too impatient?
>
> As far as I can see from what I get (before it crashes) blown is
> blown. If the sensor limit is hit there just isn't anything you can
> do.
That is true. "Highlight recovery" works by using the extra bit depth
(12) of the RAW image to try to come up with an approximate 8-bit color.
The sensor in my Minolta is very quick to blow highlights, and highlight
recovery doesn't work very well, if at all.