Are they the same bitwise or look they the same if opened in photoshop?
The latter would be no suprise, since you can't see the true content of
an HDR file. Photoshop always shows some kind of auto exposure, which
gives about the same result for a standard photo.
You can see the difference if you push the exposure slider until about
the same clipping of the highlights. You should need different values
for the different exposure steps, at least if the source images where
differently exposed.
If you open the images in Photomatix you can display the HDR histogram,
which should show clearly that they have different mean brightness.
The hdr blend planes are provided for the same reason the LDR blend
planes are: to edit for ghosts between the different exposure steps. For
HDR blend planes you need a program that is capable of using and editing
HDR layers.
--
Erik Krause
http://www.erik-krause.de
I think HDR layers are supported since photoshop CS3 (I still have CS2).
> Hello Erik
> May be I have, what we call in switzerland a "knopf". In which sense
> are HDR Layers supported in PS? Do You mean to have them as normal
> layers with as special modus, or as "ebenenstapel" (sorry do not know
> the english word)?
> regards
> David
HDR layers is not supported in CS3 standard, but AFAIK they are in CS3
extended.
>
> On 14 Nov., 21:17, Erik Krause <erik.kra...@gmx.de> wrote:
>> Am 14.11.2010 21:07, schrieb Kempinski:
>>
>> > May I ask, what Do You mean, which software allows to combine these
>> > hdr layers to a real hdr?
>>
>> I think HDR layers are supported since photoshop CS3 (I still have CS2).
>>
>> --
>> Erik Krausehttp://www.erik-krause.de
>
--
Bjᅵrn K Nilssen - http://bknilssen.no - 3D and panoramas.
> On Nov 15, 8:53 am, Laszlo <laszloadr...@gmail.com> wrote:
>> I use CS4 extended but the hdr blend plane file and the hdr file are
>> the same: one layer, the background. The LDR .ps file has the layers
>> but is LDR.
>> what program is able to work with HDR blend planes?
>
> There might be a terminology misunderstanding:
> LDR refers to 8 or 16 bit integer data/files.
> HDR refers to a 32 (or even 64bit) floating point data/file which can
> hold a far greater range of brightness values (aka dynamic range).
The use of bitdepths on files are used very confusingly IMO.
You have both 32bit (HDR float) and 32bit(RGBA), which is very confusing
to many people.
And you have 8bit(RGB=24bit) and 8bit(grey=8bit).
And 64bit (RGBAx16bit) and 64bit(HDR float).
It would be nice with some consistency here? Like 8bpc (per channel)?
A lot of people have problems with dpi/ppi, but bit depth is certainly not
any easier...
> LDR blend planes are mathematically transformed into a single HDR
> plane.
> Since we can't directly see/use the HDR image data, it must be down-
> converted (aka tone-mapped) into a 8 or 16 bit file again for all
> practical purposes (with the exception of i.e. light spheres).
>
> (enfusion is using a similar path, but without the floating point
> transformation)
>
> Based on that, why do you need HDR blend planes?
Wouldn't that allow easier ghost removal?
> no, ghost removal happens in the LDR stage, not in the HDR stage. once
> it has been merged to HDR, it is what it is.
If this was really the case there would be no need for HDR blend planes.
I guess Joost doesn't implement something which is completely useless.
When the HDR blend planes where first introduced I asked pretty much the
same. The answer was that HDR blend planes contain one warped and
blended exposure step in HDR space each.
> On Nov 15, 11:53 am, Bjᅵrn K Nilssen <b...@bknilssen.no> wrote:
>> > LDR refers to 8 or 16 bit integer data/files.
>> > HDR refers to a 32 (or even 64bit) floating point data/file which can
>> > hold a far greater range of brightness values (aka dynamic range).
>>
>> The use of bitdepths on files are used very confusingly IMO.
>> You have both 32bit (HDR float) and 32bit(RGBA), which is very
>> confusing to many people.
>> And you have 8bit(RGB=24bit) and 8bit(grey=8bit).
>> And 64bit (RGBAx16bit) and 64bit(HDR float).
>> It would be nice with some consistency here? Like 8bpc (per channel)?
>> A lot of people have problems with dpi/ppi, but bit depth is certainly
>> not any easier...
>
> Bjᅵrn,
>
> I tried to make it very simple and very clear, and I think your reply
> doesn't really help making the whole issue easier to understand.
Sorry about that. It wasn't really a reply, but just a bit of venting my
frustration because of the lack of common usage of those words 'xbit
files', mixing integer and float, and bits per channel and bits per pixel.
You did make it clear!
Another way to describe the difference between LDR/HDR could be:
LDR = 0.0->1.0 for each channel/pixel, in either 8 or 16bit integer steps
HDR = 0.0->infinity for each channel/pixel, stepless.
>> > Based on that, why do you need HDR blend planes?
>> Wouldn't that allow easier ghost removal?
>
> no, ghost removal happens in the LDR stage, not in the HDR stage. once
> it has been merged to HDR, it is what it is.
OK. I've never tried it, but I thought you might get each position (with x
exposures) as a separate HDR layer.
But apparently that is not case...
According to
http://en.wikipedia.org/wiki/Adobe_Photoshop#Version_history
photoshop CS3 supports HDR layers. I must admit that I never tried it,
since I don't use HDR anymore since exposure fusion is available (I'm
happy with CS2).
The HDR blend planes have the same brightness and pixel format as the
HDR panorama, they are indeed intended for ghost removal.
Joost
So what is your workflow? You blend to HDR Blend planes and then open the blend planes into photoshop and layer them? Then do you merge the layers to HDR after removing ghosts or do you use layer blending to come up with your final image?
Moving people are always a pain - sometimes I will blend the image and then later on patch out that area after I've done all my other adjustments
Do you merge them into a 32-bit image after manually removing the ghosts?
Ryan
So you are not using PTGui to create the HDR files from your bracketed
images, but you are loading ready made HDR files?
In that case you don't need the blend planes as they will indeed be
indentical to the panorama. You've skipped this step by creating your
own HDRs.
> So how do I edit my hdr reults for ghosting using the blend planes? Or
> what I am doing wrong?
Just like you would edit any (non HDR) panorama:
- using the Mask function in PTGui 9:
http://www.ptgui.com/beta.html
- or by post processing in photoshop. See 9.10 (in your case you need
Individual HDR Layers):
http://www.ptgui.com/support.html#9_10
- or by applying an alpha channel to your source images. See 6.8:
http://www.ptgui.com/support.html#6_8
Joost
If you need ghost removal I would recommend to use a dedicated HDR
program for now.
Joost
Only a bit. Different masks for different exposures of the same image
currently works only if you don't link the bracketed shots. If you use
this feature very carefully you can possibly avoid ghosts. But it is
hard to guess, since you don't have instant preview.
I've reverted to enfuse aligned images prior to stitching for handheld
brackeetd shots. It's much easier to control alignment and ghosting and
it's much easier to edit layers later.
How do you succeed aligning bracketed handheld fisheye HDR/enfuse images -
outside PTgui?
I've tried processing some fisheye brackets (not quite aligned) in
Photomoatix a few years ago, with very little success. Probably because
they needed to be processed/warped/defished before aligning? In PTgui they
aligned fine when not linking them.
Yes, the program must know the geometry which determines the way
something is misaligned. A simple shift and roll as for rectilinear
lenses doesn't work.
> In PTgui they aligned fine when not linking them.
Exactly. I started aligning them in PTGui but then switched to
align_image_stack, which is called automatically by my
enfuse_align_droplet (windows only, part of hugin installer).
align_image_stack needs to be informed about the geometry (fisheye and FoV).
Stitching unlinked shots in PTGui I always had the problem that PTGui
trades off alignment of brackets for alignment of the whole panorama.
And later on I had problems retouching errors because of contrast
differences between the single warped images and the exposure fused
layer. Now if a bracketed set doesn't align I can selectively replace a
misaligned image by a fake exposure from a neighboring bracket.
Stitching is much faster and editing is easy. And last but not least I
get a better tonal range from enfusing single images compared to
enfusing complete blended panorama.
http://www.360cities.net/image/les-calanques-sugiton-southern-france was
made this way.