I do HDR shots and processing. The tone mapping processes themselves
tend to produce halos and shadows around parts of images in my experience.
On 2/26/21 11:27 PM,
mkogoj...@gmail.com wrote:
> " I know at least /.../ darktable can output in linear color space
> without applying a gamma curve "
> Okay, so part of the solution is to go into Darktable and disable the
> base curve. This worked, though it's too bad it adds another step to the
> workflow. Choosing darktable option from hugin import doesn't give any
> options to specify, like styles that could be used to define linear
> settings with... does it work well for you? I'd try it but for now it
> just crashes.
Good luck with Darktable. I've never been able to figure out its user
interface. I use RawTherapee to process my RAW files.
I'm pretty sure all programs that process RAW files have a way to set
and save preferences, so I'd think you could set a preference to not
apply a color space?
> "The response curve is applied to all color channels. /.../ Maybe there
> is a problem with the white balance instead."
> Camera manufacturers add subtle changes to the RGB output channels
> in-camera. It's mostly done to keep a cohesive look to their cameras,
> like a modern photorapher may use a certain LUT they like so all of
> their work looks the same. Normally raw files are immune to this because
> there is no in-camera processing going on, but point here is to check
> for inconsistencies. It could be anything - sensor age or wear.
Camera manufacturers could also be compensating for issues in the
sensors themselves.
I've also seen sensor tests (I think at DXOMark?) indicating that
sensors' color response (in bit depth) changes over the ISO range.
Generally, the higher the ISO, the lower the bit depth.
I always shoot at 100ISO, so I don't know what impact shooting at higher
ISOs might have.
> "Hmm, if you opt for linear color, does color space matter?"
> Technically those are two different things. A colour space can be
> whichever coordinate system you can describe within the limits of human
> visual ability (400-700nm), while linearity is having a constant step
> "distance" within that coordinate system. I imagine.
I tend to think of linearity as an absolute number recording the
sensor's response to whatever light frequencies it's sensitive to. For
instance, some digital camera sensors are sensitive to the infrared
ranges used by some autonomous/assisted car systems, and their sensors
have been damaged by such cars' laser-generated beams.
Color space, yeah, that's different.
> "What impact would [15bit colour depth] that have on "linear" color space?"
> More colours between full black (0,0,0) and full white (1,1,1). If the
> system is new or very uncommon it could cause demosaicing problems, but
> that is a different topic.
Color depth is just the intensity range as detected by the particular
sensor. Some sensors aren't an even color depth across the three colors.
Some cameras truncate their sensor readings before recording them as a
RAW file. My old camera's sensor recorded 16-bit color but cut off the
high bit in the RAW file. I think some other cameras use 15-bit sensors
but truncate the colors down to 12-bit when recording. I think it's a
function of how much noise removal the camera manufacturer wants in
their camera?