Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Were CCDs better on colour than CMOS?

0 views
Skip to first unread message

Rich

unread,
Jan 25, 2009, 3:46:57 PM1/25/09
to
I heard this refrain time and again on the Olympus Dpreview group, that the
E-1 (CCD) had better colour than the NMOS sensors in the other Olympus
models. This is the first time I've seen it mentioned on the Nikon group
(s).

http://forums.dpreview.com/forums/read.asp?forum=1039&message=30779163

Chris L Peterson

unread,
Jan 25, 2009, 5:34:01 PM1/25/09
to
On Sun, 25 Jan 2009 14:46:57 -0600, Rich <no...@nowhere.com> wrote:

>I heard this refrain time and again on the Olympus Dpreview group, that the
>E-1 (CCD) had better colour than the NMOS sensors in the other Olympus
>models. This is the first time I've seen it mentioned on the Nikon group

Getting accurate color on ordinary images with any electronic sensor (or
film, for that matter) is an extremely complex problem- much more so
than for astronomical imaging. People are instantly aware of very subtle
color errors when comparing an image to reality (which you can't do with
most astronomical targets). So each camera maker uses their own
proprietary magic to manipulate the data into a realistic gamut. Some do
better than others, and some do better on some camera models than on
others.

There are differences between CCDs and CMOS detectors that manifest
themselves in different ways, but I don't see color accuracy as one of
those. Maybe on older cameras, where CMOS sensors were more limited in
dynamic range than CCDs, there could be some processing issues. But for
the last few years, cameras with CCDs and CMOS have both had similar
dynamic range- about 12 bits- which is more than enough to process and
reduce to 8-bit images.
_________________________________________________

Chris L Peterson
Cloudbait Observatory
http://www.cloudbait.com

Rich

unread,
Jan 26, 2009, 4:39:17 AM1/26/09
to
Chris L Peterson <c...@alumni.caltech.edu> wrote in
news:kmppn4p19mgn5i2hd...@4ax.com:

Bit depth aside, photogs know from experience when something is "off." For
instance, I know one photog who kept a Nikon D200 after buying a D300
because they just could not render the same skin tone quality with the
newer model. No profile would do it. This, despite obvious improvements
in other areas (like noise) going from CCD to CMOS.

Quadibloc

unread,
Jan 26, 2009, 9:02:46 AM1/26/09
to
On Jan 26, 2:39 am, Rich <n...@nowhere.com> wrote:
> This, despite obvious improvements
> in other areas (like noise) going from CCD to CMOS.

Interesting.

Noise used to be one of the big problems with CMOS image sensors.

I was reading up on this to find out what the differences were.
Originally, CCD sensors were used because they had good performance,
while the first CMOS sensors gave very noisy images. Then the CMOS
design was changed, so that instead of simply one light-sensing
transistor per cell, a small amplifier was added as well. This
improved the performance with noise.

Today, CMOS sensors have two advantages - they can be read out more
quickly, and they have inherent resistance to "blooming". But CCD
sensors have the advantage that the sensor area covers nearly the
whole chip, and the advantage that since there is only one set of
electronics reading the charge from all the cells, the image is
inherently uniform.

Thus, CCD is still the traditional choice, although some say that CMOS
has caught up with it.

John Savard

Chris L Peterson

unread,
Jan 26, 2009, 9:59:51 AM1/26/09
to
On Mon, 26 Jan 2009 03:39:17 -0600, Rich <no...@nowhere.com> wrote:

>Bit depth aside, photogs know from experience when something is "off." For
>instance, I know one photog who kept a Nikon D200 after buying a D300
>because they just could not render the same skin tone quality with the
>newer model. No profile would do it. This, despite obvious improvements
>in other areas (like noise) going from CCD to CMOS.

CMOS sensors (in general) don't have better noise specs than CCDs (in
general). It's just that CMOS designers have found ways to reduce
readout noise so that the two are similar. For long exposures (as in
astronomy), both types of sensors still require cooling, and CCDs still
have superior dynamic range and more accurate readout, which is why they
remain the primary choice for scientific imaging.

In any case, I'm sure the assessment that something is "off" is
accurate, I just place that on something other than the difference
between CMOS and CCD in the sensors.

Pierre Vandevenne

unread,
Jan 26, 2009, 10:02:21 AM1/26/09
to
Quadibloc <jsa...@ecn.ab.ca> wrote in news:b29b5bcd-0942-4d1a-8d31-
b6ad39...@n33g2000pri.googlegroups.com:

> whole chip, and the advantage that since there is only one set of
> electronics reading the charge from all the cells, the image is
> inherently uniform.

Yes, the non linearity of the response of those small amplifiers remains an
issue.



> Thus, CCD is still the traditional choice, although some say that CMOS
> has caught up with it.

From a digital photography point of view, definitely. The Canon D30
(launched in 2000 if my memory serves me well, not the later 30D) was
arguably the first widely available very good DSLR using a CMOS sensor.
Since that day, I believe all Canon DSLR used CMOS sensors.

For high speed video, CCD is definitely on the exit.

There are a few papers on SPIE exploring the use of CMOS technology for
astronomical applications.

Doug Jewell

unread,
Jan 26, 2009, 4:30:31 PM1/26/09
to
I currently have a Samsung GX10 (rebadged Pentax K10) and a
Canon 450D. The Sammy has a CCD sensor and the Canon a CMOS
sensor. While they do have slightly different colour
rendering for the most part I'd put it down to the post
processing rather than the sensor technology.

One thing I have seen radically different though is the
rendering of violet flowers. The sammy gets them fairly
close, while the Canon nearly always renders them far too
blue. I don't know if this is because of the sensor
technology, or whether it is more to do with the design of
the colour filters on the sensor. What I suspect is
happening is that the red filter on the sammy also lets
through a little light in the violet spectrum - which to
match the human eye it must do. The Canon OTOH is only
letting through light in the red spectrum.

Pierre Vandevenne

unread,
Jan 26, 2009, 4:40:59 PM1/26/09
to
On Jan 26, 10:30 pm, Doug Jewell <a...@and.maybe.ill.tell.you> wrote:

> through a little light in the violet spectrum - which to
> match the human eye it must do. The Canon OTOH is only
> letting through light in the red spectrum.

Here's a typical Canon response curve

http://www.astrosurf.com/buil/eos40d2/filter.htm

couldn't find the Samnsung one though.

Rich

unread,
Jan 26, 2009, 7:32:30 PM1/26/09
to
Quadibloc <jsa...@ecn.ab.ca> wrote in news:b29b5bcd-0942-4d1a-8d31-
b6ad39...@n33g2000pri.googlegroups.com:

> On Jan 26, 2:39 am, Rich <n...@nowhere.com> wrote:


>> This, despite obvious improvements
>> in other areas (like noise) going from CCD to CMOS.
>
> Interesting.
>
> Noise used to be one of the big problems with CMOS image sensors.
>
> I was reading up on this to find out what the differences were.
> Originally, CCD sensors were used because they had good performance,
> while the first CMOS sensors gave very noisy images. Then the CMOS
> design was changed, so that instead of simply one light-sensing
> transistor per cell, a small amplifier was added as well. This
> improved the performance with noise.
>
> Today, CMOS sensors have two advantages - they can be read out more
> quickly, and they have inherent resistance to "blooming". But CCD
> sensors have the advantage that the sensor area covers nearly the
> whole chip, and the advantage that since there is only one set of
> electronics reading the charge from all the cells, the image is
> inherently uniform.
>

Three advantages. CMOS sensors use less power.

Rich

unread,
Jan 26, 2009, 7:33:32 PM1/26/09
to
Pierre Vandevenne <pie...@datarescue.com_ns> wrote in
news:Xns9B9FA3281A94Cp...@127.0.0.1:

I think they've already started using them on a small scale. No 1G or
larger CMOS sensors at the back of professional telescopes yet. But it
seems like CMOS might be good on satellites.

Chris L Peterson

unread,
Jan 26, 2009, 8:19:18 PM1/26/09
to
On Mon, 26 Jan 2009 18:32:30 -0600, Rich <no...@nowhere.com> wrote:

>Three advantages. CMOS sensors use less power.

Not necessarily. CMOS sensors are low power, but use it continuously
during exposures. CCD sensors use no power during exposures, only during
readout (and not much, even then). Any advantage of power savings is
tied to short exposure imaging, which doesn't generally include
astronomical imaging.

Of course, for astronomical imaging you normally use a cooled camera,
and the cooler uses several orders of magnitude more power than any type
of sensor.

Chris L Peterson

unread,
Jan 26, 2009, 8:21:02 PM1/26/09
to
On Mon, 26 Jan 2009 18:33:32 -0600, Rich <no...@nowhere.com> wrote:

>I think they've already started using them on a small scale. No 1G or
>larger CMOS sensors at the back of professional telescopes yet. But it
>seems like CMOS might be good on satellites.

I don't know of any CMOS sensor being used for professional astronomical
imaging, but they are used in related applications, like wavefront
sensors for adaptive optics, and for autoguiders. In these applications
they have distinct advantages over CCDs.

MitchAlsup

unread,
Jan 27, 2009, 6:04:37 PM1/27/09
to
A) Color is a function of the filter (array) in front of the sensor
cells. These filter arrays can be put in front of CCDs or CMOS sensors
with little change in output color from the CCD versus CMOS
architecture of the cells.

B) CMOS sensors need to have the voltage turned on durring the charge
accumulation stage or the charge can leak away. This only takes
leakage amounts of power (small). However, since camera CMOS sensors
are designed for short duration imaging, the A/D channels are left
powered on and these do burn power and heat up one corner (or two) of
the sensor chips durring the charge accumulation phase. In principle,
one COULD design a CMOS sensor that was power competitive wrih CCD
sensors during the charge accumlatin phase--its just not been done
(yet). Most ofthe power of CMOS sensors are in the A/D channels.

C) There are situations where CMOS devices would not take to being
chilled to N2 temperatures, I don't know the doping profies used in
CMOS transistors, but some are not as tollerant of liquid N2 temps as
are CCDs. Thermo-electric coolers don't get cold enough to encounter
there transistor operating environments. In principle this could be
engineered away at some cost in ease of fabrication (i.e. yield).

D) CMOS sensors have faster readout times due to having a
transconductance amplifier in* the sensor cell (*)or shared between 2
or 4 cells. This is a single transistor that takes the stored charge
uses it as a voltage and this voltage in turn causes the transistor to
opeate as a current source. The column receiver supplies the current
and converts the current from the cell back to a voltage for the A/D
converters. Thus the stored charge does ot have to compete with the
heavlily loaded colum wire making the job of sensing the stored charge
much easier.

E) Camera sensors have A/D systems where the first 5%-ish and the last
5%-ish of the potential voltage range are not used. This means that
there does not have to be any temperature stabilization in the A/D
channels--and this causes a loss of 0.3-bits of A/D resolution. So a
12-bit A/D sensor ends up with a dynamic range of 11.2-bits rather
than the Nyquist 11.5-bits. I suspect the better CCD A/Ds COULD
temperature stabilize and gain most of this minor loss in S/N back.
Whether they do or not is not known to me.

F) depending on what KIND of noise one is talking about, either CMOS
or CCD sensors would win. Readout noise is won by CMOS sensors because
all the tricky logic is on the same chip (double sampling correlated
readout), overall noise is won by CCD converters because (typically)
more money is spent on getting the A./D right--and running it slower
so the flicker (1/f) noise is better. CCD win handily once dark
current noise is included (more than dozens of seconds of exposure) as
is typical for astronomical sensors.

Thus, CMOS make for better sensors when the duraton of the image
capture is short and there may be multiple images taken in a short
amoung of time, while CCD sensors are better when the durratiion of
image capture is long.

mitch

Quadibloc

unread,
Jan 27, 2009, 7:24:35 PM1/27/09
to
On Jan 27, 4:04 pm, MitchAlsup <MitchAl...@aol.com> wrote:
> A) Color is a function of the filter (array) in front of the sensor
> cells.

Since color depends on the relative proportions of the three primaries
used - if, indeed, three primaries are used - there is one thing about
a sensor type that will show up in color rendering, even if it is also
detectable elsewhere.

Sensors need to be linear - or linearized by software - and they need
to be normalized to a gamma of 1 when the color is determined from the
various inputs, even if the gamma of the luminance portion of the
picture is decreased later to enhance dynamic range.

Software can, of course, make up for known nonlinearities, and adjust
contrast, but that this might be harder to do for some sensor types is
not unbelievable to me. Especially if they don't bother to make these
adjustments at all, which may happen in at least some cases.

John Savard

John Sheehy

unread,
Jan 28, 2009, 11:26:46 AM1/28/09
to
Quadibloc <jsa...@ecn.ab.ca> wrote in news:8e63cb3d-c33d-47e8-99ae-
bff844...@e1g2000pra.googlegroups.com:

> Software can, of course, make up for known nonlinearities, and adjust
> contrast, but that this might be harder to do for some sensor types is
> not unbelievable to me. Especially if they don't bother to make these
> adjustments at all, which may happen in at least some cases.

If a converter had a feature to profile a camera with provided test
targets, it could test for all sorts of ills:

1) Dead, suck, and hot pixels

2) Non-linearities at sensor saturation (if relevant; many clip below
that point at base ISO)

3) Repeating black offset patterns in the sensor,

4) Repeating gain differences for individual sensels,

5) Differences in amplifier gain for multi-channel systems, etc,

All these things could be compensated for, to make for a more flexible
RAW conversion. Your camera wouldn't have to be perfect; it would just
need to be consistent! Anything consistent could be compensated in every
RAW presented from that same camera.

There are many things a converter could also look for dynamically, which
may be identifiable, even if not repeatable.

These may make conversion slower, so obviously, they need to be options
in an "advanced" tab.

Chris L Peterson

unread,
Jan 28, 2009, 12:07:52 PM1/28/09
to
On Wed, 28 Jan 2009 16:26:46 GMT, John Sheehy <J...@no.komm> wrote:

>If a converter had a feature to profile a camera with provided test
>targets, it could test for all sorts of ills:
>
>1) Dead, suck, and hot pixels
>
>2) Non-linearities at sensor saturation (if relevant; many clip below
>that point at base ISO)
>
>3) Repeating black offset patterns in the sensor,
>
>4) Repeating gain differences for individual sensels,
>
>5) Differences in amplifier gain for multi-channel systems, etc,
>
>All these things could be compensated for, to make for a more flexible
>RAW conversion. Your camera wouldn't have to be perfect; it would just
>need to be consistent! Anything consistent could be compensated in every
>RAW presented from that same camera.

That's what cameras already do. The raw pixel values are corrected for
all sorts of things before the actual RAW file is saved.

John Sheehy

unread,
Jan 28, 2009, 1:42:26 PM1/28/09
to
Chris L Peterson <c...@alumni.caltech.edu> wrote in
news:l341o41vudrv8jgkv...@4ax.com:

> That's what cameras already do. The raw pixel values are corrected for
> all sorts of things before the actual RAW file is saved.

Only to a limited extent. Many of these artifacts remain in real RAW
files.

0 new messages