12 Megapixels:
http://www.pbase.com/andersonrm/image/109265567
24 Megapixels:
http://www.pbase.com/andersonrm/image/109265645
1/res_out = 1/res_lens + 1/lens_sensor.
eg: you're neglecting lens resolution losses.
--
-- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm
-- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
-- [SI] gallery & rulz: http://www.pbase.com/shootin
-- e-meil: Remove FreeLunch.
-- usenet posts from gmail.com and googlemail.com are filtered out.
I think you meant to write 1/res_out = 1/res_len + 1/res_sensor, you
had lens_sensor.
But the real equation is closer to 1/res_out = sqrt(1/res_len^2+1/
res_sensor^2)
Still you are correct that the lens needs to be taken in to account,
and can be the limiting factor.
Scott
I think I meant to have my brain on when I wrote that.
>
> But the real equation is closer to 1/res_out = sqrt(1/res_len^2+1/
> res_sensor^2)
Yes - more brain idle, though I would have wrote it:
1/res_out^2 = 1/res_lens^2 + 1/res_sens^2.
> Still you are correct that the lens needs to be taken in to account,
> and can be the limiting factor.
Everything in series has a limiting factor.
> Yes - more brain idle, though I would have wrote it:
>
> 1/res_out^2 = 1/res_lens^2 + 1/res_sens^2.
>
>> Still you are correct that the lens needs to be taken in to account,
>> and can be the limiting factor.
>
> Everything in series has a limiting factor.
FWIW, I find that it's quite possible to get corner-to-corner sharp images
from the 5D2. Most of the wide angle lenses I own need to be stopped down to
f/11 or f/16 (which is an irritation and is why I'll be buying the ZE 21/2.8
if it comes out), but at 35mm and longer most lenses are sharp at the
corners at f/5.6 or f/8.
So the lens is only the limiting factor if one isn't thinking/trying.
--
David J. Littleboy
Tokyo, Japan
I've always had an open mind about this sort of comparrison but this one
reeks of fraud.
I have a some new D90's which inherite much from the D300 and similar shots
from it, compared to my D3 don't show anywhere near the difference those
examples do. Moreover... My s5 Fujifilm Pro camera (a D200 in drag) that has
to interpolate it's image x2 to get 12 megapixels produces almost as
detailed results as the D90, just a little more unsharp than the D90.
The part about the D300 image that has me thinking a lot of manipulation
has been applied to degrade it's image is in the way detail in the grey bush
at the foot of the shot looks like it have firstly have it's black level
lifted and then blured... Certainly not a camera direct example and
definately an image I could do much with to make it look comparable to the
D3x image if I were trying to pull the wool over someone's eyes.
D-Mac.info
Hmmm, comapring a APS-C sensor to a FX`sensor?
Well, there is the difference between absolute performance, pixel
peeping and plain making good photography.
Oddly, when I posted the "Megapixel challenge" before Christmas there
was only one taker (other than me) - and he shot on film (acquitting
himself well at that - I scanned his film on my 9000ED). The photos
were corner shots only, BTW, though shot in the sweet aperture spot.
Ehmmm... a lens does not have a resolution in megapixel. Both formulas
make no sense.
> Still you are correct that the lens needs to be taken in to account,
> and can be the limiting factor.
Yep.
--
Alfred Molon
------------------------
Olympus 50X0, 8080, E3X0, E4X0, E5X0, E30 and E3 forum at
http://tech.groups.yahoo.com/group/MyOlympus/
http://myolympus.org/ photo sharing site
Alfred,
It makes sense if you use the same spatial units, whether that be microns
or megapixels, and then use the appropriate formula. But you are right
in that using one figure for lens performance may simplify things too
much.
David
"megapixels" is not a resolution - it is a quantity.
Resolution is in typically in lp/mm for such a case.
Megapixels is a count over a surface area, so not too useful - lp/mm
would be the 'typical' unit.
In both cases one is simplifying things by assuming a single linear
measure (pixel effective size or lens point-spread function), and then
applying an RMS-style addition of these values to produce an effective
system resolution. This may work well if all the system MTFs are
Gaussian, but not so well otherwise. For a given sensor size, you could
express the lens resolution as "megapixels", or you could express the
sensor resolution as line-pairs per mm. All approximations, of course.
David
>> Megapixels is a count over a surface area, so not too useful - lp/mm
>> would be the 'typical' unit.
>
> In both cases one is simplifying things by assuming a single linear
> measure (pixel effective size or lens point-spread function), and then
> applying an RMS-style addition of these values to produce an effective
> system resolution. This may work well if all the system MTFs are
> Gaussian, but not so well otherwise. For a given sensor size, you could
> express the lens resolution as "megapixels", or you could express the
> sensor resolution as line-pairs per mm. All approximations, of course.
Check my math, but I believe using the area (megapixels) simplifies the
equation to 1/tot=1/len+1/sens. Just convert the lens to mpix as
res^2*48*72 (where the lens res is in lp/mm). eg: a 100 lp/mm lens
comes out to about 20 Mpix "resolution" equivalent.
> It makes sense if you use the same spatial units, whether that be microns
> or megapixels, and then use the appropriate formula. But you are right
> in that using one figure for lens performance may simplify things too
> much.
The problem is that a lens does not have a resolution - it has a certain
MTF at a certain spatial frequency. From a signal processing perspective
it is sort of a low pass filter, which attenuates the high frequencies.
.. and the sensor and anti-alias filters have an MTF as well, as do the
other elements of the system. Using a single value like "resolution" is
just an approximation, that may, or may not, work well.
Cheers,
David
Waste of time. I shot Kodak's new Ektar 100 and scanned it and it has
about 1/2 the resolution of a D300 so your Sony A900 should wipe the
floor with any 35mm film scan. Michael Reichman is right, film's day
is done in this respect.
Not to mention that the subject has an effect on resolution. If you
shoot 3" wide power lines against a sky at a distance of 5 miles,
you'll see them on the sensor, even if their physical dimension makes
theoretical resolution impossible. This is a case of contrast-induced
"super resolution." Where as a low contrast subject of the same size
will never show up on the sensor.
The sensor has a pixel count - something the filter and the lens do not.
You're mixing up things.
The subject has zero effect on the system MTF, of course.
However, the subject contrast will determine how high a spatial frequency
you can use before the signal from the subect is lost in the system noise.
We used to use the term "minimum resolvable modulation", but "minimum
resolvable contrast" may be more in fashion now.
http://en.wikipedia.org/wiki/Minimum_resolvable_contrast
When you have a line (or point) source, what I think you are saying is
that 20% modulation over one pixel may effectively detected in the eye as
5% modulation over 4 pixels - for example.
Cheers,
David
The sensor, filter and lens all have an MTF, as does the Bayer processing.
All the same - the transfer function at a certain cycles per mm (or
whatever unit you choose). Of course, you need to measure the MTF in the
same spatial units, but it doesn't matter what those spatial units are.
David
> The sensor, filter and lens all have an MTF, as does the Bayer processing.
> All the same - the transfer function at a certain cycles per mm (or
> whatever unit you choose). Of course, you need to measure the MTF in the
> same spatial units, but it doesn't matter what those spatial units are.
The sensor is a discrete sampler and for practical purposes you consider
it the source of the signal. All other things, the AA filter, the lens
and Bayer interpolation have an impact on the high frequency content.
If you want to know the overall performance of the system, all the
components must be taken into account. Changes in the MTF before and
after the sensor can have distinctly different effects on the image,
though. I'm thinking aliasing here.
Cheers,
David
He didn't say different. He just said the choice of units does not
matter (as long as you are consistent).