I ask because Vuescan's 'Calibrate' function appears to have no effect on
the Coolscan 4000 (at least, not on mine) so I have no way of initiating the
calibration manually (or do I?). I read in previous posts that the Vuescan
'Calibrate' command does not have any effect with some other scanners also.
Is this because Vuescan has its own cunning method or that it is assumed
that the scanner calibrates itself automatically, hence my original
question.
BTW I did forward a problem report to Ed Hamrick but have received no reply
(perhaps because I forgot to zip up the log file - sorry Ed!)
--
John
Replace 'nospam' with 'todnet' when replying.
> Is anyone familiar with the internal calibration system of the Nikon
> Coolscan 4000 scanner? Does it calibrate each time the film is removed, or
> is it supposed to happen at regular intervals when the scanner is idle?
It is not happening automatically but has to be intiated by the
software.
> I ask because Vuescan's 'Calibrate' function appears to have no effect on
> the Coolscan 4000 (at least, not on mine) so I have no way of initiating the
> calibration manually (or do I?). I read in previous posts that the Vuescan
> 'Calibrate' command does not have any effect with some other scanners also.
That's true. It doesn't work on the LS 40 either.
> Is this because Vuescan has its own cunning method or that it is assumed
> that the scanner calibrates itself automatically, hence my original
> question.
You need to start NikonScan if you want to calibrate the scanner. I
thought long time calibration is not necessary (and probably Ed thinks
the same) but I had problems with color streaks in some underexposed
negs. I did a calibration using Nikonscan and the streaks where gone.
> BTW I did forward a problem report to Ed Hamrick but have received no reply
Send it again and insist on an answer. May be if more people do that
he'll adress that problem.
--
Erik Krause
Digital contrast problems: http://www.erik-krause.de/contrast
I'll do that. Thanks for your advice and information.
>I ask because Vuescan's 'Calibrate' function appears to have no effect on
>the Coolscan 4000 (at least, not on mine)
You will only notice it if the calibration is bad to begin with, and
that isn't often the case with the LS-4000 for the reasons above.
>so I have no way of initiating the
>calibration manually (or do I?).
There certainly is on NikonScan - just hit the calibrate button on the
software - under Scanner Extras.
>I read in previous posts that the Vuescan
>'Calibrate' command does not have any effect with some other scanners also.
>Is this because Vuescan has its own cunning method or that it is assumed
>that the scanner calibrates itself automatically, hence my original
>question.
>
>BTW I did forward a problem report to Ed Hamrick but have received no reply
>(perhaps because I forgot to zip up the log file - sorry Ed!)
>
Why do you think it is worth a problem report? Do you have a problem
other than not being able to see the calibration change? It would be a
problem if you did see the calibration change! ;-)
--
Kennedy
Yes, Socrates himself is particularly missed;
A lovely little thinker, but a bugger when he's pissed.
Python Philosophers (replace 'nospam' with 'kennedym' when replying)
When I say the calibration doesn't work, I don't mean I can't see a
difference - I mean it doesn't make the scanner do anything. When I execute
the 'calibrate' command in NikonScan, I can hear the scanner spring to life.
This does not happen with the Vuescan calibrate command, so I concluded that
the command is non-functional.
The reason I am worried is that when using Vuescan (assuming I have not
already opened NikonScan previously, I do not know if the scanner is well
calibrated or not - it may or may not be. If I can manually initiate the
calibration, I know it is calibrated. Unfortunately, Vuescan does not let me
do this.
When using NikonScan, I have had experiences from time to time with odd
colour balance which have been cured by executing the 'Calibrate' command.
Therefore I would like to have this option in Vuescan also.
Some good news - I took Erik Krause's advice and emailed Ed Hamrick again
and got a prompt reply - the command hasn't been implemented yet but will be
in the next few weeks. Sounds like you're right and that it only worked with
an earlier scanner. Seems like a surprising oversight given Ed's familiarity
with Nikon scanners, but then he has got seriously bogged down with the
Minolta issue and there are only 24 hours in a day (I assume Ed doesn't
sleep :-) )
>I wonder if, in his attempts to solve the calibration issue with the
>Minolta 5400, Ed has corrupted the calibration procedure completely!
As I recently predicted this was bound to happen due to the
demonstrated "fragility" and rigidity of VueScan's code which, in this
case, is not flexible enough to account for the full range of
variations.
Don.
I send you some helpful informations about calibration:
Coolscans are unstable. Calibration procedure causes equalization
of channels, which are deregulating by chip's temperature changes.
This lasts only for a short time period, because when the scanner is hot,
the channels will quickly deregulate again.
Look at this link:
http://www.coolscan.pl/temp/9K-stab.jpg
This is example of 9K stability during time.
Best solution to assure acceptable stability is to unplug scanner
and plug in again after about 30 minutes (or more).
Another extreme solution is a hardware mod, in order to provide
good passive or active cooling of some chips.
(Un)stability - It is important/hazardous when you use LUT profiles.
On the other hand it is less dangerous to use matrix profiles;
note that LUT profile is more accurate and better for Coolscans,
but matrix profile is safer and stable.
That is why because matrix doesn't use white point in order to calculate
colors, which are changeable during scanning process.
As alternative option for frequent calibration, or hardware modification,
you can use more than one profile or mathematics analysis (and
software swap) for each following scan (starting up from second scan).
Scanning in high resolutions (as 4000 dpi 16x multisampling) will occur in
differences between start and end point of image. This problem occurs when
your picture has a neutral background for example.
Calibration option is suitable when not using a batch scanning.
I don't know if it's possible to add in NikonScan or Vue option "Calibrate
before each scan".
Probably not, because when "eject" function is running (f.e. in SF-2xx) the
following image is inserted automatically.
But if you scan non-stop I advise making pauses or installing an additional
cooling solution.
Ps1. I have to sell the new SA-30 with 2-year guarantee, with the
acceptance in NY (it is in Poland now, but my friends fly form Poland to NY
frequently).
Ps2. Sorry for my English - I hope that you understood me well.
Kind regards
Maciek
> I ask because Vuescan's 'Calibrate' function appears to have no effect on
> the Coolscan 4000 (at least, not on mine) so I have no way of initiating the
> calibration manually (or do I?). I read in previous posts that the Vuescan
> 'Calibrate' command does not have any effect with some other scanners also.
I insisted again on that problem and Ed provided a test version. You
can download it from
http://www.hamrick.com/files/testcali.exe (Windows)
http://www.hamrick.com/files/testcali.dmg (Mac OS X)
I didn't test it jet myself so I have no comments yet.
I've just downloaded and installed the Windows version now and tried the
calibrate function - based on the noises from the scanner, it sounds to work
fine. I haven't done any scans yet but I have some to do tonight so I'll
give it a more thorough test and let you know. So far, so good.
I've just downloaded and installed the Windows version now and tried the
calibrate function - based on the noises from the scanner, it sounds to work
fine. I haven't done any scans yet but I have some to do tonight so I'll
give it a more thorough test and let you know. So far, so good.
--
The plots present on the link above show the global stability, probably
dominated by variation in the LED output intensity with temperature. A
similar global drift of black level occurs as well, due to the ADC
reference and CCD output bias.
Basically, getting stability of the precision of 16-bits, as in the
LS-9000, would test most non-temperature stabilise systems. However, if
you compare test swatches of the colour extremes of your global drift
test, the difference in colour is barely perceptible - and I doubt it
would be noticed by eye other than on side by side swatches. So I don't
think the effect you have noted is particularly problematic, although I
have no doubt that it is present.
What I did find interesting though, it that your plot was referenced to
beak white 5 minutes after the scanner was turned on, yet within
90seconds of the start of the scan the worst offending channel had
reduced its drift rate to less than a quarter (crudely approximated from
your graph) of the start drift. This would indicate that the best
course of action would be to calibrate and take a second scan
immediately after the first has completed, or at least got 2 minutes
into itself. The first scan can then be discarded and the second kept.
This would seem to be a lot better than switching off and on all the
time, which is never good for electronics.
The primary function of the calibration is to normalise cell to cell
non-uniformity and in that respect the dark non-uniformity varies more
as a function of temperature than the response non-uniformity. The
global white reference is generally a function of the auto-exposure and
post processing - however it is well documented that NikonScan does not
implement a white point reference, hence one of the selling points of
Vuescan.
Global variations can be corrected after the image is captured - even if
they have drifted. Unfortunately, non-uniformity cannot - at least
certainly not as readily.
Nevertheless, your plots have given me an idea on how to assess the loss
of calibration accuracy over time for the Coolscan. Unfortunately, the
first attempt indicated no change at all, so I am planning something a
bit more accurate for the second iteration.
OK, first results, only for dark nonuniformity.
What I did was to scan the full frame of a 'slide' of aluminium foil so
that no illumination from the LEDs reached the scanner. To be even more
accurate I shut off the room lighting and switched off the computer
monitor whilst the scan was being captured so that no ambient light was
leaking into the scanner - the only light in the room was the LED
leakage out of the scanner and the green LED on the front panel -
spooky! ;-)
Immediately before the data capture, I calibrated the scanner, so there
was at most 5 seconds between the start of the end of calibration and
the start of the scan. I captured a full resolution monochrome frame at
16x multisampling on 14-bits per sample, with gamma set to 1.0, analogue
gain set to +2 (maximum limit on monochrome scans) and all other
parameters switched off. Total time for the scan was 12min 38seconds
and I saved this as a tiff file.
I then read the tiff file with a quick and dirty Delphi program I
knocked up this afternoon to measure the mean, peak to peak variance,
and standard deviation of each line in the frame. Since each line is
captured in sequence, the mean gives a measure of how the black level
increases with time throughout the scan period, the pk-pk variance shows
the worst case non-uniformity within that black level and the standard
deviation shows a sort of average of the non-uniformity. This data was
simply written to a text file for importing into Excel for charting and
further manipulation.
Results are at http://www.kennedym.demon.co.uk/results/gamma10.jpg
The global black level shows a similar degradation with scan time as the
global white level that Maciek's results show. However the degradation
of the pk-pk non-uniformity is a lot less, and has virtually flattened
out by half way through the scan time. This is borne out by the plot of
the standard deviation of each line, which is more of an average measure
of the non-uniformity, that gives less weight to the outliers. Since
the outliers are the lines that are visible, this is usually the plot
that matters.
To put it into context for images, the raw frame was converted to gamma
2.2 in NikonScan to retain the full 16-bit precision and an unlimited
gamma slope. The results are almost identical and are shown at:
http://www.kennedym.demon.co.uk/results/gamma22.jpg for reference. Only
the scale changes really, because these are all well up from the true
black level.
So this looks like a gradual increase in black level as the scan
progresses, with much less increase in non-uniformity. I would
therefore conclude that this is dominated by features in the scanner
that apply to all of the CCD cells, for example the ADC reference or the
CCD bias level itself. There is a component that is probably due to
increased dark level, but it is negligible.
>
> I've just downloaded and installed the Windows version now and tried the
> calibrate function - based on the noises from the scanner, it sounds to
work
> fine. I haven't done any scans yet but I have some to do tonight so I'll
> give it a more thorough test and let you know. So far, so good.
>
>
I've scanned a few slides now and as far as I can tell, everything works as
it should. Version 8.1.38 was released last night and has the calibration
feature is included.
--
John
>
> I've just downloaded and installed the Windows version now and tried the
> calibrate function - based on the noises from the scanner, it sounds to
work
> fine. I haven't done any scans yet but I have some to do tonight so I'll
> give it a more thorough test and let you know. So far, so good.
>
>
I've scanned a few slides now and as far as I can tell, everything works as
>So this looks like a gradual increase in black level as the scan
>progresses, with much less increase in non-uniformity. I would
>therefore conclude that this is dominated by features in the scanner
>that apply to all of the CCD cells, for example the ADC reference or the
>CCD bias level itself. There is a component that is probably due to
>increased dark level, but it is negligible.
It would also be interesting to see comparison to a scanner with a
conventional light source. Presumably, the temperature variations of
such a scanner would be considerably more, putting the (negligible)
LED variations into perspective. Does that make sense?
Don.
Hi Kennedy,
now my English will be really not so god, because I have temporary
no access to dictionary :-)
> The plots present on the link above show the global stability, probably
> dominated by variation in the LED output intensity with temperature.
Gospel truth. My experiments with CCD cooling has no effect in
stability of high tones (about white point). Only cooling of chips
(which control LED's) succeed. When I cool left side of plate two
channels are stable and when I cool right side of plate third channel
and IR are stable.
That is why at left side is placed chips, which controls first pair
of channels and at right side rest.
Cooling instalation in 9K - it's no problem, beacause he is bigger
(and have more place inside) with comparison to smaller coolscans.
Additionally, in small format coolscans LED's are packs in black -
box (not openable). I'm not sure if in this box are placed only LED's
or LED's with all electronic control chips.
I know that small differences between channels don't make a problem,
because I can correct this via software, but this is problem for my
colorimetric experiments (I specialize in CM).
Additionally is not the same 2 ways:
a) scan original channel luminancy
b) scan modified channel luminancy and correct it via soft
because, when I correct via soft scans, wich has small Dmin value,
software levels corrections only restore global equalizations,
but not restore original contrast in small ranges in high tones (f.e.
luminance between 89 and 90), that will be partially loss.
To correct this I need precision curve tool, which can be set
up in high precision.
I'm not sure if I good understand your experiments about black
point (I must print your mails and go to home, when I have a
dictionary :-), but remember, that:
a) when you scanning any material you should totally unfocus lens
to eliminate any structure fluctuancy
b) when you analyze scans line to line you should be considerate,
because previous line affecting to next line - this is effect of:
b1) electrical occurrence (CCD)
b2) lack ideal precision in locate each channels (in coolscans
RGBI channels is litle moved - about 0,2-0,5 pixel)
ps. I'm happy that not only me litle crazy about precision ;-)
a I hope that you will understand me though partially
Regards
Maciek
I've followed this interesting thread, and now I have a question for
you, if you don't mind of course. :)
>Cooling instalation in 9K - it's no problem, beacause he is bigger
>(and have more place inside) with comparison to smaller coolscans.
I'm planning to purchase precisely an LS-9000.
Your considerations about calibration and CCD stability made me worry
a bit: basically, do you think this scanner can show appreciable
fluctuations in CCD response while heating and so ,and that those
fluctuations are not easily recovered by NikonScan "calibration"
command?
About cooling the chips: which cooling method are you using?
Simple fans or Peltier devices?
Thanks!
>ps. I'm happy that not only me litle crazy about precision ;-)
Lots of people here (me including) are crazy about precision! You're
in good company. :-)
BTW, Kennedy is an imaging sensor designer, and there's practically no
end to his quest for precision. ;-)
I save lots of his messages as knowledge base.
Bye!
Fernando
> b2) lack ideal precision in locate each channels (in coolscans
> RGBI channels is litle moved - about 0,2-0,5 pixel)
Could that be due to chromatic aberration?
Or: How do you rule out chromatic aberration?
I have a coolscan 5000 and always thought the slight offset of the
channels was caused by CA.
regards, wim
--
www.wiskerke.com
However there is another issue that is common with scanners which is
often termed chromatic aberration as well. This is actually caused by
the movement of the RGB trilinear CCD together with the illumination
source, causing objects to exhibit colour distortion at edges - one
colour at the leading edge and the complementary colour at the trailing
edge, in the axis of the scan head movement. The design of the Nikon,
with a single polychromatic detector and separate RGBI illumination
sources, effectively prevents this latter problem from arising.
>> The plots present on the link above show the global stability, probably
>> dominated by variation in the LED output intensity with temperature.
>
>Gospel truth. My experiments with CCD cooling has no effect in
>stability of high tones (about white point). Only cooling of chips
>(which control LED's) succeed.
That confirms what I suspected! ;-)
> When I cool left side of plate two
>channels are stable and when I cool right side of plate third channel
>and IR are stable.
>That is why at left side is placed chips, which controls first pair
>of channels and at right side rest.
LEDs and LDs have a notoriously unstable output amplitude with
temperature. That is why the analogue gain is controlled by the
exposure time to the LED, rather than attempt to adjust the intensity.
>Cooling instalation in 9K - it's no problem, beacause he is bigger
>(and have more place inside) with comparison to smaller coolscans.
>Additionally, in small format coolscans LED's are packs in black -
>box (not openable). I'm not sure if in this box are placed only LED's
>or LED's with all electronic control chips.
>
I haven't opened my LS-4000 up yet. I have opened every other Nikon I
have owned, usually to clean them, but I now have a pretty good working
procedure to keep the scanner free from dirt and dust accumulation so I
haven't had to resort to cleaning this one yet. No doubt that time will
come eventually and I will have to delve inside. ;-)
>I know that small differences between channels don't make a problem,
>because I can correct this via software, but this is problem for my
>colorimetric experiments (I specialize in CM).
>Additionally is not the same 2 ways:
>a) scan original channel luminancy
>b) scan modified channel luminancy and correct it via soft
>
>because, when I correct via soft scans, wich has small Dmin value,
>software levels corrections only restore global equalizations,
>but not restore original contrast in small ranges in high tones (f.e.
>luminance between 89 and 90), that will be partially loss.
>To correct this I need precision curve tool, which can be set
>up in high precision.
>
Yep, I understand your problem, but it is pretty specialised and not
typical of the average, or even most specialised users.
I suspect that you have had to develop most of this procedure yourself.
>I'm not sure if I good understand your experiments about black
>point (I must print your mails and go to home, when I have a
>dictionary :-), but remember, that:
>
>a) when you scanning any material you should totally unfocus lens
> to eliminate any structure fluctuancy
I fully agree with you - but in this case I was scanning an opaque
image, so no defocus was necessary. For higher illuminations this would
be an absolute requirement.
>b) when you analyze scans line to line you should be considerate,
> because previous line affecting to next line - this is effect of:
> b1) electrical occurrence (CCD)
> b2) lack ideal precision in locate each channels (in coolscans
> RGBI channels is litle moved - about 0,2-0,5 pixel)
>
Again I agree, but I don't think this is an issue with the tests I did,
because it was a uniform, perfectly black target.
>ps. I'm happy that not only me litle crazy about precision ;-)
Oh lots of us are after precision, but by the sound of things, you are
looking for (and need) a lot more precision and stability than the rest
of us, so you are probably out there on your own.
I am glad you have found this group though, because is sounds like you
might be doing a lot of work that many of us can benefit from.
Also, although the range of focus distances is limited, there is a
(small?) risk of changing effective exposure depending on focus
setting. On a truely opaque 'slide' there is no need to defocus,
although one should take care of avoiding in-scanner stray light (by
reducing ambient light levels and a black slide surface, e.g. alu-foil
with black sensor-side surface).
Bart
Maybe I missed something, but I assume that was to determine the
stabilization period needed. I usually allow the scanner, and the film
inside the scanner, to reach a more stable equilibrium by allowing a
period of 'heating up'. Frequent previews will not only allow to reach
that state sooner, but it will also allow to reduce film movement
*during* the actual scan.
Bart
Not really - although I did the measurement soon after power up to
assess something like the worst case mean drift. I am sure that it will
improve if repeated after being powered and used for some time, as one
would normally do, and the shape of the curve towards the end of the
scan period supports this.
However the calibration was to reduce the non-uniformity to its minimum
level as close as possible to the start of the scan, so that its
degradation through the scan period from that optimum could be assessed,
and to determine if that degradation was similar to the mean black
drift.
Having examined the images in a little more detail this evening, I now
think I may be being somewhat overcritical in placing emphasis on the
pk-pk result.
Although this certainly is the pk-pk non-uniformity in each scan line,
there is no correlation that the peaks and troughs exist of the same
cells on each scan line. By increasing the contrast in the original
image by about 1000x in Photoshop (levels taken to 0-15 twice then 0-63
or similar on the third step) there is some evidence of this line
structure, but at a level well below the pk-pk amplitude present even
with 16x multiscanning. So I need to do a bit of filtering on the data
and repeat the stats.
I will answer you in (or after) weekend, because now I have
a lot rush orders of scanning services (this is normal in friday).
Regards
Maciek
OK, now I have had some time to examine the data and filter it to give a
better measure of the line to line variation immediately after
calibration and its degradation with time.
What I did to the original data was fairly trivial. I wanted to reduce
the random noise amplitude but retain and variation along each line, so
the original data was simply averaged over 30 pixels along the scan
direction. This reduces the variation between samples but retains any
linear nonuniformity across the CCD. Obviously the mean dark level
stays virtually the same, although the filtering smooths the curve a
little. 30 samples averaged out of a total of 5782 isn't a vast amount
of smoothing.
The results are shown at
http://www.kennedym.demon.co.uk/results/filtered.jpg
For reference, the original unfiltered data is at
http://www.kennedym.demon.co.uk/results/gamma10.jpg
As expected, both the pk-pk non-uniformity (what will be most visible as
line structure) and the standard deviation are significantly lower in
the filtered plots, indicating that the line structure is actually
substantially less than the random dark noise - by about a factor of 4
or so. This isn't as low as is required for the line structure to be
completely invisible in the random noise if the image was subjected to
extreme level shifts, but at near normal luminance it is below visible
thresholds.
At the rate the pk-pk level is increasing, it would be expected to
become visible and objectionable after two or three scans but, as
explained previously, this data was captured pretty soon after power on
to capture the worst case. If I can remember, I'll try to capture
another set of data after the scanner has been on and working for an
hour or so.
Hi Fernando again!
> I'm planning to purchase precisely an LS-9000.
Brilliant decision. Congratulations.
Affection to LS-9000 is frequently similiar to women:
Probably You will love and hate, but for sure it will design
stream of your live :-)
> Your considerations about calibration and CCD stability made me worry
> a bit:
Don't worry. For this price you have not a better alternative.
Next (litle better) level of scanners begining with 5-digit price.
Exception is Imacon, but:
a. he scans only dry (with dust and scratches)
b. is still CCD (increse noises with increasing of density)
c. I'm not tested it ;-)
> basically, do you think this scanner can show appreciable
> fluctuations in CCD response while heating and so ,and that those
> fluctuations are not easily recovered by NikonScan "calibration"
> command?
Yes, but only where the scanner is very hot (after long continuous job).
In general, where scanner is not strongly charged at one session,
calibration is helpful.
> About cooling the chips: which cooling method are you using?
> Simple fans or Peltier devices?
Two flat fans. Better solution is to install large flat passive cooling an
plate (you will have to cut firmly small radiators on control chips),
becouse:
a. passive cooling not need an additional power
b. not produce eventually noise
LS-9000 with Kami wet feeder and 8 or 16 multisampling is excellent
solutions, but you scholud remember that this type of scanning take some
time.
Regards
Maciek
>LS-9000 with Kami wet feeder and 8 or 16 multisampling is excellent
>solutions, but you scholud remember that this type of scanning take some
>time.
http://groups.yahoo.com/group/coolscan8000-9000/
Lots of smart and experienced scanners there. See ya.
PS: I don't ever use more than 4x multisampling.
rafe b.
http://www.terrapinphoto.com