Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Calibration Of Electronic Equipment In The Home Workshop

18 views
Skip to first unread message

Too_Many_Tools

unread,
Feb 28, 2007, 9:27:45 PM2/28/07
to
I have a well stocked test bench at home containing a range of analog,
digital and RF test equipment as I am sure most of you also do.

Well the question I have is how do you handle the calibration of your
equipment? What do you use for calibration standards for resistance,
voltage, current and frequency?

Links to recommended circuits, pictures and sources would be
appreciated.

Since this is a need for anyone who has test equipment, I hope to see
a good discussion on this subject.

Thanks

TMT

MassiveProng

unread,
Feb 28, 2007, 9:50:52 PM2/28/07
to
On 28 Feb 2007 18:27:45 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> Gave us:

If you have used test gear, and you do not intend to PAY to have it
calibrated, you be best off leaving it all the fuck ALONE!

James Sweet

unread,
Feb 28, 2007, 9:59:43 PM2/28/07
to


What sort of equipment are you trying to calibrate? You can do a search
online for various voltage reference sources. Accurate frequency
reference can be picked up off the airwaves from transmitters such as
WWV. What methods you use depend on what the equipment is and how
accurate you need it to be.

a7yvm1...@netzero.com

unread,
Feb 28, 2007, 10:23:41 PM2/28/07
to
On Feb 28, 9:50 pm, MassiveProng
<MassivePr...@thebarattheendoftheuniverse.org> wrote:

> If you have used test gear, and you do not intend to PAY to have it
> calibrated, you be best off leaving it all the fuck ALONE!

As alone as you on a Friday night?

Jim Yanik

unread,
Feb 28, 2007, 10:33:06 PM2/28/07
to
"Too_Many_Tools" <too_man...@yahoo.com> wrote in
news:1172716065....@8g2000cwh.googlegroups.com:

Wavetek and Fluke both make nice all-in-one calibrators for voltmeters and
scopes,they do V,I,R,timing and bandwidth checks.Not cheap,though.
There's plenty of older,used cal standards on the market,too.
Getting them certified may be a problem due to their age.

For F-counters,you need a WWV or GPS-based receiver.

high-end stuff,you send out to a lab.
(consider them your "primary standards")

A warning;calibration procedures of some TEK gear may be written to use
their recommended list of standards,and difficult or impossible to do fully
with substitutes.
Especially their video test gear.

--
Jim Yanik
jyanik
at
kua.net

Ken Smith

unread,
Feb 28, 2007, 10:50:23 PM2/28/07
to
In article <1172716065....@8g2000cwh.googlegroups.com>,

Too_Many_Tools <too_man...@yahoo.com> wrote:
>I have a well stocked test bench at home containing a range of analog,
>digital and RF test equipment as I am sure most of you also do.
>
>Well the question I have is how do you handle the calibration of your
>equipment? What do you use for calibration standards for resistance,
>voltage, current and frequency?

For frequency, you can use WWV. You need:
A short wave radio with an audio output.
Perhaps an audio filter tuned to about 1KHz.
A generator you wish to calibrate near the WWV frequency.
A frequency counter that is not too far off.

Procedure:
Tune in WWV.
Put wire on generator and set it to WWV-1KHz
Listen for tone and move stuff around until it sounds good.
Feed tone into the filter.
Place the counter on the output of the filter.

The number on the counter is X Hz away from 1KHz when the generator is XHz
off from WWV-1KHz.

--
--
kens...@rahul.net forging knowledge

Phil Allison

unread,
Feb 28, 2007, 10:57:23 PM2/28/07
to

"Too_Many_Tools" <too_man...@yahoo.com> wrote in message
news:1172716065....@8g2000cwh.googlegroups.com...


** Groper alert !

>I have a well stocked test bench at home containing a range of analog,
> digital and RF test equipment as I am sure most of you also do.
>
> Well the question I have is how do you handle the calibration of your
> equipment? What do you use for calibration standards for resistance,
> voltage, current and frequency?


** A few 0.1% precision resistors does the first job for DMMs.

A Natsemi " LH0070 " 10.000 volt ( +/- 0.02% ) voltage reference IC with
0.1 % resistor divider chain giving 1.000 & 0.1000 volts for the DC volts
ranges DMMs and scopes does the second.

For AC volts, a scope screen with internal graticule is used to establish
the p-p amplitude of a sine wave - then it can be used to check then AC
ranges DMMs etc - to a 1% accuracy.

A 12MHz crystal oscillator ( 11.99993 MHz @ 25C ) checks the DFM -
calibrated using an off air standard frequency transmission picked up on a
scanner.

....... Phil


David L. Jones

unread,
Feb 28, 2007, 11:40:51 PM2/28/07
to

If you've got that sort of gear at home then usually you have better
(and calibrated) gear at work as well, in which case most of us would
simply bring in our gear from home and spot check it against the good
gear.

In the absense of this gear, you can simply use precision components.
Voltage reference chips with 0.05% or better are cheap and readilly
available.
0.01% resistors are available too.

If you have multiple meters for example, you can also keep an eye on
them by comparison. Using any old component, if all three meters read
the same then you can be pretty confident they haven't drifted.

Checking scope horizontal timebases is easy with a crystal oscillator
and divider.

There are various methods for getting an accurate frequency standard,
but one of the newst methods is using a GPS derived reference. Second
hand Rubidium standards can also be had on eBay.

Generally though, good quality test gear does not drift out of spec,
so the need for regular calibration is minimal.

Dave :)

Too_Many_Tools

unread,
Mar 1, 2007, 12:47:49 AM3/1/07
to
>
> If you have used test gear, and you do not intend to PAY to have it
> calibrated, you be best off leaving it all the fuck ALONE!

LOL

You're a real ray of sunshine, aren't you?

Now go out and play in traffic while we adults talk about serious
stuff..

TMT


On Feb 28, 8:50 pm, MassiveProng


<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 28 Feb 2007 18:27:45 -0800, "Too_Many_Tools"

> <too_many_to...@yahoo.com> Gave us:

Too_Many_Tools

unread,
Mar 1, 2007, 12:50:49 AM3/1/07
to

Thanks for the (positive) comments so far.

I look forward to any more you might want to offer.

Any circuits or examples others have done?

Any cal boxes that anyone have built?

TMT

Robert Baer

unread,
Mar 1, 2007, 1:04:41 AM3/1/07
to
You left many things unsaid: (a) Traceable to NBS/NTIS or not; (b) if
not, how many reliable digits; (c) at what cost.
You can make a 5V "standard" that any loading will not damage with
error better than 0.5mV and runs for at least 6 months with no
observable change - and the cost is only a few dollars (uses
off-the-shelf parts).
You can buy thru DigiKey, resistors rated at 0.05% and at 0.1% -
rather decent as references.
You can buy a Fluke bench meter rated at 6.5 digits and even pay a
bit more for traceability.

Robert Baer

unread,
Mar 1, 2007, 1:09:35 AM3/1/07
to
David L. Jones wrote:

True, 0.01% resistors are available, *but* they are extremely
expensive (over $100 each) and they are made when and if the
manufacturer sees fit to do so.

Homer J Simpson

unread,
Mar 1, 2007, 1:13:45 AM3/1/07
to

"Too_Many_Tools" <too_man...@yahoo.com> wrote in message
news:1172728249.8...@z35g2000cwz.googlegroups.com...

> I look forward to any more you might want to offer.

If it was me, I would start by reading Scroggie's Radio Laboratory Handbook.

--
.

--
.
.
.
.
.
.
.
.

--


Glenn Gundlach

unread,
Mar 1, 2007, 1:29:22 AM3/1/07
to

I agree with Prong on this one. I've worked in repair and broadcast
for 35 years. Unless you have a compelling reason to change it, leave
it alone. Of course, this assumes it's good stuff to begin with like
Fluke and Tek.

The only time I altered a Fluke 8060 was an eBay purchase. When
testing some new boards, I was reading 4.998 on the 5 volt ref. Never
saw any boards that far off and then tried the other Fluke which was
as expected. The eBay meter got a little 'tweak' but that was very
unusual. BTW, the 5V ref on the boards was an ADI AD588 which is
almost good enough for the 8060. I would use it to cal a 3.5 digit
meter with no qualms.

GG

doug

unread,
Mar 1, 2007, 3:46:08 AM3/1/07
to
This works if you only need about 1 part in a million. The movement of
the ionosphere makes wwv useless for real calibration. This was, of
course, wonderful when we had nothing else. It is far better to get
a gps standard (they are used on cell sites and show up on ebay) and
just use it for the timebase all the time. Alternately, use a Rb
source. They were also used in cell sites and are available easily.
They cannot move more than about a part in 100 million and they make
excellent time bases for frequency counters.

David L. Jones

unread,
Mar 1, 2007, 1:45:45 AM3/1/07
to

Not so.
RS Components have 0.01% resistors for AU$34.50 (US$26)
Farnell have 0.02% for as little as AU$20

Dave :)

Dr. Anton T. Squeegee

unread,
Mar 1, 2007, 2:16:19 AM3/1/07
to
In article <1172716065....@8g2000cwh.googlegroups.com>,
too_man...@yahoo.com (known to some as Too_Many_Tools) scribed...

> I have a well stocked test bench at home containing a range of analog,
> digital and RF test equipment as I am sure most of you also do.

<snippety>

I could post pictures... ;-)

> Well the question I have is how do you handle the calibration of your
> equipment? What do you use for calibration standards for resistance,
> voltage, current and frequency?

Hmm. Excellent question.

For frequency, I actually have three different references, all GPS-
locked. One is my primary reference, an HP Z3801, as retired from a
cellphone site. The second and third ones are both combination clocks
and freq-references, one from Trak Systems (now Trak Microwave) and the
other from Odetics/Zypher. All three use a very stable OCXO that is
constantly disciplined by the GPS receiver.

Long-term accuracy is on the order of 1E10 -11th or so. In other
words, about as good as you can get without being NIST certified.

I don't have good primary voltage or current references as yet.
That's on the 'Acquire' list for scrounging this year. For resistance,
simple Pomona plugs with 0.01% tolerance resistors work pretty well for
2-wire. For anything more, I will probably have to rent one of the Fluke
all-in-ones.

I'm just beginning to gather the goodies I need for calibrating my
O-scope collection. That will eventually consist of Tektronix leveled
sine-wave generators, and one of their CG5xxx series calibration
generators.

Keep the peace(es).


--
Dr. Anton T. Squeegee, Director, Dutch Surrealist Plumbing Institute
(Known to some as Bruce Lane, KC7GR)
http://www.bluefeathertech.com -- kyrrin a/t bluefeathertech d-o=t calm
"Salvadore Dali's computer has surreal ports..."

JW

unread,
Mar 1, 2007, 6:20:04 AM3/1/07
to
On 28 Feb 2007 22:45:45 -0800 "David L. Jones" <alt...@gmail.com> wrote
in Message id: <1172731545.0...@z35g2000cwz.googlegroups.com>:

>On Mar 1, 5:09 pm, Robert Baer <robertb...@earthlink.net> wrote:

[...]

>> True, 0.01% resistors are available, *but* they are extremely
>> expensive (over $100 each) and they are made when and if the
>> manufacturer sees fit to do so.
>
>Not so.
>RS Components have 0.01% resistors for AU$34.50 (US$26)
>Farnell have 0.02% for as little as AU$20

You guys are paying *way* too much. We use Riedon .01% precision resistors
in our A/D products, and pay about 5 bucks apiece. Their site is down at
the moment, but even Digikey has .01% resistors for around the same price:
http://www.digikey.com/scripts/DkSearch/dksus.dll?Criteria?Ref=3107&Site=US&Cat=34342147

JW

unread,
Mar 1, 2007, 6:30:38 AM3/1/07
to
On 28 Feb 2007 21:50:49 -0800 "Too_Many_Tools" <too_man...@yahoo.com>
wrote in Message id:
<1172728249.8...@z35g2000cwz.googlegroups.com>:

>I look forward to any more you might want to offer.
>
>Any circuits or examples others have done?

For a cheap voltage reference, I would look into Analog device's AD780
series. The AD780BN has an initial error of +-1mV and is available in a
plastic 8 pin DIP for easy assembly. Download the data sheet and you'll
find sample circuit diagrams. Be sure to use a nice clean power supply and
use good decoupling practices around the device.

For resistance, see my post with a link to Digikey.


MassiveProng

unread,
Mar 1, 2007, 7:02:39 AM3/1/07
to
On 28 Feb 2007 19:23:41 -0800, a7yvm1...@netzero.com Gave us:


I think there is a fly buzzing around the room. Perhaps it's only
flatulence.

Of no consequence either way.

Anthony Fremont

unread,
Mar 1, 2007, 7:05:56 AM3/1/07
to

You're amazing. You don't even know what equipment, qualifications or needs
he has, yet you're right there with THE answer. Lets take an example. I
have a 20 year old Hitachi scope as you know, the voltage cal is way off
(~20%) in a couple of ranges. Are you suggesting that I should drag it
across town, spend $200 and be without it for 2 weeks just to get it
adjusted by some obstinate, E-1 grade line tech, instead of using a brand
new DMM w .03% accuracy to tweak it myself? I'm quite sure that my Micronta
is up to the task to be honest.

If someone doesn't need traceable calibration, then why should they pay for
it? Especially if they have the resources to do it themselves. I'm
thinking of buying a cheap used Rb time base from e-bay so I can cal my old
Protek freq counter and adjust the timebase on my Hitachi scope, it's
certainly cheaper than having it done. Using a PIC driven by an ordinary
can xtal, and a quartz wristwatch of known accuracy, I was able to tweak the
xtal to within about 1-2ppm over the course of a week or two. Of course you
know that's impossible, don't you?

MassiveProng

unread,
Mar 1, 2007, 7:09:33 AM3/1/07
to
On Thu, 1 Mar 2007 03:50:23 +0000 (UTC), kens...@green.rahul.net (Ken
Smith) Gave us:

Joe Kane's audio video set up discs on Laser Disc, and DVD, and now
HD DVD are the bee's knees for a lot of audio spectrum sine wave
tones.

DVD $20, Player $40 TV and Audio gear already owned.

Catching it off some transmission has to be far more inaccurate.
Short wave receiver worth having $100 plus. Then add in audio gear?

Catching you thinking old is better than new after your brow beating
of BAH... priceless.

Hehehe... just kidding...

MassiveProng

unread,
Mar 1, 2007, 7:18:32 AM3/1/07
to
On 28 Feb 2007 21:47:49 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> Gave us:

>>


>> If you have used test gear, and you do not intend to PAY to have it
>> calibrated, you be best off leaving it all the fuck ALONE!
>
>LOL
>
>You're a real ray of sunshine, aren't you?
>
>Now go out and play in traffic while we adults talk about serious
>stuff..
>
>TMT


Serious? Look, dumbfuck, if you are concerned with calibration,
then you should be concerned enough to do it right. Asking here the
way you did means that you are beyond your depth to start with.

If you are too stupid to take your device to a place where freshly
calibrated devices are, and check it against them, you are too stupid
to be attempting to do it with some patched up method in the home
without cal manuals from the makers of all those devices. Far too
stupid.

So fuck you, pops.

You prove that numeric age does not an adult make. The traffic I
play in runs at 30GHz, so you are screwed with that presumption as
well. I had calibrated meters back in 1970, and knew more then than
you do know.

MassiveProng

unread,
Mar 1, 2007, 7:23:46 AM3/1/07
to
On 28 Feb 2007 21:50:49 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> Gave us:

>


>Thanks for the (positive) comments so far.
>
>I look forward to any more you might want to offer.
>
>Any circuits or examples others have done?
>
>Any cal boxes that anyone have built?
>
>TMT


All the circuits suggested here, while appearing good on the surface,
are all likely to introduce more error into your instruments than
correct.

THAT IS WHY REAL calibration services are used, and why I suggested
that if you do not intend to have it professionally calibrated, you...
YOU IN PARTICULAR, should just leave the gear alone, as you are too
fucking stupid to do it without introducing the aforementioned error.

In other words, they are better off UNcalibrated and still reliable,
than after any futzing around a twit like you will do with them. You
LACK the competence.

You want to attack my adulthood, fine, little boy. You are
BRAINLESS for this task.

I have worked in cal labs and in QA for years. You lose, sonny.

MassiveProng

unread,
Mar 1, 2007, 7:24:59 AM3/1/07
to
On Thu, 01 Mar 2007 06:09:35 GMT, Robert Baer
<rober...@earthlink.net> Gave us:


Or if the order is large enough.

MassiveProng

unread,
Mar 1, 2007, 7:32:24 AM3/1/07
to
On Thu, 1 Mar 2007 06:05:56 -0600, "Anthony Fremont"
<spam...@nowhere.com> Gave us:

>You're amazing. You don't even know what equipment, qualifications or needs
>he has, yet you're right there with THE answer. Lets take an example. I
>have a 20 year old Hitachi scope as you know, the voltage cal is way off
>(~20%) in a couple of ranges. Are you suggesting that I should drag it
>across town, spend $200 and be without it for 2 weeks just to get it
>adjusted by some obstinate, E-1 grade line tech, instead of using a brand
>new DMM w .03% accuracy to tweak it myself? I'm quite sure that my Micronta
>is up to the task to be honest.


You are to be disappointed. The scale dial on that scope is likely
fitted with resistors, and one of them has shifted, which shifts all
the dividers on the dial below it.

It is not a calibration issue. It is a repair issue.

So shove it up your ass, you E-1 grade dipshit.

Your micronta? Bwuahahahahahah!

Anthony Fremont

unread,
Mar 1, 2007, 7:43:01 AM3/1/07
to
MassiveProng wrote:
> On Thu, 1 Mar 2007 06:05:56 -0600, "Anthony Fremont"
> <spam...@nowhere.com> Gave us:
>
>> You're amazing. You don't even know what equipment, qualifications
>> or needs he has, yet you're right there with THE answer. Lets take
>> an example. I have a 20 year old Hitachi scope as you know, the
>> voltage cal is way off (~20%) in a couple of ranges. Are you
>> suggesting that I should drag it across town, spend $200 and be
>> without it for 2 weeks just to get it adjusted by some obstinate,
>> E-1 grade line tech, instead of using a brand new DMM w .03%
>> accuracy to tweak it myself? I'm quite sure that my Micronta is up
>> to the task to be honest.
>
>
> You are to be disappointed. The scale dial on that scope is likely
> fitted with resistors, and one of them has shifted, which shifts all
> the dividers on the dial below it.

Given that the symptom is not as you describe, then I figure you are
probably wrong, again. If so, I imagine I can fix it.

> It is not a calibration issue. It is a repair issue.

You just know it all don't you?

> So shove it up your ass, you E-1 grade dipshit.

Just had to get that anal jab in there, huh?

> Your micronta? Bwuahahahahahah!

Since 3% accuracy is considered good in the scope world, I think it would do
fine.


Tim Shoppa

unread,
Mar 1, 2007, 8:40:25 AM3/1/07
to
On Feb 28, 9:27 pm, "Too_Many_Tools" <too_many_to...@yahoo.com> wrote:
> I have a well stocked test bench at home containing a range of analog,
> digital and RF test equipment as I am sure most of you also do.
>
> Well the question I have is how do you handle the calibration of your
> equipment? What do you use for calibration standards for resistance,
> voltage, current and frequency?

It depends entirely on what you need the equipment for.

If for any legal reason you need NBS traceability, then the question
of how and how often is already answered by your regulatory agencies.

If you don't, then I cannot imagine that a couple off-the-shelf
precision resistors, voltage references, and frequency references
(total cost: $10) would not be good enough for sanity checking for
almost any pedestrian uses.

If you're the sort who keeps equipment on your bench just to calibrate
equipment on your bench just to calibrate equipment on your bench,
then any rational argument about traceability is pointless because
you've already set yourself up in an infinite circular loop.

Tim.

Ken Smith

unread,
Mar 1, 2007, 9:33:15 AM3/1/07
to
In article <12uctfb...@corp.supernews.com>, doug <doug@doug> wrote:
>Ken Smith wrote:
>> In article <1172716065....@8g2000cwh.googlegroups.com>,
>> Too_Many_Tools <too_man...@yahoo.com> wrote:
>>
>>>I have a well stocked test bench at home containing a range of analog,
>>>digital and RF test equipment as I am sure most of you also do.
>>>
>>>Well the question I have is how do you handle the calibration of your
>>>equipment? What do you use for calibration standards for resistance,
>>>voltage, current and frequency?
>>
>>
>> For frequency, you can use WWV. You need:
>> A short wave radio with an audio output.
>> Perhaps an audio filter tuned to about 1KHz.
>> A generator you wish to calibrate near the WWV frequency.
>> A frequency counter that is not too far off.
>>
>> Procedure:
>> Tune in WWV.
>> Put wire on generator and set it to WWV-1KHz
>> Listen for tone and move stuff around until it sounds good.
>> Feed tone into the filter.
>> Place the counter on the output of the filter.
>>
>> The number on the counter is X Hz away from 1KHz when the generator is XHz
>> off from WWV-1KHz.
>>
>>
>>
>This works if you only need about 1 part in a million.

One PPM is enough for almost all the test equipment you will find on
places like ebay.

You can do better if you average over longer periods.

[.....]


>just use it for the timebase all the time. Alternately, use a Rb
>source. They were also used in cell sites and are available easily.
>They cannot move more than about a part in 100 million and they make
>excellent time bases for frequency counters.

I thing someone messed up a decimal. You just made a Rb clock 100 times
worse that WWV.

chuck

unread,
Feb 28, 2007, 9:37:38 PM2/28/07
to

A lot of good points have been made already so I'll just add a small one.

Don't mess with calibration of quality equipment unless you have reason
to believe the calibration is off AND THAT IS ADVERSELY AFFECTING YOUR
WORK PRODUCTS. An amazing amount of electronics work has been done using
equipment with non-current calibration stickers, some of which was out
of calibration.

If metrology is something that interests you as a hobby, then jump into
it and have fun. Tim's last paragraph ought to be printed and framed.

Chuck

----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----

Jim Yanik

unread,
Mar 1, 2007, 10:27:32 AM3/1/07
to
Dr. Anton T. Squeegee <Spamme...@dev.null> wrote in
news:MPG.20501d9ee...@192.168.42.197:

TEK sold their TM500/5000 line to TEGAM years ago,they may still make some
calibration products. www.tegam.com

--
Jim Yanik
jyanik
at
kua.net

Jim Yanik

unread,
Mar 1, 2007, 10:40:20 AM3/1/07
to
chuck <nos...@nospam.org> wrote in
news:1172759...@sp6iad.superfeed.net:

> Tim Shoppa wrote:
>> On Feb 28, 9:27 pm, "Too_Many_Tools" <too_many_to...@yahoo.com>
>> wrote:
>>> I have a well stocked test bench at home containing a range of
>>> analog, digital and RF test equipment as I am sure most of you also
>>> do.
>>>
>>> Well the question I have is how do you handle the calibration of
>>> your equipment? What do you use for calibration standards for
>>> resistance, voltage, current and frequency?
>>
>> It depends entirely on what you need the equipment for.
>>
>> If for any legal reason you need NBS traceability, then the question
>> of how and how often is already answered by your regulatory agencies.
>>
>> If you don't, then I cannot imagine that a couple off-the-shelf
>> precision resistors, voltage references, and frequency references
>> (total cost: $10) would not be good enough for sanity checking for
>> almost any pedestrian uses.
>>
>> If you're the sort who keeps equipment on your bench just to
>> calibrate equipment on your bench just to calibrate equipment on your
>> bench, then any rational argument about traceability is pointless
>> because you've already set yourself up in an infinite circular loop.
>>
>> Tim.

reminds me of the local TV station techs who insisted that the video gear
of theirs I serviced and calibrated was off,and it turned out their 75 Ohm
termination was 87ohms.Other techs double-terminated monitors and
complained of low brightness,tried to tweak it in,screwed it all up.
Or they would have a "reference" generator at the end of 100's of feet of
coax and complain it was a few percent off.

>>
>
> A lot of good points have been made already so I'll just add a small
> one.
>
> Don't mess with calibration of quality equipment unless you have
> reason to believe the calibration is off AND THAT IS ADVERSELY
> AFFECTING YOUR WORK PRODUCTS. An amazing amount of electronics work
> has been done using equipment with non-current calibration stickers,
> some of which was out of calibration.
>
> If metrology is something that interests you as a hobby, then jump
> into it and have fun. Tim's last paragraph ought to be printed and
> framed.
>
> Chuck

this is good advice,because without a service manual and cal procedure,you
have no way of knowing what adjustments INTERACT with others.
Adjust a power supply,and gain and timing goes out the window.
Freq.response tweaks can affect more than one area of the signal.


for example,
TEK 475s have multiple vertical gain adjustments,and different adjustments
for the 2/5-10mv ranges.And the gain affects F-response.

krw

unread,
Mar 1, 2007, 11:37:04 AM3/1/07
to
In article <Xns98E66C96A31...@64.209.0.86>,
jya...@abuse.gov says...

<snip>

> this is good advice,because without a service manual and cal procedure,you
> have no way of knowing what adjustments INTERACT with others.
> Adjust a power supply,and gain and timing goes out the window.
> Freq.response tweaks can affect more than one area of the signal.
>
>
> for example,
> TEK 475s have multiple vertical gain adjustments,and different adjustments
> for the 2/5-10mv ranges.And the gain affects F-response.
>

A *real* good example of calibration adjustments that interact is
the trigger delay lines used in 60's era scopes. Without a
program, calibrating one was impossible (hard enough with).

--
Keith

Bud--

unread,
Mar 1, 2007, 1:24:56 PM3/1/07
to
Ken Smith wrote:

> In article <1172716065....@8g2000cwh.googlegroups.com>,


> Too_Many_Tools <too_man...@yahoo.com> wrote:
>
>>I have a well stocked test bench at home containing a range of analog,
>>digital and RF test equipment as I am sure most of you also do.
>>
>>Well the question I have is how do you handle the calibration of your
>>equipment? What do you use for calibration standards for resistance,
>>voltage, current and frequency?
>
>

> For frequency, you can use WWV. You need:
> A short wave radio with an audio output.
> Perhaps an audio filter tuned to about 1KHz.
> A generator you wish to calibrate near the WWV frequency.
> A frequency counter that is not too far off.
>
> Procedure:
> Tune in WWV.
> Put wire on generator and set it to WWV-1KHz
> Listen for tone and move stuff around until it sounds good.
> Feed tone into the filter.
> Place the counter on the output of the filter.
>
> The number on the counter is X Hz away from 1KHz when the generator is XHz
> off from WWV-1KHz.
>

I believe the color subcarrier in a color TV is phase locked to the
transmitted signal and, for network studio transmissions, is derived
from a cesium clock. From what I have read it is more accurate than WWV
and doesn't require extra equipment other than a TV displaying an image
with a studio source. Frequency is 3.579545 MHz.

--
bud--

doug

unread,
Mar 1, 2007, 4:15:40 PM3/1/07
to
No. One part in 100 million (10^-8) is 100 times better than wwv
(10^-6). There is lots of excellent equipment on ebay that can take
advantage of this level of accuracy. The main point is that you do
not have to think about it very often. The low cost counters that have
uncompensated or poorly compensated timebases are basically useless for
any serious work. The other nice part about the high stability
references is that you can distribute it to all the synthesizers on your
bench and everything is coherent.
Of course it depends on what you do. For my ham work, one ppm is fine.
I do other work where the Rb source is not good enough.

Clint Sharp

unread,
Mar 1, 2007, 2:39:00 PM3/1/07
to
In message <v9gdu2d4f59ckjgf4...@4ax.com>, MassiveProng
<Massiv...@thebarattheendoftheuniverse.org> writes

> Joe Kane's audio video set up discs on Laser Disc, and DVD, and now
>HD DVD are the bee's knees for a lot of audio spectrum sine wave
>tones.
>
> DVD $20, Player $40 TV and Audio gear already owned.
Aren't you reliant on the stability of the DVD player reference clock
for the stability of the test tones? I would suspect a GPS reference
would be considerably better. Of course, I could be wrong!

>
> Catching it off some transmission has to be far more inaccurate.
> Short wave receiver worth having $100 plus. Then add in audio gear?
>
> Catching you thinking old is better than new after your brow beating
>of BAH... priceless.
>
> Hehehe... just kidding...

--
Clint Sharp

clifto

unread,
Mar 1, 2007, 5:22:06 PM3/1/07
to
Glenn Gundlach wrote:
> I agree with Prong on this one. I've worked in repair and broadcast
> for 35 years. Unless you have a compelling reason to change it, leave
> it alone. Of course, this assumes it's good stuff to begin with like
> Fluke and Tek.

A few years ago I got stuck with the Telequipment scope, the last junk
available. It was sadly out of calibration. I whipped up a few voltage
dividers and took other stuff and spent an hour calibrating the vertical
channels and horizontal timebase. I got my work done. The head tech
wanted to know how, and I told him, and he hit the ceiling, ran to the
other room, grabbed the scope and sent it out for calibration.

He was strangely silent when the scope came back. When I asked him
outright about it, he didn't look up from his work when he mumbled
something about the vertical being 0.4% off reference.

Mission-critical? Send it out. Need for extreme precision? Send it out.
But if 2-5% is good enough and you've got some time and imagination,
you can get it awfully close to where you need it.

--
"...global warming is an apocalyptic faith whose preachers demand sacrifices
of others that they find far too painful for themselves."
-- Andrew Bolt, in Australia's Herald Sun

clifto

unread,
Mar 1, 2007, 5:32:18 PM3/1/07
to
Bud-- wrote:
> I believe the color subcarrier in a color TV is phase locked to the
> transmitted signal and, for network studio transmissions, is derived
> from a cesium clock. From what I have read it is more accurate than WWV
> and doesn't require extra equipment other than a TV displaying an image
> with a studio source. Frequency is 3.579545 MHz.

I've seen that discussed elsewhere, and although it would take me a week
to find the particulars, (1) the frequency can be off as much as 10 Hz
by FCC standards, (2) from what I've read it's frequently off by more
than that, even on network feeds, (3) IIRC they don't even use the good
clocks on the networks any more, (4) NIST clocks are going to be a couple
of orders of magnitude better than the best a network would buy for the
purpose of meeting FCC regulations, (5) IIRC the frequency should
actually be 3,579,545.454545454545..... Hz, and (6) Doppler shift on
the incoming television signal could potentially cause the subcarrier
frequency to vary up and down.

Chris Jones

unread,
Mar 1, 2007, 5:42:36 PM3/1/07
to
Anthony Fremont wrote:

> If someone doesn't need traceable calibration, then why should they pay
> for
> it? Especially if they have the resources to do it themselves. I'm
> thinking of buying a cheap used Rb time base from e-bay so I can cal my
> old Protek freq counter and adjust the timebase on my Hitachi scope, it's
> certainly cheaper than having it done. Using a PIC driven by an ordinary
> can xtal, and a quartz wristwatch of known accuracy, I was able to tweak
> the
> xtal to within about 1-2ppm over the course of a week or two. Of course
> you know that's impossible, don't you?

I don't think the oscillator on a PIC would be good to 2ppm absolute
accuracy even with a very good xtal, unless you FIRST calibrate it against
something that has already been calibrated, therefore it doesn't get you
far. It will however be good enough to calibrate your scope since that
would only need 1% or so, and any old crystal should achieve that, even
with a fairly primitive oscillator. For frequency calibration, your best
bet is to receive an off-air standard, for example GPS or in many countries
there are low frequency standard transmissions (50kHz, 60kHz, 77.5kHz or
others, look up which ones are available in your country). It is quite
feasible to build your own receiver for these. These transmitters are
maintained to a higher accuracy than any piece of hardware that a hobbyist
could afford (e.g. 2 parts in 10^12).
http://www.npl.co.uk/time/msf/ctm001v05.pdf

Chris

David L. Jones

unread,
Mar 1, 2007, 5:49:31 PM3/1/07
to
On Mar 1, 9:20 pm, JW <n...@dev.nul> wrote:
> On 28 Feb 2007 22:45:45 -0800 "David L. Jones" <altz...@gmail.com> wrote
> in Message id: <1172731545.077695.220...@z35g2000cwz.googlegroups.com>:

>
> >On Mar 1, 5:09 pm, Robert Baer <robertb...@earthlink.net> wrote:
>
> [...]
>
> >> True, 0.01% resistors are available, *but* they are extremely
> >> expensive (over $100 each) and they are made when and if the
> >> manufacturer sees fit to do so.
>
> >Not so.
> >RS Components have 0.01% resistors for AU$34.50 (US$26)
> >Farnell have 0.02% for as little as AU$20
>
> You guys are paying *way* too much. We use Riedon .01% precision resistors
> in our A/D products, and pay about 5 bucks apiece. Their site is down at
> the moment, but even Digikey has .01% resistors for around the same price:http://www.digikey.com/scripts/DkSearch/dksus.dll?Criteria?Ref=3107&S...

It's not too much when you only want a couple, and you can have it
within the hour.

If you want volume and price matters, sure, you shop around.

Dave :)

Ken Smith

unread,
Mar 1, 2007, 9:19:39 PM3/1/07
to
In article <12ue9cj...@corp.supernews.com>, doug <doug@doug> wrote:
>Ken Smith wrote:
>> In article <12uctfb...@corp.supernews.com>, doug <doug@doug> wrote:
>>
A mistake go read it if you want.

>No. One part in 100 million (10^-8) is 100 times

Yes, I misread the statement.

[....]


>uncompensated or poorly compensated timebases are basically useless for
>any serious work.

That depends a lot on your definition of "serious". There are lots of
things where just being within 100PPM is more than good enough. RS232 is
ok up to 5% error. If the so called 60Hz in your motor home was actually
59.9Hz, I don't think you would mind.


> The other nice part about the high stability
>references is that you can distribute it to all the synthesizers on your
>bench and everything is coherent.
>Of course it depends on what you do. For my ham work, one ppm is fine.
>I do other work where the Rb source is not good enough.

A lot of them have worse short term noise than a good OCXO.

Dr. Anton T. Squeegee

unread,
Mar 1, 2007, 10:22:32 PM3/1/07
to
In article <Xns98E66A6AEDF...@64.209.0.86>, jya...@abuse.gov
(known to some as Jim Yanik) scribed...

<snippety>

> www.tegam.com

Unfortunately, according to their product list, they have
discontinued ALL their O-Scope calibration products.

However, that's not necessarily the end of the world, as it were.
This simply means that there is a better chance of such showing up on
the surplus market, which consequently creates a much better chance of
my finding something useful. ;-)

Thanks.

Too_Many_Tools

unread,
Mar 1, 2007, 10:22:16 PM3/1/07
to
On Mar 1, 6:18 am, MassiveProng

<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 28 Feb 2007 21:47:49 -0800, "Too_Many_Tools"
> <too_many_to...@yahoo.com> Gave us:

Damn...you back already? Well I guess the traffic was light today.

Well MiniPrick since you are here taking up bytes, why don't you prove
to us how really brilliant you are?

Why don't you rub both of your brain cells together and come up with a
setup that a home lab can use for general cal purposes? You ARE smart
enough to do that, aren't you? Don't disappoint all of us now....we
are waiting....so either put up or shut up "Genius".

And oh yeah....that's Mr. Dumbfuck to you MiniPrick....now get to
work. LOL

TMT

Too_Many_Tools

unread,
Mar 1, 2007, 10:28:47 PM3/1/07
to
>
> You want to attack my adulthood, fine, little boy. You are
> BRAINLESS for this task.

I wasn't attacking your adulthood MiniPrick....an adult would not
behave in the manner you are.

Now a child....yes a child could easily be acting this way....so from
now on we will call you MiniPrick.


>
> I have worked in cal labs and in QA for years. You lose, sonny.
>

Laugh....laugh....laugh....emptying trash cans is NOT working in cal
labs and QA.

Hey MiniPrick....you done with the homework assignment yet?

TMT

On Mar 1, 6:23 am, MassiveProng


<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 28 Feb 2007 21:50:49 -0800, "Too_Many_Tools"

> <too_many_to...@yahoo.com> Gave us:

Robert Baer

unread,
Mar 2, 2007, 12:21:35 AM3/2/07
to
David L. Jones wrote:
> On Mar 1, 5:09 pm, Robert Baer <robertb...@earthlink.net> wrote:
>
>>David L. Jones wrote:

>>
>>>On Mar 1, 12:27 pm, "Too_Many_Tools" <too_many_to...@yahoo.com> wrote:
>>
>>>>I have a well stocked test bench at home containing a range of analog,
>>>>digital and RF test equipment as I am sure most of you also do.
>>
>>>>Well the question I have is how do you handle the calibration of your
>>>>equipment? What do you use for calibration standards for resistance,
>>>>voltage, current and frequency?
>>
>>>>Links to recommended circuits, pictures and sources would be
>>>>appreciated.
>>
>>>>Since this is a need for anyone who has test equipment, I hope to see
>>>>a good discussion on this subject.
>>
>>>If you've got that sort of gear at home then usually you have better
>>>(and calibrated) gear at work as well, in which case most of us would
>>>simply bring in our gear from home and spot check it against the good
>>>gear.
>>
>>>In the absense of this gear, you can simply use precision components.
>>>Voltage reference chips with 0.05% or better are cheap and readilly
>>>available.
>>>0.01% resistors are available too.
>>
>>>If you have multiple meters for example, you can also keep an eye on
>>>them by comparison. Using any old component, if all three meters read
>>>the same then you can be pretty confident they haven't drifted.
>>
>>>Checking scope horizontal timebases is easy with a crystal oscillator
>>>and divider.
>>
>>>There are various methods for getting an accurate frequency standard,
>>>but one of the newst methods is using a GPS derived reference. Second
>>>hand Rubidium standards can also be had on eBay.
>>
>>>Generally though, good quality test gear does not drift out of spec,
>>>so the need for regular calibration is minimal.
>>
>>>Dave :)
>>
>> True, 0.01% resistors are available, *but* they are extremely
>>expensive (over $100 each) and they are made when and if the
>>manufacturer sees fit to do so.
>
>
> Not so.
> RS Components have 0.01% resistors for AU$34.50 (US$26)
> Farnell have 0.02% for as little as AU$20
>
> Dave :)
>
You guys outside the US ahve it soooo good!
Here in the US, it is like i mentioned.

Robert Baer

unread,
Mar 2, 2007, 12:38:04 AM3/2/07
to
JW wrote:

> On 28 Feb 2007 22:45:45 -0800 "David L. Jones" <alt...@gmail.com> wrote
> in Message id: <1172731545.0...@z35g2000cwz.googlegroups.com>:


>
>
>>On Mar 1, 5:09 pm, Robert Baer <robertb...@earthlink.net> wrote:
>
>

> [...]


>
>
>>> True, 0.01% resistors are available, *but* they are extremely
>>>expensive (over $100 each) and they are made when and if the
>>>manufacturer sees fit to do so.
>>
>>Not so.
>>RS Components have 0.01% resistors for AU$34.50 (US$26)
>>Farnell have 0.02% for as little as AU$20
>
>

> You guys are paying *way* too much. We use Riedon .01% precision resistors
> in our A/D products, and pay about 5 bucks apiece. Their site is down at
> the moment, but even Digikey has .01% resistors for around the same price:

> http://www.digikey.com/scripts/DkSearch/dksus.dll?Criteria?Ref=3107&Site=US&Cat=34342147
"Your search criteria has expired"
Furthermore a search on "34342147" (no quotes) gets zero matches.
A search on "3107" (no quotes) gets matches that are not better than 1%.
Strangely enough, a search on "precision resistors" (no quotes_ is as
bad.
Worse, a search for "resistors" and wading thru the various types
gets *at best* Chip Resistor-Thin Film(67311 items) with 0.02% as the
best or tolerance listed.
So......
Where are those mysterious 0.01% resistors???

Robert Baer

unread,
Mar 2, 2007, 12:41:28 AM3/2/07
to
MassiveProng wrote:

"Bee's Knees"?? Was that before or after the "Cat's pajamas"?

Robert Baer

unread,
Mar 2, 2007, 12:43:15 AM3/2/07
to
MassiveProng wrote:

Ahhh yesss....the Golden Rule.

Robert Baer

unread,
Mar 2, 2007, 12:57:43 AM3/2/07
to
Tim Shoppa wrote:

Exactly and completely correct in all aspects.
I made a 0.1% resistance reference box: 100 ohms, 1K, 10K, 100K, 1M,
10M and 100M that has been invaluable.
I made a voltage reference box using an Intersil (was Xicor) 5V FGA
reference powered by a 9V battery; good for source and sink and that
initial accuracy of 0.5mV was hard to beat; my HP 5326B verifies the
value within its accuracy as well as the 0.5mV of the reference; both in
the same region of fuzziness - so not too bad.
My handheld DVMs are "fair"; actually the 3.5 digit one os more
stable and reliable in readings than the 4.5 digit.
That one can be set by only one pot which either makes the resistor
readings within spec or makes the DC readings in spec - but not both; i
opted for DC reading accuracy.
I hate it when i have to fiddle with what seems to be a perfectly
good meter, just to make it read correctly (based on two other references).
One of these daze, i may be rich enough to get a Fluke 88845A; and if
*really* rich, will pay for a traceable meter!

Robert Baer

unread,
Mar 2, 2007, 1:00:33 AM3/2/07
to
Jim Yanik wrote:

Some TV stations had so much RFI that even a VOM had trouble reading
properly on any scale!

Robert Baer

unread,
Mar 2, 2007, 1:01:35 AM3/2/07
to
Bud-- wrote:

Check.

MassiveProng

unread,
Mar 2, 2007, 2:21:35 AM3/2/07
to
On Thu, 1 Mar 2007 06:43:01 -0600, "Anthony Fremont"
<spam...@nowhere.com> Gave us:

>Since 3% accuracy is considered good in the scope world,

Huh?

> I think it would do
>fine.

Bwuahahahahahahaha! Hilarious!

MassiveProng

unread,
Mar 2, 2007, 2:46:48 AM3/2/07
to
On 1 Mar 2007 19:28:47 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> Gave us:

>Laugh....laugh....laugh....emptying trash cans is NOT working in cal
>labs and QA.

Said the utter retard that needed to ask in a BASIC electronics
groups about something which he should already know if he planned to
attempt such a procedure.

Nice try, retard boy. Too bad you are wrong.... again.

>Hey MiniPrick....you done with the homework assignment yet?

That of calling you the retarded fuckhead that you are? Sure...
done.

>TMT, the total Usenet retard

Yep... that'd be you. Your nym is more correct than you'll ever
know. You're a jack-of-no-trades.

You're a real piece of shit... errr... work, there, bub.

My first advice was spot on. To make a proper cal, the source has
to be ten times better than the accuracy you wish to claim for the
instrument.

NONE of the circuits given in this thread are good enough. ALL of
those IC chips drift with T so much that calling them a cal source is
ludicrous. So are you if you think I don't now quality assurance, and
proper procedure.

You ain't it.

MassiveProng

unread,
Mar 2, 2007, 2:53:06 AM3/2/07
to
On Thu, 1 Mar 2007 19:39:00 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>Aren't you reliant on the stability of the DVD player reference clock
>for the stability of the test tones?

For audio? absolutely. It had nothing to do with the DVD player's
clock. There are several industry standard tones provided, and the
disc replaced hardware TV test generators for years.

It carries DTS and THX certified content.

That is the current reference standard for MPEG, if you know who
they are. That's good enough for me. I can verify the setup of my
FPD, and I can setup my stereo <sic>with the audio diagnostic and
setup section.

> I would suspect a GPS reference
>would be considerably better. Of course, I could be wrong!

Hey, chucko! He didn't give a GPS source. You don't get to change
the scene, pal! A subsequent poster mentioned a GPS setup.

Go back and read.

David L. Jones

unread,
Mar 2, 2007, 2:53:14 AM3/2/07
to
On Mar 2, 6:21 pm, MassiveProng

<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On Thu, 1 Mar 2007 06:43:01 -0600, "Anthony Fremont"
> <spam-...@nowhere.com> Gave us:

>
> >Since 3% accuracy is considered good in the scope world,
>
> Huh?

Yep, didn't you know that scope you are using is only a few % accurate
on the vertical scale?

> > I think it would do
> >fine.
>
> Bwuahahahahahahaha! Hilarious!

Hardly, it would be perfectly adequate for the job actually.

Dave :)

MassiveProng

unread,
Mar 2, 2007, 2:59:15 AM3/2/07
to
On 1 Mar 2007 19:22:16 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> Gave us:

>And oh yeah....that's Mr. Dumbfuck


You're an idiot. You're busted.

Proven beyond your depth.

You are too stupid to even know that if you DID make a cal source
device, you would have to get it calibrated to make it worth a shit.

There is no backwoods calibration of perfectly good gear. All there
is is some hillbilly fucktard like you fucking up what was once
perfectly good gear by thinking you have one tenth the brains you need
to do such a chore correctly. You do not.

So fuck off.

MassiveProng

unread,
Mar 2, 2007, 3:21:35 AM3/2/07
to
On Thu, 01 Mar 2007 16:32:18 -0600, clifto <cli...@gmail.com> Gave us:

>Bud-- wrote:
>> I believe the color subcarrier in a color TV is phase locked to the
>> transmitted signal and, for network studio transmissions, is derived
>> from a cesium clock. From what I have read it is more accurate than WWV
>> and doesn't require extra equipment other than a TV displaying an image
>> with a studio source. Frequency is 3.579545 MHz.
>
>I've seen that discussed elsewhere, and although it would take me a week
>to find the particulars, (1) the frequency can be off as much as 10 Hz
>by FCC standards, (2) from what I've read it's frequently off by more
>than that, even on network feeds, (3) IIRC they don't even use the good
>clocks on the networks any more, (4) NIST clocks are going to be a couple
>of orders of magnitude better than the best a network would buy for the
>purpose of meeting FCC regulations, (5) IIRC the frequency should
>actually be 3,579,545.454545454545..... Hz, and (6) Doppler shift on
>the incoming television signal could potentially cause the subcarrier
>frequency to vary up and down.


So there!

MassiveProng

unread,
Mar 2, 2007, 3:31:07 AM3/2/07
to
On 1 Mar 2007 23:53:14 -0800, "David L. Jones" <alt...@gmail.com>
Gave us:

>Yep, didn't you know that scope you are using is only a few % accurate
>on the vertical scale?


You guys must be behind the times.

MassiveProng

unread,
Mar 2, 2007, 3:32:14 AM3/2/07
to
On 1 Mar 2007 23:53:14 -0800, "David L. Jones" <alt...@gmail.com>
Gave us:

>


>Hardly, it would be perfectly adequate for the job actually.


Wrong. That could easily leave the scope over 6% off.

It takes a much finer source to calibrate a device than the final
accuracy of the device being calibrated, dipshit.

Anthony Fremont

unread,
Mar 2, 2007, 4:16:36 AM3/2/07
to
MassiveProng wrote:
> On Thu, 1 Mar 2007 06:43:01 -0600, "Anthony Fremont"
> <spam...@nowhere.com> Gave us:
>
>> Since 3% accuracy is considered good in the scope world,
>
> Huh?

Yeah, maybe you should do some reading.

JW

unread,
Mar 2, 2007, 6:09:57 AM3/2/07
to
On Fri, 02 Mar 2007 05:38:04 GMT Robert Baer <rober...@earthlink.net>
wrote in Message id:
<0LOFh.7549$_73....@newsread2.news.pas.earthlink.net>:

>JW wrote:
>
>> On 28 Feb 2007 22:45:45 -0800 "David L. Jones" <alt...@gmail.com> wrote
>> in Message id: <1172731545.0...@z35g2000cwz.googlegroups.com>:
>>
>>
>>>On Mar 1, 5:09 pm, Robert Baer <robertb...@earthlink.net> wrote:
>>
>>
>> [...]
>>
>>
>>>> True, 0.01% resistors are available, *but* they are extremely
>>>>expensive (over $100 each) and they are made when and if the
>>>>manufacturer sees fit to do so.
>>>
>>>Not so.
>>>RS Components have 0.01% resistors for AU$34.50 (US$26)
>>>Farnell have 0.02% for as little as AU$20
>>
>>
>> You guys are paying *way* too much. We use Riedon .01% precision resistors
>> in our A/D products, and pay about 5 bucks apiece. Their site is down at
>> the moment, but even Digikey has .01% resistors for around the same price:
>> http://www.digikey.com/scripts/DkSearch/dksus.dll?Criteria?Ref=3107&Site=US&Cat=34342147
> "Your search criteria has expired"

I hate it when that happens...

> Furthermore a search on "34342147" (no quotes) gets zero matches.
> A search on "3107" (no quotes) gets matches that are not better than 1%.
> Strangely enough, a search on "precision resistors" (no quotes_ is as
>bad.
> Worse, a search for "resistors" and wading thru the various types
>gets *at best* Chip Resistor-Thin Film(67311 items) with 0.02% as the
>best or tolerance listed.
> So......
> Where are those mysterious 0.01% resistors???

Put .01% into the search box and you'll get a link to the .01% resistors.
For quantity 1 they are 5-6 bucks apiece.

Or maybe this link will work if it doesn't expire, that is:
http://www.digikey.com/scripts/DkSearch/dksus.dll?Criteria?Ref=916&Site=US&Cat=34342147

carneyke

unread,
Mar 2, 2007, 6:57:46 AM3/2/07
to
On Mar 1, 10:40 am, Jim Yanik <jya...@abuse.gov> wrote:
> chuck <nos...@nospam.org> wrote innews:1172759...@sp6iad.superfeed.net:

>
>
>
>
>
> > Tim Shoppa wrote:
> >> On Feb 28, 9:27 pm, "Too_Many_Tools" <too_many_to...@yahoo.com>
> >> wrote:
> >>> I have a well stocked test bench at home containing a range of
> >>> analog, digital and RF test equipment as I am sure most of you also
> >>> do.
>
> >>> Well the question I have is how do you handle the calibration of
> >>> your equipment? What do you use for calibration standards for
> >>> resistance, voltage, current and frequency?
>
> --
> Jim Yanik
> jyanik
> at
> kua.net- Hide quoted text -
>
> - Show quoted text -- Hide quoted text -
>
> - Show quoted text -

I just got done calibrating a AM503 & A6302 current probe / amp
somebody took a screwdriver to. Without the manual and all required
gear (PG506), and cal fixtures it would never have worked properly
again. I work in a cal lab and the best part about iso9002 was
requiring the sealing stickers (cal void if seal is broken). We never
used them prior to iso certification.

carneyke

unread,
Mar 2, 2007, 7:02:49 AM3/2/07
to
> used them prior to iso certification.- Hide quoted text -

>
> - Show quoted text -

Forgot to mention : If you want to cal your own gear, mark any pots /
vari-caps and write down any software codes BEFORE changing. Do not
adjust the compensation capacitors in any Tektronix attenuators
without a PG506 and a procedure.

Jim Yanik - This note isn't for you, as you have seen the damage
too.....

Too_Many_Tools

unread,
Mar 2, 2007, 3:14:20 PM3/2/07
to
On Mar 2, 1:46 am, MassiveProng

<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 1 Mar 2007 19:28:47 -0800, "Too_Many_Tools"
> <too_many_to...@yahoo.com> Gave us:

So it sounds like you are having a problem finding two brain cells
MiniPrick...try harder.

No more of your excuses....SHOW us how great you are.

Laugh...laugh...laugh....

TMT

Clint Sharp

unread,
Mar 2, 2007, 1:19:08 PM3/2/07
to
In message <8llfu2pm4lvaqph1u...@4ax.com>, MassiveProng
<Massiv...@thebarattheendoftheuniverse.org> writes

>On Thu, 1 Mar 2007 19:39:00 +0000, Clint Sharp
><cl...@clintsmc.demon.co.uk> Gave us:
>
>>Aren't you reliant on the stability of the DVD player reference clock
>>for the stability of the test tones?
>
> For audio? absolutely. It had nothing to do with the DVD player's
>clock. There are several industry standard tones provided, and the
>disc replaced hardware TV test generators for years.
So varying the reference clock of a DVD player doesn't affect the pitch
of any tones played back off a disk? OK. I wondered because it does on a
CD player.

>
> It carries DTS and THX certified content.
K, so it's a good for setting up home theatre equipment at the very
least

>
> That is the current reference standard for MPEG, if you know who
>they are.
Not personally, but I may have heard of them in passing.

> That's good enough for me. I can verify the setup of my
>FPD,
FPD?

>and I can setup my stereo <sic>with the audio diagnostic and
>setup section.

>
>> I would suspect a GPS reference
>>would be considerably better. Of course, I could be wrong!
>
> Hey, chucko! He didn't give a GPS source. You don't get to change
>the scene, pal!

Hey 'pal' I didn't try to change the scenario, I just speculated that a
GPS source would be better.


>A subsequent poster mentioned a GPS setup.
> Go back and read.

No need, I read that post, that's why I mentioned it. Turn down the
aggression a notch or two, I only asked a question and speculated that
there might be a better way.
--
Clint Sharp

David L. Jones

unread,
Mar 2, 2007, 4:06:24 PM3/2/07
to
On Mar 2, 7:32 pm, MassiveProng
<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 1 Mar 2007 23:53:14 -0800, "David L. Jones" <altz...@gmail.com>

> Gave us:
>
>
>
> >Hardly, it would be perfectly adequate for the job actually.
>
> Wrong. That could easily leave the scope over 6% off.
>
> It takes a much finer source to calibrate a device than the final
> accuracy of the device being calibrated, dipshit.

Not in this case.
If he used a meter with 0.5% accuracy on DC volts then he could check
and adjust his scope's vertical scale to the same 0.5% accuracy.

And if you start crapping on about the tolerance of the resistor chain
adding up etc, then you haven't thought about this one hard enough...

Dave :)

David L. Jones

unread,
Mar 2, 2007, 4:20:17 PM3/2/07
to
On Mar 2, 7:31 pm, MassiveProng
<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 1 Mar 2007 23:53:14 -0800, "David L. Jones" <altz...@gmail.com>

> Gave us:
>
> >Yep, didn't you know that scope you are using is only a few % accurate
> >on the vertical scale?
>
> You guys must be behind the times.

My 6000 series Agilent is not behind the times, and it's only +/-2%
accurate on the vertical scale.
A good analog scope like say the Tek2465 is only 2% as well.

Perhaps those two are the exception huh? Care to post some links to
prove otherwise?
I could post until the cows come home scopes that are no better than a
few % accurate on the vertical scale.

Dave :)

MassiveProng

unread,
Mar 2, 2007, 5:19:18 PM3/2/07
to
On Fri, 2 Mar 2007 18:19:08 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>> That's good enough for me. I can verify the setup of my
>>FPD,
>FPD?


Flat Panel Display.

You ain't too sharp, Sharp.

MassiveProng

unread,
Mar 2, 2007, 5:21:23 PM3/2/07
to
On Fri, 2 Mar 2007 18:19:08 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>>the scene, pal!


>Hey 'pal' I didn't try to change the scenario, I just speculated that a
>GPS source would be better.


There's no doubt it would be better for RF frequency locks. Since we
use them at work, I have no doubts about their capabilities.

Audio though? Most of the respondents referred to audio spectrum
frequencies.

MassiveProng

unread,
Mar 2, 2007, 5:30:31 PM3/2/07
to
On 2 Mar 2007 13:06:24 -0800, "David L. Jones" <alt...@gmail.com>
Gave us:

>On Mar 2, 7:32 pm, MassiveProng


If you set a scope up with 0.5% accurate source validator, the scope
will NOT have that accuracy level. It will ONLY have that accuracy
level at that set point, and that is even questionable.

David L. Jones

unread,
Mar 2, 2007, 6:09:30 PM3/2/07
to
On Mar 3, 9:30 am, MassiveProng
<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 2 Mar 2007 13:06:24 -0800, "David L. Jones" <altz...@gmail.com>

Which is why you do it for each range and then spot check it to see
that there is no funny business. Perfectly valid technique for home
calibration of a scope vertical scale.

Dave :)

Jim Yanik

unread,
Mar 2, 2007, 8:14:33 PM3/2/07
to
"David L. Jones" <alt...@gmail.com> wrote in
news:1172870416....@s48g2000cws.googlegroups.com:

> On Mar 2, 7:31 pm, MassiveProng
><MassivePr...@thebarattheendoftheuniverse.org> wrote:
>> On 1 Mar 2007 23:53:14 -0800, "David L. Jones" <altz...@gmail.com>
>> Gave us:
>>
>> >Yep, didn't you know that scope you are using is only a few % accurate
>> >on the vertical scale?
>>
>> You guys must be behind the times.
>
> My 6000 series Agilent is not behind the times, and it's only +/-2%
> accurate on the vertical scale.
> A good analog scope like say the Tek2465 is only 2% as well.

Better look again;IIRC,it's 1.25% . That does not include the cursors.

It's not really significant,as you can't get that resolution on the screen.

ehsjr

unread,
Mar 2, 2007, 11:29:42 PM3/2/07
to
Robert Baer wrote:
> JW wrote:
>
>> On 28 Feb 2007 22:45:45 -0800 "David L. Jones" <alt...@gmail.com> wrote
>> in Message id: <1172731545.0...@z35g2000cwz.googlegroups.com>:
>>
>>
>>> On Mar 1, 5:09 pm, Robert Baer <robertb...@earthlink.net> wrote:
>>
>>
>>
>> [...]
>>
>>
>>>> True, 0.01% resistors are available, *but* they are extremely
>>>> expensive (over $100 each) and they are made when and if the
>>>> manufacturer sees fit to do so.
>>>
>>>
>>> Not so.
>>> RS Components have 0.01% resistors for AU$34.50 (US$26)
>>> Farnell have 0.02% for as little as AU$20
>>
>>
>>
>> You guys are paying *way* too much. We use Riedon .01% precision
>> resistors
>> in our A/D products, and pay about 5 bucks apiece. Their site is down at
>> the moment, but even Digikey has .01% resistors for around the same
>> price:
>> http://www.digikey.com/scripts/DkSearch/dksus.dll?Criteria?Ref=3107&Site=US&Cat=34342147
>>
>
> "Your search criteria has expired"
> Furthermore a search on "34342147" (no quotes) gets zero matches.
> A search on "3107" (no quotes) gets matches that are not better than 1%.
> Strangely enough, a search on "precision resistors" (no quotes_ is as
> bad.
> Worse, a search for "resistors" and wading thru the various types gets
> *at best* Chip Resistor-Thin Film(67311 items) with 0.02% as the best or
> tolerance listed.
> So......
> Where are those mysterious 0.01% resistors???
>

Try DigiKey Part # MR102-100-.01-ND

Ed

ehsjr

unread,
Mar 3, 2007, 12:22:58 AM3/3/07
to
Too_Many_Tools wrote:
> I have a well stocked test bench at home containing a range of analog,
> digital and RF test equipment as I am sure most of you also do.
>
> Well the question I have is how do you handle the calibration of your
> equipment? What do you use for calibration standards for resistance,
> voltage, current and frequency?
>
> Links to recommended circuits, pictures and sources would be
> appreciated.
>
> Since this is a need for anyone who has test equipment, I hope to see
> a good discussion on this subject.
>
> Thanks
>
> TMT
>

The real question is how much precision do you
really need in the home "lab"? How often have
you needed to use your DMM with how many
*accurate* significant digits? 100 minus some
*very* small percent of the time, 2 significant
digits is all you need. Do you _really_ care
if your 5.055 volt reading is really 5.06 or 5.04?

Oh hell yes, I want to puff out my chest like everyone
else and think I have *accurate* equipment.

But I'm curious as to what home circuits need meters
that can read voltage accurately to 3 decimal places?
2 decimal places? The question for current measurement:
in what home brew circuit design/troubleshooting do you
need accuracy below the tens of mA digit ? *Need*, not
*want*. Do you even trust your DMM on an amps setting
for those measurements, or do you measure the current
indirectly? How about ohms? Would you trust any
DMM, regardless of who calibrated it, to measure
down in the miliohm numbers?

To me, the design of the circuit being mesured has
to take care of all of that crap. If it is so
poorly designed that a 10 mV departure from nominal
(that is missed by my innaccurate meter) will keep
it from working, that suggests other problems.
Yes, the home "lab" person wants extreme accuracy
to as many decimal places as he can get. But when does
he ever really need it?

None of this is to argue against having the best
instrumentation you can afford, or references to
check it against, or paying for calibration and so
forth. But for myself, I need a dose of reality
from time to time when I start drooling over some
accuracy specs that I will never need at home. My
bet is that most of us are seduced by that same muse.

Ed

Anthony Fremont

unread,
Mar 3, 2007, 6:35:01 AM3/3/07
to

You surely didn't mean tens of _mA_, did you? I build stuff with PICs as
you know, and some of it is designed to run on batteries and needs to go for
long periods of time unattended. The current draw for a 12F683 running at
31kHz is 11uA, sleep current is 50nA. If I could only measure current to
"tens of mA", I'd never know if the PIC was setup right for low current draw
and I certainly couldn't have any idea of expected battery life. I wouldn't
even know if it was sleeping until it ate thru some batteries in a few days
instead of six or eight months. I think I have a need to measure fractions
of a uA.

> *want*. Do you even trust your DMM on an amps setting
> for those measurements, or do you measure the current
> indirectly? How about ohms? Would you trust any
> DMM, regardless of who calibrated it, to measure
> down in the miliohm numbers?
>
> To me, the design of the circuit being mesured has
> to take care of all of that crap. If it is so
> poorly designed that a 10 mV departure from nominal
> (that is missed by my innaccurate meter) will keep
> it from working, that suggests other problems.
> Yes, the home "lab" person wants extreme accuracy
> to as many decimal places as he can get. But when does
> he ever really need it?

When he needs it he needs it, what can I say? Do I really "need" a new DSO?
Well I've managed to get by all this time without one, so maybe you think I
don't really "need" one. I see it like this though, I don't get allot of
time to tinker anymore. I'd like to spend it more productively. Instead of
fumbling around and trying to devise silly methods to make my existing
equipment do something it wasn't designed to (like going off on a tangent to
build a PIC circuit that will trigger my scope early so I can try to see
some pre-trigger history).

> None of this is to argue against having the best
> instrumentation you can afford, or references to
> check it against, or paying for calibration and so

I don't know if I really agree with that. ;-)

Clint Sharp

unread,
Mar 2, 2007, 6:06:13 PM3/2/07
to
In message <bm8hu29f86pm64qki...@4ax.com>, MassiveProng
<Massiv...@thebarattheendoftheuniverse.org> writes
Ahh, so you're recommending calibrating test equipment using the same
DVD as you use to set up your boob tube. Explains a lot.
--
Clint Sharp

MassiveProng

unread,
Mar 3, 2007, 4:20:14 PM3/3/07
to
On Fri, 2 Mar 2007 23:06:13 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>In message <bm8hu29f86pm64qki...@4ax.com>, MassiveProng
><Massiv...@thebarattheendoftheuniverse.org> writes
>>On Fri, 2 Mar 2007 18:19:08 +0000, Clint Sharp
>><cl...@clintsmc.demon.co.uk> Gave us:
>>
>>>> That's good enough for me. I can verify the setup of my
>>>>FPD,
>>>FPD?
>>
>>
>> Flat Panel Display.
>>
>> You ain't too sharp, Sharp.
>Ahh, so you're recommending calibrating test equipment using the same
>DVD as you use to set up your boob tube. Explains a lot.

No, dipshit. I recommended an industry standard source, and I don't
"set-up" my boob tube you brainless twit, I CHECK its setup, and use
the disc to actually set-up my home audio system.

This ain't the days of the sixties where you get to tweak a bunch of
pots.

Stop being so think.

Clint Sharp

unread,
Mar 3, 2007, 6:56:14 PM3/3/07
to
In message <4hpju2t7fpiqg2t0v...@4ax.com>, MassiveProng
<Massiv...@thebarattheendoftheuniverse.org> writes

> No, dipshit. I recommended an industry standard source,
To calibrate test gear. As specified in the original post.

>and I don't
>"set-up" my boob tube you brainless twit,
So you don't know how to access the service menu and make changes to the
setup of your boob tube. Fair enough, I thought someone as knowledgeable
as you would know how to do that but I guess I was wrong.

> I CHECK its setup, and use
>the disc to actually set-up my home audio system.
Good for you, please explain how the OP was going to use a DVD to
calibrate his test equipment.

>
> This ain't the days of the sixties where you get to tweak a bunch of
>pots.
No, you don't, all the adjustments are done via menu now.
>
> Stop being so think.
I think, you just rant. Please get it right. Maybe you could use that
DVD to calibrate your anger response, maybe you could eBay it and your
home audio system to pay for some anger management?
--
Clint Sharp

Too_Many_Tools

unread,
Mar 3, 2007, 8:20:54 PM3/3/07
to
Good comments Ed.

I want to thank everyone else who has offered *positive* comments
also.

Like I said, I think this is a need for anyone who has equipment at
home.

TMT

> Ed- Hide quoted text -

MassiveProng

unread,
Mar 3, 2007, 8:38:01 PM3/3/07
to
On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <alt...@gmail.com>
Gave us:

>Which is why you do it for each range and then spot check it to see


>that there is no funny business. Perfectly valid technique for home
>calibration of a scope vertical scale.
>
>Dave :)

It doesn't matter how many "places" you "spot check" it, you are not
going to get the accuracy of your comparison standard on the device
you intend to set with it. What you do is take the basic INaccuracy
of the device needing to be set, and add to it the basic INaccuracy of
the standard to which you are setting it. You CANNOT get any closer
than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
to make the scope that accurate. You need a *finer* standard than the
accuracy level you wish to achieve.

You need to understand that as a basic fact, chucko.

MassiveProng

unread,
Mar 3, 2007, 8:50:26 PM3/3/07
to
On Sat, 03 Mar 2007 05:22:58 GMT, ehsjr <eh...@bellatlantic.net> Gave
us:


Modern instrument accuracies are so good, and keep their setup so
well, to open one up and tweak it with less than a professional
calibration standard available is ludicrous in the extreme.

No matter how smart one is, if one has an instrument, and wants to
test accuracy, one should make an appearance somewhere where an
already recently calibrated instrument is available to EXAMINE your
instrument against.

NONE should be "adjusted" at all ever if the variance is too small
to warrant it, and even pro calibrators follow this creed. If at all
possible, their main task is to VERIFY an instrument's accuracy
WITHOUT making ANY adjustment. ANY that DO need adjustments are
typically marked "defective" and require a factory inspection/repair.

I speak from experience, so I don't care what the ToolTard thinks
about his capacity for the task, he is a fucking retard if he tries it
without first checking his gear against known good gear.

It really is THAT SIMPLE.

Anthony Fremont

unread,
Mar 3, 2007, 9:44:19 PM3/3/07
to

The "basic fact" here is that we were talking about adjusting a 3% scope
with a .03% meter. Now that the number are back where they belong, please
procede to restate your case. The scope's vertical sensitivity could easily
be adjusted to within 3% using said meter, now can't it? Just like Keith
says......


MassiveProng

unread,
Mar 3, 2007, 10:09:03 PM3/3/07
to
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>So you don't know how to access the service menu and make changes to the

>setup of your boob tube.


Sorry, you dumbfuck, but you assuming that all TVs have this
capacity proves even further how little you know about it.

MassiveProng

unread,
Mar 3, 2007, 10:10:41 PM3/3/07
to
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>Good for you, please explain how the OP was going to use a DVD to
>calibrate his test equipment.


Stupid shit. The suggestion I posed mine against was some twit
suggesting WWV and a 1kHz tone, which is about as old hat as it gets.

You should really learn to read ENTIRE threads before you mouth off,
jackass.

MassiveProng

unread,
Mar 3, 2007, 10:11:24 PM3/3/07
to
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>No, you don't, all the adjustments are done via menu now.

Wrong again, dumbass. You'd like to think that your guess is
correct, but it is not, dipshit.

MassiveProng

unread,
Mar 3, 2007, 10:12:15 PM3/3/07
to
On Sat, 3 Mar 2007 23:56:14 +0000, Clint Sharp
<cl...@clintsmc.demon.co.uk> Gave us:

>I think, you just rant. Please get it right. Maybe you could use that

>DVD to calibrate your anger response, maybe you could eBay it and your
>home audio system to pay for some anger management?


Fuck you, you fucking retard. Meet up with me, and I'll show you
how I manage it.

MassiveProng

unread,
Mar 3, 2007, 10:17:08 PM3/3/07
to
On 3 Mar 2007 17:20:54 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> Gave us:

>Good comments Ed.
>
>I want to thank everyone else who has offered *positive* comments
>also.

I want you to leave the group and never return, you top posting
Usenet RETARD!


>
>Like I said, I think this is a need for anyone who has equipment at
>home.

Like I said, anyone as dumb as you are, regardless of your "tool
count", should not be futzing with perfectly good instruments.

You are simply too fucking stoopid to do it correctly.

>TMT

Yes, YOU!

Learn about top posting, asswipe, and how it is frowned upon in
Usenet, or are you just another pants down past the asscrack, gang boy
retard?

MassiveProng

unread,
Mar 3, 2007, 10:21:44 PM3/3/07
to
On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
<spam...@nowhere.com> Gave us:

>MassiveProng wrote:
>> On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <alt...@gmail.com>
>> Gave us:
>>
>>> Which is why you do it for each range and then spot check it to see
>>> that there is no funny business. Perfectly valid technique for home
>>> calibration of a scope vertical scale.
>>>
>>> Dave :)
>>
>> It doesn't matter how many "places" you "spot check" it, you are not
>> going to get the accuracy of your comparison standard on the device
>> you intend to set with it. What you do is take the basic INaccuracy
>> of the device needing to be set, and add to it the basic INaccuracy of
>> the standard to which you are setting it. You CANNOT get any closer
>> than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
>> to make the scope that accurate. You need a *finer* standard than the
>> accuracy level you wish to achieve.
>>
>> You need to understand that as a basic fact, chucko.
>
>The "basic fact" here is that we were talking about adjusting a 3% scope
>with a .03% meter.

]
Nope. READ HIS replies. He was talking about using a 3% meter.

> Now that the number are back where they belong, please
>procede to restate your case.

Fuck you. Read HIS criteria, dipshit, don't impose yours. Remeber,
it was ME that stated that the cal device had to be ten times more
accurate than the target to be cal'd. So fuck off.

> The scope's vertical sensitivity could easily
>be adjusted to within 3% using said meter, now can't it? Just like Keith
>says......

That is NOT what the retarded bastard said, you retarded bastard.

The Real Andy

unread,
Mar 3, 2007, 11:14:56 PM3/3/07
to

Ahhh, who actually uses a scope to make accurate measurements?

Jim Yanik

unread,
Mar 3, 2007, 11:15:20 PM3/3/07
to
"Anthony Fremont" <spam...@nowhere.com> wrote in
news:12ukcke...@news.supernews.com:

Actually,one CAN calibrate an instrument to a greater accuracy than it's
specified accuracy,-for a short time-;it's called a transfer standard.
Of course,there are limits to how much greater accuracy you can
achieve,based on resolution and repeatability.

For ordinary cals,your standard should be at least 4x better than the DUT.
10x is great.

Phil Allison

unread,
Mar 3, 2007, 11:28:20 PM3/3/07
to

"The Real Andy"


> Ahhh, who actually uses a scope to make accurate measurements?


** Anyone who needs to.

Low frequency ( 1 to 30Hz), single shot, asymmetrical or pulse waves and
high frequencies are all " grist for the mill " even with a CRT based
scope.

Shame what happens with a DMM used on the same.

....... Phil


David L. Jones

unread,
Mar 4, 2007, 2:08:30 AM3/4/07
to
On Mar 4, 12:38 pm, MassiveProng
<MassivePr...@thebarattheendoftheuniverse.org> wrote:
> On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <altz...@gmail.com>

LMAO!
If I use 0.5% accurate meter to adjust a something, then the accuracy
of that adjusted device at that point in time at that adjusted value
*becomes* 0.5%. The device that was adjusted only gets it's accuracy
figure of 0.5% *after* the adjustment. The 0.5% of the device does NOT
get added to the 0.5% of the meter in this particular case!

Dave :)

MassiveProng

unread,
Mar 4, 2007, 3:04:08 AM3/4/07
to
On Sun, 04 Mar 2007 14:14:56 +1000, The Real Andy
<there...@nospam.com> Gave us:


I guess the same idiots that claim they can calibrate one with a 3%
meter.

Also, if you do NOT know how to make accurate measurements with
scopes, you should be in some other industry.

MassiveProng

unread,
Mar 4, 2007, 3:15:04 AM3/4/07
to
On 3 Mar 2007 23:08:30 -0800, "David L. Jones" <alt...@gmail.com>
Gave us:

>On Mar 4, 12:38 pm, MassiveProng
><MassivePr...@thebarattheendoftheuniverse.org> wrote:
>> On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <altz...@gmail.com>
>> Gave us:
>>
>> >Which is why you do it for each range and then spot check it to see
>> >that there is no funny business. Perfectly valid technique for home
>> >calibration of a scope vertical scale.
>>
>> >Dave :)
>>
>> It doesn't matter how many "places" you "spot check" it, you are not
>> going to get the accuracy of your comparison standard on the device
>> you intend to set with it. What you do is take the basic INaccuracy
>> of the device needing to be set, and add to it the basic INaccuracy of
>> the standard to which you are setting it. You CANNOT get any closer
>> than that. So, a 0.5% meter, and a 0.5% scope cannot be used together
>> to make the scope that accurate. You need a *finer* standard than the
>> accuracy level you wish to achieve.
>>
>> You need to understand that as a basic fact, chucko.
>
>LMAO!
>If I use 0.5% accurate meter to adjust a something, then the accuracy
>of that adjusted device at that point in time at that adjusted value
>*becomes* 0.5%.

Absolutely incorrect!

If you do that, the MINIMUM error is 0.5%. It is ALWAYS greater
than that value by that value plus the error of the device you think
you set.

How can you not understand that basic fact?

> The device that was adjusted only gets it's accuracy
>figure of 0.5% *after* the adjustment.

Absolutely INCORRECT!

The error of a device is NOT tied to how it got set or what it got
set with, dipshit, it is tied to precision of the circuits the device
are based upon.

> The 0.5% of the device does NOT
>get added to the 0.5% of the meter in this particular case!

Wanna bet?

ehsjr

unread,
Mar 4, 2007, 3:25:08 AM3/4/07
to

I surely meant tens of mA.

I build stuff with PICs as
> you know, and some of it is designed to run on batteries and needs to go for
> long periods of time unattended. The current draw for a 12F683 running at
> 31kHz is 11uA, sleep current is 50nA. If I could only measure current to
> "tens of mA", I'd never know if the PIC was setup right for low current draw
> and I certainly couldn't have any idea of expected battery life. I wouldn't
> even know if it was sleeping until it ate thru some batteries in a few days
> instead of six or eight months. I think I have a need to measure fractions
> of a uA.

You may, but not accuracy below the tens of _mA_ digit.
When you need accuracy below tens of mA, you measure
voltage across a resistance. It doesn't make a lot of
sense to look for your meter to be accurate to 8 decimal
places for your .00000005 amp reading.

Here's how you do it with accuracy at the tens of _mV_ digit:

For 11 uA, put a 10K .01% resistor in series with
the supply and measure .11 volts across it. The voltage
would range from 0.109989 to 0.110011. Keep only
2 decimal places. Your computed current, worst case,
would be off by 1 uA

For 50 nA, use a 2 meg 1% resistor and measure .10
volts across it. The voltage would range from .099
to .101 taking the 1% into account. Throw out the
last digit. Your current computation would be off
worst case, by 5 nA.

With a voltmeter accurate to 2 decimal places.
I don't know why you would

>
>
>>*want*. Do you even trust your DMM on an amps setting
>>for those measurements, or do you measure the current
>>indirectly? How about ohms? Would you trust any
>>DMM, regardless of who calibrated it, to measure
>>down in the miliohm numbers?
>>
>>To me, the design of the circuit being mesured has
>>to take care of all of that crap. If it is so
>>poorly designed that a 10 mV departure from nominal
>>(that is missed by my innaccurate meter) will keep
>>it from working, that suggests other problems.
>>Yes, the home "lab" person wants extreme accuracy
>>to as many decimal places as he can get. But when does
>>he ever really need it?
>
>
> When he needs it he needs it, what can I say?

I asked, looking for concrete cases. Your case
with the PIC is an excellent example of when a
person needs to know about really small currents.
It definitely fits into the difference I had in mind
between "needs" and "wants". But it does not mean he
needs accuracy out to 8 decimal places. He needs it to
2 decimal places, as was shown. Three decimal places
would be nice. :-)

> Do I really "need" a new DSO?

I have no opinion on that, and it would be irrelevant
if I did. I don't know what your situation is.

> Well I've managed to get by all this time without one, so maybe you think I
> don't really "need" one. I see it like this though, I don't get allot of
> time to tinker anymore. I'd like to spend it more productively. Instead of
> fumbling around and trying to devise silly methods to make my existing
> equipment do something it wasn't designed to (like going off on a tangent to
> build a PIC circuit that will trigger my scope early so I can try to see
> some pre-trigger history).
>
>
>>None of this is to argue against having the best
>>instrumentation you can afford, or references to
>>check it against, or paying for calibration and so
>
>
> I don't know if I really agree with that. ;-)

Well, you're free to argue against having the best
instrumentation you can afford, or having references
to check it against or getting it calibrated or
whatever, if that's how you feel. I tend to err on
the side of wanting the best even when it is
not the best fit for what I really need.

Ed

MassiveProng

unread,
Mar 4, 2007, 3:29:14 AM3/4/07
to
On Sun, 04 Mar 2007 08:25:08 GMT, ehsjr <eh...@bellatlantic.net> Gave
us:

>


>You may, but not accuracy below the tens of _mA_ digit.
>When you need accuracy below tens of mA, you measure
>voltage across a resistance. It doesn't make a lot of
>sense to look for your meter to be accurate to 8 decimal
>places for your .00000005 amp reading.


ALL handheld meters use voltage read across a precision shunt
resistor for current readings. I am not talking about inductive
probes. Standard current.

The Real Andy

unread,
Mar 4, 2007, 3:32:57 AM3/4/07
to
On Sun, 04 Mar 2007 00:04:08 -0800, MassiveProng
<Massiv...@thebarattheendoftheuniverse.org> wrote:

My point exactly.

>
> Also, if you do NOT know how to make accurate measurements with
>scopes, you should be in some other industry.

More to the point, if you dont understand the concept of error then
you should be in another industry.

MassiveProng

unread,
Mar 4, 2007, 3:44:55 AM3/4/07
to
On Sun, 04 Mar 2007 18:32:57 +1000, The Real Andy
<there...@nospam.com> Gave us:

Good thing I never made that claim.

>> Also, if you do NOT know how to make accurate measurements with
>>scopes, you should be in some other industry.
>
>More to the point, if you dont understand the concept of error then
>you should be in another industry.

That is about the gist of what I have been trying to tell them.

Some dope thinking he can adjust his meter accurately with a damned
drifty voltage reference chip should have his head examined, not his
instruments!

The Real Andy

unread,
Mar 4, 2007, 4:06:22 AM3/4/07
to
On 2 Mar 2007 12:14:20 -0800, "Too_Many_Tools"
<too_man...@yahoo.com> wrote:

>On Mar 2, 1:46 am, MassiveProng
><MassivePr...@thebarattheendoftheuniverse.org> wrote:
>> On 1 Mar 2007 19:28:47 -0800, "Too_Many_Tools"
>> <too_many_to...@yahoo.com> Gave us:
>>
>> >Laugh....laugh....laugh....emptying trash cans is NOT working in cal
>> >labs and QA.
>>
>> Said the utter retard that needed to ask in a BASIC electronics
>> groups about something which he should already know if he planned to
>> attempt such a procedure.
>>
>> Nice try, retard boy. Too bad you are wrong.... again.
>>
>> >Hey MiniPrick....you done with the homework assignment yet?
>>
>> That of calling you the retarded fuckhead that you are? Sure...
>> done.
>>
>> >TMT, the total Usenet retard
>>
>> Yep... that'd be you. Your nym is more correct than you'll ever
>> know. You're a jack-of-no-trades.
>>
>> You're a real piece of shit... errr... work, there, bub.
>>
>> My first advice was spot on. To make a proper cal, the source has
>> to be ten times better than the accuracy you wish to claim for the
>> instrument.
>>
>> NONE of the circuits given in this thread are good enough. ALL of
>> those IC chips drift with T so much that calling them a cal source is
>> ludicrous. So are you if you think I don't now quality assurance, and
>> proper procedure.
>>
>> You ain't it.
>
>So it sounds like you are having a problem finding two brain cells
>MiniPrick...try harder.
>
>No more of your excuses....SHOW us how great you are.
>
>Laugh...laugh...laugh....
>
>TMT

At the end of the day, why do you need to calibrate your instruments?
Do you need to do it, or is it just for self satisfaction? Are your
trying to prove a point or do you need traceable calibaration? What
are you trying to acheive?

Anthony Fremont

unread,
Mar 4, 2007, 4:10:34 AM3/4/07
to
MassiveProng wrote:
> On Sat, 3 Mar 2007 20:44:19 -0600, "Anthony Fremont"
> <spam...@nowhere.com> Gave us:
>
>> MassiveProng wrote:
>>> On 2 Mar 2007 15:09:30 -0800, "David L. Jones" <alt...@gmail.com>
>>> Gave us:
>>>
>>>> Which is why you do it for each range and then spot check it to see
>>>> that there is no funny business. Perfectly valid technique for home
>>>> calibration of a scope vertical scale.
>>>>
>>>> Dave :)
>>>
>>> It doesn't matter how many "places" you "spot check" it, you are
>>> not going to get the accuracy of your comparison standard on the
>>> device you intend to set with it. What you do is take the basic
>>> INaccuracy of the device needing to be set, and add to it the basic
>>> INaccuracy of the standard to which you are setting it. You CANNOT
>>> get any closer than that. So, a 0.5% meter, and a 0.5% scope cannot
>>> be used together to make the scope that accurate. You need a
>>> *finer* standard than the accuracy level you wish to achieve.
>>>
>>> You need to understand that as a basic fact, chucko.
>>
>> The "basic fact" here is that we were talking about adjusting a 3%
>> scope with a .03% meter.
> ]
> Nope. READ HIS replies. He was talking about using a 3% meter.

I believe we were talking about scopes only being about 3% accurate in the
vertical. You are the one that pulled that garbage out of the air about
leaving the scope 6% off. You should try reading what people write instead
of what you wish they wrote.

>> Now that the number are back where they belong, please
>> procede to restate your case.
>
> Fuck you. Read HIS criteria, dipshit, don't impose yours. Remeber,
> it was ME that stated that the cal device had to be ten times more
> accurate than the target to be cal'd. So fuck off.

Are you really that incompetent? I AM THE ONE that stated that I could use
.03% meter to adjust it. It was in my very first post in this thread. Now
stop lying.

It is loading more messages.
0 new messages