[snip... snip...]
BR><The 30 frames-per-second was chosen, originally, to avoid beats with the
BR><60 hz power line frequency, and the 25 frame rate in Europe was to
BR><match the 50 hz lines there. The actual frame rate in the US is now
BR><about 29.94, and if there is line interference, you will notice that
BR><it always creeps up the screen. I do not know if the system in
BR><Europe is exactly 50. The shift to 29.94 is a result of the
BR><mathematics involved in Color Interleave....but that is the subject
BR><of another discussion.
okay, so please start the discussion...
BR>--
BR> harvard\ spool.cs.wisc.edu!astroatc!vidiot!brown
BR>Vidiot ucbvax!uwvax!astroatc!vidiot!brown
BR> rutgers/ INTERNET:vidiot!brown%astroa...@spool.cs.wisc.edu
BR> br...@wi.extrel.com
later days...
leo d.
l...@barney.sbe.csuhayward.edu
* SLMR 2.1a * Did you hear the one about the one-armed fisherman???
Walden Puddle |
Newark, California USA | Cynic: Someone who smells the
510-795-7660 | flowers, then looks for the casket.
rhosoft.com |
=============
: [snip... snip...]
: BR><The 30 frames-per-second was chosen, originally, to avoid beats with the
: BR><60 hz power line frequency, and the 25 frame rate in Europe was to
: BR><match the 50 hz lines there. The actual frame rate in the US is now
: BR><about 29.94, and if there is line interference, you will notice that
: BR><it always creeps up the screen. I do not know if the system in
: BR><Europe is exactly 50. The shift to 29.94 is a result of the
: BR><mathematics involved in Color Interleave....but that is the subject
: BR><of another discussion.
: okay, so please start the discussion...
Okay.
3.5795454545 MegaHertz is the colorburst frequency. It was defined in the
original design for "compatable color" and is cast in stone.
So:
3.5795454545 megaHertz / 455 = 7.867132866 kHz / 262.5 = 29.97003
I forget the name for the 455 divisor, but it is also cast in stone.
The 262.5 is the number of lines divided by two (525/2).
Your best bet is to go to the library and read a technical book on NTSC.
But, it all boils down to compatibility with B&W receivers. They just
couldn't stick in any old numbers for the color sub-carrier. Weird artifacts
occured by doing that, like dot crawl. Also, the color sub-carrier needed
to fit inbetween the luminance frequency energy.
When you read the book, all will fall into place.
--
harvard\ spool.cs.wisc.edu!astroatc!vidiot!brown
Vidiot ucbvax!uwvax!astroatc!vidiot!brown
rutgers/ INTERNET:vidiot!brown%astroa...@spool.cs.wisc.edu
br...@wi.extrel.com
<br...@vidiot.UUCP (Vidiot) wrote from 're: ntsc vs pal'
<
<[snip... snip...]
<
<BR><The 30 frames-per-second was chosen, originally, to avoid beats with the
<BR><60 hz power line frequency, and the 25 frame rate in Europe was to
<BR><match the 50 hz lines there. The actual frame rate in the US is now
<BR><about 29.94, and if there is line interference, you will notice that
<BR><it always creeps up the screen. I do not know if the system in
<BR><Europe is exactly 50. The shift to 29.94 is a result of the
<BR><mathematics involved in Color Interleave....but that is the subject
<BR><of another discussion.
<
<okay, so please start the discussion...
The problem with keeping NTSC video at the same clock rate(s) as its
monochrome predecessor concerns what happens when it's actually broadcast
(sent out over the air). Harmonics of the horizontal drive and of the
color subcarrier fell close enough to each other to cause interference.
The vertical drive was changed from 60Hz to 59.94Hz (0.1% less) to place
these harmonics exactly in between each other. This makes the frame rate
29.97Hz. Unfortunately now when there is hum in the video this results in
its slowly crawling upward (8.3 seconds from bottom to top) and thus it is
more noticable.
Billy Y..
But it doesn't explain why the color subcarrier is what it is. Where did
the 455 come from?
That is why I suggested that the original poster read a technical book on
NTSC, which should be available in the local library.
><: BR><The 30 frames-per-second was chosen, originally, to avoid beats with the
><: BR><60 hz power line frequency, and the 25 frame rate in Europe was to
><: BR><match the 50 hz lines there. The actual frame rate in the US is now
><: BR><about 29.94, and if there is line interference, you will notice that
><: BR><it always creeps up the screen. I do not know if the system in
><: BR><Europe is exactly 50. The shift to 29.94 is a result of the
><: BR><mathematics involved in Color Interleave....but that is the subject
><: BR><of another discussion.
><
><: okay, so please start the discussion...
In order to put this into non-technical terms, or at least non-energy spectrum
terms, think of the following:
The NTSC picture is made up of two sequential fields of 262-1/2 lines, inter-
leaved. The color signal frequency is chosen to make the color information
occur out-of-phase with each successive line, and also out out-of-phase
each successive frame, reducing the visible effects of the 3.58 signal.
The basic frequency, the one "defined" is the Aural carrier offset of 4.5
MHZ. Choosing exactly 3.58 for color results in another product of 920 KHZ
(4.50-3.58) which would be very visible in the picture. Changing the
Chroma signal frequency allows this (920) signal to also occur out-of-phase
on each successive frame, or field, I'm not sure which, but then in order
to keep the first condition, the horizontal scan rate has to be reduced
a little as well. If the original 525 lines are to be consistent, it is
then necessary to reduce the Vertical a little as well, thus 59.94 instead
of 60.
All of this essentially to make monochrome look better!
Thank Goodness ATV (HDTV) doesn't have to be compatible....
Richard
><
>Okay.
>3.5795454545 MegaHertz is the colorburst frequency. It was defined in the
>original design for "compatable color" and is cast in stone.
>So:
>3.5795454545 megaHertz / 455 = 7.867132866 kHz / 262.5 = 29.97003
I find it interesting that this colourburst frequency of 3.5795
MHz is right smack in the amateur radio 80 meter band. It seems
that any and all ham operators can legally send morse code at that
frequency at the legal power limit of 1 killowat output and screw up all of
the colour tvs in the cities. How come this doesn't appear to be
such a problem?
ma...@efn.org n0gth
: Primarily because there are so many TV sets spewing garbage out on that
: frequency that nobody would ever be heard if they were to use it. The
: advantage, though, is that you can tear the colorburst oscillator out of
: a TV set, and skew it a little bit up or down the band and have a nice
: QRP rig.
Actually, I have found just the opposite to be true.
The colorburst frequency is not only cast in stone-it is extremely accurate.
It is more accurate as a frequency reference than WWV. This is provided
that you are tuned to a network-supplied program.
If you try to receive 3.57954545 mHz near a TV, you will hear almost nothing.
In fact, the only way to take advantge of the precision of the colorburst is
to lock an oscillator to the 7.867 kHz signal radiated by the very strong
magnetic deflection circuit.
I know about this first hand. I wish their *were* a bit of 3.57954545 mHz
leakage from a TV; it would make calibrating a lot easier.
Is this still true?
I have no direct knowledge, but... many years ago (mid-70's if I remember
right) one of the hobby electronics mags (I think it was Radio-Electronics) had
an article for a frequency standard derived from a color tv. Soon afterward a
letter appeared in the letter column (where else :-), written by an engineer at
one of the better-equipped stations in L.A. He stated that even network-
supplied programs taken from a live feed usually go through a time-base
corrector at the local station, and that this breaks the "chain of
traceability" back to the network's precision frequency standard.
(of course, anything that the local taped from a satellite feed for broadcast
later is completely divorced from any standards at the network.)
Also, at that time it was stated that the networks used rubidium-clock
frequency standards, which are secondary standards: They're awfully good but
they still have to be calibrated against something better. NIST (the folks who
run WWV) uses cesium-beam clocks, which are primary standards, needing no
calibration for frequency. Have the networks since upgraded to cesium-beam
clocks? And, given that the local stations probably haven't, does it matter
anyway? Even if they have, they're still "only" as good as NIST's clocks, so
why should one over-the-air signal be better than another? (propagation
changes on shortwave, maybe?)
--- Jamie Hanrahan, Kernel Mode Systems, San Diego CA
Internet: j...@cmkrnl.com (JH645) Uucp: uunet!cmkrnl!jeh CIS: 74140,2055
>
>Also, at that time it was stated that the networks used rubidium-clock
>frequency standards, which are secondary standards: They're awfully good but
>they still have to be calibrated against something better. NIST (the folks who
>run WWV) uses cesium-beam clocks, which are primary standards, needing no
>calibration for frequency. Have the networks since upgraded to cesium-beam
>clocks? And, given that the local stations probably haven't, does it matter
>anyway? Even if they have, they're still "only" as good as NIST's clocks, so
>why should one over-the-air signal be better than another? (propagation
>changes on shortwave, maybe?)
From what I remember, a Rubidium clock has better short term accuracy than
a Cesium clock. The Cesium one is more accurate in the long term. So a
Rubidium clock that is compared with a cesium one, say using phase
comparison with wwvb, is as accurate as these clocks get - about 1
part in 10^12.
Rajiv
aa9ch
r-d...@nwu.edu
: Is this still true?
I received E-Mail from lar...@net.com (reply bounced) stating pretty much
the same thing you stated above. He mentioned things like TBC/frame synch.
processors in the loop *and* the effect of doppler from satellite.
If this is the case, I doubt it matters whether we are talking rubidium or
cesium-beam; the color burst on a network program would be traceable to
an inexpensive quartz reference at the local TV station.
The article in Radio Electronics appeared 3-5 years ago, if I remember
correctly.
> rticle for a frequency standard derived from a color tv. Soon afterward a
> letter appeared in the letter column, written by an engineer at
> one of the better-equipped stations in L.A. He stated that even network-
> supplied programs taken from a live feed usually go through a time-base
> corrector at the local station, and that this breaks the "chain of
> traceability" back to the network's precision frequency standard.
>
Correct. TV stations are not locked to atomic time.
Rajiv AA9CH writes:
> From what I remember, a Rubidium clock has better short term accuracy than
> a Cesium clock. The Cesium one is more accurate in the long term. So a
> Rubidium clock that is compared with a cesium one, say using phase
> comparison with wwvb, is as accurate as these clocks get - about 1
> part in 10^12.
A really *good* Rubidium clock like the HP 5065 *does* have better short
term stability (1 to 100 seconds averaging time) than even the best
Cesium standard (the HP5071). However, it will drift about a part in
1E11 per month. On the other hand, low cost "mini" Rubidium standards
do not have such great short term stability and they may drift a part
in 1E10 or more per month. The accuracy of a Rubidium depends on when
was the last time it was calibrated against a Cesium standard or GPS.
People who want even better short term stability (like JPL) use a
Hydrogen maser locked to a Cesium. H Masers have 100's of times
better short term stability than the best Rubidium, but they have long
term drift due to the teflon container aging.
By the way, there is no requirement for the colorburst frequency to
have extremely high absolute accuracy. It is merely necessary for it
to have exactly the right *ratio* to the sound carrier to avoid
interference. The fact that they couldn't move the frequency of the
sound carrier (because of the backward compatibility requirement) is
why the field rate had to be lowered by .1%. (The color burst also
must have a certain ratio to the field rate, as was previously stated).
The network rubidium clocks never had anything to do with this problem.
They were used to allow switching between feeds without a *phase* jump.
They could have run the whole network 1PPM high, for example, without
a problem, as long as the whole network was the same.
Rick Karlquist N6RK
rka...@scd.hp.com
No, and it never was except in some very special cases. Because of the
way the terrestrial telco microwave distribution was done, the reference
phase changed during the course of the day.
>I have no direct knowledge, but... many years ago (mid-70's if I remember
>right) one of the hobby electronics mags (I think it was Radio-Electronics) had
>an article for a frequency standard derived from a color tv. Soon afterward a
>letter appeared in the letter column (where else :-), written by an engineer at
>one of the better-equipped stations in L.A. He stated that even network-
>supplied programs taken from a live feed usually go through a time-base
>corrector at the local station, and that this breaks the "chain of
>traceability" back to the network's precision frequency standard.
That's correct, though the device is actually called a frame synchronizer.
The broadcast subcarrier is referenced to the station master sync generator,
and that's usually a simple crystal controlled oscillator. The FCC tolerance
on subcarrier is +/- 10 Hz so a crystal reference is good enough.
>(of course, anything that the local taped from a satellite feed for broadcast
>later is completely divorced from any standards at the network.)
>
>Also, at that time it was stated that the networks used rubidium-clock
>frequency standards, which are secondary standards: They're awfully good but
>they still have to be calibrated against something better. NIST (the folks who
>run WWV) uses cesium-beam clocks, which are primary standards, needing no
>calibration for frequency. Have the networks since upgraded to cesium-beam
>clocks? And, given that the local stations probably haven't, does it matter
>anyway? Even if they have, they're still "only" as good as NIST's clocks, so
>why should one over-the-air signal be better than another? (propagation
>changes on shortwave, maybe?)
The networks have abandoned the rubidium references and use crystal
oscillators today, just like the local stations. With the change from
telco microwave distribution to satellite distribution, there is enough
doppler that a tight reference is worthless anyway. Geosync satellites
really aren't precisely geosync. They describe small figure 8s in their
position boxes, and this introduces enough phase variation through path
length changes, and enough frequency error through doppler, that you can
watch the subcarrier vector rotate one way then the other if you reference
the vectorscope to the uplink signal while watching the downlink signal.
There's a several Hertz +/- error that varies over the course of the
day.
Gary
--
Gary Coffman KE4ZV | You make it, | gatech!wa4mei!ke4zv!gary
Destructive Testing Systems | we break it. | uunet!rsiatl!ke4zv!gary
534 Shannon Way | Guaranteed! | emory!kd4nc!ke4zv!gary
Lawrenceville, GA 30244 | |
# In a discussion of use of color burst signal as a timebase, Jamie Hanrahan
# <j...@cmkrnl.com> wrote:
# >
# >Also, at that time it was stated that the networks used rubidium-clock
# >frequency standards, which are secondary standards: They're awfully good but
# >they still have to be calibrated against something better. NIST (the folks who
# >run WWV) uses cesium-beam clocks, which are primary standards, needing no
# >calibration for frequency. Have the networks since upgraded to cesium-beam
# >clocks? And, given that the local stations probably haven't, does it matter
# >anyway? Even if they have, they're still "only" as good as NIST's clocks, so
# >why should one over-the-air signal be better than another? (propagation
# >changes on shortwave, maybe?)
#
# From what I remember, a Rubidium clock has better short term accuracy than
# a Cesium clock. The Cesium one is more accurate in the long term. So a
# Rubidium clock that is compared with a cesium one, say using phase
# comparison with wwvb, is as accurate as these clocks get - about 1
# part in 10^12.
Use of a time base with this sort of accuracy was necessary back in the pre-
frame shaker (synchronizer) days, and short term accuracy would have been a
primary goal, but with modern hardware (along with the profound cheap-
skated-ness that's become so much a part of network television these days)
I'd be surprized if anyone's off-the-air subcarrier approached this kind
of performance. The FCC says plus or minus 10Hz is ok here..
Billy Y..
Nope. With satellite delivery of all network programming, all local stations
run the satellite received signal through their FrameStore TBC.
>Also, at that time it was stated that the networks used rubidium-clock
>frequency standards, which are secondary standards: They're awfully good but
>they still have to be calibrated against something better. NIST (the folks
who
>run WWV) uses cesium-beam clocks, which are primary standards, needing no
>calibration for frequency. Have the networks since upgraded to cesium-beam
>clocks?
In the days before satellite distribution and frame synchronizers the
networks did provide a good frequency reference. NBS (which later became
NIST) even published pamphlets showing simple modifications to a TV set
and monthly corrections to the three network standards. The reason you
could calibrate to a given level of accuracy quicker using color subcarriers
than using WWV had nothing to do with the frequency standards used. It
was a result of the errors introduced by HF propagation of WWV. (You're
doing quite well if you can get a part in 10^7 using HF WWV.) As the path
length changes (which it does constantly) phase errors are introduced.
Using WWVB at 60 kHz improves things by a couple of orders of magnitude,
but it takes a good local standard and about 24 hours of averaging to
start getting close to the accuracy of the transmitted frequencies. The
network color frequencies cut this down to about fifteen minutes.
Alas, no more.
--
Gordon R. Smith, K7HFV gsm...@rahul.net
Salt Lake City, Utah
>The problem with keeping NTSC video at the same clock rate(s) as its
>monochrome predecessor concerns what happens when it's actually broadcast
>(sent out over the air). Harmonics of the horizontal drive and of the
>color subcarrier fell close enough to each other to cause interference.
This was corrected by having the ratio between the color subcarrier and
the horizontal sync at 455/2 or 227.5. This does not explain the choice
of 3.5795454545... (which BTW is defined as 5*63/88). The subcarrier
could have been 3.583125 MHz exactly and this problem would be solved.
>The vertical drive was changed from 60Hz to 59.94Hz (0.1% less) to place
>these harmonics exactly in between each other. This makes the frame rate
>29.97Hz. Unfortunately now when there is hum in the video this results in
>its slowly crawling upward (8.3 seconds from bottom to top) and thus it is
>more noticable.
Again, this is based on ratios to solve it.
--
Phil Howard, KA9WGN | Don't put off until next month...
<p...@netcom.com> | ...your right to buy ONE HANDGUN this month!
> Also, at that time it was stated that the networks used rubidium-clock
> frequency standards, which are secondary standards: They're awfully good but
> they still have to be calibrated against something better. NIST (the folks who
> run WWV) uses cesium-beam clocks, which are primary standards, needing no
> calibration for frequency. Have the networks since upgraded to cesium-beam
> clocks? And, given that the local stations probably haven't, does it matter
> anyway? Even if they have, they're still "only" as good as NIST's clocks, so
> why should one over-the-air signal be better than another? (propagation
> changes on shortwave, maybe?)
>
Colorburst transmit frequency is required to be +/-10Hz (5.5873E-6 or 5.5873ppm).
PPM means parts per million.
This requires a good ovenized oscillator(that isn't cheap). Rubidium Oscillators
go for about $20,000 I think. Cesium Beam clocks are > $200,000.
The clocks that NIST uses are the best in the world. They have about 10 of them
that are all averaged together.
WWV, however loses a lot in its method of transmission and to propagation effects.
Received accuracy (if you have a stable enough PLL to track it without further
loss of accuracy) is about 1E-7 (0.1ppm) frequency accuracy and 1ms for timing.
Even to keep this accuracy would cost you at least $1000. Stratum 3 oscillators used in non-central office telephone equipment are 4.7ppm and cost at least $2000.
Since the colorburst crystal in your TV is > 100ppm, any PLL that uses that
crystal to lock onto an external source cannot be any better than 100ppm.
WWV is not the problem. Unless you are in TV broadcasting I don't think you
would need better than WWV accuracy.
Incidently, NIST was working on a computer system where you could request time
and frequency by modem. It would figure out the delay of the telephone path
and compensate for it. Some manufacturers (True Time was one of them) was working
on clock sources that are locked to the GPS (Global Positioning System) system
with Stratum 3 accuracy and better timing accuracy.
-Cliff
---
-----------------------------------------------------------------------
Clifton Powers Let No Man Have Two Fibers Until
pow...@aur.alcatel.com all Men Have One.
Alcatel Network Systems -Steven Hornung, BTRL, July 1987
Video Technology Where's my Fiber?
-----------------------------------------------------------------------
>Colorburst transmit frequency is required to
>be +/-10Hz (5.5873E-6 or 5.5873ppm).
>PPM means parts per million.
>This requires a good ovenized oscillator(that isn't cheap).
No it doesn't. Any decent AT cut crystal oscillator will hold +/- 5 ppm over
any reasonable indoor temperature range (10 to 50 degrees C).
That ought to run you about 10 bucks.
> Rubidium Oscillators
>go for about $20,000 I think. Cesium Beam clocks are > $200,000.
Rubidium standards start at less than $2000.
Cesium standards cost from $30,000 to $70,000.
> Stratum 3 oscillators used in non-central office telephone
> equipment are 4.7ppm and cost at least $2000.
No they don't. Stratum *2* oscillators are the ones that cost around $2000.
And they are far more accurate than 4.7 ppm.
>and compensate for it. Some manufacturers (True Time
>was one of them) was working on clock sources that are locked to
>the GPS (Global Positioning System) system
>with Stratum 3 accuracy and better timing accuracy.
A good GPS system should be able to achieve
Stratum 1 accuracy (1 part in 10^11)
not just Stratum 3 accuracy (worst than 1 part in 10^6).
Rick Karlquist N6RK
Precision Time and Frequency R&D
HP Santa Clara Division
rka...@scd.hp.com
: This was corrected by having the ratio between the color subcarrier and
: the horizontal sync at 455/2 or 227.5. This does not explain the choice
: of 3.5795454545... (which BTW is defined as 5*63/88). The subcarrier
: could have been 3.583125 MHz exactly and this problem would be solved.
: --
: Phil Howard, KA9WGN | Don't put off until next month...
: <p...@netcom.com> | ...your right to buy ONE HANDGUN this month!
5 * 63 / 88 - How about that!
This has been running for some years now. Last I knew, the phone number was
1-303-494-4774 . ... still seems to be. It answers at 1200 or 300 bps only.
The delay compensation works if your computer echoes all characters sent by
NIST. With most modems, syncrhonization should be possible to within about
+- 10 msec of the correct time. No frequency standard is provided by this
service.
I couldn't find an exact figure to back this up, but quoting "NBS Special
Publication 432" (National Bureau of Standards, which is now the
National Institute of Standards and Technology (NIST)) (#432 supersedes
publication 236) for WWV (Fort Collins, Colorado) & WWVH (Kauai,
Hawaii): "Accuracy and Stability: The time and frequency broadcasts
are controlled by the primary NBS Frequency Standard in Boulder,
Colorado. The frequencies as transmitted are accurate to within one
part in 100 billion (1x10^11) at all times. Deviations are normally
less than one part in 1000 billion (1x10^12) from day to day. However,
changes in propagation medium (causing Doppler effect, diurnal shifts,
etc.) result in fluctuations in the carrier frequencies AS RECEIVED by
the user that may be much greater than the uncertainty described above.".
A binary coded decimal (BCD) time code is also transmitted on a 100Hz
subcarrier by WWV & WWVH.
The audio portion of these broadcasts can be heard via telephone (but
not the RF carriers). At (303)499-7111 for WWV and (808)335-4363 for WWVH.
The accuracy received anywhere in the contiguous 48 states is 30ms or
better.
Now if you want real accuracy ( :-) ), try WWVB (Fort Collins,
Colorado). This is a BCD time code only (1 bit per second!) on a
60KHz radio carrier. And to quote publication 432 again: "The frequency
of WWVB is normally within its prescribed value to better than 1 part in
100 billion (1x10^11). Deviations from day to day are less than 5 parts
in 1000 billion (5x10^12). Effects of the propagation medium on received
signals are relatively minor at low frequencies; therefore, frequency
comparisons to better than 1 part in 10^11 are possible using appropriate
receiving and averaging techniques.".
From publication 432: "Frequency Calibration Service Using Network
Television: For those users who require only frequency calibrations,
an alternative to the radio broadcasts is available. This service provides
a means of calibrating oscillators traceable to NBS. It gives the user
the option of calibrating his oscillator quickly at very low cost, with
modest accuracy, or of expending more time and money for higher accuracy.".
"The service is very reliable because the networks use extremely
stable rubidium or cesium oscillators to generate the 3.58MHz color
subcarrier frequency which is transmitted with all color programs. The
color signal is then used as a transfer standard. Any oscillator that
has a frequency of 10/N MHz, where N is any integer from 1 to 100, can
be calibrated.".
"If a user wants to make a calibration, he compares the color signal
coming from the network centers in New York City (or Los Angeles for
those on the West Coast) with his local oscillator. NBS monitors the same
network signals and publishes the difference between the network oscillators
and the NBS Frequency Standard in the monthly NBS Time and Frequency
Services Bulletin. A user then knows two things: (1) the difference
between his oscillator and the network oscillators (by measurement)
and (2) the difference between the networks and NBS (by publication).
With this information, he can easily compute the difference between
his oscillator and NBS. Thus, his calibration is traceable to the NBS
Frequency Standard.".
"NBS has developed two methods for making these frequency calibrations.
Equipment is commercially available for both methods.".
"Color Bar Comparator Method: The color bar comparator is a simple
circuit that connects to a standard color television set. It produces
a colored bar on the screen that changes color or moves across the
screen at a rate that depends on the frequency difference between the
user's oscillator and the TV network signal. By timing these changes
with a stopwatch and referring to the data published by NBS, an oscillator
can be rapidly calibrated to an accuracy of 1 part in 1 billion (1x10^9).".
"Digital Offset Computer Method: The second method, using a digital
offset computer, provides an automatic means of calibrating high-
quality crystal or atomic oscillators. It compares a signal from the
user's oscillator with the TV color signal and displays the frequency
difference on the TV screen as parts in 100 billion (parts in 10^11).
If measurements are averaged over about 15 minutes, a calibration
accuracy of one part in 100 billion can usually be achieved.".
NBS time can also be received via the GOES (Geostationary Operational
Environmental Satellite) satellites of the National Oceanic and Atmospheric
Administration (NOAA).
> Even to keep this accuracy would cost you at least $1000. Stratum 3 oscillators used in non-central office telephone equipment are 4.7ppm and cost at least $2000.
>
> Since the colorburst crystal in your TV is > 100ppm, any PLL that uses that
> crystal to lock onto an external source cannot be any better than 100ppm.
I am building a self-resetting clock which uses WWVB and should have
an accuracy of around 1ms (which is much more than needed for a clock :-) ).
I have a current version which has been running for over 10 years which
as I remember is accurate to around 1/256 of a second. So the important
factor for accuracy while NOT receiving the WWVB signal is the STABILITY
of the crystal (not the frequency accuracy).
For more information on receiving WWVB, see Don Lancaster's articles
in Radio Electronics: July 1972 (pages 54-58), August 1972 (pages 60-62),
August 1973 (pages 48-51 - this starts the WWVB part), and September
1973 (pages 98,101). I only used the design for the receiving antenna
and preamp (which are not my areas of expertise) and threw out the rest
of the design and replaced it with a microprocessor (note the dates on
the articles :-) ). I am lucky that I live in Colorado, so I didn't need
to use any of the fancier receiving methods which Don Lancaster describes
in his articles.
> Incidently, NIST was working on a computer system where you could request time
> and frequency by modem. It would figure out the delay of the telephone path
> and compensate for it. Some manufacturers (True Time was one of them) was working
(303)494-4774 (1200 baud, 8bit, 1stop, no parity).
Did everybody make it this far? :-)
-------------------------------------------------------------------------------
| Neil Collins coll...@spot.colorado.edu |
-------------------------------------------------------------------------------
I couldn't find an exact figure to back this up, but quoting "NBS Special
> Even to keep this accuracy would cost you at least $1000. Stratum 3 oscillators used in non-central office telephone equipment are 4.7ppm and cost at least $2000.
>
> Since the colorburst crystal in your TV is > 100ppm, any PLL that uses that
> crystal to lock onto an external source cannot be any better than 100ppm.
I am building a self-resetting clock which uses WWVB and should have
an accuracy of around 1ms (which is much more than needed for a clock :-) ).
I have a current version which has been running for over 10 years which
as I remember is accurate to around 1/256 of a second. So the important
factor for accuracy while NOT receiving the WWVB signal is the STABILITY
of the crystal (not the frequency accuracy).
For more information on receiving WWVB, see Don Lancaster's articles
in Radio Electronics: July 1972 (pages 54-58), August 1972 (pages 60-62),
August 1973 (pages 48-51 - this starts the WWVB part), and September
1973 (pages 98,101). I only used the design for the receiving antenna
and preamp (which are not my areas of expertise) and threw out the rest
of the design and replaced it with a microprocessor (note the dates on
the articles :-) ). I am lucky that I live in Colorado, so I didn't need
to use any of the fancier receiving methods which Don Lancaster describes
in his articles.
> Incidently, NIST was working on a computer system where you could request time
> and frequency by modem. It would figure out the delay of the telephone path
> and compensate for it. Some manufacturers (True Time was one of them) was working
(303)494-4774 (1200 baud, 8bit, 1stop, no parity).
Unfortunately publication 432 is out of date here. The nets don't
work this way anymore, and haven't since the late 1970s, as has been
noted previously in this thread. Using broadcast colorburst will only
give you a reference that's the local crystal oscillator at the local
broadcast outlet. It's likely no more accurate than your own crystal
oscillator, about 4 ppm. NIST should circulate a retraction of this
technique because it's likely still misleading folks. (They may have
issued one, but I haven't seen it.)
See NIST Special Publication 432 (Revised 1990), inside cover
says "Supersedes NBS Spec. Publ. 432 dated September 1979.
Oh, I can't find a mention of TV color signal calibrating at all
there. See the 1979 version (NIST was NBS then), page 10 & 11
section 7 titled "Digital Frame Synchronizers and
TelevisionTechniques". <sic>
This section explains that a frame synchronizer replaces the
original accuracy of the network feed with the accuracy of the
local station reference. It also says that only a few stations
have them, but this will change if the cost of frame
synchronizers drops...
I'd guess a bit of RAM and some timing circuits cost a bit less
now than in 1979. So this implies that TV color carriers are
useless now as a frequency standard. Which probably explains why
this information was dropped from Pub 432.
You're right, the new Pub 432 should probably have made a point
of "retracting" this bit of information. Instead they just
dropped all mention of it.
PS: The old Pub 432 does mention that there may be a special case
where good color references may be available. The old pub
claims that ABC stations in Los Angeles and New York are
co-located with their network studios and use the network
cesium references. Anyone know if this is still true? Or
has their cesium reference outlived it's lifetime and not
been replaced?
The Hewlett-Packard Cesium Beam Standard is US$46,000, plus $5,000 for the
clock display and standby battery (sheesh) and $8,500 for the high-performance
cesium beam tube (improves accuracy from +/- 3E-12 to +/- 2E-12, and
improves short-term stability). I know this because, as a sufferer from
compulsive time fetishism, it's on my shopping list for when I win the
lottery. The 20ms accuracy of my Heathkit clock is OK for now, but I'd
much rather KNOW what time it is than have WWV TELL me what time it is.
Their rubidium standard is US$42,000, plus about $10,000 in accessories that
I couldn't live without. Its short-term stability is about 10 times better
than the cesium beam standard (5e-13 over 100s, as opposed to 5e-12 over 1s).
It's just the thing for netting parties, so maybe this could be a club
purchase.
Just for comparison, they also have a quartz frequency standard for $9,500.
Its short-term stability is about two orders of magnitude worse than that
of the cesium beam standard, but of course its accuracy depends on the
standard with which it's calibrated.
The quartz standard also claims very high spectral purity, saying that
spectra less than 1 Hz wide can be obtained when the 5mHz output of the
standard is multiplied to 10 GHz. Perhaps that poor New York repeater
owner with the 243 MHz spur should consider one of these :-).
-- Bruce Toback
Internet: bto...@netcom.com
Packet: kn...@kc7y.az.usa.na
My PAYING daytime job is as a design engineer for ABC New York. WABC TV is
adjacent to the network facilities, and I believe they get their sync reference
from the network. They did the last time I looked, which was some years ago.
I'll post here if this is untrue. ABC Network uses a Rubidium standard for
sync, BUT, the backup, if the Rubidium croaks, is the crystal oscillator in the
GVG sync generator which is normally locked to the Rubidium. Unfortunately,
the casual viewer has no way of knowing if the Rubidium reference is up.
Most every little two-bit local station now has frame synchronizers these days,
I think the least expensive ones are only a few thousand dollars now. If so,
their colorburst frequency is set by their local reference, and not the
network. Also some cable systems use processing which destroys the burst
integrity. therefore, don't count on this way of calibrating your frequency
counter....
We used to have a Cesium standard on loan from NBS, backed up by TWO Rubidiums,
which we owned as the reference. After the widespread use of frame syncs, the
NBS took their cesium back, and finally the Rubidiums wore out. We talked
about just using the crystal oscillator in the GVG sync generator as the
reference all the time, and in fact have run that way from time to time for
several days to weeks, but we also use the sync as the reference for our clocks
and time-of-day time code (also encoded in the vertical interval and used by
Nielson for ratings measurements), so we decided to spring for the more
accurate Rubidium standard, and bought a new one.
Interestingly, another inaccuracy (really a jitter of sorts) in the frequency
of the colorburst can be caused by doppler effects caused by the satellite
drifting slightly in its orbital position. Almost all TV signals are
transmitted to affiliates by satellite these days. Some day, this may migrate
more to Fiber, but for now it's mostly satellite.
John
--
*******************************************************************************
John H. Schmidt, P.E. |Internet: sch...@auvax1.adelphi.edu
Technical Director, WBAU |Phone--Days (212)456-4218
Adelphi University | Evenings (516)877-6400
Garden City, New York 11530 |Fax-------------(212)456-2424
*******************************************************************************
: My PAYING daytime job is as a design engineer for ABC New York. WABC TV is
: adjacent to the network facilities, and I believe they get their sync reference
: from the network. They did the last time I looked, which was some years ago.
: I'll post here if this is untrue. ABC Network uses a Rubidium standard for
: sync, BUT, the backup, if the Rubidium croaks, is the crystal oscillator in the
: GVG sync generator which is normally locked to the Rubidium. Unfortunately,
: the casual viewer has no way of knowing if the Rubidium reference is up.
: Most every little two-bit local station now has frame synchronizers these days,
: I think the least expensive ones are only a few thousand dollars now. If so,
: their colorburst frequency is set by their local reference, and not the
: network. Also some cable systems use processing which destroys the burst
: integrity. therefore, don't count on this way of calibrating your frequency
: counter....
That brings up another interesting question:
Are these frame sychronizers located in the signal path such that
they are always inline? Assuming the answer is yes, this means that
*everything* passes through them. What type of video codecs do these
devices employ? Are their effects visible enough so that we vidiots
with our 32" monitors would be able to see their nasty artifact trails?
In other words, what sort of digitization of video is going on in these
frame synch boxes? As good as D2? Almost D1?
Yes there are frame syncs inline all the time. Sometimes there are
multiple frame syncs in the chain. You can sometimes notice that
lip sync is slightly off because of the multiple field delays through
the frame syncs. The typical frame sync samples composite video with
8 bit samples at 4X subcarrier. There are visible artifacts in some
pictures. D2 is no better. In fact it's usually worse because of the
error masking it uses to make up for tape dropouts. FEC fixes a lot of
them, but not all. D1 is considerably better because it stores component
video streams on separate tracks, but it is confined almost exclusively
to post rooms and the end product is transfered to composite format for
intermediate storage and playback. Then the whole picture is crammed
through a 10 year old transmitter with aging tubes. All your 32 inch
monitor does is give you a *bigger* look at all the various transmission
errors. NTSC looks best, by design, on a 19 inch screen that doesn't have
too much resolution.
: Yes there are frame syncs inline all the time. Sometimes there are
: multiple frame syncs in the chain. You can sometimes notice that
: lip sync is slightly off because of the multiple field delays through
: the frame syncs. The typical frame sync samples composite video with
: 8 bit samples at 4X subcarrier. There are visible artifacts in some
: pictures. D2 is no better. In fact it's usually worse because of the
: error masking it uses to make up for tape dropouts. FEC fixes a lot of
: them, but not all. D1 is considerably better because it stores component
: video streams on separate tracks, but it is confined almost exclusively
: to post rooms and the end product is transfered to composite format for
: intermediate storage and playback. Then the whole picture is crammed
: through a 10 year old transmitter with aging tubes. All your 32 inch
: monitor does is give you a *bigger* look at all the various transmission
: errors. NTSC looks best, by design, on a 19 inch screen that doesn't have
: too much resolution.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Amen! -
It looks even better, perhaps best, on a properly set-up 13 inch
broadcast monitor.
I wish the folks creating the letter-box movies with the 20:5 aspect
ratio had a better understanding of what NTSC is and is not.
I see these things and think my deflection circuitry has crapped out!
I tried J&R Music World, they list an AIWA for $380 and another for $500.
B&H has some professional stuff for $1850 (too much for me).
Also, where is the FAQ for this newsgroup?
Thanks,
Navaneeth
--
-----------------------------------------------------------------------------
Navaneeth Chakravarti, x456 Cambridge Technology Partners
e-mail: nc...@ctp.com Voice: (517) 372-8400
chak...@egr.msu.edu (617) 374-8456
Please remember that the TV you have has to play what the video puts out. That
means that you either have to have a multi system TV, the video has to
"convert" the output to one of another system that you have on your TV, or it
has to put out the video of the country and you obtain a TV set there.
I ASSUME that you know this, but considering how that word breaks down, want to
be sure you knew.
I have seen people coming here and making these mistakes,
Shalom from Jerusalem,
Azriel 4X1PI
: Deviations from day to day are less than 5 parts
: in 1000 billion (5x10^12).
5 parts in 1000 billion is not 1 part in 5x10^12, it's
5 parts in 10^12, or one part in 2x10^11.