Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Why 59.94 Hz?

4,850 views
Skip to first unread message

Bob Myers

unread,
Dec 4, 1993, 6:14:40 PM12/4/93
to
I recently received some mail asking where the NTSC 59.94 Hz field rate
came from in the first place. Thinking that this might be a topic
of general interest, I've decided to post a short discussion of this
here - hope no one minds!

Before the NTSC color encoding system was added to the U.S. TV standard,
television WAS at 60.00 Hz; it was set at this rate to match the power
line frequency, since this would make interference from local AC sources
less objectionable (the "hum bars" would be stable in the displayed
image, or - if the TV rate wasn't exactly locked to the line - at least
would move very slowly). Actually, in some early systems, the TV vertical
rate WAS locked to the AC mains!

A problem came up, though, when trying to add the color information. The
FCC had already determined that it wanted a color standard which was fully
compatible with the earlier black-and-white standard (there were already
a lot of TV sets in use, and the FCC didn't want to obsolete these and
anger a lot of consumers just to add color!) Several schemes were proposed,
but what was finally selected was a modification of a pixel-sequential
system proposed by RCA. In this new "NTSC" (National Television Standards
Committee) proposal, the existing black-and-white video signal would
continue to provide "luminance" information, and two new signals would be
added so that the red, green, and blue color signals could be derived
from these and the luminance. (Luminance can be considered the weighted
sum of R, G, and B, so only two more signals are needed to provide sufficient
information to recover full color.) Unfortunately, there was not enough
bandwidth in the 6 MHz TV channels (which were already allocated) to add
in this new information and keep it completely separate from the existing
audio and video signals. The possibility of interference with the audio
was the biggest problem; the video signal already took up the lion's
share of the channel, and it was clear that the new signal would be placed
closer to the upper end of the channel (the luminance signal is a
vestigial-sideband AM signal, with the low-frequency information located
close to the bottom of the channel; the audio is FM, with the audio carrier
4.5 MHz up).

Due to the way amplitude modulation works, both the luminance and
the color ("chrominance") signals tend to appear, in the frequency domain
(what you see on a spectrum analyzer) as a sort of "picket fence" pattern.
The pickets are located at multiples of the line rate up and down from the
carrier for these signals. This meant that, if the carrier frequencies were
chosen properly, it would be possible to interleave the pickets so that
the luminance and chrominance signals would not interfere with one another
(or at least, not much; they could be separated by using a "comb filter",
which is simply a filter whose characteristic is also a "picket fence"
frequency spectrum. To do this, the color subcarrier needed to be at
an odd multiple of one-half the video line rate. So far, none of this
required a change in the vertical rate. But it was also clearly
desirable to minimize interference between the new chroma signal and
the audio (which, as mentioned, is an FM signal with a carrier at 4.5 MHz
and 25 kHz deviation. FM signals also have sidebands (which is what made
the "picket fence" pattern in the video signals), but the mathematical
representation isn't nearly as clean as it is for AM. Suffice it to
say that it was determined that to minimize chroma/audio mutual interference,
the NTSC line and frame rates could either be dropped by a factor of
1000/1001, or the frequency of the audio carrier could be moved UP a like
amount. There's been (and was then) a lot of debate about which was the
better choice, but we're stuck with the decision made at the time - to move
the line and field/frame rates. This was believed to have less impact on
existing receiver than a change in the audio carrier would.

So, now we can do the math.

We want a 525 line interlaced system with a 60 Hz field rate.

525/2 = 262.5 lines/field. 262.5 x 60 Hz = 15,750 Hz line rate.

This is the rate of the original U.S. black-and-white standard.

We want to place the color subcarrier at an odd multiple of 1/2
the line rate. For technical reasons, we also want this multiple to
be a number which is fairly easy to generate from some lower multiples.
455 was selected, and

15,750 x 455/2 = 3.58313 MHz

This would've been the color subcarrier frequency, but now the
1000/1001 correction to avoid interference with the audio:

60 x 1000/1001 = 59.94005994005994.......

The above relationships still apply, though:

262.5 x 59.94... = 15,734.265+ Hz

15,734.265+ x 455/2 = 3.579545+ MHz

And so we now have derived all of the rates used in the current standard.


Bob Myers KC0EW Hewlett-Packard Co. |Opinions expressed here are not
Advanced Systems Div. |those of my employer or any other
my...@fc.hp.com Fort Collins, Colorado |sentient life-form on this planet.

0 new messages