Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

The suitability of RGB

52 views
Skip to first unread message

Great Hierophant

unread,
Jul 27, 2003, 5:09:37 AM7/27/03
to
To many, RGB is the holy grail of image quality when it comes to classic gaming.
It provides the best image quality of any color encoding system. It does not
rely on approximations of chromaninence and luminensce nor does it compress the
signal and recombine it later through algebraically. However, in the quest for
RGB there are many unfortunate drawbacks. First and most important has to do
with the horizontal scan rate. As I write this, I am looking at an RGB capable
monitor. A VGA monitor uses RGB signals to designate the color. But the
drawback is that the monitor is not capable of horizontal scan frequencies lower
than 30KHz and can often go as high as 130-140KHz for the best computer
monitors.. An RGB suitable for classic gaming purposes must have a horizontal
scan frequency of 15-30Hz. I understand no difficulty in constructing a monitor
to go from horizontal scan frequencies of 15.75-130kHz. Of course, there is a
great difficulty in PC hardware, which will double scan low resolutions and or
squeeze them into a 480 line resolution. It turns out that classic gaming video
does not look its best this way. A non-VGA RGB monitor will not do this.
Instead, it will either display video at 50-60fps for lower resolutions and
interlace that for higher resolutions, the resolution cutoff depending on the
horizontal scan frequency of that monitor. Europeans who have SCART capable TVs
do not have this problem. They have their own problems, which involve ports for
50Hz PAL systems. The US has the problem of finding good low H scan rate RGB
monitors and RGB capable consoles.

If a console natively outputs an RGB signal, then that is the signal everyone
should always seek to output to their display device of choice. Every console
since the Sega Genesis does so in one way or another. [Any console that can do
VGA output, (Dreamcast, Gamecube) supercedes this when it can do it.] Of the
earlier systems, the Sega Master System can do RGB, as can the Colecovision with
a hardware modification. The Atari **00 systems do not output in RGB but in
chroma/luma values, NTSC or PAL color signals. So does the NES, and switching
the PPU for the Playchoice 10's will lead to palette differences and some
graphical glitches. In these earlier systems, converting to RGB will lead to
loss of the classic feel , abandonment of the methods employed by the programmers
to take advantage of the NTSC system, and inaccuracies. In essence, any console
that can should, and those that cannot shouldn't. What should be done is that
classic consoles should be upgraded not to RGB but to composite video out, so to
avoid R/F hell. That way, the signal will be good quality without sacrificing
the basic princples of clasic console video production.

GH


-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----

Halo-

unread,
Jul 27, 2003, 10:11:38 AM7/27/03
to
On Sun, 27 Jul 2003 05:09:37 -0400, "Great Hierophant"
<great_hi...@hotmail.com> wrote:

> Europeans who have SCART capable TVs
>do not have this problem. They have their own problems, which involve ports for
>50Hz PAL systems.

Most European TVs support RGB SCART at both 50hz and 60hz, and as the
PAL colour coding does not affect RGB so RGB SCART output it is ideal.
We can use US games with an RGB SCART and it'll work fine.

>In these earlier systems, converting to RGB will lead to
>loss of the classic feel , abandonment of the methods employed by the programmers
>to take advantage of the NTSC system, and inaccuracies. In essence, any console
>that can should, and those that cannot shouldn't. What should be done is that
>classic consoles should be upgraded not to RGB but to composite video out, so to
>avoid R/F hell. That way, the signal will be good quality without sacrificing
>the basic princples of clasic console video production.

What about the PAL system? You've stated above that games took
advantage of the NTSC-system, which I assume had lazy ports. This
means that all these NTSC advantages are lostd. Surely if RGB is
possible on old systems it will allow PAL gamers to experience an
image _closer_ to the one originally intended on NTSC systems and thus
is a good thing.

If you can output RGB on a console as well, it will probably allow you
to have multiregion consoles and the problems of a PAL/NTSC cart on
NTSC/PAL system a on the 2600 should be lost. Can anyone confirm this?

Also, in other words, are you also saying that emulation is bad
because it produces a better picuture which was never intended and
thus is actually a worse image?

Clay Cowgill

unread,
Jul 27, 2003, 3:30:05 PM7/27/03
to
"Great Hierophant" <great_hi...@hotmail.com> wrote in message
news:3f23a08a$1...@corp.newsgroups.com...

> To many, RGB is the holy grail of image quality when it comes to classic
gaming.
> It provides the best image quality of any color encoding system. [...]

> An RGB suitable for classic gaming purposes must have a horizontal
> scan frequency of 15-30Hz.

Simple enough-- just buy a "standard res" monitor normally used for an
arcade game. Native 15.75Khz scan rate, available from 13-33" in CRT
format, some are switchable to medium res (~24KHz sweep) and VGA (~31.5KHz).
Many high-end video projectors and flat panel TV's also support 15KHz scan
rates now with RGB in. (Unlike computer monitors, video projectors and
HDTV's still need to show standard NTSC video with 63.5us lines, so keeping
the 'slow' scan rate in there is required and often it still works with the
RGB inputs.)

-Clay


Ocelot

unread,
Jul 27, 2003, 6:16:41 PM7/27/03
to
"Great Hierophant" <great_hi...@hotmail.com> wrote in message news:<3f23a08a$1...@corp.newsgroups.com>...
> To many, RGB is the holy grail of image quality when it comes to classic gaming.
> It provides the best image quality of any color encoding system. >

> If a console natively outputs an RGB signal, then that is the signal everyone


> should always seek to output to their display device of choice. Every >console since the Sega Genesis does so in one way or another.

I have to disagree here. Some systems output RGB natively internally
but don't support it at the video connector. The N64 is a prime
exmaple, and the RGB that you get off the internals is weak and needs
boosting.

What you are suggesting is throwing money and hadware hacking at a
problem. RGB monitors generally fall into the 'not cheap' category.
You can get a C= 1080, 1084, 1084s, and 2002 for a decent price, but
for anything larger you'd have to either use an arcade monitor (and
those don't come in nice looking cabinets unless you leave it in the
coinop) or a professional broadcast monitor (eg. Sony PVM) which are
definitely not cheap. Importing a multistandard TV set that can do
PAL-60 is pretty cost prohibitive as well.

Before anyone starts throwing this kind of money at their gaming setup
they should look at inexpensive ways to improve picture quality with
their existing signels. I suspect that more than 95% of consumer
televisions are not properly calibrated for colour/tint,
contrast/brightness and sharpness. The increase in picture quality is
staggering once a set is calibrated.


> In these earlier systems, converting to RGB will lead to
> loss of the classic feel , abandonment of the methods employed by the programmers
> to take advantage of the NTSC system, and inaccuracies.

I've always been a big proponent of choosing your video output method
wisely, not by system by the *game* being used. Best example --
Second or third generation Playstation 1 and Saturn games used to
'cheat' to get tranparencies and fog by dithering. Under composite
and RF the mudding and dot-crawl (and to a large extent how the TV is
calibrated) help to blend the rows of dots together and it fools your
eye into thinking it's transparent. Under RGB and Y/C (S-video) these
dots resolve themsevles into just that -- rows of dots, and the effect
of transparency is lost. This is not true for *all* PS1 or Saturn
games... so I let the game choose the display (as well as the video
standard -- If I'm playing a PAL PS1 game on an NTSC PSX then RGB is
the way to go, otherwise you lose colour signal).

There is nothing inherently wrong with Y/C. It offers a quantum leap
in terms of quality over RF and Composite. The corresponding jump
from Y/C to Y/U/V (Component) and RGB is *far* less dramatic than what
you'd think, and nearly imperceptible on a monitor under 20" (and
truth be told the only reliable way I can tell the difference between
Y/C and Y/U/V on a properly calibrated 27" monitor are with test
patterns).

Then there are the things that can go wrong or complicate RGB setups.
Sync issues are a huge headache if you don't know what the monitor and
console are capable of. The PS1 doesn't output standard sync, it lets
the SCART device strip it from one of the other signals. In a homebrew
RGB setup you'll have to get the proper sync yourself. This is not a
plug-in-and-go solution for the casual gamer. S-video and a properly
calibrated television are a satisfactory compromise in this situation.

I've been down the RGB road before and it's pretty easy to be an 'RGB
snob' if you have the hardware. OTOH, if you can get 90% of the
quality that RGB gives you with an S-video hookup and a properly
calibrated display device that you can buy locally -- it's very hard
to justify paying possibly 3x as much to get that last 10% (and
arguably that last bit may not even be detectible by the average
person).


I suppose the same arguments that apply to audiophilia can apply here
-- there will always be someone who will find that something is 'not
good enough' to their discerning ear (or in our case eye).

Regards,
Rob S.

Great Hierophant

unread,
Jul 27, 2003, 8:03:31 PM7/27/03
to

"Halo-" <mu...@nospam.freeplayuk.co.uk> wrote in message
news:gmm7ivg0655ojd89h...@4ax.com...

> On Sun, 27 Jul 2003 05:09:37 -0400, "Great Hierophant"
> <great_hi...@hotmail.com> wrote:
>
> > Europeans who have SCART capable TVs
> >do not have this problem. They have their own problems, which involve ports
for
> >50Hz PAL systems.
>
> Most European TVs support RGB SCART at both 50hz and 60hz, and as the
> PAL colour coding does not affect RGB so RGB SCART output it is ideal.
> We can use US games with an RGB SCART and it'll work fine.

That I am very glad to hear, and makes me want to seriously consider a European
TV the next time I can afford one.

> >In these earlier systems, converting to RGB will lead to
> >loss of the classic feel , abandonment of the methods employed by the
programmers
> >to take advantage of the NTSC system, and inaccuracies. In essence, any
console
> >that can should, and those that cannot shouldn't. What should be done is that
> >classic consoles should be upgraded not to RGB but to composite video out, so
to
> >avoid R/F hell. That way, the signal will be good quality without sacrificing
> >the basic princples of clasic console video production.

> What about the PAL system? You've stated above that games took
> advantage of the NTSC-system, which I assume had lazy ports. This
> means that all these NTSC advantages are lostd. Surely if RGB is
> possible on old systems it will allow PAL gamers to experience an
> image _closer_ to the one originally intended on NTSC systems and thus
> is a good thing.

Perhaps, but I believe that most current PAL TVs can decode an NTSC signal.
While there were many lazy ports to the PAL system, I'm sure some games too
advantage of that system's pecularities.

> If you can output RGB on a console as well, it will probably allow you
> to have multiregion consoles and the problems of a PAL/NTSC cart on
> NTSC/PAL system a on the 2600 should be lost. Can anyone confirm this?

In theory, yes. But there are many barriers, including european games designed
for PAL, region protection, and consoles with different hardware in NTSC and PAL
countries.

> Also, in other words, are you also saying that emulation is bad
> because it produces a better picuture which was never intended and
> thus is actually a worse image?

For some consoles, yes. For RGB capable consoles, the image, if properly
emulated, is so close to the true image displayed on an RGB monitor that arguing
over differences is not important. What becomes important in these instances is
accounting for the horizontal scan frequency. For NTSC only consoles, which
include
every important console from the Fairchild to the NES, with the exception of the
Colecovision, it produces an inaccurate picture. The early computer systems
especially suffer. The Atari 2600 sometimes does tricks that are difficult to
replicate on a VGA monitor. The NES's color palette varies from emulator to
emulator and proper emulation of it requires exact timing with an NTSC device.

Lately, I have been seeing very accurate palettes in NES and 2600 emulators. Of
course, accurate is a debatable characterization because NTSC is less accurate
than PAL when preproducing colors accurately. TVs with poorly tuned circuitry
can show something other than what was intended, as can adjustments to the color
controls of every TV. This cannot be avoided on NTSC monitors. Also, NTSC
colors tend to blend into one another, while an high resolution RGB display will
show pixels in a rigid way, no matter what color they border. (One of the great
things about CRT monitors in my opinion.)

But concerning the physical dimensions of the frame, there are differences. A
2600 will not generally take up the whole TV screen. There will be borders, even
in NTSC, because the beam is displaying 262.5 lines at 60Hz and (accounting for
overscan say 240 visible lines), the 2600 image usually occupies only 192 of
those lines. Horizontal blanking ensures that there will be
a sizeable lefthand border of the screen. The NES works differently. The NES
uses 240 lines, and on a perfectly flat projector screen they will all be
visible. On a CRT TV, the frame enclosing the glass obscures the top and bottom
lines, and the best result is that the top and bottom 8-lines are not visible for
an image of 224 lines. On my excellent TV, using the NES's composite out, an
extra two or three pixels are also blotted out, and the scanlines are more
diagonal than on a good RGB monitor. As for the horizontal, the NES stretches
the image enough that it covers the whole screen and the frame obscures the edges
of the image. It is distorted because it is running on a 4:3 TV screen and the
image is 1.06:1. But it does not look bad at all. An emulator's stretched NES
image looks disastrous on an VGA monitor. I think this is so because the screen
is merely stretching 256 pixels to 320 pixels or a multiple thereof.
Unfortunately, the screen cannot show fractions of pixels, so it looks poor.

Here I am at a disadvantage. I have never seen a console that can output RGB do
so through a SCART connection or to a RGB computer monitor like the Commodore
1084. For those systems, I would imagine that the viewer would have a much
better chance of seeing the whole image onscreen without it being blocked by
borders, especially if the monitor has horizontal and vertical screen size
controls. Otherwise, he will have the glass enclosure cutoff some pixels, as
arcade monitors show.

The true difficulty here is as it has always been, upscanning. This is taking a
15kHz horizontal frequency (which most classic consoles use) and converting it to
a 30kHz horizontal frequency of VGA monitors. For all classic 2-D consoles, the
usual maximum resolution is 240 lines at 50 or 60Hz. For a VGA monitor, the
minimum is 480 lines at 50 or 60Hz. Most upscan convertors simply do line
doubling, like emulators in full screen modes. Each line is repeated to give the
image some size, otherwise, it will be a small image in the middle of the screen.
On RGB monitor the beams are widened a bit and each high resolution interlaced
frame becomes two low resolution progressive frames. In my view, the difficulty
is both methods bring their own scanlines to the picture. VGA has double the
scanlines in fullscreen resolution mode than RGB. And those scanlines are
visible unless the screen resolution is put to 1024x768, where the eye can no
longer perceive scanlines. But at that resolution, either the game frame is in a
very small window or the pixel sizes are quadrupled! But in 640x480, what
emulators to do simulate the scanlines is to darken every second line to the
point that the second line is not displayed. The problem here is that this may
give an accurate picture by eliminating the doubled scanlines, it looks nothing
like what a TV or RGB monitor can produce.

John

unread,
Jul 27, 2003, 9:10:14 PM7/27/03
to

"Great Hierophant" <great_hi...@hotmail.com> wrote in message
news:3f24685e$1...@corp.newsgroups.com...

>
> "Halo-" <mu...@nospam.freeplayuk.co.uk> wrote in message
> news:gmm7ivg0655ojd89h...@4ax.com...
> > On Sun, 27 Jul 2003 05:09:37 -0400, "Great Hierophant"
> > <great_hi...@hotmail.com> wrote:
> >
> > > Europeans who have SCART capable TVs
> > >do not have this problem. They have their own problems, which involve
ports
> for
> > >50Hz PAL systems.
> >
> > Most European TVs support RGB SCART at both 50hz and 60hz, and as the
> > PAL colour coding does not affect RGB so RGB SCART output it is ideal.
> > We can use US games with an RGB SCART and it'll work fine.
>
> That I am very glad to hear, and makes me want to seriously consider a
European
> TV the next time I can afford one.

It also seems that the Euro TVs generally have PAL50, 60 and NTSC support
build in.
I cannot for the life of me think why the US market knobble their TVs so
they don't accept PAL. Surely it's just a bit of decoding software on a chip
these days?


neogeoman

unread,
Jul 27, 2003, 10:49:44 PM7/27/03
to
> Every console since the Sega Genesis does so in one way or another.

This is not true. The 3DO doesn't output RGB at all, nor does the PCFX,
or the FM Towns Marty, and most models of N64 cannot be modified (tho
some euro shops claim to offer this mod no one'e ever actually seen it).

> Simple enough-- just buy a "standard res" monitor normally used for an
> arcade game.

I have found this to be less than optimal. Arcade monitors tend to
display their best image when the input signal is exceptionally bright,
which most consoles don't do, and the result is an image with less
contrast and less vivid colour than you'd achieve on a consumer monitor.

> There is nothing inherently wrong with Y/C. It offers a quantum leap
> in terms of quality over RF and Composite. The corresponding jump
> from Y/C to Y/U/V (Component) and RGB is *far* less dramatic than what
> you'd think

This is absolutely the case. Svideo allows for incredible looking
games, and if it's all you have you shouldn't feel like you're missing
out on much. If you're making the move and need to select Svideo OR
RGB, try and shoot for RGB, or better, a solution that supports both.

Lawrence.

Great Hierophant

unread,
Jul 27, 2003, 11:39:29 PM7/27/03
to

"Clay Cowgill" <cl...@yahoo.com> wrote in message
news:1LVUa.141858$GL4.36776@rwcrnsc53...

An interesting idea, except that should one use consoles with such display
devices? For plasma screens and rear projection screens, the answer is clearly
no due to burn in potential. That leaves the standard CRTs and LCD screens. The
former usually never has a VGA input and the latter takes a digital signal. A
high end front projector is a very strange way to play a game.

Great Hierophant

unread,
Jul 27, 2003, 8:45:02 PM7/27/03
to

"Ocelot" <rob_o...@hotmail.com> wrote in message
news:36249131.0307...@posting.google.com...

[parody mode on]

Compromise is WEAK. In the DARWINIAN STRUGGLE that is gaming, anything less than
the BEST is SLAVERY! A gamer who NICKELS AND DIMES his way through the the
GREATEST activity that man has ever devised for his own amusement is neither man
nor GAMER. WOE to ye that have strayed from the TRUE FAITH. For the GAMER,
there can only be PERFECTION. Advocating anything less is HERESY.

[parody mode off]

> > In these earlier systems, converting to RGB will lead to
> > loss of the classic feel , abandonment of the methods employed by the
programmers
> > to take advantage of the NTSC system, and inaccuracies.
>
> I've always been a big proponent of choosing your video output method
> wisely, not by system by the *game* being used. Best example --
> Second or third generation Playstation 1 and Saturn games used to
> 'cheat' to get tranparencies and fog by dithering. Under composite
> and RF the mudding and dot-crawl (and to a large extent how the TV is
> calibrated) help to blend the rows of dots together and it fools your
> eye into thinking it's transparent. Under RGB and Y/C (S-video) these
> dots resolve themsevles into just that -- rows of dots, and the effect
> of transparency is lost. This is not true for *all* PS1 or Saturn
> games... so I let the game choose the display (as well as the video
> standard -- If I'm playing a PAL PS1 game on an NTSC PSX then RGB is
> the way to go, otherwise you lose colour signal).

I had no idea that consoles as late as the Playstation relied on NTSC weaknesses
to enhance their display qualities. Especially since the Playstation has
hardware support for fog and transparancies. I hope none of the great games use
this.

> There is nothing inherently wrong with Y/C. It offers a quantum leap
> in terms of quality over RF and Composite. The corresponding jump
> from Y/C to Y/U/V (Component) and RGB is *far* less dramatic than what
> you'd think, and nearly imperceptible on a monitor under 20" (and
> truth be told the only reliable way I can tell the difference between
> Y/C and Y/U/V on a properly calibrated 27" monitor are with test
> patterns).
>
> Then there are the things that can go wrong or complicate RGB setups.
> Sync issues are a huge headache if you don't know what the monitor and
> console are capable of. The PS1 doesn't output standard sync, it lets
> the SCART device strip it from one of the other signals. In a homebrew
> RGB setup you'll have to get the proper sync yourself. This is not a
> plug-in-and-go solution for the casual gamer. S-video and a properly
> calibrated television are a satisfactory compromise in this situation.
>
> I've been down the RGB road before and it's pretty easy to be an 'RGB
> snob' if you have the hardware. OTOH, if you can get 90% of the
> quality that RGB gives you with an S-video hookup and a properly
> calibrated display device that you can buy locally -- it's very hard
> to justify paying possibly 3x as much to get that last 10% (and
> arguably that last bit may not even be detectible by the average
> person).

How does the user know that his television is properly calibrated? What can he
do to get the right levels of tint, color, brightness, contrast, and sharpness?
Will the factory preset levels do?

Also, the Y/C S-Video connection has 90% of the quality of an RGB connection?
The S-Video connection still outputs an NTSC signal, with Y and C separated for a
cleaner signal. Then I wonder if the component connection looks much worse than
analog uncompressed RGB.

Clay Cowgill

unread,
Jul 28, 2003, 2:08:18 AM7/28/03
to
"Great Hierophant" <great_hi...@hotmail.com> wrote in message
news:3f249af2$2...@corp.newsgroups.com...

>
> An interesting idea, except that should one use consoles with such display
> devices? For plasma screens and rear projection screens, the answer is
clearly
> no due to burn in potential.

Well, they're certainly more prone to burn-in than say a regular front-view
CRT, but if you're not a store running the same game for hundreds of hours
on end I doubt you'd burn anything in.

If there are any rear-projectors using DLP or similar stuff they'd be safe--
there's no 'tube' to burn-in (usually a spinning wheel for color and the
micro-mirrors in the DLP do the rest all as reflective light).

> That leaves the standard CRTs and LCD screens. The
> former usually never has a VGA input and the latter takes a digital
signal. A
> high end front projector is a very strange way to play a game.

Newed "HDTV ready" TV's are coming with RGB inputs much more frequently.
Component too-- I'm not sure how hard a Component->RGB conversion is. Front
projection LCD's are getting pretty nice. (Look at the ones from Piano for
home/portable use.) There are a lot of front-projection LCD's for less than
a good Sony XBR tube set now...

-Clay


Clay Cowgill

unread,
Jul 28, 2003, 2:12:59 AM7/28/03
to
"neogeoman" <newsg...@NOSPAMgamesx.com> wrote in message
news:MPG.198f1e329...@news3.asahi-net.or.jp...

> > Simple enough-- just buy a "standard res" monitor normally used for an
> > arcade game.
>
> I have found this to be less than optimal. Arcade monitors tend to
> display their best image when the input signal is exceptionally bright,
> which most consoles don't do, and the result is an image with less
> contrast and less vivid colour than you'd achieve on a consumer monitor.

In general you'd want a newer monitor with switchable input levels-- older
arcade monitors rely on ~3V P-P signals which you're not going to get out of
any internal RGB path without a video amp. Newer monitors support the 1V
P-P outputs from more PC-based arcade boards (gambling games in particular),
so using that with a console would be pretty good.

The "right" answer is to use a video amp though-- not many RGB outputs in
home consoles are buffered and tapping directly off them can yield noise and
interference since they're not meant to drive cables. One exception there
would be anything that used those Yamaha "all in one" graphics engine chips
(YGV608/618? if my memory serves) as their RGB outputs were suitable for
pushing pretty ugly loads...

Peter de Vroomen

unread,
Jul 29, 2003, 12:38:55 PM7/29/03
to
> I cannot for the life of me think why the US market knobble their TVs so
> they don't accept PAL. Surely it's just a bit of decoding software on a
chip
> these days?

I cannot even understand why they chose NTSC. Overhere, we call NTSC 'Never
The Same Color', i.e. because there is no colorburst, there is no way to
calibrate colors, which means that the colors on *every* TV set differ. And
that simply to get 60Hz instead of a 50Hz refresh rate. I can't see the
difference, they both flicker. Even mij monitor flickers at 67Hz (standard
VGA 640x480 refresh frequency). You need at least 75Hz to have a
flicker-free picture.

I guess the people who chose for NTSC instead of PAL were old, had bad eyes
but had wallets big enough to catch money thrown at them from here or
there...

PeterV


Luis Vasquez

unread,
Jul 29, 2003, 2:31:49 PM7/29/03
to

"Peter de Vroomen" <pet...@ditweghaluh.jaytown.com> wrote in message
news:3f26a320$0$49110$e4fe...@news.xs4all.nl...
Oh crap, not another NTSC vs. PAL argument. The reason why it works at 60 hz
is so that it matches the frequency of our power lines. Same reason why PAL
works at 50 hz. Of course, nowadays the power frequency doesn't matter due
to advances in technology.
They both have their advantages. I prefer NTSC because that's what I grew up
with.


Clay Cowgill

unread,
Jul 29, 2003, 5:52:43 PM7/29/03
to
"Peter de Vroomen" <pet...@ditweghaluh.jaytown.com> wrote in message
news:3f26a320$0$49110$e4fe...@news.xs4all.nl...
> I cannot even understand why they chose NTSC.

Backwards compatibility was a big deal when going to NTSC color broadcast
standard in the states-- they picked a clever (if not "perfect") method to
implement color using only the existing B/W bandwidth and timing. The fact
that it's held up as long as it has is pretty amazing...

> Overhere, we call NTSC 'Never
> The Same Color', i.e. because there is no colorburst, there is no way to
> calibrate colors, which means that the colors on *every* TV set differ.

Well, there is colorburst in NTSC-- the beginning of each scanline like PAL
IIRC... Phase difference accounts for color change on the line. TV to TV
color control is impossible however since all TV's have color/hue/brightness
controls of some sort or other which the use can tweak output without any
reference standard. Broadcast standards here impose strict phase and timing
calibration for signals coming from broadcasters, but home game systems and
the like don't have the compensated temperature controlled colorburst
generators that a TV station does.

(The Atari 2600 is barely "NTSC-like" if you look at the signals. Amazing
that TV's can even latch onto it in color if you ask me...)

Most TV's shipped in the US are not calibrated for "proper" display from a
reference standpoint-- "default" settings tend to crank up the color and
brighness because people like the look of brighter/more colorful TV's in the
showrooms...

> And
> that simply to get 60Hz instead of a 50Hz refresh rate. I can't see the
> difference, they both flicker.

If you grew up watching 60Hz TV, 50Hz PAL is immediately noticeable to most
people over here. "What's wrong with that TV?" or "Why is that flickering?"
is a common question. ;-)

> I guess the people who chose for NTSC instead of PAL were old, had bad
eyes
> but had wallets big enough to catch money thrown at them from here or
> there...

Probably less conspiracy theory and more cost... 60Hz power here, different
frequency standard uses, and at the time of adoption of the broadcat
standard color TV's didn't exist.

-Clay


metallik

unread,
Jul 29, 2003, 7:25:37 PM7/29/03
to
> that simply to get 60Hz instead of a 50Hz refresh rate. I can't see the
> difference, they both flicker. Even mij monitor flickers at 67Hz (standard
> VGA 640x480 refresh frequency). You need at least 75Hz to have a

Standard 640x480 VGA is 60hz. 320x200 VGA is 72hz.

I once coaxed my PC monitor into a 50hz refresh using a C-64 emulator.
It was like watching a strobe light. Standard PAL televisions must have
had some seriously long-persistience phosphor.

> I guess the people who chose for NTSC instead of PAL were old, had bad eyes

Does it have anything to do with the 60hz cycle of A/C power here?
Regardless, I prefer goofy color over epelipsy-inducing refresh.
S-video fixes the color problem anyway.

Kev

unread,
Jul 29, 2003, 9:22:39 PM7/29/03
to
metallik <meta...@fuse.removethis.net> wrote in
news:3f27000c$0$12988$a046...@nnrp.fuse.net:

Yep. 60Hz refresh was selected because it wouldn't "beat" with the 60Hz
power. This "beating" would show up as a moving "wave" on the picture
that would crawl up/down depending on the difference in frequency between
the two sources. The power supplies in old TV's were pretty shitty and
usually unregulated, so they wiggled a bit with the AC line frequency,
which would cause ugly video. the PAL standard was set at 50Hz for the
same reason- power over in europe is 50Hz.

However, these days, the signals are 100% derived from the colourburst,
so it is no longer 60Hz but 59.98 or something like that (I can't
remember the exact value right now and I'm too lazy to look it up).

Great Hierophant

unread,
Jul 29, 2003, 9:28:35 PM7/29/03
to

"metallik" <meta...@fuse.removethis.net> wrote in message
news:3f27000c$0$12988$a046...@nnrp.fuse.net...

> > that simply to get 60Hz instead of a 50Hz refresh rate. I can't see the
> > difference, they both flicker. Even mij monitor flickers at 67Hz (standard
> > VGA 640x480 refresh frequency). You need at least 75Hz to have a
>
> Standard 640x480 VGA is 60hz. 320x200 VGA is 72hz.

VGA does not go above 70Hz.

> I once coaxed my PC monitor into a 50hz refresh using a C-64 emulator.
> It was like watching a strobe light. Standard PAL televisions must have
> had some seriously long-persistience phosphor.

Standard 720x350 MDA is 50Hz. All VGA monitors can sync down to it.

metallik

unread,
Jul 30, 2003, 6:30:42 PM7/30/03
to
>>Standard 640x480 VGA is 60hz. 320x200 VGA is 72hz.
>
> VGA does not go above 70Hz.

Perhaps it was 70hz then.. I just checked standard ASCII and it was 70,
so the 72 is mistaken.

Bruce Tomlin

unread,
Jul 30, 2003, 7:15:39 PM7/30/03
to
In article <3f23a08a$1...@corp.newsgroups.com>,
"Great Hierophant" <great_hi...@hotmail.com> wrote:

> scan frequency of 15-30Hz. I understand no difficulty in constructing a
> monitor
> to go from horizontal scan frequencies of 15.75-130kHz. Of course, there is a
> great difficulty in PC hardware, which will double scan low resolutions and
> or
> squeeze them into a 480 line resolution. It turns out that classic gaming
> video

FYI, if you can find an old DEC VT240 color monitor, they use NTSC sync
rates. (and BNC plugs) I used one with a Sega Genesis for a while.

0 new messages