What exactly was the point of limiting the IIgs to a maximum of
640x200? The pixels are so irregularly shaped that the screen images
look horrendous. At this point my only guess is that memory cost too
much for the extra resolution to be worthwhile.
Now, the 320x200 mode is just fine and dandy. That mode is quite nice.
But 640x200 totally blows. The text looks terrible, unless you make it
double wide to compensate for the unnatural skinniness that 640x200
imposes on everything.
This reminds me of something that's always bothered me.. how is it
that the image appears to really only have 200 lines, when NTSC is
really 525 lines? I realize the edges are blank and unused to prevent
overscan, but still, at 200 scanlines tall you'd have a squashed
letterbox in the middle and 162 blank scanlines above and below.
It would seem that those 200 image scanlines are double-thick, yet
when you look at the graphics on an NTSC monitor you can't tell that a
pixel is two scanlines tall. Even with the old hires 280x192 grahpics,
a pixel appears to be made up of a single scanline, with a definite
separation from the next pixel above and below but no visible split in
the pixel itself. Is this really a "normal" NTSC resolution at all?
-Mr. Boffo
Email: mister...@hotmail.com
> So anyway I've been trying out stuff like Hyperstudio for the IIgs,
> and the one thing which keeps coming to mind is: "Boy, this screen is
> just godawful!"
>
> What exactly was the point of limiting the IIgs to a maximum of
> 640x200?
The screen resolution of every Apple II is designed to be compatible
with standard television, but without interlace (which would cause
flicker).
This sets the horizontal scan frequency (about 15.75 kHz), refresh rate
(50 or 60 Hz depending on locality) and hence the number of lines of
vertical resolution (192 or 200, depending on video mode).
To get a higher vertical resolution requires either
(a) Interlacing the image at half the scan rate. This would produce
excessive flicker on a computer monitor.
(b) Use a higher frequency monitor. This would eliminate compatibility
with standard television.
The second part of the equation is the address space required for the
screen image. The IIgs Super Hi-res graphics mode uses 32KB, and due to
hardware implementation details, has to reside within a particular
memory bank, which only has about 40KB of address space potentially
available for the video buffer.
The same video chip can support interlaced video (640x400) by using two
32KB buffers in adjacent banks (which means that the video memory is no
longer linear), but this is only supported on Apple's Video Overlay
Card, not in the IIgs implementation.
Apple could have chosen to support higher resolutions by doing a
significantly different hardware implementation, or by reducing the
number of colours available for each pixel. The system they ended up
using was a good compromise for colour range without introducing
significant compatiblity issues.
> This reminds me of something that's always bothered me.. how is it
> that the image appears to really only have 200 lines, when NTSC is
> really 525 lines?
NTSC is 525 lines, but it interlaced with two fields, each of which
displays 262.5 lines. There are 59.94 fields per second, or 29.97
complete frames per second.
The IIgs generates non-interlaced video with frames approximately at the
field rate of NTSC, i.e. about 60 frames per second. The vertical
resolution available is that of the NTSC field (262.5 lines), not the
NTSC frame (525 lines).
The original Apple II video modes use 192 lines, though I'm not sure why
this particular value was chosen, except as a side effect of the memory
map chosen for the video modes.
For the IIgs Super Hi-res mode, the revised memory map allowed enough
room for a further eight lines of video.
> It would seem that those 200 image scanlines are double-thick, yet
> when you look at the graphics on an NTSC monitor you can't tell that a
> pixel is two scanlines tall. Even with the old hires 280x192 grahpics,
> a pixel appears to be made up of a single scanline, with a definite
> separation from the next pixel above and below but no visible split in
> the pixel itself. Is this really a "normal" NTSC resolution at all?
If you had a "normal" NTSC signal, there would be another 200 scan lines
interleaved with the 200 existing lines, but half a frame later in time.
The total 525 scanlines are split between the two frames, so that one gets
262 and one 263 lines, and these are interspersed. So there is really only
62 "spare" lines above and below the image on a IIgs 200 line mode.
I do agree that it looks awful. I prefer modes which use a correct aspect
ratio so that the pixels are square. The HGR 280x192 isn't too bad on most
monitors. Even 320x200 looks squat, as it should be 320x240 for square
pixels.
Simon.
"Mr. Boffo" <mister...@hotmail.com> wrote in message
news:3a46b23b...@news.wi.centuryinter.net...
>Mr. Boffo <mister...@hotmail.com> wrote:
>
>> So anyway I've been trying out stuff like Hyperstudio for the IIgs,
>> and the one thing which keeps coming to mind is: "Boy, this screen is
>> just godawful!"
>>
>> What exactly was the point of limiting the IIgs to a maximum of
>> 640x200?
>
>The screen resolution of every Apple II is designed to be compatible
>with standard television, but without interlace (which would cause
>flicker).
>
>This sets the horizontal scan frequency (about 15.75 kHz), refresh rate
>(50 or 60 Hz depending on locality) and hence the number of lines of
>vertical resolution (192 or 200, depending on video mode).
>
>To get a higher vertical resolution requires either
>
>(a) Interlacing the image at half the scan rate. This would produce
>excessive flicker on a computer monitor.
Ah, yes, I recall seeing this on an Amiga, where the screen was an
NTSC monitor and the resolution required 60hz interlacing. A single
horizontal line has terrible flicker, unless made to be two lines
thick, so there's no real gain from the extra resolution. (sigh)
>The same video chip can support interlaced video (640x400) by using two
>32KB buffers in adjacent banks (which means that the video memory is no
>longer linear), but this is only supported on Apple's Video Overlay
>Card, not in the IIgs implementation.
Hee, it's an Apple product, so who cares about bank linearity? Anyone
who has ever dealt with the hell that is double-high and double-low
res graphics knows that Apple is no stranger to bizarre video bank
arrangements. :-)
-Mr. Bofo
Email: mister...@hotmail.com
There were special monitors with long-persistence phosphers which solved the
flicker problem. Unfortunately, they then caused any movement on the screen
to smear, which is why those interlaced modes were very rarely used.
When the Mac II and IBM VGA were intoduced in 1987 they used hi-res RGB
colour monitors capable of 480 lines. But the cost! Even in 1990 a cheap VGA
card and monitor cost US$500. In 1987 it was several times that.
p.s. It's summer here, over the last fortnight the temperatures have been in
the 18-22C range, but now it's raining heavily, so maybe the fire will be
lit after all :-)
Merry Christmas, happy holidays!
--
Roger Johnstone, Invercargill, New Zealand
Apple II - Future Cop:LAPD - Warcraft II
http://homepages.ihug.co.nz/~rojaws
______________________________________________________________________
from the Bottom episode "Holy"
Richie:Now look, we've got guests coming remember, so I'd better get on
with my turkey.
Eddie:What're ya going to do with it?
Richie:Well, it's the season of good will and peace on earth so I thought
I'd chop both its feet off, rip out its innards, strip it, shove an
onion up its arse, and bung it in a very hot place for four hours
until it's completely burnt.
While 640 x 200 is far from ideal, the resulting display is not all that
horrible. Really, the biggest limitation is on the number of colors per
palette-- just 4.
> This reminds me of something that's always bothered me.. how is it
> that the image appears to really only have 200 lines, when NTSC is
> really 525 lines? I realize the edges are blank and unused to prevent
> overscan, but still, at 200 scanlines tall you'd have a squashed
> letterbox in the middle and 162 blank scanlines above and below.
>
> It would seem that those 200 image scanlines are double-thick, yet
> when you look at the graphics on an NTSC monitor you can't tell that a
> pixel is two scanlines tall. Even with the old hires 280x192 grahpics,
> a pixel appears to be made up of a single scanline, with a definite
> separation from the next pixel above and below but no visible split in
> the pixel itself. Is this really a "normal" NTSC resolution at all?
>
....
No; it is a plain non-interlaced RGB setup. The reason you do not see a
separation between lines is dot size. Dot pitch for the CRT is .37mm, a good
deal larger than modern SVGA CRT ratings of .25mm or smaller. Lines on the IIgs
monitor are fat enough so that 200 lines fill most of the screen with no
obvious separations.
Rubywand
Yes, there is flicker, but that comment is insane. You clearly know
nothing of interlaced video.
bp
--
"Heisenberg may have slept here."
>Television uses an interlaced display, where every second scanline is sent
>out in any particular frame. The Apple II sends out the same image for two
>consecutive frames, so that two adjacent scanlines actually will contain the
>same data. This is why the Apple's VBI (vertical blanking interrrupt) is
>only 25 Hz not 50 Hz.
Nope. the Apple VBI rate is 50/60 Hz.
For alternate field scan lines to display interlaced--as opposed to
superimposed--they must all start one-half line time later relative to
the vertical sync pulse. This timing is not
used by Apple video,
so all fields are superimposed, with half the lines and 50/60 Hz refresh rate.
(Frequency depends on PAL/NTSC, of course.)
-michael
Email: mjm...@aol.com
Home page: http://members.aol.com/MJMahon/