This claims to allow CGA signal to be output to a NTSC or PAL
compatible monitor. It designed for maintenance and modification on
older video cabinet systems.
The $200 price tag seems insane, but the actual circuit board doesn't
seem to be too complicated.
I would love to be able to use 80 Col color on my old Commodore 128s.
My 1084 is gone, and finding a CGA monitor these days isn't so easy.
I know others have looked at methods to get VGA out and failed.
Perhaps we're working to hard. Maybe going to NTSC or PAL first will
get us an acceptable signal. It's only 640 x 200 or or 640 x 400
interlaced. With that a simple switch box and an NTSC to VGA adapter
would do the job.
Does any one have more information on this board? Be nice to drive the
cost down a bit.
script:
>http://www.converters.tv/signals/cga_to_vga.html
At $85 US, their RGB to composite converter is summat more reasonable.
salaam,
dowcom
To e-mail me, add the character zero to "dowcom". i.e.:
dowcom(zero)(at)webtv(dot)net.
--
http://community.webtv.net/dowcom/DOWCOMSAMSTRADGUIDE
MSWindows is television,… Linux is radar.
Cheers
Fotios
I thought CGA was 16-color RGBI, with a two-level "I" signal. When I
connect my old Zenith CGA monitor to my 128, I see 16 colors.
Brian
--
Cheers
Fotios
I agree the price is insane. Maybe if the board had no solder mask the
price would go down a lot.
I also don't see how this can work with CGA. As I've mentionned, there
really is nothing in common with the RGBI timings and what a color TV
expects. You really need more than a handful of resistors and a tiny
IC to fix the inherent, fundamental mismatch between the two. It's
like trying to listen to 101.3 WXCL but the radio is tuned to 97.8.
First, look at a typical horizontal line in NTSC. Now look at a
horizontal line in RGBI. Only about 2/3 of the line has actual pixels
in it. Out of the 1016 clock cycles for a line, only 640 are pixels.
This is why the 128's 80 column output is so much narrower than the 40
column display. NTSC lines are much fuller, the front and back porches
much shorter.
If you change the 16MHz crystal in the 128 to a 14.31818MHz crystal,
and you can predict the behavior of the rest of the machine (I can't,
I'd have to look), you'd have a much better chance of encoding RGBI
into blurry NTSC.
I'm not even sure if you can program the VDC to output the correct
sync frequency with a 14MHz crystal.
It's something to try to finally lay this horse to rest.
Most just deal with CGA's first 8 colors. You probably can do oneself a
better deal to convert the RGBI into Analog RGB, first - and then scan
double it. How do you do it? There is a board in your RGBI monitors like
1902A that takes the digital RGBI (4 digital bits) to make the Analog R,G, &
B lines.
Basically, remake that card but external of a monitor and add a scan
doubler - and it ought to work. CGA originally was 8 colors and then CGA was
extended further to 16 colors with many of the cards but the most earliest
of them. The biggest problem is these guys don't tie in the Intensity line
to each of the R, G and B DACs. I is tied in and basically the DACs are 2
bit DACs. One bit would be the 'color' bit and the other would be the
'intensity bit'. However, the intensity bit is tied to ALL R,G,B DACs at the
same time. However, if you seperated the Intensity bit or say added too more
bits from say the user port, you could theoretically increased the color
options to 64 colors but there will be some sort of limits caused by CIA not
being synced with the VDC so it is all timing. Your best timing is character
clock timing, that is about it.
So in theory, you could increase the intensity option externally from the
user-port for example. This was among part of the concept presented to
A7yvm.... since RGBI was so digital. Basically, the RGBI lines (with
exception to the clock lines (syncs), is basically not much different to any
of the I/O on the C64/128.
"Brian Ketterling" <twee...@no-potted-meat-products-peoplepc.com> wrote in
message news:b3QJh.9784$PL....@newsread4.news.pas.earthlink.net...
"Mangelore" <fot...@commodore128.org> wrote in message
news:koRJh.11039$8U4....@news-server.bigpond.net.au...
That is new to me. Are there any sources that support your claim that
there were 8 color CGA adapters?
> Most just deal with CGA's first 8 colors.
Really?
The only variation I'm aware of how CGA monitors treated the intensity
bit i.c.w. the black and yellow colors.
> You probably can do oneself a
> better deal to convert the RGBI into Analog RGB, first - and then scan
> double it. How do you do it? There is a board in your RGBI monitors like
> 1902A that takes the digital RGBI (4 digital bits) to make the Analog R,G, &
> B lines.
>
> Basically, remake that card but external of a monitor and add a scan
> doubler - and it ought to work. CGA originally was 8 colors and then CGA was
> extended further to 16 colors with many of the cards but the most earliest
> of them.
CGA introduced by IBM supported 16 colors from the start (1981). I
really wonder where you get your information from.
Really?
Those converters "Mangelore" is talking about take most likely analog
RGB, with analog RGB there is no point in having an "intensity" input.
> If you find out that those DACs are more than 1 bit - then you can tie the
> Intensity lines to the R,G & B DACs and voila - 16 colors. Just takes a few
> wires. However, it is likely all integrated into a single chip and you would
> have to look at the specs of the chip itself - if it has an Intensity input
> pin and tie it there and voila. Those cheap ________, doesn't think about
> that.
As far as the DAC's are concerned; a couple of resistors suffice. It
doesn't make sense to make an one or two bit DAC IC.
Turning RGBI into analog RGB (with all 16 colors) requires no more than
a few resistors.
Hi Rick,
The converter board I have uses a surface mounted Analog Devices chip
with only RGB inputs. I've tried wiring up the Intensity line using a
few resistors and wires directly to the RGB inputs of chip but the
results are poor. While it does now output 16 colours to the screen,
there's significant levels of brightness bleeding across the screen at
times. I've also tried using a few diodes without luck.
Cheers
Fotios
IBM's first color card in 1981 always had 16 colors available in text mode
(I still have one) In graphics mode, there ware two palettes of four colors
each available. There was also a hidden mode that allowed 16 colors in
160 x 200 resolution but it wasn't supported by BASIC so wasn't used
widely.
Tom Lake
As for why a CGA2VGA converter wouldn't support all 16 colors.
As for any program, usually only 8 colors are displayed at a time so maybe
it would be necassary to test the converter with a program that displays all
16 colors.
As for VDC, that exactly does such in most programs. Oh well.
"Tom Lake" <tl...@twcny.rr.com> wrote in message
news:45fc68f9$0$17177$4c36...@roadrunner.com...
Basically the DAC board is a set of three 2-Bit DACs and the Intensity line
is tied to all three DACs. Intensity is used to give a second intensity
level for R,G,B but not independently unless you add two more "Intensity"
lines from elsewhere. Which isn't there built into the VDC or CGA card but
you could in theory fake it by adding into the bus from a data bus. Since
from a technical sense, they are just bits to be converted to real color.
Digital R,G & B lines are just bits not color. It is only color after
converted to analog AND sent out the raster gun or however the LCD works.
"Patrick de Zeester" <inv...@invalid.invalid> wrote in message
news:45fc251f$0$13343$e4fe...@dreader16.news.xs4all.nl...
There maybe is an actual Intensity line. I need more specifics.
"Mangelore" <fot...@commodore128.org> wrote in message
news:hyZKh.12437$8U4....@news-server.bigpond.net.au...
They're supposed to be CGA to VGA converters (see Christian Lott's post
earlier in the thread), so they should take digital RGBI. I must have
missed it on the website, though -- I didn't see anything about how many
colors they output, or what kind of DACs they use.
Brian
--
They use an Analog Devices ad9985 chip.
Cheers
Fotios
Cheers
Fotios
Ok, this chip was designed for analog RGB input and basically scan doubling.
Not digital. So, you would naturally convert the intensity to analog then
tie it into the R,G, & B lines after the R,G & B had been converted to
analog. Otherwise, you need to do it differently.
The chip was designed for capturing ANALOG RGB and hooking them to digital
flatpanel. In effect, the converter is doing the opposite. Converting analog
RGB to a digital LCD panel.
So what you should have done - is have three 2-Bit DACs (like used inside
the DAC board in the 1902A), Basically each DAC has two digital inputs and 1
analog output. So, you would be tieing the Intensity line to the high-bit of
each of the DACs for example and tie the Digital Red, Green & Blue lines
from the CGA / VDC video output. Then convert to analog. You don't want to
be mixing a digital intensity signal into an analog red, green or blue. Kind
of gets messy. I think, that is what you had sort of an issue with. Same
thing with Analog Intensity into digital inputs of the DACs. Gets messy as
the Analog Intensity is not clean. In this case your resistor/capacitor
circuit that makes each DAC circuit. You don't want to place an analog
signal into a digital input.
Whatever the case is, you would then have a clean conversion. In your case,
you home-brewed your DAC circuit with resistors and capacitors, ect. With
basically two digital inputs (bits-IN) and one analog out put. Basically
three circuits. But you made a 4th "DAC" with the intensity and put it into
a digital inputs of each of your "Red-DAC","Gree-DAC" and "Blue-DAC"
circuits. Thus, getting a messy input. This is my theory on what you wrote.
I'm probably scaring myself, now.
"Mangelore" <fot...@commodore128.org> wrote in message
news:cH5Lh.12819$8U4....@news-server.bigpond.net.au...
Joseph Fenn wrote:
>
> On Sun, 18 Mar 2007, Rick Balkins wrote:
> Well when Rick and Bryan and the others working on this project finally
> reach a compromised decision that really works why dont you guys
> market the necessary box and shame the Creators of the original
> C=VGA group from England who never did reach fruition with the
> NTSC version. They suceeded with PAL for the Europeans but never
> quite made it with the C128 and NTSC. So will cross my fingers
> and see if your efforst reach fruition.
> Joe (aka kilroy)
>
script:
>… Plus, I don't have access to an NTSC
>C128 for testing. …
AFAIK, CGA/RGBI is universal, neither PAL nor NTSC.
The British effort that J. Fenn constantly refers to apparently tried to
convert composite to VGA. That was my initial impression, anyway. (Two
or more years ago.)
salaam,
dowcom
To e-mail me, add the character zero to "dowcom". i.e.:
dowcom(zero)(at)webtv(dot)net.
--
That's what I thought but the title of the thread made me think
otherwise. Anyway, I managed to also hook up an Amiga 500. Initial
results look great. I'll be updating the following thread with more test
results, screen shot links, etc....
http://landover.no-ip.com/128/viewtopic.php?id=453
Cheers
Fotios
Kind of like the LCD screens on clocks. Only thing is the power input from
the wall outlet.
"bud" <dow...@webtv.net> wrote in message
news:21102-45F...@storefull-3112.bay.webtv.net...
Group: comp.sys.cbm Date: Mon, Mar 19, 2007, 3:24am (CDT+5) From:
fot...@commodore128.org (Mangelore)
script:
>. Plus, I don't have access to an NTSC
>C128 for testing. .
AFAIK, CGA/RGBI is universal, neither PAL nor NTSC.
The British effort that J. Fenn constantly refers to apparently tried to
convert composite to VGA. That was my initial impression, anyway. (Two
or more years ago.)
salaam,
dowcom
To e-mail me, add the character zero to "dowcom". i.e.:
dowcom(zero)(at)webtv(dot)net.
--
http://community.webtv.net/dowcom/DOWCOMSAMSTRADGUIDE
MSWindows is television,. Linux is radar.
As for the Amiga, it has both digital and analog RGB lines. So, if you want
the analog RGB, you plug the Amiga's analog RGB lines directly to pin
43,48,53 of the AD9985. Maybe inline a resistor or diode. But basically
direct. You basically want to plug directly as possible. This device was
actually designed more suited for an Amiga or VGA/SVGA PC then a Commodore
128 or CGA PC.
That is the difference. The device is primarily designed to convert Analog
RGB to digital signal. Whereas, when plugging a C128 to it, you had to first
convert the RGBI to analog RGB just to go back to digital again but not in
standard RGBI configuration but for flat-panel monitors or projectors.
In modern term, the C128/CGA is already a 'digitized' video.
"Mangelore" <fot...@commodore128.org> wrote in message
news:aonLh.13140$8U4....@news-server.bigpond.net.au...
That's not a bad idea. I'll need someone who has a C128 and Amiga.
> As for the Amiga, it has both digital and analog RGB lines. So, if you want
> the analog RGB, you plug the Amiga's analog RGB lines directly to pin
> 43,48,53 of the AD9985. Maybe inline a resistor or diode. But basically
> direct. You basically want to plug directly as possible. This device was
> actually designed more suited for an Amiga or VGA/SVGA PC then a Commodore
> 128 or CGA PC.
>
I got the Amiga working with it yesterday. I connected the analog (not
digital) outputs to the board. The range of colours were limited when
using the digital ones.
Anyway, as per the AD chip datasheet, the board includes a small analog
input circuit that terminates and couples the inputs. I hooked the Amiga
video output to the input of that circuit. It only includes a small
inductor, capacitor and resistor.
It's all good!
I'm you're man! I got several NTSC 128's and a PAL Amiga that runs here in the states on a NTSC power supply. I can test both PAL and NTSC analog modes for you as I use both.
Either this thread has become confused, or I have. The original post
referred to a converter that will accept CGA and convert it to either NTSC
or PAL video. Joe's Great White Whale was supposed to accept both 40 column
(NTSC or PAL) and 80 column Commodore video and convert either to VGA. I
think they also wanted to accomodate the video of other "classic" computers.
Brian
--
Yep your Exactly right Brian! Except you forgot about the part
about "they suceeded in makeing the device for PAL useage, but never
could conquer the problem with NTSC displays (for the C128 mode)
that is.
Joe
"Payton Byrd" <plb...@bellsouth.no.spam.net> wrote in message
news:lhCLh.19279$Wc....@bignews3.bellsouth.net...
However this thread began to evolve to a wider topic of converters suited
for C128 80 col. video. I am addressing Mangelore's part and appeared to
have provided an interesting input that seems to actually help. That is
suprising in itself.
"Brian Ketterling" <twee...@no-potted-meat-products-peoplepc.com> wrote in
message news:SqJLh.14446$tD2....@newsread1.news.pas.earthlink.net...
Well, I didn't forget the NTSC problem, but I was referring to their intent.
I guess you could substitute "intended" for "supposed" in my sentence above.
I wasn't following Commodore news for a while, and I guess I missed the
point when the C=VGA project spun to a stop. Are you saying that 40 column
NTSC worked when it was coming from 64's and 128's in 64 mode, but not when
it was coming from 128's in 128 mode? That seems implausible (I hope
VIC-II-generated video doesn't have a secret embedded signal to tell the
monitor what mode the computer is in ;-)! ).
Brian
--
Buck up, Rick -- your technical inputs are sure to find their mark now and
then :) .
Personally, though, I wouldn't mind a "for-dummies" synopsis of Mangelore's
project.
Brian
--
Thanks Payton,
I've sent you an e-mail with further details.
Cheers
Fotios
"Brian Ketterling" <twee...@no-potted-meat-products-peoplepc.com> wrote in
message news:1f4Mh.129690$_73....@newsread2.news.pas.earthlink.net...
Hi Brain,
It's just a device that allows you to connect either the C128 RGBI 80
column video port, or an Amiga 500/600/1200 video port, to a VGA monitor.
Cheers
Fotios
> It's just a device that allows you to connect either the C128 RGBI 80
> column video port, or an Amiga 500/600/1200 video port, to a VGA
> monitor.
It's an interesting device though. Right now I for one have nothing to
connect the RGBI from my 128 to and I don't really want to get an old
CRT just for that.
As rumor has it, you can use current Dell displays like the 2407WFP to
connect an Amiga, since they accept the low sync rates via VGA. The
monitor has composite, S-Video and component inputs too, so it makes
sense. Haven't tried it yet though.
It'll probably look like this, at best:
http://upload.wikimedia.org/wikipedia/en/5/59/CGA_CompVsRGB_Text.png
If your definition of "work" is whatever is barely legible in a fuzzy
rainbow of shifting colors, more power to you, I guess. I point out
that this picture on Wikipedia looks too good to be true, I don't know
how they took it. As you can see, white text turns into a multi-color
brew. I'd have expected worse. I also expect it to be shimmering
constantly on a live screen. No way that's staying constant.
The only way to get CGA properly encoded into an NTSC signal is with
some sophisticated signal processing. That's why you have to pay for
it, there's no demand for it. VGA to NTSC and NTSC to VGA is just as
hard, but almost everyone has a use for that so it's mass produced,
thus cheap.
You have to read in a line with 16MHz pixel rate, then apply various
algorithms to limit the bandwidth and encode the colors in ways that
are compatible with the NTSC dot clock, and the limited bandwidth of
the I and Q sub-carriers. Oh not easy at all.
CGA to VGA is easier because you don't have to care about a tight
relationship between clock frequencies because the colors are not
encoded, they're just voltages on three wires as opposed to phasors
buried in one wire. You just clock in the data at one speed, and toss
it out at twice the speed. Assuming you can get a clean clock signal.
Now if only someone could tell me why I can't get the LUT to work on
my project... My VGA output is purple, but it's sharp! (More or less!)
http://dfpresource.org/VGA_noLUT.jpg
The blurriness here are JPG compression artifacts.
Anyways, I hope people read that Wiki page, although it's vague on a
few things it should clear things up.
I'm interested in the CGA to VGA and NTSC to VGA... the CGA to VGA+
board would make a nice internal upgrade for my 128D... and it would
round it out nicely to be able to route the composite through to the LCD
monitor as well.
I understand why they're pricey, but that they're available at all is
the really attractive part...the price however is somewhat less
attractive... the cga-vga board prices out at $141 USD, the
NTSC/PAL-SXGA I'd like is $185 USD, and the NTSC-DVI converter I'd
prefer give a choice is $275 USD. LOL, and that doesn't even include
shipping from Australia.
Those prices are reasonable. It's a lot of work to design one of
these. I mean if I released my design as-is, you'd end up paying about
33$ for just the PCB, toss in another 60$ for parts, and then you have
to solder it, test it, put it in a case (another 20$), add a power
supply, and that's if you build it yourself and have the equipment to
do so.
Of course, you're looking at a lot more capital for a batch of these converters...
BTW, I did buy one of the RGB to VGA converters from www.converters.tv.
In a word: Sux0r.
Problem is, they SAY it can do RGBI, but it can't. So, now I have to build a
RGBI to RGBA converter to feed to this thing.
-Jay
Yes, I've used one of their converters. They are expensive and don't
support the I in RGBI, so you only get 8 colours.
I'm working on a solution that will be more economical and support RGBI.
It's very simple to integrate the Intensity signal. Just a few resistors
and diodes does the trick.
Hah. People say a lot of things about RGBI don't they?
A quicky RGBI to RGBA converter is getting 3 DACs suitable for video. You
send the syncs in for clocking. The DACs should be 2 bits resolution each.
Intensity would be tied to each. EGA just took three seperate Intensity
lines for each of the R,G and B lines. Getting you the 64 colors.
Of course, after you get 15 KHz RGBA. The next step is scan doubling. Then
you should have it viewable on VGA monitors.
"Mangelore" <fot...@commodore128.org> wrote in message
news:mn2Oh.2289$M....@news-server.bigpond.net.au...
a> It'll probably look like this, at best:
a> http://upload.wikimedia.org/wikipedia/en/5/59/CGA_CompVsRGB_Text.png
Yeah, RGBI to composite is a no-go. I haven't really followed the
discussion, but isn't what we want RGBI to VGA? That can be done with
(relatively) simple scandoubling.
a> Now if only someone could tell me why I can't get the LUT to work
a> on my project... My VGA output is purple, but it's sharp! (More or
a> less!)
a> http://dfpresource.org/VGA_noLUT.jpg
That looks exactly like the DTV prototype that Jeri sent me for PAL
testing. I don't remember exactly what the problem was, but it was
related to the colour carrier inversion. You're not working with PAL
though, are you?
--
___ . . . . . + . . o
_|___|_ + . + . + . Per Olofsson, arkadspelare
o-o . . . o + Mage...@cling.gu.se
- + + . http://www.cling.gu.se/~cl3polof/
Hi,
RGBI-> composite is a meme that just won't die because people toss
around terms like "NTSC" without really checking what it means. It's
not their fault, I think back then people called anything with a hsync
close to 15.7KHz and vsync close to 60Hz "NTSC". Just like people call
a 50Hz vsync PAL. NTSC and PAL should only refer to how colors are
encoded into a single signal.
But a monitor can accomodate some variation in the syncs. A NTSC color-
demodulator is not as forgiving, and jamming any old signal in there
can only result in confetti. I'm not aware of how PAL works but I'm
sure it's not a random signal either...
> That looks exactly like the DTV prototype that Jeri sent me for PAL
> testing. I don't remember exactly what the problem was, but it was
> related to the colour carrier inversion. You're not working with PAL
> though, are you?
No, we should stop talking about broadcast video terms once we're in
the RGB/VGA range.
The problem is how I decided to handle the RGBI->RGB conversion
digitally. I wanted to use the software look-up table (LUT) feature on
the AL250 chip. It *seemed* so easy when reading the datasheet... :)
I can email you my worksheet in open office format. Basically I
connect RGBI to R, RGBI to B and RGBI to B. I'm talking about the
digital RGB inputs on the AL250. This gives me 16 discrete values on
each primary color. I could then re-map any input nybble to any 8 bit
primary color.
So let's say I've got the "color" 1010 on the RGBI input, that's half
red + half blue on a RGBI monitor. I just map it as 0A0A0Ah on the
AL250's RGB input. Actually it's 505050h because the 250 doesn't have
the lowest bits, they are mapped internally as 0. So the input nybble
1010 ends up as 01010000 the "R" channel of the 250 internally. Green
is different because of the 565 format. (Or something like that. I
don't have the thing in front of me right now)
The big plan was to give users access to a RS-232 port on my PIC so
they can write their own LUTs on the fly and have different palettes
to suit their whims. 5 shades of green, 3 shades of blue and 8 shades
of red would be no problem for example. You could even have 15 new
LUTs that could be accessed by tapping the unused 250 inputs to the
user port. A new palette on every line like the IIgs... But that's
another story.
I use my spreadsheet to map those inputs to my outputs. I generate a
XML file containing the I2C commands to program the 250's registers.
I'm using a USB->I2C thingy that reads XML.
According to the 250 datasheet, all I have to do is tell it that red
LUT data is coming in, and keep writing red LUT data.
command: write to red LUT
command: write LUT address
command: write value
command: write LUT address
command: write value
command: write LUT address
command: write value
etc....
This resulted in diddly squat until I poked around the other registers
of the chip. But the damn data sheet is so terse it's open to
interpretation. So I tried this
command: write to red LUT
command: write LUT address
command: write value
command: write to red LUT
command: write LUT address
command: write value
command: write to red LUT
command: write LUT address
command: write value
etc...
Got canary yellow instead of purple. So WTF you know??
Anyways, I believe this all solvable. I either made a mistake I can't
see, the chip's faulty or I need a magic sequence of commands, or
something's wrong with my hardware. I just ordered the official 251
eval board from Averlogic. Another 250$ expense. Gah....
The *big* *BIG* problem with my design is that I got so entrenched in
thinking in "video" terms like line-locked generator, genlock, PLL,
dot clock, etc... I couldn't see the forest for the trees. I made the
same mistake as everyone else. :)
The fundamental requirement IMHO is to regenerate a local 16MHz dot
clock in-phase with the incoming RGBI data, ultimately locking my box
to the 128's 16MHz crystal. Otherwise you get metastability problems.
(Or locking the incoming data to the local 16MHz)
The RGBI->VGA problem is essentially *not* a video problem, it's a
clock-domain crossing problem. Once I thought of it this way, I
decided to get rid of the whole VCO/PLL/divider approach since it
obviously won't work. (When the VDC gets re-programmed, how do I know
what to put in the PLL divider to get 16MHz back?)
And there is a lot of jitter on my VCO , the app note I got late in
the game basically says I couldn't use the genlock chip anyways in the
conditions of this project. I can email you this elusive app note. So
even if I built a local frequency meter on my PIC, I still have the
same noise and sensitivity issues in the VCO.
The wide tuning range of the VCO implies very careful layout and lots
of analog work. Not so much fun. I'd need 500uV of noise or less on
the VCO's supply.
Once you think of it as a clock-domain problem, you can just use a
crystal or resonator oscillator and use a data-synchronizer to get the
RGBI data re-clocked to the LO on my board.
Problem is that data-synchronizers don't typically work at the same
input and output frequencies. (AFAIK so there's a lot reading planned
this weekend!)
But that's OK, since I have now brainstormed a lead-lag phase detector
in a state machine that can be implemented in a PIC. So for any HSYNC
that comes in, I don't care about the divider, just the lead-lag for
that line and I correct my LO for that line. I'll have complete
software control of the up and down pulses, even a "charge pump" of
some sorts by varying the width of the up down signals.
Since my LO will be then be a VCXO of some kind, the Q is much higher
than a VCO's apparent Q, thus sensitivity to temperature, time and
noise diminishes drastically.
I just need to find a hyper-abrupt varactor to fit into a 5V or 3.3V
system and read up on filters and state machines. I also need a few
silicon delay elements...
Anyways I'm just rambling on.
**************************************************************************
* Ham since 1937 HiSchool Sophomore ex W9ZUU, KP4EX, W4FAG, KH6ARG KH6JF *
* WW2 Vet since Sep 1940 to just After VJ day. US Signal Corps AACS *
**************************************************************************
> No, we should stop talking about broadcast video terms once we're in
> the RGB/VGA range.
Yeah, 15.7 KHz Hsync RGB,15.7 KHz Hsync RGBI and VGA/SVGA,ect. (31KHz+ Hsync
RGB). In these formats - I care less about NTSC or PAL. I only care about
how the power cycling rate would effect VSync which has slight effects on
KHz but NTSC or PAL is ONLY in terms of power cycling rate standards but
really there is 50Hz and 60 Hz PAL - for those uneducated folks - so lets
not worry about using words "PAL" or "NTSC" as it is meaningless even with
analog RGB. I would only care about 50/60 Hz power cylcling which the VSync
is synced with, aprox.
> The problem is how I decided to handle the RGBI->RGB conversion
> digitally. I wanted to use the software look-up table (LUT) feature on
> the AL250 chip. It *seemed* so easy when reading the datasheet... :)
>
> I can email you my worksheet in open office format. Basically I
> connect RGBI to R, RGBI to B and RGBI to B. I'm talking about the
Right here - I see the problem, RGBI to B - twice. Shouldn't all RGBI lines
be tied to each R,G and B lines of the AL250 digital input. The reason that
everything looks Red, Blue or Purple tones is because of the Blue to tied
twice - but where is the green? An idea to think about. Once Green is tied
in, then the green will effect tones that will allow for your yellows,
greens, browns and some other colors to appear including your greys.
Unless this is just a typo and I have to re-think and look at the rest more
clearly. Or I am missing something in this and not following.
> digital RGB inputs on the AL250. This gives me 16 discrete values on
> each primary color. I could then re-map any input nybble to any 8 bit
> primary color.
>
> So let's say I've got the "color" 1010 on the RGBI input, that's half
> red + half blue on a RGBI monitor. I just map it as 0A0A0Ah on the
> AL250's RGB input. Actually it's 505050h because the 250 doesn't have
> the lowest bits, they are mapped internally as 0. So the input nybble
> 1010 ends up as 01010000 the "R" channel of the 250 internally. Green
> is different because of the 565 format. (Or something like that. I
> don't have the thing in front of me right now)
>
> The big plan was to give users access to a RS-232 port on my PIC so
> they can write their own LUTs on the fly and have different palettes
> to suit their whims. 5 shades of green, 3 shades of blue and 8 shades
> of red would be no problem for example. You could even have 15 new
> LUTs that could be accessed by tapping the unused 250 inputs to the
> user port. A new palette on every line like the IIgs... But that's
> another story.
A cool way to increase number of colors and with clever coding, nice things
could happen. Nice for games on the 128.
Hmmm.....
Joe,
Enough with this NTSC/PAL stuff. VDC is NEITHER NTSC or PAL. It is simply
its own video architecture format. Basically, the VDC output is really just
a digital output bus. It isn't much different then the Serial bus except
that it is output only AND that it is a 4 bit bus and there is a couple of
clock pins.
Then the video hardware device (aka the monitor) then takes the 4 bits and
converts it to analog R,G and B. The 4 digital bits are labeled,
R(Red),G(Green),B(Blue) & I(Intensity).
The monitor has a DAC board inside consisting of THREE 2-bit DACs. You might
think that it will do 64 colors assuming that there is 6-Bits of DAC
resolution (2Bit DAC+2Bit DAC+2Bit DAC = 6-BIT). You would only be HALF
right.
The problem is that the Intensity line (the 4th bit of the digital bits
coming in) is tied to the high bit of all three DACs digital input.
Ultimately the output becomes 3 analog signals of 2 states. If there was two
more bits, then they could be intensity 2 and three or you can have the
Intensity lines labeled Intensity-Red, Intensity-Green and Intensity-Blue.
Then you would tie only one of the Intensity lines to each of the DACs. Each
DAC would have seperate Intensity lines. Voila, you now got primitive EGA.
If you want to go far enough, add 8-Bits from say the User Port + the one
Intensity line built in, you now will have 9 Intensity Lines. 8 coming from
the User Port. (You'll have some limits but that is another issue.) You can
then add upgrade the DACs to 4 Bit DACs and add put 3 of the 9 "Intensity"
lines + the digital "red" line to the digital input for DAC-"Red". The next
3 Intensity lines plus the digital "green" line to the DAC-"Green" and the
last 3 Intensity lines plue the digital "blue" line to DAC-"Blue". (Here, I
label the 3 Four-Bit DACs)
Then they'll produce 3 analog signals with 16 levels of intensities. The CRT
itself accepts three analog inputs for each of the three raster guns that
produce red,green and blue raster beams which will mix by close proximity of
the beams and that they come together to a point. Thus, giving you the
colors you see. I over simplied the overall process of and omitted how the
Hsync and Vsync lines comes into play.
However, there are possible snags with the approach. Possible need of a
buffer for the User Port input which will hold the bits as needed. You won't
be making any changes in the bit values on the User Port sourced "Intensity"
bits as the 1 intensity bit. However, luckily the Intensity bit and the
R,G,B bits are changed and sent based on char clock rate and not dot clock.
As you coud tell, the R,G,B,I lines are modified at 1 or 2 Mhz and buffered.
I believe, you could only change the colors at char clock rate. That is
video chip limits.
Anyway, the RGBI is digital and has nothing to do with PAL or NTSC.
The issue with C=VGA is TOTALLY not related. The issue is likely Composite
video to RGB/RGBI conversion. The biggest issue was likely with decoding an
NTSC video signal from the NTSC C64's VIC-II chip and making a clean analog
RGB signal.
A process may entail the digitizing of the NTSC signal into digital RGB bits
at 24 Bit resolution. This is a decode analog color info and converting them
to digital RGB at 24 Bit digital RGB. (Note: Intensity is not there as it is
really meaningless - it is really just luminances of R,G and B anyway)
Then it is converted to analog RGB and operating on a 31 Khz HSync -
regardless of 50Hz or 60 Hz power cycle rate (VSync).
NTSC or PAL is meaningless on the RGB side of things. The trouble was likely
in converting & extracting the color info and converting them into RGB. PAL
television video was probably easier to extract. Converting them into RGB.
Typically involving some sort of LUT and choosing RGB values that represents
the colors best. Like on VICE.
The problem was probably in extraction of color info.
I can't say for sure. I just hypothyzing.
Then again, there may be nothing wrong with the 80 col. video mode in either
PAL or NTSC. There may be an issue with the 40 col. video. Then again, Alan
Bairstow is not the designer. Besides IIRC: the hardware designer/developer
is in the US and it obviously working in the US.
The main chip in the converter could be auto-defaulting for PAL television
input because the power that is running the converter is operating at 50 Hz
in PAL unless he is using a power converter for the 50/60 Hz. There is so
many factors that can be involved but there is ABSOLUTELY no reason why
there should be any problems.
Hell, you can hook an C128D from the US to a 1902A monitor from Germany - IN
GERMANY - provided that you have a power converter device for the C128D so
you could plug it into a European power outlet. Note: The German 1902A would
be directly plugged into the European power outlet.
There is ABSOLUTELY no problem viewing the 80 col. mode. It is your 40 col.
mode that won't work right.
Same deal with taking a IBM XT from the US to Germany and plugging a CGA
monitor from Germany. As long as the power issues are dealt with for the PC,
there is no problem on the video. Same SHIT! Same video format! That is why
it is called RGBI (sometimes called CGA and sometimes called digital RGB)
If Alan Bairstow is reading c.s.c., I am pretty sure that he has read my
message!
"Joseph Fenn" <jf...@lava.net> wrote in message
news:Pine.BSI.4.64.07...@malasada.lava.net...