On Tue, 7 Jun 2022 at 13:07:14, NY <m...@privacy.invalid> wrote (my
responses usually FOLLOW):
>"J. P. Gilliver (John)" <
G6...@255soft.uk> wrote in message
>news:tXSbS6N7CzniFw+i@a.a...
>>>line frequency affect colour decoding? I presume the PAL crystal in a
>>>TV is nowhere near as tightly controlled as the studio reference
>>
>> True - it is (or was) about the commonest frequency made, but even so
>>cheap crystals can have a tolerance as poor as 500 ppm, but ..
>>
>>>frequency, and PAL has to be able to cope with that, hopefully
>>
>> ... that's what the colour burst at the start of every line is for -
>>to give the local oscillator in the set a frequency, and more
>>important a phase, reference. So the set only needs to hold the
>>reference for one line - 64 μs (less actually because of sync pulse
>>and porches).
>
>The problem comes when the colour is encoded at one frequency and
>decoded at another frequency (because of tolerances in the receiver
>crystals). The colour burst gets the local frequency into the correct
>phase initially, but it the drifts out of phase because the frequency
>is slightly different. And being double-sideband modulation, you don't
>a frequency shift (as with SSB), you get periodic fading. I suppose in
>a really bad case where the frequency difference has a period less than
>a line period, you start to see variations in saturation across each line.
>
Let's do the sums (I haven't until now). 4.43... MHz, with a pretty
grotty crystal - let's say 500 ppm out, which is 0.0005. Hmm, over 2 kHz
out; I didn't think it would be as much as that. (Maybe TV crystals are
better than that; that's just the worst tolerance I can ever remember
seeing quoted for a crystal oscillator.) But assuming that, how far out
could it get in 60 μs (64 less pulse - not sure about porches)? About
one-seventh of a cycle. Yes, I guess that would start to show hue
distortion. Can't say I've ever seen it, though; maybe crystals were
better than that; if they were 100 ppm or better, that gets us down to
about 1/35 - about 10 degrees - which I think wouldn't show. Plus, if
the colour burst gave a shove to frequency as well as phase ... I don't
know if it did (or was even long enough to do so effectively; probably
not).
>
>>>producing saturation rather than hue errors. I realise that before
>>>colour, TVs synced to the mains, so tuners had to cope with a
>>>variation in field rate of 50 +/- 0.5 Hz.
>>
>> I think some Baird Televisors used the mains, but for the frame/field
>>sync., no, that's what the sync pulses are for - even system A ("405"
>>lines) had sync. pulses. (Think about it: how did battery TVs work!)
>>Colour sets didn't do sync. any differently to monochrome sets - they
>>certainly didn't divide down the colour subcarrier, or multiply up to
>>it. Yes, the frequency of the subcarrier was _selected_ to have a
>>very precise relationship to the line and field pulses, but that was
>>to minimise artefacts (such as dot patterns, especially on B/W sets)
>>- sets did not do the divisions/multiplications inherent in that
>>selection. (You only have to look at what home VCRs did to the colour
>>signals, and sets still worked with the output from those!)
>
>When did TV studios stop being synchronised to the mains and start to
>synchronise to a crystal? I though it continued well into 240- and
>405-line days, and it was mainly the introduction of colour that
I think they kept an _eye_ on the relationship, if only to avoid pulsing
when using artificial lighting (even incandescents pulse somewhat), But
I think it was more a matter of nudging the station master oscillator to
avoid that, rather than actually _deriving_ from the mains.
Anyone here know?
>prevented it because anything other than an exact line/frame rate would
>destroy the careful f(sub-carrier) = f(line) * (n+1)/2 + 25 Hz
>relationship, so you'd get variable dot-patterning which the exact
>relationship was designed to minimise.
>
I imagine that relationship was derived by getting both from a master
oscillator, for the reason you describe: if there _was_ any tweaking
done, it would retain the relationship (i. e. the line/frame structure
would be tweaked by the same amount the subcarrier was, so the
relationship would remain fixed).
>
>VCRs (well, VHS anyway) were a lot less tolerant of non-compliant
>waveforms than TVs were. My first computer (Transam Wren - CP/M3) had a
>625/25 RGB output for driving an RGB monitor, but had no composite
>output. Colours could only be seen as shades of amber on the amber
>screen that was built in. So I made a PAL converter, using a PAL
>encoder IC and a PAL 4.43 MHz crystal (*). This drove the TV's BNC
>input and produced fairly good results, subject to the poor HF response
>of the average TV and therefore the blurring of fine horizontal detail.
>I've no idea whether the timing of the various components (line/frame
>rate, front/back porch, equalising pulses) was correct. I never got
>chance to try the RGB output on an RGB monitor, but I did try feeding
>the composite output through a VCR. Feeding it live (ie PAL input
>remodulated onto an RF output) showed noticeable tearing and some
>shimmering of colour. Recording and playing back was more amusing: the
>VCR's head struggled to sync and colours flickered randomly. Head-sync
>problems suggests signal timing errors. Loss of colour and alternating
>correct/complementary colour suggests error in PAL frequency.
I'm not surprised at the colour problems, given the number of stages the
colour (difference) signal went through in a domestic VCR. I'm a bit
surprised at the head sync problems; I'd have _thought_ a computer would
generate more precise video than, say, many video cameras of the era (it
wasn't unknown for them to use two completely independent and not
connected oscillators for line and field/frame!). If the head was
"motorboating" that probably didn't help the colour timing either.
>
>My PAL decoder was a great idea, and at least on a TV that had baseband
encoder (-:
>input (BNC or SCART) you could see the colours. The ability to record
>to tape was one of those "let's see what happens" experiments rather
>than something that was needed. It's a shame that typical PC monitors
>expected US TV timings (640x480) or else computer-standard timings for
>800x600, 1024x768. I did try the Wren to the monitor on my first
>IBM-type PC but it wouldn't sync. No doubt a monitor designed for a BBC
>Micro would have worked perfectly - but would have been useless for PC
>signals.
At the standard resolutions, yes. Some of the graphics cards of those
days (I don't know if still) could be directly programmed in terms of
frequencies - I did meet one 405-line enthusiast who, rather than use
one of the converter boxes popular at the time (and maybe still) among
that fraternity, generated a system A signal direct from her computer (I
think from Linux). But yes, DOS and Windows probably only put out the US
TV standard at the time. (I wonder if it was even interlaced.)
>
>The ultimate in VCRs producing weird signals was the more recent VHS
>machines which were designed for PAL but which could play back a tape
>that had been recorded on NTSC. The line/frame rate was 525/30 which
>some TVs would sync with, at the expense of a very loud relay clonking
>as it changed over, and reduced picture height causing the aspect ratio
>to be wrong. But the colour was decoded correctly. Apparently those
>VCRs were designed to produce a hybrid output: 525/30 but with
>pseudo-PAL encoding of colour so a UK TV would be able to decode it. When I
Or PAL-M as I think it was called (and I think Brazil and possibly a few
others even broadcast it).
It confused some sets, too: some had a (presumably crude was more than
sufficient) line/field rate detector, and if they received 525/60,
assumed they were getting NTSC, and switched the colour accordingly - so
if you fed them 525/60 PAL, they didn't decode it!
(I presume the 525/60 PAL those machined produced used the 4.43...
subcarrier European PAL sets were expecting, so the careful calculations
re avoiding dot crawl/patterning didn't work.)
>went over to stay with my sister who was living in the US at the time,
>I recorded a bit of US TV on their multi-standard VCR (PAL/NTSC/SECAM)
>and brought it back to see what my PAL-only equipment made of it,
>because I'd heard rumours of partial compatibility. It was a one-way
>process: most "modern" PAL VCRs and TVs could sync with NTSC, but
>virtually no NTSC equipment could do PAL - an example of the US "the
>world stops at America's borders" syndrome ;-) My sister was warned to
Like the "World Series" baseball (is it? Or American football? I don't
follow either).
>buy her multi-standard equipment in the UK before setting off, because
>it would not be available for love nor money in the US.
>
At least things have improved slightly since then: I don't know how
compatible digital TV standards are between US and Europe, but now at
least in Folkestone the odd time I can get French TV, it decodes fine,
whereas previously even ignoring the PAL/SECAM question, the different
sound separation means I couldn't get both sound and vision. (I do
remember back in the day displaying their 819-line on a 405 set though!
Just a curiosity - no sound and you got two tall thin side-by-side
pictures.)
>
>
>(*) A project in an electronic magazine. I forget whether I had to use
>resistors to derive the luminance signal from the correct proportions
>of RGB, or whether the IC did that itself. Mains hum was a problem: I
>found that I had to drive the circuit from a battery because a
>supposedly smoothed PSU caused a lot of ripple in the picture. This was
>before the days of switched-mode power supplies everywhere: nowadays a
>mobile phone charger that generated USB-compliant power would probably
>have been fine. A bridge rectifier with smoothing capacitors was
>probably not good enough ;-)
Yes, "smoothed" still has _some_ ripple. A linear regulator might have
helped - the ubiquitous 78 series in those days, probably.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)Ar@T+H+Sh0!:`)DNAf
"If god doesn't like the way I live, Let him tell me, not you." - unknown