Did they make digital tvs compatible from the US to Europe to Asia to
Australia, etc?
I think they should have. If not, is it only the 50 versus 60
vertical scan rate that was the problem?
I don't think I've read anything about this.
The following gives an indirect answer...
http://en.wikipedia.org/wiki/Digital_terrestrial_television
...which appears to be "no". There is no law of nature that prohibits a
multi-voltage, multi-standard receiver, but there is a law of economics --
there's little or no demand for one, as it would be useful only to people
who travelled a lot.
As for a single-inventory non-portable "universal" receiver... It would cost
more than a set that received only the local standard, so, again, you have
economics working against a multi-standard receiver.
The North and South American standard is NTSC, which transmits 30 frames
per second, while PAL, used in Europe, is 25 frames per second. The
switch to digital didn't affect that.
Digital is neither NTSC or PAL. Those are exclusively analogue. It rather
annoys that DVDs are labelled as NTSC and PAL when what they're referring
to is a region.
--
*He who laughs last, thinks slowest.
Dave Plowman da...@davenoise.co.uk London SW
To e-mail, change noise into sound.
It wasn't just a 60Hz/50Hz scanrate issue. NTSC is 525 lines (480 of
picture), versus 625 lines (576 picture) for PAL. They also use
different methods of modulating the color in the signal. SECAM is
similar to PAL, but the color was different yet again.
It depends a bit on how you will view the signals. The basic HD formats
(720p, 1080i) seem to be the same everywhere, so connecting an HD
receiver (satellite or cable or similar) or something like BluRay or
upconverting DVD would be somewhat universal.
Many electronics these days have universal power supplies, and can
handle 110-220V@50-60Hz.
The hard part is if you want to use an antenna. Frequencies and even the
way the digital signal is modulated will vary from country to country,
not to mention the differneces in SD format.
--
If there is a no_junk in my address, please REMOVE it before replying!
All junk mail senders will be prosecuted to the fullest extent of the
law!!
http://home.comcast.net/~andyross
Digital TV has its own formats and standards. It is NOT a "digitization" of
NTSC or PAL.
Nevertheless, European TV is still 25 fps, and US TV is still 30 fps, is
it not? Or am I more confused than normal today?
> Nevertheless, European TV is still 25 fps, and US TV is
> still 30 fps, is it not? Or am I more confused than normal
> today?
The latter, probably. Check with the Wikipedia article to get an idea of
what the actual formats are.
Who are these "they"?
Sort of. Multisystem TV's Were common in the 1980's. There were only 4 systems
of video, although there there were lots of ways to transmit them.
They were NTSC (60Hz, 3.57mHz color carrier), 50Hz PAL, 60Hz PAL, and 50Hz
SECAM. There also was 405 line UK TV (dropped in the early 1980's) and
NTSC 4.43 (same signal, color carrier moved to make cheaper playback equipment).
I still have a 1985 Sharp TV set that will play both NTSC versions, All PAL
versions, and SECAM from anywhere except France. I had a 14 system VCR that
would play and record French SECAM and a different TV set to play it on.
My kids use a 21 inch 4:3 CRT that is simialr, except that it does not
have a French tuner. It added component and S-video instead.
> Did they make digital tvs compatible from the US to Europe to Asia to
> Australia, etc?
I also have had VCRS that included digital TV standards converters. They
were multisystem VCRs with the conversion feature added on top.
But digital TV was not needed, analog TV's played the signals fine. It was
just a matter of adding the correct hardware.
> I think they should have. If not, is it only the 50 versus 60
> vertical scan rate that was the problem?
The color carrier. NTSC used a phase modulated color carrier at 3.5mHz. PAL
used a similar carrier at 4.43mHz. To fix a problem noticed in NTSC signals
the BBC adopted the practice (which was in the proposed NTSC spec but
dropped to save money) of alternating the phase every other line, hence
the name PAL (Phase Alternating Line).
TV sets which would lock on 50Hz or 60Hz signals as appropriate were not
a technical issue and by 1980 almost all made would anyway.
SECAM used a different decoding method, but those chips were easily found,
and it was common to see TV sets and VCRS that would play/record SECAM signals
broadcast using PAL over the air standards. Eastern Europe (Warsaw Pact
countries), most Arab countires, China, and the USSR used some form of SECAM
encoded signals with PAL frequencies.
The French used a different channel spacing, and AM sound, which made
their SECAM signals impossible to tune with the correct tuner. It also made
Eastern European TVs worthless in France and vice versa.
> I don't think I've read anything about this.
You either must have head your head under a rock, or live in the US and never
traveled out of there.
Note that I had several multisystem TV sets, VCRS (BETA and VHS), and even
a portable combination AM/FM/SW receiver and TV set that looked like a
Star Treck tri-corder, all puchased in the 1980's in Philly.
Geoff.
--
Geoffrey S. Mendelson N3OWJ/4X1GM
Those who cannot remember the past are condemned to misquote it.
> Who are these "they"?
Akai, Sony, Toshiba, JVC, NEC, Hitachi, Sharp, Panasonic (National),
Memorex (Radio Shack house brand) are just the TV's and VCR's I've owned.
AFAIK,the TV systems are STILL incompatible;
Europe uses different broadcast modulation schemes and different frequency
assignments.
--
Jim Yanik
jyanik
at
localnet
dot com
The frequencies which suit small densely populated countries close
together might well not suit a large one with large distances between
centres of population.
--
*Starfishes have no brains *
** False argument.
The video signal that is digitised varies in the number of lines and fields
per second.
PAL is synonymous with 50 fields per second.
NTSC is synonymous with 60 fields per second.
"NTSC" badged DVDs when played on most DVD players come out as " PAL 60"
video - where the number of lines is correct but the field rate is 60
Hz..
The TV set in use must be able to cope with this.
.... Phil
I assume in this case you are talking about digital TV. It all depends upon
how you look at it. I don't know about the pre-war 405 line English system,
which finally was stopped in the 1980s. However the 525 line US system and
and 625 line English/French systems were basicly the same, a "flying spot"
of light, zero volts being white and about one volt being black. The scanning
speed was the same, the US system had less lines because it scanned 60 times
a second, the English/French 50.
A DC syncrchronization aka "sync" pluse was included to keep everything
together so if signal got scrambled, the TV would bring it back together
quickly.
Those rates were chosen because the studio lights were arc lights and flashed
on and off at the power line rate, so the TV cameras had to be syncronized
to them or you would get moving black stripes across the screen.
The RCA system for compatible color TV (compatible with black and white),
used 1/4 of the color information based on the fact that your eye only sees
about that much. The color information was encoded on a phase modulated
3.57mHz subcarrier, which at the time was beyond the picture information, but
still within the transmitted signal.
The original RCA system, alternated the phase of the carrier every line,
so that it would fix itself if there was a transmssion or syncrhonization
problem. To save money, the National Television Standards Commitee (NTSC)
which chose the standard, dropped the alternating phase.
When the BBC adopted their 625 line system to replace the 405, they used a
modification of the original RCA system with a 50 Hz field rate (25Hz frame
rate) which gave them 625 lines. Because there was more modulation, 3.57mHz
was still inside the picture, so they moved the color subcarrier up to 4.43
mHz. As an "in your face" they called the system PAL, Phase Alternating Line,
to differentiate it from the NTSC choice.
The French used a different color encoding system called SECAM, which was also
based on the RCA system (1/4 color, 4.43mHz color carrier) but designed
to be totally incompatible so that you could not watch French TV in England
and vice versa.
NTSC stands for National Television Standrds Comittee, PAL for Phase Alternating
Line, and SECAM is a French acronym for what could be loosely translated as
system of transmitting color TV.
Although the frame rates were different, and the color carriers at different
frequencies, the information was basicly the same, and pretty much encoded
the same way. So it was pretty easy, but expensive to build multisystem
TVs.
Except for the people in the channel Islands, or on the coasts of England
or France, there was no reception of signals anyway, so no one would buy
them anyway.
As the 1960's progressed and TV spread throught the world, variations of
NTSC, PAL and SECAM were adopted either because the standards fit the
former colonial powers that ran the countries or they did not fit the
country next door. So the UK used PAL, the French SECAM, Germany PAL (but
modified so that the tuners would not work with UK signals), East Germany
used SECAM (but modified to use the cheaper west German tuners) and so on.
So there were many ways of encoding the video, but it all came down to a
number between 0 and 1 for brightness and 1/4 color information.
In the early 1980's satellite TV became a problem. Multisystem TV sets
existed, once you put a signal up, there was no way to stop someone from
receiving it if they could see your signal. In the US, the requirment for
a Federal license for a satellite dish was dropped, and in many places there
never was one.
HBO was the leader of the movement to prevent people watching these signals
and pushed for a way of encrypting satellite video. What they did was to
embrace the original MPEG-1 video standard, which was then encrypted using
the US DES (Data Encryption Standard). DES was chosen because it was illegal
to export DES chips from the US, which made it illegal to export HBO
receivers.
The MPEG-1 standard was simply a digital compression based by taking the two
relevant bits of information, brightness and color and combining them and
using various mathematical compression algorythms. In the end though what
went in was very much the same INFORMATION in an analog TV signal because
that's what they had coming in and that's what they wanted coming out.
The MPEG-1 standard included various other things, such as the ability to
have more than one video program, more than one audio channel per program,
and several different digial audio compression choices from none to
what later became MP3 (shortend form of it's full name).
Over the years there have been improvements to the MPEG-1 standard, to become
the MPEG-2 (aka MP2) which is used in DVDs. DVD's for those that don't know
are MPEG-2 video streams represented in flat files, with some extra indexing
information.
In some places there was a short flirtation with encoding MPEG-1 signals
on CDs (video CDs). Commodore made a version of the Amiga called the PC-TV,
using the Philips system and I think there was a competing Sony one.
VCDs took off eventually because video tapes and players and later DVDs
were taxed over 200% in some countries, but computers with CD drives
were not. :-)
There are many compression techniques in use, but the ones used for TV
transmission still work very much the same way, with the light level and color
information being the same as it was in the RCA system.
The data transmitted is still almost universally MPEG transmission streams,
with different compression and encoding methods. Because some countries
still have TV sets that flash at 60 times a second and others at 50, the
frame rates of 25 and 30 have been kept, but are really meaningless. There
really are three rates in use, 24 (film), 25 (used for film and video) and
30 (video). TV set's just play them and whatever decoder box you use or disk
player just converts them to the national standard that is expected of them.
What is loosely called MPEG-4 standards have no frame rate per se, a frame
changes only when the information on the screen changes. So a live action
sporting event may have the full 25 or 30 frames per second, but a photo of
two people watching a sunset in silence may only have 10 or 12.
As for over the air, there are three currently used systems of digital
TV. It's up to the country to decide which standard is used in their country
and I'm sure politics matters. The most common is the DVB-T (digital video
broadcast terrestrial), which has been in use in the EU for a long time now.
It's relatively simple, cheap to produce and unencumbered by expensive
patents.
The US uses a system called ATSC (American Television Standards Committee),
which is different than the DVB-T, although it does basicly the same thing.
Compared to the DVB-T system, which is much older, it uses more sophistocated
chips, with more expensive patent licenses.
DVB-T and ATSC tuners are incompatible. My guess is that was done so that
US manufacturers could get a financial incentive for choosing that system,
in terms of licensing fees, instead of fighting cheap knock-offs from China.
There are companies that manufacture dual DVB-T/ATSC tuner chipsets, they
are targetd to laptops but will eventually find their way into pocket TVs
for travelers.
The third system which I mentioned is Japanese in origin and is incompatble
with the other two. I know nothing about it, except that a few south asian
countries have chosen it.
So if you are still reading, the answer is basicaly that while the INFORMATION
has not changed since the early 1950's, the way of encoding, compressing and
transmitting it has changed, but that does not make it inaccessable.
While you could buy a multisystem analog TV or VCR to cross borders as it were
you can still do so digitally. Since the videos transmitted are basicaly the
same (MPEG transport streams) world wide, it's just a matter of a tuner chip
if you go (signally) from country to country, and if you receive your signals
in another method (over the internet, from a recording, etc), then they are
pretty much the same.
> NTSC stands for National Television Standrds Comittee, PAL for Phase
> Alternating
> Line, and SECAM is a French acronym for what could be loosely translated
> as
> system of transmitting color TV.
** Everyone knows that NTSC stands for:
" Never Twice the Same Color"
and SECAM =
" Something Essentially Contrary to the American Method "
.... Phil
As stupid as always. VITS took care of that over 30 years ago. That
was long before you had your last cohernet thought.
--
You can't fix stupid. You can't even put a band-aid on it, because it's
Teflon coated.
The real problem was not that the NTSC system did not have the autocorrection
that was in the original design and used in the PAL system. The real problem
was that there was a knob on the TV set that could make everything change
color.
Even with the early 1960's transmission errors, and differences between
the actual colors of various sources, if the color control was set and
left at 'about right", it always would have been a watchable picture.
The problem was that almost no one had any clue of how to adjust it properly,
and most were set and left in a very wrong postion, while others were
being constantly misadjusted.
All of the TV magazines, science mags, etc had articles on how to properly
adjust your TV set, and I'm sure that for everyone who read and followed
them, there were 10 times the people who didn't.
It was really bad in area where there were many TVs, such as a department store.
For some strange reason, the cheap TV's were never adjusted properly and the
expensive ones always were. :-)
** Most any TV set has internal adjustments for colour quality as well as
the usual external ones. However, each maker has their own ideas of how to
set the colour balance ( or colour temp) of a screen - possibly to be
technically accurate OR to look " nice " to most viewers.
Means that a row of different TVs in a shop all look different.
Baffles the brains of nearly all potential customers who insist on the
totally specious notion that they can immediately decide which is the best
by just comparing them with their eyeballs.
A similar nonsense goes on with stereo speakers and other bits of audio gear
too.
You have go NO hope WHATEVER of convincing anyone that merely looking at
a pix on a screen or listening to a pair of speakers is NO WAY to tell how
good either is.
.... Phil
Don't arc lights work on DC?
But I don't think that's correct. For it to work, TV would have to be
mains locked. It was in the very early days, but later was pulse generator
locked with no direct reference to mains other than being nominally the
same frequency. Mains lock was really just to make receiver design simpler.
The only type of light I've seen which gives problems flicker wise on a TV
camera is fluorescent. Before high frequency ballasts became available,
the work round was to use them in groups of three - from different phases.
--
*No husband has ever been shot while doing the dishes *
> But I don't think that's correct. For it to work, TV would have to be
> mains locked. It was in the very early days, but later was pulse generator
> locked with no direct reference to mains other than being nominally the
> same frequency. Mains lock was really just to make receiver design simpler.
You sort of danced around it. In the very early days is when the frequency was
set. Once set it stayed.
That really was the point of my very long explanation. A long time ago someone
decided to fix the scan rate and representation of data. The actual information
used in all the TV systems was the same, it was just used with incompatable
frame rates, encoding systems and transmission systems mostly for politcal
reasons. TV sets that could receive, decode and play any and all signals
existed.
The reason that everyone did not have a universal TV set was because the price
was kept lower with single system sets and countries like the UK, which made
a substansial income from the TV license did not want you watching tv from
France or the Republic of Ireland for free.
From a technology point of view, it was obvious that the digitial TV standards
MPEG and so on were designed with existing TV sets in mind. If not they would
not have been a continuation of the old limited national standards with their
horrible color encoding choice (1/4 of the resolution that the monochrome
signal had) and instead gone with the more extensible, accurate and easily
compressable RGB system used in computer data.
Actually, the sync pulses keep the horizontal and vertical scanning in the
receiver at the same frequency and phase as the transmitted signal.
> Those rates were chosen because the studio lights were arc
> lights and flashed on and off at the power line rate, so the TV
> cameras had to be syncronized to them or you would get moving
> black stripes across the screen.
This might have been a consideration, but the principal concern was "hum
bars" in the receiver. Modern power supplies are sufficiently well-filtered
that this isn't a concern.
> The RCA system for compatible color TV (compatible with black
> and white), used 1/4 of the color information based on the fact
> that your eye only sees about that much.
Actually, it's more like 1/3.
> The color information was encoded on a phase modulated 3.57MHz
> subcarrier, which at the time was beyond the picture information, but
> still within the transmitted signal.
Actually, it was within the picture (luminance) information. NTSC has always
had a potential video bandwidth of 4.2 MHz.
> The original RCA system, alternated the phase of the carrier every line,
> so that it would fix itself if there was a transmssion or syncrhonization
> problem. To save money, the National Television Standards Commitee
> (NTSC) which chose the standard, dropped the alternating phase.
Actually, it was dropped because it didn't seem possible at the time to
design a reasonably priced receiver that would take full advantage of this
feature (in particular, the elimnation of the Hue control). Also, the US
distribution system didn't have problems with non-linear phase, so PAL had
little practical advantage.
Also, the original proposal used red and blue color-difference signals,
rather than the more-efficient I and Q. The original NTSC proposal was
virtually identical to PAL. (If you don't believe this, I have a copy of
"Electronics" magazine that confirms it.)
> The French used a different color encoding system called SECAM,
> which was also based on the RCA system (1/4 color, 4.43mHz color
> carrier) but designed to be totally incompatible so that you could not
> watch French TV in England and vice versa.
SECAM stands for "sequential avec memoire".
SECAM was actually adopted because the French were idiots. They wanted a
system that was relatively easy to record on videotape. Unfortunately, it
made the receiver more-complex and expensive. A classic example of lousy
engineering.
Though that might be the common opinion, it is, of course, untrue. There is
nothing inherently unstable or inaccurate about NTSC.
Actually, the real problem was that the networks didn't give a damn about
getting the color right.
This changed (I think) sometime in the late 70s. I've owned a number of
color TVs since then (want me to list them?), and don't remember even once
having touched the Hue control (incorrectly called the Tint control on most
sets).
It's significant, though, that if the average [censored] is given free hand
to adjust the Hue control, flesh tones almost always wind up on the green
side.
There are specific standards for color temperature and color accuracy. Any
"good" set should have a user selectable setting for 6500D. Many sets have
have essentially perfect primaries ("perfect" in that they meet the
standards). Most sets have slightly "off" tracking, however.
Left to my own devices, I tend to set the color temperature rather high --
9000K or so. This is perhaps because "noon daylight" looks yellow to me.
The principal problem is that the out-of-the-box settings almost always have
the brightness and contrast jacked way up, so the naive viewer will be
impressed. This is roughly equivalent to "the louder speaker sounds better".
** You have got to be the most ignorant wanker on the planet.
.... Phil
>> Though that might be the common opinion, it is, of course,
>> untrue. There is nothing inherently unstable or inaccurate
>> about NTSC.
> ** You have got to be the most ignorant wanker on the planet.
When was the last time you adjusted the Hue control on an NTSC receiver?
That's not a rhetorical question.
The over the air signals were also spaced differently than PAL and instead
of FM audio like everyone else in the world, they used AM. So even if
you could maniptulate your TV tuner into picking up the video signal, and did
not mind watching it in black and white, there was no sound.
The rest of the world that did adopt SECAM used the PAL over the air
channel spacing and audio carriers, so that a PAL VCR could record/play the
signals with very little modification if any at all and a PAL TV could play
them in black and white, with audio.
The system was called MESECAM (Middle East Secam because many arab countries
adopted it). I think the Warsaw Pact countries, Soviet Union and China (PRC)
also did, but the Soviet VCRs ran at a different speed than the regular ones.
There was also NTSC 4.43, which was a 60Hz NTSC signal with the color subcarrier
at 4.43 mHz. It was developed as a cheap way of adding NTSC capability to
multisystem VCRs and TV sets, but was never broadcast over the air.
That's why I said that the OP must of either spent the last 30 years under
a rock or in the US. In the US no one cared, everything was NTSC or converted
to it for sale, while elsewhere in the world, everyone was trying to get
multisystem TV sets and VCRs.
You could buy them the US too, but only in stores that catered to foreigners,
visitors and sailors on leave.
>> Did they make digital TVs compatible from
>> the US to Europe to Asia to Australia, etc?
>
>The following gives an indirect answer...
>
>http://en.wikipedia.org/wiki/Digital_terrestrial_television
>
>...which appears to be "no". There is no law of nature that prohibits a
>multi-voltage, multi-standard receiver, but there is a law of economics --
>there's little or no demand for one, as it would be useful only to people
>who travelled a lot.
The reason I care is the opposite of that. There are only two
DVDR-with-harddrives for sale in the US, and one is cheaper than the
one I have, which itself is inferior in design. The other may be
better or not. However there are other models for sale in Australia,
and probably other parts of the world. I want to buy one from
Australia and use it here.
>As for a single-inventory non-portable "universal" receiver... It would cost
>more than a set that received only the local standard, so, again, you have
>economics working against a multi-standard receiver.
What i had in mind wasn't** a multi-standard receiver but their
adopting one standard for the whole world, something they didnt' do
with B&W or color tv, for understandable reasons.
From reading the first few replies I guess the reason there is no
single standard now is so that the digital tv would play on analog
televisions, that making a set-top box or digital to analogue
converter which would also change frame rate was considered hard.
**OTOH, I am a broken DVD player that plays both NTSC and PAL dvds and
the girl who gave it to me said it cost 40 dollars. It even has a
button on the remote to change from NTSC to PAL and back. So the part
that handled the second format couldn't have been more than 5 dollars,
maybe 10, right? Maybe much less. Doesn't that mean it would cost
no more to include that in tvs?
(Strangely it does refer to needing matching regions, but gives no
indication on the box, on the player, or in the manual, what region it
is. My friend said it played the US and Europe and Japan, regions 1
and 2.
>In article <prestwhich-D2346...@mx02.eternal-september.org>,
> Smitty Two <prest...@earthlink.net> wrote:
>> The North and South American standard is NTSC, which transmits 30 frames
>> per second, while PAL, used in Europe, is 25 frames per second. The
>> switch to digital didn't affect that.
>
>Digital is neither NTSC or PAL. Those are exclusively analogue. It rather
>annoys that DVDs are labelled as NTSC and PAL when what they're referring
>to is a region.
If that is the case, how is it possible I have a DVD that is PAL, but
all regions?
(I bought it by mistake, didn't notice the PAL, can't play it on my
DVD player**, but can on the computer. **The DVD player in the other
thread is broken.)
There are three possibilities of video encoded on DVDs. NTSC film (24/1001
frames per second), PAL (film and video 25 frames per second) and and NTSC
video 30/1001 frames per second.
The video encoding is based upon the source material.
Region encoding has absolutely nothing to do with the source material, it has
to do with encryption method to limit sales in various countries.
DVD players sold in the US will automatically play the video on the disk as
NTSC unless they use an HDMI output, where they will may just pass it and let
the TV set figure out how to play it or convert it anyway.
The same with DVD players sold in PAL countries, however most of them
have a setup option to either convert everything to PAL, convert it to NTSC,
or leave it the way it is for a multisystem TV.
But thoss were in the analog signals. When they went to digital, why
didn't they stop using PAL or stop using NTSC? That is my point.
What tied them to both PAL and ntsc at the same time?
Regional pride?
Or was it because they wanted current analog tvs to be able to receive
digital signals that went through a set-top digital to analog
converter, and some tvs wanted 50 cycle and others 60 cycle, so if the
air-borne signal was the same, it couldnt' be converted to one of 50
or 60?
> To fix a problem noticed in NTSC signals
>the BBC adopted the practice (which was in the proposed NTSC spec but
>dropped to save money) of alternating the phase every other line, hence
>the name PAL (Phase Alternating Line).
>
>TV sets which would lock on 50Hz or 60Hz signals as appropriate were not
>a technical issue and by 1980 almost all made would anyway.
>
>SECAM used a different decoding method, but those chips were easily found,
>and it was common to see TV sets and VCRS that would play/record SECAM signals
>broadcast using PAL over the air standards. Eastern Europe (Warsaw Pact
>countries), most Arab countires, China, and the USSR used some form of SECAM
>encoded signals with PAL frequencies.
>
>The French used a different channel spacing, and AM sound, which made
>their SECAM signals impossible to tune with the correct tuner. It also made
>Eastern European TVs worthless in France and vice versa.
>
>> I don't think I've read anything about this.
>
>You either must have head your head under a rock, or live in the US and never
>traveled out of there.
Please see my question higher up.
What exactly do you want? Here in Israel you can buy a DVB-T set top
box for 300 NIS ($75 US) with one having been on sale a few weeks ago for
99 NIS.
It has a USB port, but no storage, which allows you to plug in a disk drive
or USB memory stick, and record off the air. If your program provider uses
the EPG (electronic programing guide) material, you can use it to set the
device to record the programs.
The recordings are raw MPEG TS (transport streams) files. Which you can
use a PC to convert to something useful. You have to unplug the drive from
the unit and plug it into your PC, there is no PC to device connection.
I think the devices are single threaded, you can only watch one program,
or record one program, or play one recording at a time.
You can also buy a USB tuner stick for 99 NIS that plugs into a PC and
lets you use the PC as a TV set or PVR. They all come with included software,
which IMHO sucks, you can buy a program off the internet called DVBViewer
which is pretty good for 15 euros. Note that you will need around a 2.6 gHz
(or the equivilant multi core) PC to properly decode and record 720P or better
video.
You just need to be careful that the device you are buying supports the
compression standard used for the video, besides the transmission standard.
Israel's service is very new, so they decided to use H.264 video encoding
and AAC (aka MP4a) audio encoding. Not all of the boxes on the market could
decode them nor some of the programs for the USB sticks.
> You sort of danced around it. In the very early days is when the
> frequency was set. Once set it stayed.
Sadly not when locked to mains as that frequency drifts. By rather a lot
in electronic terms.
> That really was the point of my very long explanation. A long time ago
> someone decided to fix the scan rate and representation of data. The
> actual information used in all the TV systems was the same, it was just
> used with incompatable frame rates, encoding systems and transmission
> systems mostly for politcal reasons. TV sets that could receive, decode
> and play any and all signals existed.
I think you're reading in the political bit. Different countries had
settled on different mains frequencies rather before such things mattered
much.
> The reason that everyone did not have a universal TV set was because the
> price was kept lower with single system sets and countries like the UK,
> which made a substansial income from the TV license did not want you
> watching tv from France or the Republic of Ireland for free.
That is total nonsense. The TV licence is needed in the UK just to
operated a TV receiver - regardless of where the progs are transmitted
from. And they were single channel sets originally, because only the BBC
transmitted TV and only the one channel. Not many in the UK would have
been interested in French language broadcasts. ;-)
> From a technology point of view, it was obvious that the digitial TV
> standards MPEG and so on were designed with existing TV sets in mind. If
> not they would not have been a continuation of the old limited national
> standards with their horrible color encoding choice (1/4 of the
> resolution that the monochrome signal had) and instead gone with the
> more extensible, accurate and easily compressable RGB system used in
> computer data.
Think you're well into hindsight. When the UK PAL system was finalised
(1960?), computers were some esoteric device in a lab. But in any case a
major priority of any colour TV system then was that it can be easily
receivable on a monochrome only set - and not make that set more expensive
to produce.
--
*There are two sides to every divorce: Yours and shit head's*
Why would the rest of the world want multi-standard TVs? You might if you
lived within reception distance of another country with a language you
understood well and it used a different system - but how often does this
happen?
In the UK, PAL VHS would playback NTSC tapes on a PAL TV for many a year.
Bit of a cludge, but it worked well enough for the poor quality of VHS.
--
*Why are they called apartments, when they're all stuck together? *
** Go fuck yourself - asshole.
NTSC inherently suffers from sensitivity to phase shift in the sub carrier
during transmission and reception that cause colour changes on the screen -
particularly so when changing channel.
PAL does not.
Hence the famous acronym as quoted by me.
Go fuck yourself.
.... Phil
** The fact that folk ALL have TV sets and VCRs that work with those
standards ???
You trolling bloody IDIOT !!
..... Phil
Because they could. :-)
Seriuosly the digital standards were developed with keeping the old systems
in play, even if they were no longer needed.
> What tied them to both PAL and ntsc at the same time?
>
> Regional pride?
>
> Or was it because they wanted current analog tvs to be able to receive
> digital signals that went through a set-top digital to analog
> converter, and some tvs wanted 50 cycle and others 60 cycle, so if the
> air-borne signal was the same, it couldnt' be converted to one of 50
> or 60?
>
It really did not matter. Maybe in 1983 when digtially encrypted HBO satellite
receviers were designed, but in 2005 when the US conversion started, it was
simple enough to use anything they wanted and produce NTSC or PAL or computer
RGB output or all three on a set top box.
The actual encoding is not PAL or NTSC anyway. H.264 which is the current
standard for high end compression does not have a fixed frame rate. I mentioned
that in a previous posting.
With a fast enough decoder chip you can take any resoltuion and frame rate
and put out anything else. My Western Digitial TV Live unit will take
almost any compressed video file up to 1080P60 (1080x720 60 frames a second)
and put it out on the fly, with audio in sync from 480i60 (standard NTSC),
or 560i50 (standard PAL), in composite, 480P60 or 560P60 in component,
or digital in HDMI with several choices in between.
Why you could not slap an ATSC or DVB-T or the Japanese standard tuner
chip (or all three) on it instead of a USB port or ethernet is more of a
matter of product placement than anything else.
That's almost irrelevant. When the UK went to digital TV broadcasts (was
that around 2000 with Sky's digital terrestrial service?) there was no
no need to continue to support PAL. After all much of their material
was NTSC anyway. They were encoding the signals in one place, so there was
no restriction on what equipment was used except cost, and on the set end
they could of used anything they wanted.
I expect they chose PAL because it was the existing standard, and they could
buy subassemblies cheaply.
However ATSC was compeltely different. It was supposed to be a new standard,
not a re-hashing of an old one. There was no need to keep NTSC compability
as long as it could be created in set top boxes.
Note that there were and still are two other incompatble digital TV standards
in use in the US. The cable companies use one of their own, and the DBS
companies use a different one. Since there are two competing DBS companies,
each using their own incompatible encryption, you could say there are four
incompatible ones.
They all use some sort of MPEG TS transmission, but the streams can not
be read with the other company's devices.
Actually there are more differences between PAL and NTSC color
encoding than the alternation of the phase:
1) NTSC I and Q color difference, PAL R-Y, B-Y
2) Different primaries, especially green. PAL had a smaller color
gamut.
3) Different color bandwidth for different colors. NTSC had 1.3
MHz for I and 0.5 MHz for Q. PAL was equal for R-Y and B-Y.
4) Excellent interleaving of chroma-luminance frequency
components which was largely destroyed by the phase alteration.
As a note, much of the advantage of points 2), 3) and 4) was lost
on early sets which just used 0.5 MHz bandwidth for decoding both
chroma components and bandwidth limiting the luminance signal to
minimize chroma-luma crosstalk. Also most sets did not use the
NTSC primary phosphors so a lot of the advantages of NTSC were
lost for a few decades. When integrated circuits became
available, dual bandwidth chroma decoders started appearing as
well as comb filters to separate the luminance and chroma
signals. More accurate phosphors were also gradually used in
sets. The result was a major improvement in picture quality with
the original 1953 broadcast standards. No such receiver
improvement was possible with the PAL system. Regarding VITS,
that was introduced, but very few sets used it.
David
>>>> Though that might be the common opinion, it is, of course,
>>>> untrue. There is nothing inherently unstable or inaccurate
>>>> about NTSC.
>>> ** You have got to be the most ignorant wanker on the planet.
>> When was the last time you adjusted the Hue control on an NTSC receiver?
> ** Go fuck yourself - asshole.
> NTSC inherently suffers from sensitivity to phase shift in the sub-
> carrier during transmission and reception that cause color changes
> on the screen -- particularly so when changing channels.
> PAL does not.
> Hence the famous acronym as quoted by me.
> Go fuck yourself.
Wouldn't it be nice if you actually knew what were talking about?
Both NTSC and PAL use subcarrier phase to convey hue. (The amplitude is
roughly the saturation.) Both systems are sensitive to non-linear phase
errors.
Because PAL alternates phase between lines, the non-linear color errors are
in opposite directions, and the eye tends to average them out -- at the
expense of saturation. (Complementary colors sum to white.) High levels of
non-linear phase can produce visible "saturation banding" on a PAL set, just
as they can cause "color banding" on an NTSC set.
PAL was adopted in Europe because European distribution systems suffered
from relatively high levels of non-linear phase. The American distribution
system did not, so abandoning phase alternation was not a major loss.
The wildly inaccurate reverse acronym was based on sloppy engineering in the
studio -- nothing inherent in NTSC.
I doubt that any American member of this group has adjusted the Hue control
on their NTSC set for at least 30 years.
That isn't immediately clear to me. How badly would pahse alteration affect
the frequency components of the subcarrier?
You left out 3.5. The I and Q primaries' color and bandwidth are based on
how the eye actually perceives color. NTSC not only transmits more color
information, but uses the available bandwidth more effectively.
> As a note, much of the advantage of points 2), 3) and 4) was lost
> on early sets which just used 0.5 MHz bandwidth for decoding both
> chroma components and bandwidth limiting the luminance signal to
> minimize chroma-luma crosstalk.
Actually, most early sets (at least RCA) had full-bandwidth color. RCA
continued to offer such sets for two or three years. I suspect many current
sets using digital processing are full-bandwidth, but there's no easy way to
know which is which.
> When integrated circuits became available, dual bandwidth chroma
> decoders started appearing...
Not that I'm aware of. Such sets require a second delay line, which runs up
the cost.
> as well as comb filters to separate the luminance and chroma
> signals.
Correct.
> More accurate phosphors were also gradually used in
> sets. The result was a major improvement in picture quality with
> the original 1953 broadcast standards. No such receiver
> improvement was possible with the PAL system.
Oh? Why?
It happened all over Europe, all of the time. But the big draw was
"foreign films".
> In the UK, PAL VHS would playback NTSC tapes on a PAL TV for many a year.
> Bit of a cludge, but it worked well enough for the poor quality of VHS.
No they would not. They had to be kludged to do it in the first place and
often were.The TV sets had to be capable of syncing at 60 fields per second
instead of 50, the video speeds of the recorders had to be modified and
the NTSC color signals inverted every other line.
Those VCRs were actually multisystem VCRs with EXTRA circuitry to convert
NTSC to PAL (by the line inversion). What they lacked was the 3.57mHz color
carrier circuitry and may of had NTSC 4.43.
How odd to come across you in a group other than SCJM!
Then the OP should've been to alt.corp.akai, alt.corp.sony,
alt.corp.toshiba, alt.corp.jvc, alt.corp.nec, alt.corp.hitachi,
alt.corp.sharp, alt.corp.panasonic , etc.
Not to sci.electronics. *repair*
> "Geoffrey S. Mendelson"
>
>
>> NTSC stands for National Television Standrds Comittee, PAL for Phase
>> Alternating
>> Line, and SECAM is a French acronym for what could be loosely
>> translated as
>> system of transmitting color TV.
>
>
> ** Everyone knows that NTSC stands for:
>
> " Never Twice the Same Color"
>
> and SECAM =
>
> " Something Essentially Contrary to the American Method "
>
>
>
> .... Phil
And PHIL = PLEASE HELP I'M LOST!!
...heh
--
Live Fast, Die Young and Leave a Pretty Corpse
Correct.
Oh? Why?
1) They were stuck with the smaller color gamut because of the
color primary choices used in the encoding.
2) They could not use wide bandwidth decoders because the chroma
encoding was equal bandwidth.
3) Comb filtering in PAL is not nearly as effective since the
chroma components are 'smeared' out rather than tightly
interleaved between the main luminance components. The phase
alteration and the 25 Hz offset of the chroma carrier in PAL
(look up Hannover bars) kills the effective use of comb filters.
Your point 3.5 is well taken. Regarding the second delay line,
the extra delay needed in the I channel was just a simple lumped
component all pass filter that could be fabricated at very low
cost. I also remember the time when early VCRs actually included
the NTSC pre-distortion phase compensator that was part of the
broadcast standard to compensate for the nonlinear delay of the
IF stages in the receivers. The theory was that you pay only once
in the broadcast encoder rather than in every TV set. I actually
bought a few of these on the replacement part market to use in
other video projects for about $1.00 each. It was a passive
module with three leads containing a few inductors and
capacitors. I installed one in a RF modulator I had and they sure
eliminated the chroma smear and sharpened up the luminance. It is
interesting that even with SAW IF filters which could have been
made with uniform group delay, they are fabricated to reproduce
the delay characteristics of the older tuned inductor-transformer
IF amplifiers.
David
G'day mate,
Take it somewhere else, eh?
Thanks, cocksucker.
Never heard of Hanover bars. (Though I've lived in PA, I've never been in
any, either.) I didn't realize PAL had this basic problem.
> Your point 3.5 is well taken. Regarding the second delay line,
> the extra delay needed in the I channel was just a simple lumped
> component all pass filter that could be fabricated at very low
> cost. I also remember the time when early VCRs actually included
> the NTSC pre-distortion phase compensator that was part of the
> broadcast standard to compensate for the nonlinear delay of the
> IF stages in the receivers. The theory was that you pay only once
> in the broadcast encoder rather than in every TV set.
Which is one of the problems with SECAM. Transmitting only one color signal
per line simplifies encoding and recording (at the studio) at the expense of
a more-expensive receiver.
> I actually
> bought a few of these on the replacement part market to use in
> other video projects for about $1.00 each. It was a passive
> module with three leads containing a few inductors and
> capacitors. I installed one in a RF modulator I had and they sure
> eliminated the chroma smear and sharpened up the luminance. It is
> interesting that even with SAW IF filters which could have been
> made with uniform group delay, they are fabricated to reproduce
> the delay characteristics of the older tuned inductor-transformer
> IF amplifiers.
This, also, is new to me. I'd always assumed there was no correction in one
part of the system for errors in another.
> That's almost irrelevant. When the UK went to digital TV broadcasts (was
> that around 2000 with Sky's digital terrestrial service?)
No. Sky doesn't broadcast terrestrial signals in the UK. Satellite and
cable only.
Terrestrial digital started in '98 with a consortium including the BBC and
ITV.
> there was no no need to continue to support PAL. After all much of their
> material was NTSC anyway.
So you think they should have gone to NTSC? Why would the Uk replace a
better newer system with an older inferior one?
Digital was in addition to the UHF PAL service - with it carrying all the
same channels and more.
> They were encoding the signals in one place,
> so there was no restriction on what equipment was used except cost, and
> on the set end they could of used anything they wanted.
> I expect they chose PAL because it was the existing standard, and they
> could buy subassemblies cheaply.
PAL has nothing to do with any digital transmission. Some of the
originating sources may still have been PAL at some point though.
STBs had a PAL output for use with sets with no line input.
> However ATSC was compeltely different. It was supposed to be a new
> standard, not a re-hashing of an old one. There was no need to keep NTSC
> compability as long as it could be created in set top boxes.
That applies to any STB. What goes in is irrelevant provided it will
interface with the domestic TV.
> Note that there were and still are two other incompatble digital TV
> standards in use in the US. The cable companies use one of their own,
> and the DBS companies use a different one. Since there are two
> competing DBS companies, each using their own incompatible encryption,
> you could say there are four incompatible ones.
So the US is in a bit of a mess? ;-)
> They all use some sort of MPEG TS transmission, but the streams can not
> be read with the other company's devices.
That's business politics for you.
--
*The longest recorded flightof a chicken is thirteen seconds *
>>>>>> ** Everyone knows that NTSC stands for:
>>>>>> "Never Twice the Same Color"
>
>>>>> Though that might be the common opinion, it is, of course,
>>>>> untrue. There is nothing inherently unstable or inaccurate
>>>>> about NTSC.
>
>>>> ** You have got to be the most ignorant wanker on the planet.
>
>>> When was the last time you adjusted the Hue control on an NTSC receiver?
>
>> ** Go fuck yourself - asshole.
>
>
>> NTSC inherently suffers from sensitivity to phase shift in the sub-
>> carrier during transmission and reception that cause color changes
>> on the screen -- particularly so when changing channels.
>> PAL does not.
>> Hence the famous acronym as quoted by me.
>> Go fuck yourself.
>
>
> Wouldn't it be nice if you actually knew what were talking about?
** Go fuck yourself - you stinking, autistic asshole.
> PAL was adopted in Europe because European distribution systems suffered
> from relatively high levels of non-linear phase. The American distribution
> system did not, so abandoning phase alternation was not a major loss.
** Absolute pack of lies.
NTSC inherently suffers from sensitivity to phase shift in the sub-
carrier during transmission and reception that cause color changes
on the screen - particularly so when changing channels.
> The wildly inaccurate reverse acronym was based on sloppy engineering in
> the
> studio -- nothing inherent in NTSC.
** Significant phase shifts occur during propagation and in domestic antenna
systems.
Go fuck yourself - you stinking, autistic asshole.
..... Phil
** In case you are still unaware - the DTV coding system used in the USA is
quite different from that used in Europe and most places including
Australia.
Look it up on Wiki - you trolling, fucking PITA idiot.
.... Phil
when I was at TEK,I used to have a chart with all the worlds TV systems,and
their differences.
I tossed all that stuff when I was laid off,didn't have room for all the
stuff I'd have kept if I could. I repaired and calibrated TEK NTSC and PAL
video test equipment.I did a little bit of digital video,and -one- SECAM
unit,so I won't claim any expertise with SECAM.
--
Jim Yanik
jyanik
at
localnet
dot com
> Michael A. Terrell wrote:
>> As stupid as always. VITS took care of that over 30 years ago.
>
> The real problem was not that the NTSC system did not have the
> autocorrection that was in the original design and used in the PAL
> system. The real problem was that there was a knob on the TV set that
> could make everything change color.
>
> Even with the early 1960's transmission errors, and differences
> between the actual colors of various sources, if the color control was
> set and left at 'about right", it always would have been a watchable
> picture.
>
> The problem was that almost no one had any clue of how to adjust it
> properly, and most were set and left in a very wrong postion, while
> others were being constantly misadjusted.
>
> All of the TV magazines, science mags, etc had articles on how to
> properly adjust your TV set, and I'm sure that for everyone who read
> and followed them, there were 10 times the people who didn't.
Which really didn't matter,as the program sources varied widely in color
accuracy.
>
> It was really bad in area where there were many TVs, such as a
> department store. For some strange reason, the cheap TV's were never
> adjusted properly and the expensive ones always were. :-)
>
> Geoff.
>
*VIRS* was the VITS signal meant for autocorrection,but it wasn't used much
IIRC.
VIRS = vertical interval reference signal
VITS = vertical interval test signals.
>> The real problem was not that the NTSC system did not have
>> the autocorrection that was in the original design and used in
>> the PAL system. The real problem was that there was a knob
>> on the TV set that could make everything change color.
>
> Actually, the real problem was that the networks didn't give a damn
> about getting the color right.
They got their video from a number of different sources,who also didn't put
much effort into correct color.
>
> This changed (I think) sometime in the late 70s. I've owned a number
> of color TVs since then (want me to list them?), and don't remember
> even once having touched the Hue control (incorrectly called the Tint
> control on most sets).
Likely the addition of VIRS circuitry.
>
> It's significant, though, that if the average [censored] is given free
> hand to adjust the Hue control, flesh tones almost always wind up on
> the green side.
>I doubt that any American member of this group has adjusted the Hue control
>on their NTSC set for at least 30 years.
True. US receivers use the VIR (Vertical Interval Reference) on line
20 for chroma phase correction to automagically correct both static
and differential phase errors. I think this started in about 1980.
In a past life, when I was doing video, it meant "Now That Seems
Crazy", "Nobody Thinks Such Crap", or "Nail Through Some Coax".
--
Jeff Liebermann je...@cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
Yes it does which is why VITS was developed in the '70s like Terrell
pointed out. Hue issues in the US were non-existent for the last 30+
years. Then we turned the whole analog mess off after running digital
for 10+ years
G²
WHO CARES? Analog is thankfully gone.
G²
Thanks for that. In knew I was using the wrong term but I haven't
worked in a broadcast station since '85. The CBS affiliate in Madison
WI had the Tektronix VIRS corrector for the incoming network feed. One
of the engineers modified it to compensate for blacks below setup. The
only FCC citation the station got in 30 years was from the black level
on the CBS show 'The Price Is Right' when they spin the wheel.
G²
>
>"mm"
>>
>> The reason I care is the opposite of that. There are only two
>> DVDR-with-harddrives for sale in the US, and one is cheaper than the
>> one I have, which itself is inferior in design. The other may be
>> better or not. However there are other models for sale in Australia,
>> and probably other parts of the world. I want to buy one from
>> Australia and use it here.
>
>
>** In case you are still unaware - the DTV coding system used in the USA is
>quite different from that used in Europe and most places including
>Australia.
Do I trust the word of a jackass?
>Look it up on Wiki - you trolling, fucking PITA idiot.
Do I take advice from jackasses?
Kerplunk.
>
>.... Phil
>
>
What error message. I'll bet it was "incorrect disk", or something related
to region code, not because it was PAL.
The problem was that there were too many places in the system to
adjust the phase, and no way to match the phase of multiple sources
outside a single studio. The coaxial & microwave relays used by TV
networks needed amplifiers and correction circuits at regular
intervals. Every location required the careful adjustment of all
parameters so a usable signal was availible at the other end. I freind
of mine worked ATT longlines back then and told me what a PITA it was to
keep the system working properly. Not only was there a master E_W feed,
but most of it could be rerouted around an outage, even if the phasing
didn't match. that was the reason that ATT was able to quickly piece
together a nationwide feed to all network TV stations, no matter which
network on the day JFK was killed.
> Even with the early 1960's transmission errors, and differences between
> the actual colors of various sources, if the color control was set and
> left at 'about right", it always would have been a watchable picture.
>
> The problem was that almost no one had any clue of how to adjust it properly,
> and most were set and left in a very wrong postion, while others were
> being constantly misadjusted.
>
> All of the TV magazines, science mags, etc had articles on how to properly
> adjust your TV set, and I'm sure that for everyone who read and followed
> them, there were 10 times the people who didn't.
>
> It was really bad in area where there were many TVs, such as a department store.
> For some strange reason, the cheap TV's were never adjusted properly and the
> expensive ones always were. :-)
one of the problems with the cheap Tvs were that people would play
with the settings. Some people liked everyone to look like they were
wearing clown makeup. Or as one idiot put it when i told him not to
toch one of our TVs, "If I'm buying a color TV, I want all the color I
can get!" :(
--
You can't fix stupid. You can't even put a band-aid on it, because it's
Teflon coated.
William, Phil is a mentally ill Aussie who rarely takes his
medicine. Just ignore him.
Pathetic Halfwit Infecting Lambs.
Early TVs often had a faint hum bar in the vertical. By being locked
to the line frequency, it was fixed to one location, and most people
never saw it.
Why? Dish or Direct supply all the equipment and install it, just
like the various CATV companies.
> > They all use some sort of MPEG TS transmission, but the streams can not
> > be read with the other company's devices.
>
> That's business politics for you.
So, you think someone should be able to us one company's equipment to
steal service from another?
Really? Entire chipsets were made to use it and they reduced the
cosst to build new TVs. Just because it wasn't etched on the CRT's
face doesn't mean it wasn't used.
There were no 'digtially encrypted HBO satellite receviers' in 1983.
An external 'Video Chiper II' was used with recievers on a small list
that were tested to work with the 'Video Chiper II'. Most commercial
grade C-bnad receivers had a low pass filter in the video amplifier that
prevented them from working. The interesting thing was that the cheaper
equiment that was barely better than consumer grade made up most of that
list. United Video Cablevision in Cincinatti, Ohio was one of the
systems picked to do field testing before the system went live. I
modified all our Collins-Rockwell receivers to work with the 'Video
Chiper II' test units. They freaked out when I sent them the test data
and told them what hardware I was using. BTW, the test unit serial
number was 16.
It wasn't until combo consumer grade recievers wer built that the
'Video Chiper II' was changed into a plug in module so it could be
replaced or upgraded as the securtiy software changed.
Also, note that the original 'Video Chiper' was full digial
scrambling built for the military, while the 'Video Chiper II' digitized
the audio and inverted the sync on the video. VC units cost over a
million dollars each. HBO wanted a way to turn off the feed to CATV
systems who were late, or didn't even try to pay thier bills. A well
known MSO in the early '80s was over six months behind on everything
except their payroll and utility bills. HBO wanted to make them catch
up, and stay that way.
> The actual encoding is not PAL or NTSC anyway. H.264 which is the current
> standard for high end compression does not have a fixed frame rate. I mentioned
> that in a previous posting.
>
> With a fast enough decoder chip you can take any resoltuion and frame rate
> and put out anything else. My Western Digitial TV Live unit will take
> almost any compressed video file up to 1080P60 (1080x720 60 frames a second)
> and put it out on the fly, with audio in sync from 480i60 (standard NTSC),
> or 560i50 (standard PAL), in composite, 480P60 or 560P60 in component,
> or digital in HDMI with several choices in between.
>
> Why you could not slap an ATSC or DVB-T or the Japanese standard tuner
> chip (or all three) on it instead of a USB port or ethernet is more of a
> matter of product placement than anything else.
>
> Geoff.
> --
> Geoffrey S. Mendelson N3OWJ/4X1GM
> Those who cannot remember the past are condemned to misquote it.
And digitial TV is a waste of time.
>>I doubt that any American member of this group has adjusted the Hue
>>control
>>on their NTSC set for at least 30 years.
>
> True. US receivers use the VIR (Vertical Interval Reference) on line
> 20 for chroma phase correction to automagically correct both static
> and differential phase errors. I think this started in about 1980.
** So fucking what ??????????????????
NTSC color started in the USA in the early 1950s.
The famous irreverent NTSC acronym way predates 1980.
You stupid, fucking cunthead.
.... Phil
> Why? Dish or Direct supply all the equipment and install it, just
> like the various CATV companies.
> > > They all use some sort of MPEG TS transmission, but the streams can
> > > not be read with the other company's devices.
> >
> > That's business politics for you.
> So, you think someone should be able to us one company's equipment to
> steal service from another?
It's what the OP apparently wants. A universal TV.
--
*Why do overlook and oversee mean opposite things? *
I'm not sure when the UK came off mains lock - somewhere like the late
'50s. And there must have been older TVs still in use when this happened,
as even some very early single channel ones were converted when ITV
started in the mid '50s. And I can't remember rolling hum bars being
common.
--
*It ain't the size, it's... er... no, it IS ..the size.
Is that a serious question or just a troll?
I never said anything about stealing. There are economies of scale in
service, support and repair realized by using the same equipment with the
same standards.
As for preventing theft or signal piracy, there are standards in place for
external decryption add ons for satellite and cable TV receivers. They
range from a simple memory chip with encryption keys on it, to custom
decryption hardware.
The form factor is a credit card sized smart card, like the one used for
GSM SIMs (subscriber ID modules) in the early phones.
Using standard hardware and transmission methods allows a customer to buy
the exact receiver they want, have it installed in the location and setup
they want and get the support options they want.
The program provider sends them a decryption card which they insert in the
receiver and then watch the programs they pay for. Since the interface
standard is an open one, anyone can build receivers and program providers
are free to choose the encryption method they want without being wedded to
a particular receiver.
These devices exist not only as parts of a receiver, but as an add on for
home theater PCs. I have seen them sold for Windows and Mac computers.
> And digitial TV is a waste of time.
Certainly in the UK the ability to cram in more 'choice' at the expense of
technical quality is very noticeable.
So 'digital' gets the blame rather than those who control it.
--
*Some people are only alive because it is illegal to kill.
Why does a universal TV imply theft? He just wants the ability to buy
whatever program material HE wants, and not be subject to the whims of the local
licensee of a studio to determine if they want to sell it or not.
I'll give you an example. If you are a Star Trekfan, watch a movie
called "Galaxy Quest". Even If if you hate Star Trek, you'll love it.
Imagine an episode of Star Trek with Tim Allen, Allen Rickman, Sigourney
Weaver and equivalent quality writing.
Never shown in Israel in the theaters, never imported as a DVD, except in
the stores that imported zone 1 (US) DVDs. Around five years after release
made it to cable TV.
Is wanting to buy a DVD of it theft?
In the early 1980's the woman I was dating loved a movie called "Children
of Paradise". Since considered THE BEST FRENCH FILM, it was ignored in the US
except in "art houses" and rare in them. In order to get her a copy I had to
buy one in PAL in the UK or SECAM in France (it's black and white, not much
difference between them). Then it had to be converted to NTSC. In those days,
it was done by aiming a camera at TV screen. :-(
Don't worry, in case anyone cares a BD rip is now floating around the internet.
:-(
>> True. US receivers use the VIR (Vertical Interval Reference) on line
>> 20 for chroma phase correction to automagically correct both static
>> and differential phase errors. I think this started in about 1980.
??? How can the reference signal correct a differential phase error?
> ** So fucking what ??????????????????
> NTSC color started in the USA in the early 1950s.
> The famous irreverent NTSC acronym way predates 1980.
> You stupid, fucking cunthead.
The point being that the problems with NTSC had nothing to do with the
design of the system, but the failure of the networks to establish high
standards of image and signal quality. As these were gradually put into
place, the supposed "inherent problems" with NTSC gradually disappeared.
This WAS NOT due to the use of VIR on consumer receivers. VIR was primarily
to catch and correct problems along the signal chain.
The lie that PAL is somehow inherently superior to NTSC refuses to die. NTSC
is the "better" system. Period.
> Why does a universal TV imply theft? He just wants the ability to buy
> whatever program material HE wants, and not be subject to the whims of
> the local licensee of a studio to determine if they want to sell it or
> not.
So no different to cable companies or DVDs etc. They also want to control
who can watch their copyright.
--
*Artificial Intelligence is no match for Natural Stupidity *
Yes and it became a non-issue over 30 years ago
>
> You stupid, fucking cunthead.
>
> .... Phil
For someone as brilliant as you in electronics and audio, when it
comes to American TV, you're one of the most ignorant blowhard
buffoons I've run across.
Happy New Year to you.
G²
I only have one DVD player, a Philips, and it wouldn't play it. But
I've seen it once, via the computer, and once is enough.
I only brought it up because if the PAL DVD is "all regions", then it
seems to me when a DVD is labelled PAL, they aren't (necessarily?)
referring to a region. Unless *ALL* PAL DVD's are region-free?
Someone gave me the other DVD player because it was broken, the one
that does both PAL and NTSC is by Colby, DVD-224, and looking for
info, I see that it's only 40 dollars, sometimes given as premiums,
and often breaks early. I doubt I can fix it, but I'll look inside.
> Meat Plow wrote:
>>
>> On Sun, 09 Jan 2011 14:21:01 +1100, Phil Allison wrote:
>>
>> > "Geoffrey S. Mendelson"
>> >
>> >
>> >> NTSC stands for National Television Standrds Comittee, PAL for Phase
>> >> Alternating
>> >> Line, and SECAM is a French acronym for what could be loosely
>> >> translated as
>> >> system of transmitting color TV.
>> >
>> >
>> > ** Everyone knows that NTSC stands for:
>> >
>> > " Never Twice the Same Color"
>> >
>> > and SECAM =
>> >
>> > " Something Essentially Contrary to the American Method "
>> >
>> >
>> >
>> > .... Phil
>>
>> And PHIL = PLEASE HELP I'M LOST!!
>
>
> Pathetic Halfwit Infecting Lambs.
LOL
--
Live Fast, Die Young and Leave a Pretty Corpse
No, jackass. The 'National Television Standards Committe' was
created for monochrome TV, long before any color televison was
developed.
> The famous irreverent NTSC acronym way predates 1980.
As well as color TV, sheep shagger.
PLL circuits allowed the signals to remain in sync.
> > ** So fucking what ??????????????????
> > NTSC color started in the USA in the early 1950s.
> > The famous irreverent NTSC acronym way predates 1980.
> > You stupid, fucking cunthead.
>
> The point being that the problems with NTSC had nothing to do with the
> design of the system, but the failure of the networks to establish high
> standards of image and signal quality. As these were gradually put into
> place, the supposed "inherent problems" with NTSC gradually disappeared.
> This WAS NOT due to the use of VIR on consumer receivers. VIR was primarily
> to catch and correct problems along the signal chain.
>
> The lie that PAL is somehow inherently superior to NTSC refuses to die. NTSC
> is the "better" system. Period.
He'll have to look in another universe, then. :)
The US TVs had sync, that matched the line frequency. Some early
sync generators simply multiplied the line frequency by 525 then divided
by 2 for the horizontal frequency.
Even when they built crystal controlled generators, the line
frequency remained close enough that it would take minutes or hours to
roll though a frame. Also, by the '50s the power supplies were better
filtered. The hum bars were faint, but visible on older TVs, and one of
the first signs of trouble when they became more pronounced. A lot of
US monochrome TVs were transformerless, and used a voltage doubler in
the power supply. pairs of 300 uF 160 volt electrolytics, where some
early TVs had 8 or 16 uF filtering.
Sigh. Economy of scale would make it insane to build the equipment to
process the competitor's signals.
> As for preventing theft or signal piracy, there are standards in place for
> external decryption add ons for satellite and cable TV receivers. They
> range from a simple memory chip with encryption keys on it, to custom
> decryption hardware.
>
> The form factor is a credit card sized smart card, like the one used for
> GSM SIMs (subscriber ID modules) in the early phones.
>
> Using standard hardware and transmission methods allows a customer to buy
> the exact receiver they want, have it installed in the location and setup
> they want and get the support options they want.
Buy what? the signal provider provideds the hardware. The only
thing you buy is the service. Is that so hard to understand? I can
call brighthouse, and they will turn on the existing drop, and connect
their hardware. I can call Dish or Direct. They will install their
antenna and reciever.
> The program provider sends them a decryption card which they insert in the
> receiver and then watch the programs they pay for. Since the interface
> standard is an open one, anyone can build receivers and program providers
> are free to choose the encryption method they want without being wedded to
> a particular receiver.
>
> These devices exist not only as parts of a receiver, but as an add on for
> home theater PCs. I have seen them sold for Windows and Mac computers.
Maybe where you live. You can buy a TV tuner card here, and connect
it in place of the TV from any of the other services availible in the
US. This isn't Israel.
It dosen't matter who controls it whan there is no longer an OTA
signal availible. I lost all OTA service after the change.
>> ??? How can the reference signal correct a differential phase
>> error?
> PLL circuits allowed the signals to remain in sync.
Things will remain in sync but that does NOT fix a differential
phase error in the transmission path. The VIR would adjust the
phase at a mid luminance point, but the differential error still
existed above and below that level.
>Even when they built crystal controlled generators, the line
>frequency remained close enough that it would take minutes or
>hours to
>roll though a frame. Also, by the '50s the power supplies were
>better
>filtered. The hum bars were faint, but visible on older TVs,
>and one of
>the first signs of trouble when they became more pronounced. A
>lot of
>US monochrome TVs were transformerless, and used a voltage
>doubler in
>the power supply. pairs of 300 uF 160 volt electrolytics, where
>some
>early TVs had 8 or 16 uF filtering.
Color TV changed the field rate to 59.94 Hz and a 120 Hz hum bar
would roll through the picture in under 15 seconds. Even
monochrome broadcasts adhered to the new frame rate.
David
I started working in broadcast TV in '76 at the CBS affiliate in
Madison WI WISC-TV. The incoming signal was a terrestrial microwave
link. There was no differential gain or phase errors on that signal.
When they put up test signals during non-network times they looked
like the signals from the local test generators. SN >60dB and flat
within a fraction of a dB. At the time Madison was market 103 or so so
it wasn't because we were the 'big boys'. The signals in the mid-west
were generally excellent - at least for CBS.
G²
So it's our fault that your network wasn't capable of doing proper
video conversion? Did you ever stop to think that they just didn't give
a damn, and making it look bad made their other crap look better?
Sigh. It wasn't a couple components. A video proc amp that used VIR
weighed 30 pounds and filled at least seven inches of rack space. The
entire signal was analyzed and corrections were made. Try a cross fade
with signals from two cites with no visible artifacts. It took some
time to match a pair of framestore & proc amps, but done properly no one
noticed.
> >Even when they built crystal controlled generators, the line
> >frequency remained close enough that it would take minutes or
> >hours to
> >roll though a frame. Also, by the '50s the power supplies were
> >better
> >filtered. The hum bars were faint, but visible on older TVs,
> >and one of
> >the first signs of trouble when they became more pronounced. A
> >lot of
> >US monochrome TVs were transformerless, and used a voltage
> >doubler in
> >the power supply. pairs of 300 uF 160 volt electrolytics, where
> >some
> >early TVs had 8 or 16 uF filtering.
>
> Color TV changed the field rate to 59.94 Hz and a 120 Hz hum bar
> would roll through the picture in under 15 seconds. Even
> monochrome broadcasts adhered to the new frame rate.
DUH! I was a TV broadcast engineer at three US TV stations. By the
time NTSC was modified for color compatibility the TVs were better
designed.
People who never worked in the industry have no clue. They generally
used the cheapest imported TV they could find, then bitched about it.
If they ever saw the video from a TK-46 with a set of new Plumbicons on
a $7,000 studio monitor, they would shoot their digital TVs.
> NTSC color started in the USA in the early 1950s.
Monochrome compatible NTSC color was proposed in 1950 and approved in
1953. That was the 2nd attempt as the first NTSC comittee conjured an
incompatible system in 1941 that was generally rejected as most of the
manufacturers decided to delay introduction of consumer TV sets until
after the war was over. There was a also an attempt at standarization
in the 1930's. Light reading:
<http://www.ntsc-tv.com>
> The famous irreverent NTSC acronym way predates 1980.
You might want to read what I scribbled. VIR hue correction started
in about 1980.
> You stupid, fucking cunthead.
"Obscenity is the currency of a bankrupt vocabulary"
--
Jeff Liebermann je...@cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
"Jeff Liebermann = Radio Ham Jerk Off "
>>I doubt that any American member of this group has adjusted the Hue
>>control
>>on their NTSC set for at least 30 years.
>
> True. US receivers use the VIR (Vertical Interval Reference) on line
> 20 for chroma phase correction to automagically correct both static
> and differential phase errors. I think this started in about 1980.
** So fucking what ??????????????????
NTSC color started in the USA in the early 1950s.
The famous irreverent NTSC acronym way predates 1980.
You stupid, ILLITERATE, ASD fucked cunthead.
.... Phil
I have enjoyed the Sommerwork, Lieberman, Mendelsohhn, Terrell
discussions. I started working on color tv's in 1955 when I was in
college. Vacuum tubes galore and two console cabinets just to put a
color picture on a 10" tube. But it was color and the amount of power
was not a major consideration. I bought my first color tv in 1958, a
1957 set on sale with a 21" round picture tube set that drew about 375
watts. The vertical scan was 59.9 hz, as I recall and when the power
supply caps started failing you got a slow-moving hum bar moving up
the screen. The color variations were due to various network sources
having slightly different phases for the color subcarrier and sets
that had very pure theoretical demodulators. Then sometime in the
early 1960's someone had the idea of having any signals close in phase
to the phase of "ideal" skin tones of WASPs move closer to the "ideal"
skin tone, making it less necessary to constantly adjust the "hue"
control. It distorted colors, of course, but people looked more
"Normal" and it was agood selling point for a number of years.
Want to bet? What I've seen of PAL on multi standard TVs & VCRs was
a sick joke. A man who owned a bunch of Greek restaurants in lake
County, Fl. imported the pair, and his relatives sent him a steady
stream of PAL tapes. They all looked like shit. They were commercial
tapes, not recorded OTA.
Also, they developed better chroma demodulator circuits in the '60s.
Each one could be identified by the overal image. Sylvania was the
easiest. It had a faint blue tinge.
>The reason that everyone did not have a universal TV set was because the price
>was kept lower with single system sets and countries like the UK, which made
>a substansial income from the TV license did not want you watching tv from
>France or the Republic of Ireland for free.
I ran into something like that when I "visited" Israel in the early
1970's. There were literally dozens of TV antennas on every apartment
rooftop. Few wanted to watch the official Israeli TV channels. What
they wanted was TV from Jordan, Syria, and Egypt, which required much
larger yagi antennas. Incidentally, much of the Arab broadcasting was
with either in English, or with Hebrew subtitles or Hebrew
overdubbing.
In my innocence, I went to the Ministry of the Post to inquire about
setting up a cable TV system that would hopefully get rid of the
antenna clutter. I was detained for questioning as some manner of
spy, saboteur, or total idiot. I should have guessed that there was a
reason that Israel didn't have a cable TV system. The government
certainly wasn't interested in an improved method of watching Arab
broadcasting.
A short while later, I repeated the mistake by attempting to establish
a land mobile radio business. At the time, private radio
communications was not exactly appreciated in an essentially war time
economy just after the 1972 Yom Kippur War.
Politics always trumps technology.
--
Jeff Liebermann je...@cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558