Is there a way to determine a program's HD resolution?
For example: I am watching a soccer tournament on HDTV cable. I have
my cable box and my HD projector both set at the highest resolution.
However this broadcast doesn't have the resolution that I have
previously seen on other HD soccer events.
Can I determine the resolution of this particular broadcast?
What information is available about this particular broadcast?
thanks,
Ron
comp.dcom.telecom.tech
Depending on your equipment, you may be able to get a readout of the
scan rate (eg, 1080i, 720p, or 480i), but that doesn't really tell
you much. The only way to determine the highest frequencies present
in the received signal (the "resolution") would be to use a spectrum
analyzer.
I've been looking for a used spectrum analyser I can afford, but
haven't seen anything suitable for less that $3000.
/Chris, AA6SQ
I don't know where you got your information from, but a spectrum
analyzer will NOT let you determine the resolution of a HD signal.
S.A.'s are great. I can't tune any transmitter or RF device without one.
However, they show NOTHING about the content (bits) of a signal.
I don't even believe my $45K Agilent 8VSB/QAM unit will do this.
Ron;
I can determine the picture Resolution for a Digital
TV transmission under these conditions........
Cable TV from Time Warner....
Cable Box is SA3250HD
3250HD menu options selected: 1080i, 720p, 480p wide, 480p,
480i wide, & 480i.....
2nd Cable menu set to 'Auto' DVI
DVI cable from Cable Box to Sony HDTV
HDTV is 34HS510 CRT
As I switch Digital Cable stations using DVI....
I get screen Format Text of: 1080i, 720p, 480p, or 480i
The wide text option has yet to show... Need FOX????
I'm not sure which 'brain box' puts these Format
labels on the lower right of the TV screen for
5 seconds.
In principle a spectrum analyzer will give you the (horizontal)
resolution of a TV signal. You simply hook the luma component
output of the receiver (STB) to the input of the spectrum analyzer.
We are talking the video baseband signal here, of course.
Then you simply look to see where, at what frequency, the actual
information in the signal stops and pure noise begins.
But IN GENERAL, it simply does not work. This is because there
is so little in a picture that is actually in focus and actually
has a big amplitude .... except the sync ... that you can't
see it over the noise from the rest of the picture.
I've done this with our two (sorry, only $38,000) HP (Agilent)
spectrum analyzers and it was hard work to find a scene
that worked.
What you really need to do is get a really good frame grab
of a picture with lots of in focus (and this is important)
detail. Then you rummage around for a line with lots of
such detail, isolate out the one line, chop out the
section with detail, and do a spectrum analysis by FFT.
You can of course do this with a vertical cut too.
This works, though I've only done it
on single line scope traces.
As an alternative, you can look at a frame grab ...
or even a digital scope trace of a single line ... and
measure some risetimes, which you measure in pixel times.
Of course, you have to be careful to distinguish from
detail from a camera or telecine and detail from a
computer generated logo, which can be ... and are,
in my local NBC Olympics presentation ... wildly
different in resolution.
This works.
Doug McDonald
I once picked up an old klystron-based Polarad at a junk place for $90.
It worked well, from 0-40ghz!
Kinda lost it in the hurricane, though.
--
Dave Oldridge+
ICQ 1800667
A false witness is worse than no witness at all.
Untrue. You don't look at the RF (modulated signal), but at the
analog video after detection (component out). That also has a
spectrum, with the highest frequencies corresponding to the highest
transmitted detail. As a subsequent poster has noted, it isn't easy
even then -- you need an analyzer that will integrate over a long
enough period to capture enough bits of signal to register.
Unfortunately, it's been years since I worked at a place with access
to such gear, and I'm not willing to spend a year's salery to
purchase my own ...
/Chris, AA6SQ
No, that doesn't tell you the resolution, only transmitted scan rate.
A SD picture upconverted to 1080i will have show as 1080i, but will
have only the resolution of the orignal SD source, not HD resolution.
There is *no* inexpensive way to determine how much resolution is
available in the transmitted signal, unless the station happens to
choose to transmit a calibrated test pattern. HDNet does or used to
do this sometimes at like 5am.
Of course, your eyes will give you a good idea -- the resolution of
the USC-VT game on ESPN at the moment looks like quality HD to me.
/Chris, AA6SQ
There is no point to read the specs. It is possible to have a signal on the
highest resolution mode, but if the source was up converted to this mode,
then the picture quality will be the same as the original source. It is
impossible to make something that is not present in the first place.
I have seen many HDTV and digital broadcasts where the picture quality is
not very good. This is because the source that it came from was not of good
quality.
This is the same concept as having a very high end sound system that costs
tens of thousands of dollars. The user than puts on a CD disk that was
recorded from an old 45 RPM record that had many scratches on it.
Or you can take your HDTV set, and connect it to a standard CATV normal
resolution cable system. You will definitely see the difference there! /But,
the set will only resolve the best signal content that it can receive.
--
Jerry G.
==========================
"Chris Thomas" <cth...@mminternet.com> wrote in message
news:MPG.1b9a32c45...@news.mminternet.com...
This is only partially true. Upconverting with a good scaler can reduce
jaggies that would appear on a true 480 scan line picture. I know it is
"fake" and not real resolution, but the picture can look (and really *be*)
better after up-scaling.
You can also do slight edge enhancement with the extra pixels and make the
picture appear to be better. Too much and it looks like crap again, but
a very little (one or two pixels or so) will make it look better to most
eyes.
--
Jeff Rife | Sam: What d'ya say to a beer, Normie?
SPAM bait: |
Ask...@usdoj.gov | Norm: Hi, sailor...new in town?
sp...@ftc.gov |
Still think you're blowing smoke. I'm willing to try it, if you post a
procedure. Spectrum Analyzers cannot pick out bits.
If you're right, I will post an apology.
Me don't thinks so.
They JUST ain't that FAST.
>
>
> Still think you're blowing smoke. I'm willing to try it, if you post a
> procedure. Spectrum Analyzers cannot pick out bits.
>
> If you're right, I will post an apology.
>
> Me don't thinks so.
>
> They JUST ain't that FAST.
I've already replied!!! It is NOT EASY to do this with a spectrum
analyzer!!
We are not talking ANYTHING about bites here ... we are talking
about an analog signal ... the baseband luma signal coming
out of the component output of the set top box.
There of course is no problem with the "fastness" of a spectrum
analyzer, of course .... in terms of frequency ... the absolute max
frequency is going to be in the 30 MHz range, which is total peanuts
for a typical spectrum analyzer.
The typical spectrum analyzer, of course, scans frequency. For this
purpose you would set it to a 300kHz or so bandpass, and let it
scan. Modern spectrum analyzers will average scans, so you let it
average a while until the noise in the spectrum due to the changing
video goes away, or at least sort of goes away. At this point
a naive view is that you just look at the screen. If you did this on
a signal without sync pulses that would be true. But the luma
on component carries the sync ... and it has sharp edges. So
you have to do an average on a blank screen (tune to a blank channel
and hope there is no message box on the screen) and subtract
the spectrum of that.
Once all this is done, normally you will see a spectrum
that tapers out into mud at the higher frequencies and you
won't be able to REALLY tell where the actual spectral content
of the picture stops.
I've done if often, believe me.
***********************
On the other hand, if you get a digital oscilloscope with
a really long memory, set it to go up to 50 MHz or so and
fill it up with a scene that looks like it has a lot of
sharp content. Transfer to a computer. Go looking for sharp
edges that are not sync. Align the midpoints of a bunch of these
in time (all going up or all going down, of course) and add them
up. Take a length of points equal to about 5 or 10 times the
risetime and calculate the spectrum from the Fourier transform.
That will give you a good idea of what is going on. You will
note a big differene ebetween edges in a real picture and
edges from logos and bugs ... the latter are sharper.
And alternate computation method is to find a bunch of
stuff with sharp looking edges, calculate the complex FFT,
take the absolute value, and add that up in the frequency domain.
Doug McDonald
Thanks for the information. It is unfortunate that one can't get this
information easily. It would be very useful.
If the FCC required that all HD broadcasts identify the "source"
resolution momentarily at the beginning of each broadcast would that
be a bad thing (during the title/episode#)?
Maybe in another 5 years or so...
again thanks for setting me straight,
Ron
Chris Thomas <cth...@mminternet.com> wrote in message news:<MPG.1b9ac45bb...@news.mminternet.com>...
Thanks, Doug. You make a lot of the points I was going to. And I
KNOW it ain't easy! My first real job (a couple of careers ago) was
as a tramsmitter tech for a local TV station. The head of the
engineering dept was an old curmedgeon who was fanatical about the
signal quality. So I was tasked to give him a weekly report of
"effective" video BW. This was beaucoup years ago, and spectrum
analyzers were nothing like today. But I had the advantage that I
could, when we went off the air at night, trasnmit test patterns of
my choice. I also did stuff like listening to the demod'd video with
a comm receiver - a poor man's s.a. But I digress...
If anyone else wants to try what we're talking about, here's a lower
tech way: Draw a test pattern of equally spaced vertical black and
white lines, tack it on the wall, and point your NTSC camcorder at
it. Hook an o'scope to the composite output of the camera. You will
see (mostly) a square wave, along with glitching caused by the horiz
and vert sync and blanking. Or, take your NTSC DVD player and play
one of the DVDs (AVIA et al) with test patterns. Pick a test pattern
with vertical lines. Look at the composite or component video out
with an o'scope. You will see a similar square wave.
Now zoom the camcorder out (this is harder with the DVD!) so the
lines become closer together. The square wave will increase in
frequency and degrade, becoming more like a sine wave (ask Nyquest
why). At some point, you will get a sine-ish wave of .707 the
amplitude of the square wave (-3 db point). The frequency of this
sine wave is approximately the video bandwidth. (This is much easier
to describe than to do, because of noise, sync pulses, etc. It may
depend on having the right o'scope - sample and hold helps a lot.)
Now, you can mathematically change this time-domain view into a
frequency domain view by applying a Fortier transform. This is
trivial because we're just talking about a square wave through a low
pass filter, but I'll assume everybody else's calculus is rusty too.
So plug in your spectrum analzyer (which does the transformation for
you!) in place of the o'scope. You will see some low frequency
components (60 hz vertical, 15 khz horizontal, plus LOTs of
harmonics), and a big blip of energy around 3.5 - 5 mhz. That energy
is the test pattern pixels. (Frequency depends on whether we're
talking BW or color - don't want to date myself here, but I said this
was a while ago.)
Doing the same test with HD video and a max res test pattern, you
would see the energy around 30 Mhz. Note that the max freq depends
on which color output you're looking at, assuming you use the
component out. At least I think it does. Also notice that this
frequency is NOT directly related to the 19 Mhz ATSC bandwidth
(getting multiple 30 mhz signals in 19 mhz is one reason why ATSC
uses JPEG encoding.)
If you get rid of the test pattern and look at real video, instead of
a nice peak at one frequency, you see a lot of low frequency video
energy, tailing off off at higher frequencies, until it reaches the
noise floor. As Doug says, it's hard/next to impossible to tell
where the highest frequency video is -- it's at best a judgement
call.
But, other than getting stations/networks to transmit a test pattern
in the wee hours, I don't know any other way to determine what
resolution you're receiving. And my opinion is that relatively few
operators these days care enough about video quality to bother doing
that.
/Chris, AA6SQ (plus a Phone I commercial somewhere in the files)
OK. so if you've done this often, post a procedure. I am more than
willing to borrow my S.A. from work ( Tek 494 ). Sorta old but good..
>
>
> But, other than getting stations/networks to transmit a test pattern
> in the wee hours, I don't know any other way to determine what
> resolution you're receiving.
As I pointed out, that's not necessary ... nor, indeed, with
MPEG using encoders that may change the highest frequency
depending on the content, really even meaningful.
The method I described using oscilloscope capture
and computer processing works, however. The trick is to
pick the proper part of the picture.
Doug McDonald
>>
>>Once all this is done, normally you will see a spectrum
>>that tapers out into mud at the higher frequencies and you
>>won't be able to REALLY tell where the actual spectral content
>>of the picture stops.
>>
>>I've done if often, believe me.
>>
>
> OK. so if you've done this often, post a procedure.
That's what I did. Just do it ... I assure you, it's
not gonna work no matter how carefully you do it,
without a very special picture to work with. If you do
try it, I suggest "Life with Bonnie".
Doug McDonald
http://zone.ni.com/devzone/conceptd.nsf/webmain/0B04C09D4A44C78186256C3F007F
8B02?opendocument&Submitted&node=175341_US
This person does a fair job of explaining it, too, with some diagrams and
I've always liked the BW = 0.35/risetime formula.
You can get the maximum signal risetime off a very fast scope sweep,
positive-going trigger, trigger level at the 10% point.
"Sal"
"Sal M. Onella" <salmo...@food.poisoning.org> wrote in message
news:ZVSYc.226536$Oi.161235@fed1read04...
While we are at it lets just do a wavelet transform and look at the
coefficients...
Come on people, the guy asked a simple question, and the simple answer is
no, there is no way to know the originating resolution. What you get is the
result of a number of intervening compromises.
Leonard
> "Doug McDonald" <mcdo...@scs.uiuc.edu> wrote in message
> news:cgttrf$93p$1...@news.ks.uiuc.edu...
>
>>And alternate computation method is to find a bunch of
>>stuff with sharp looking edges, calculate the complex FFT,
>>take the absolute value, and add that up in the frequency domain.
>>
>
I tried the Oscilloscope - computer FFT yesterday on my
actual own STB and Jay Leno.
Hints: you need some sort of apodization. I first took 2048
sample, then made into 4096 samples by attaching a reversed
copy of the first 2048, resulting in a symmetric pattern. Then
I subtracted the average of this from each sample, multiplied by
a single half-cycle Sin apodication function, and then re-calculated
the average and subtracted it off. Then FFT and calulate the
absolute value.
You might also try not doubling the sample, just subtracting off the
average and a straight line between the two ends (so there is no
step at the join point if you make it into a circle.) I have not
tried this.
Results: this was 1080i output from the box. It appears the analog
risetime (1/e time) of the box is about 9nsec or 3 dB down at 18 MHz.
There is very large output on many samples up to 14 MHz. There are
occasional samples with fairly discrete peaks out to 22 MHz ... the
most prominent being 22 MHZ which is three times a fundamental of
about 7 MHz from stripes in a building in the set behind Leno. There
is some information out to about 25 MHz and noise out to 30MHz, but
it is weak.
Doug McDonald
> Hints: you need some sort of apodization. I first took 2048
> sample, then made into 4096 samples by attaching a reversed
> copy of the first 2048, resulting in a symmetric pattern. Then
> I subtracted the average of this from each sample, multiplied by
> a single half-cycle Sin apodication function, and then re-calculated
> the average and subtracted it off. Then FFT and calulate the
> absolute value.
I confess no understanding of apodizations. I see it appears to be
related to a Fourier series, which I do understand.
In all this, I am reminded of a story about a problem on a Physics test:
"Describe the use of a barometer to approximate the height of a building."
Two interesting answers:
1) Drop the barometer off the roof and time the fall.
H = one-half g(t-squared)
2) Tell the building manager, "I'll give you this nice barometer if you
tell me how tall your bulding is."
3) And the most popular answer was lower the barometer by a string and
measure the string.
> Sal M. Onella wrote:
>
>> "Doug McDonald" <mcdo...@scs.uiuc.edu> wrote in message
>> news:cgttrf$93p$1...@news.ks.uiuc.edu...
>>
>>>And alternate computation method is to find a bunch of stuff with sharp
>>>looking edges, calculate the complex FFT, take the absolute value, and
>>>add that up in the frequency domain.
>>>
>>>
>>
>
> I tried the Oscilloscope - computer FFT yesterday on my actual own STB and
> Jay Leno.
>
<snip>
> There is very large output on many samples up to 14 MHz. There are
> occasional samples with fairly discrete peaks out to 22 MHz ... the most
> prominent being 22 MHZ which is three times a fundamental of about 7 MHz
> from stripes in a building in the set behind Leno. There is some
> information out to about 25 MHz and noise out to 30MHz, but it is weak.
>
Leno takes a 38Mbs 4:2:2 backhaul to the broadcasters before being
decoded and re-encoded to 19Mbs for broadcast. It would be curious to see
how HDNet's live feed looks. They encode to 19Mbs at the production truck
and pass the mpeg stream all the way to the home STB before it is ever
decoded. (Their master control is in the MPEG domain.)
>For example: I am watching a soccer tournament on HDTV cable. I have
>my cable box and my HD projector both set at the highest resolution.
>However this broadcast doesn't have the resolution that I have
>previously seen on other HD soccer events.
Apparently the event was filmed using SDTV equipment and upconverted
to HDTV for broadcasting.
One way to look at the problem is to partially decode the MPEG2 stream
and get the I-picture DCT coefficients. The highest DCT components
(representing the highest spatial frequencies in horizontal, vertical
and diagonal direction) should always be zero in an upconverted
signal.
Depending on what coefficients are present, you might be able to
create a heuristic algorithm to determine the lowest resolution used
during the processing chain.
Paul
LCD and Plasma technology (when Plasma is real HD :) is typically 720p
native. Although Hitachi's claims all their plasma displays are
1080i(?).
Ignore manufactuer statements such as this one from Pioneer's website:
"High-Definition, Defined If you want to enjoy high-definition
programming in all its splendor, your TV must have a minimum native
resolution of 720p. The "p" stands for progressive, which means that
all the pixel rows are shown at once, instead of half at a time like
interlaced."
The above statement is completely bullshit. Although 720p is
considered to have a slight edge of 1080i, it's a highly subjective
debate that comes down to the picture quality vs performance.
Pioneer's statement is also very misleading because 85% of the HD
content broadcasted today is sent out in 1080i. This accounts for all
of the cable/satellite channels (except ESPN HD) being broadcasted and
CBS, NBC, WB. ESPN HD, ABC and FOX (the later of which rarely
broadcast in HD) are the only networks electing 720p as their native
format.
The most important thing is to make sure whatever set you buy can do
both and that it supports either 720p or 1080i native. Many Plasma
displays can really only do 480p native and they upconvert to 720p.
I have no idea what "the technology inside is interlaced" even means.
Do you?
--
Thor Lancelot Simon t...@rek.tjls.com
But as he knew no bad language, he had called him all the names of common
objects that he could think of, and had screamed: "You lamp! You towel! You
plate!" and so on. --Sigmund Freud
Also, I've yet to find a DLP based set that does 720p native.
Regarding LCD, I know it is always natively progressive scan. You'll
never find a LCD display that does 1080i native.
t...@panix.com (Thor Lancelot Simon) wrote in message news:<cij5de$f3s$1...@panix5.panix.com>...
> Not that hard. CRT and DLP Rear-projection and CRT DiectView sets are
> always 1080i native (because the technology inside is interlaced) some
> models off 720p to 1080i auto converter built in (this is why you'll
> notice some models claim to support both).
INCORRECT!!! DLP RP sets are NOT 1080i. They are 720p.
>
> LCD and Plasma technology (when Plasma is real HD :) is typically 720p
> native. Although Hitachi's claims all their plasma displays are
> 1080i(?).
Also incorrect. Some RP-LCD is 720p, others (Sony) are 768p ...
a major mistake unless they simply leave 48 lines blank
when displaying 720p material.
>
> Ignore manufactuer statements such as this one from Pioneer's website:
>
> "High-Definition, Defined If you want to enjoy high-definition
> programming in all its splendor, your TV must have a minimum native
> resolution of 720p. The "p" stands for progressive, which means that
> all the pixel rows are shown at once, instead of half at a time like
> interlaced."
>
That statement is correct. When they say "minimum" they
are including 1080 (i or p) in the statement.
> The above statement is completely bullshit. Although 720p is
> considered to have a slight edge of 1080i, it's a highly subjective
> debate that comes down to the picture quality vs performance.
> Pioneer's statement is also very misleading because 85% of the HD
> content broadcasted today is sent out in 1080i.
> This accounts for all
> of the cable/satellite channels (except ESPN HD)
You don't know this for most channels. The cable and
satellite companies can use any resolution they wish unless
the provider has more power and has a rigid contract that
demans 1080x1920. It is generally thought that they
ARE cheating.
> being broadcasted and
> CBS, NBC, WB. ESPN HD, ABC and FOX (the later of which rarely
> broadcast in HD) are the only networks electing 720p as their native
> format.
FOX is broadcasting quite a lot in HD. Especially sports.
Doug McDonald
>
> Also, I've yet to find a DLP based set that does 720p native.
>
The Samsungs do .... what on earth are you thinking???
Doug McDonald
>> LCD and Plasma technology (when Plasma is real HD :) is typically 720p
>> native. Although Hitachi's claims all their plasma displays are
>> 1080i(?).
>
>Also incorrect. Some RP-LCD is 720p, others (Sony) are 768p ...
>a major mistake unless they simply leave 48 lines blank
>when displaying 720p material.
I don't see a problem with upconverting from 720p to 768p as long as
the signal is from a sampled natural source, such as a camera or film
scanner. The signal source should limit the spatial frequencies below
the Nyquist frequency (fs/2) anyway, so the upsampling is safe.
However, if computer graphics is going to be displayed, which contains
frequencies at or above the Nyquist frequency, frequency limiting the
input to below fs/2 before upconversion, will degrade the image. Thus,
when a LCD is used as a computer monitor e.g. for text processing, the
native resolution should be used to eliminate the need for band
limiting.
For 1080i signals, down converting to 768p should give about 7 %
better resolution than downconverting to 720p. However, both
conversions will suffer from the deinterlace artifacts.
Paul
Doug McDonald <mcdo...@scs.uiuc.edu> wrote in message news:<cik650$2h2$1...@news.ks.uiuc.edu>...
> JamesMason wrote:
> >
> > LCD and Plasma technology (when Plasma is real HD :) is typically 720p
> > native. Although Hitachi's claims all their plasma displays are
> > 1080i(?).
>
> Also incorrect. Some RP-LCD is 720p, others (Sony) are 768p ...
> a major mistake unless they simply leave 48 lines blank
> when displaying 720p material.
>
>
768p IS NOT a DTV resolution, it's not ackowledged by the ATSC as a
valid HD format. No one broadcast in 768p, no one ever will. So the
Sony is ALWAYS upconverting content to it's "native" 768p. The
original posters response that "in general" all LCD and Plasmas are
720p native holds true. I believe his point was the LCD and Plasma are
naturally progressive scan.
> >
> > Ignore manufactuer statements such as this one from Pioneer's website:
> >
> > "High-Definition, Defined If you want to enjoy high-definition
> > programming in all its splendor, your TV must have a minimum native
> > resolution of 720p. The "p" stands for progressive, which means that
> > all the pixel rows are shown at once, instead of half at a time like
> > interlaced."
> >
>
> That statement is correct. When they say "minimum" they
> are including 1080 (i or p) in the statement.
>
How do you come to that conclusion? I checked the statement he's
refering to at Pionners website. The statement appears to be a stab at
1080i or at least an attempt to promote 720p as a superior standard
(which would make good marketing since for Pionner as they no longer
produce CRT rear-projection sets).
Regardless, the statement is untrue. 720p is better for action
(sports), 1080i is higher resolution. Those statements may be up for
debate, but what's not debatable is that 1080i is the perfered format
for broadcasters and cable companies. The race isn't even close. If
your set does 720p natively it's having to covert a 1080i signal the
majority of the time, which means unfortunately on sets that do 720p
native you aren't able to "... enjoy high-definition programming in
all its splendor" as the Pionner web site claims.
It is misleading.
> > of the cable/satellite channels (except ESPN HD)
> You don't know this for most channels. The cable and
> satellite companies can use any resolution they wish unless
> the provider has more power and has a rigid contract that
> demans 1080x1920. It is generally thought that they
> ARE cheating.
>
True the cable companies can choose to use whatever format they wish.
They can even switch from 1080i to 720p if they desire, but they
don't. Here's a rundown of what the stations broadcast out in:
Cable----
HDNet - 1080i
HDNet Movies - 1080i
INHD - 1080i
INHD2 - 1080i
Discovery HD - 1080i
HBO HD - 1080i
Showtime HD - 1080i
TMC HD - 1080i
Stars HD - 1080i
TNT HD - 1080i
Major Networks----
NBC HD - 1080i
WB HD - 1080i
CBS HD - 1080i
PBS HD - 1080i
The above stations always broadcast in 1080i. If you own a HDTV that
does 720p native the signal is being converted (downgraded) to support
your set. Regardless of what the cable provider is sending out, this
is what the signal orginates in. For Comcast and TimeWarner cable
everything is sent out in in it's original format be it 720p or 1080i.
The ONLY networks broadcasting in 720p are:
ABC, UPN, FOX and ESPN HD.
>
> > being broadcasted and
> > CBS, NBC, WB. ESPN HD, ABC and FOX (the later of which rarely
> > broadcast in HD) are the only networks electing 720p as their native
> > format.
>
> FOX is broadcasting quite a lot in HD. Especially sports.
>
>
And this makes since that FOX and ESPN would select 720p. It's a
superior format for action.
>Regardless, the statement is untrue. 720p is better for action
>(sports), 1080i is higher resolution. Those statements may be up for
>debate, but what's not debatable is that 1080i is the perfered format
>for broadcasters and cable companies. The race isn't even close. If
>your set does 720p natively it's having to covert a 1080i signal the
>majority of the time, which means unfortunately on sets that do 720p
>native you aren't able to "... enjoy high-definition programming in
>all its splendor" as the Pionner web site claims.
The real question is, how many display devices are capable of
displaying correctly a true 1080i resolution.
For instance, for a direct view CRT, measure the screen height in
millimeters and divide it by 1080. If the result is smaller than the
CRT pitch (should be available from the manufacturer's
specifications), the CRT is not capable of displaying the full 1080
resolution.
Even if the CRT shadow mask pitch is sufficiently small, due to the
electronic beam focus, some electrons will spill over to nearby pixels
(which in fact helps to reduce Moire) reducing the effective
resolution.
So in practice, the practical vertical resolution is close to 540
pixels. In fact many manufacturers don't even bother to try to
interlace the two fields but instead shamelessly write both fields on
top of each other, resulting in a 540 pixel vertical resolution
regardless if the original source is 1080i or "upconverted" 720p
(effectively down converted to 540 :-).
On a 720p or 768p display, the 720p signal will always be displayed at
that resolution as well as down converted stationary (or film based)
1080i images. For moving details in 1080i images, depending of the
quality of the scaler/deinterlacer, the true resolution may remain at
720/768 or drop to 540 (or if the deinterlacer is bad and too
aggressive, even cause deinterlace artifacts).
The real only advantage of these 540 displays at 1080i is the
elimination of digital deinterlace artifacts, but may still suffer
from some the general interlace artifacts.
The display quality for 720p or 768p (or other computer format) native
resolution displays depends very much on the quality of the
scaler/deinterlacer.
Paul
> Regardless, the statement is untrue. 720p is better for action
> (sports), 1080i is higher resolution. Those statements may be up for
> debate, but what's not debatable is that 1080i is the perfered format
> for broadcasters and cable companies.
That depends on the meaning of the word "is".
The race isn't even close. If
> your set does 720p natively it's having to covert a 1080i signal the
> majority of the time, which means unfortunately on sets that do 720p
> native you aren't able to "... enjoy high-definition programming in
> all its splendor" as the Pionner web site claims.
Sure they are 720p is sufficiently superior that you lost lose
little if any converting from 1080i. You lose nothing
in the vertical direction, of course, since 720 (the vertical
resolution of 720p) is greater than 540 (the maximum vertical
resolution of 1080i except in absolutely static scenes,
when it is, as used, about 700). You do lose possible resolution
horizontally (1280 vs 1400), but this is small. Note the
use of present tense. There is no signal source for TV that
actually even uses 1440, let alone 1920 ... there are no
common cameras that do an actual 1920, and while telecines
are capable of it .. you won;t notice it on film, except
old 70mm movbies and Imax.
Now of course 1080p is a different matter entirely, it
really IS 1 1/2 times the spatial resolution of 720p.
The companies using 1080i are making a mistake, pure and simple.
>
> Cable----
> HDNet - 1080i ...reportedly correct
> HDNet Movies - 1080i ..... who knows
> INHD - 1080i ditto
> INHD2 - 1080i ditto
> Discovery HD - 1080i ditto
> HBO HD - 1080i ditto
> Showtime HD - 1080i ditto
> TMC HD - 1080i ditto
> Stars HD - 1080i ditto
> TNT HD - 1080i ditto
>
> Major Networks----
> NBC HD - 1080i true
> WB HD - 1080i true
> CBS HD - 1080i true
> PBS HD - 1080i true
>
> The above stations always broadcast in 1080i. If you own a HDTV that
> does 720p native the signal is being converted (downgraded)
You do not wnow what ACTUAL resolution the cable and
satellite channels are using, except that reportedly MArk Cuban
requires HDNet to be non-downgraded. It ALMOST CERTAINLY
is much less than true 1080i.
> to support
> your set. Regardless of what the cable provider is sending out, this
> is what the signal orginates in.
> The ONLY networks broadcasting in 720p are:
> ABC, UPN, FOX and ESPN HD.
And their pictures are at least as good.
Now all of this would change if the 1080 people
were 1080p AND THAT WAS ACTUALLY FULLY USED. But they
are NOT USING .... NOBODY ... the full 1080x1920 resolution.
At BEST it is 540x1440 except for fully static images, which
could be rougly 700x1440. Anything higher vertically and
you would see hopelessly horrendous interlace artifacts.
That on a 1080i HDTV I see very few interlace artifacts means
that they ARE NOT TRANSMITTING ANYWHERE NEAR 700 lines vertically.
Doug McDonald