A CRT TV should give you a better picture quality than a LCD TV but
picture quality isn't necessarily the reason why people buy LCD TVs.
How sure are you that when you get a digital signal you are not still in
a marginal area? Your choice for digital may have to be via satellite,
either subscription service or 'Free to Air' or Freesat. A TV with an
inbuilt Freeview tuner may not be the best option.
--
Alan
news2006 {at} amac {dot} f2s {dot} com
Most people would agree that a really good CRT - correctly set up - looks
better than an LCD or plasma.
HOWEVER - it is very often not the case in real life. CRT's can go out of
focus; it is very challenging indeed to achieve perfect beam focus across
the entire screen. The electronics are sophisticated and it is quite common
for a TV to gradually drift out of focus as the years pass.
Secondly, convergence. Perfect convergence across the entire screen is,
again, immensely challenging to achieve, and it is very common for a CRT TV
to show convergence errors after some time in service.
The great thing about LCD and plasma is that you don't need to worry about
any of these.
On the other hand, most would agree that an LCD picture is less rich and
dynamic than a CRT picture, and in particular they are still prone to
"glowing" blacks.
They've made great strides in this area, but LCDs show blacks by blocking
off the light from the backlight. There is always some light leakage, hence
the glowing greys. A few manufacturers have introduced hacks by varying the
intensity of the backlight dynamically, but the bottom line is, it's an
inherent weakness in the technology.
Plasma avoids this problem, but has it's own shortcomings. They use a lot
of energy, for one thing. The other issue is that if you don't sit well
back you can clearly see the pixel 'mesh'.
In summary, then: most people would agree that a perfectly set up CRT gives
a better picture than an LCD or plasma. But most CRT's are running with
focus and convergence problems, which make the picture look fuzzy compared
with an LCD or plasma.
Don't think for a moment that LCDs and plasma took over from CRT because
they are better! They took over for three good reasons:
1/ Cheaper. After decades of manufacture, there is no more room for driving
down the manufacturing costs of CRTs, and frankly they are expensive to
make. LCDs and plasma screens are also expensive to make, but both offer
great potential for ongoing reductions in manufacturing costs. That means
more profit for Sony, et al.
2/ Bigger. It is a real challenge to make a CRT work satisfactorily above
about 36". There is no inherent limitation in the size of LCDs and plasma
screens, and the demand for high quality, large screens meant a move away
from CRT technology was required.
3/ Flat. Most people prefer the dead flat viewing screen an LCD or plasma
can offer. More importantly, CRTs are extremely deep compared with LCD and
plasma, especially in the larger sizes. Bulky TVs were becoming less
acceptable to owners.
I would compare the picture on your particular TV with those you see in the
shops, and then decide if you want to switch. There's certainly no rush:
LCD and plasma are improving all the time, and OLED screens will be coming
along in the next few years.
SteveT
[snip good well made points]
I would add that a major problem (challenge :-) ) with any flat panel
display, LCD, Plasma, or OLED, is that of temporal resolution. TV is,
and will probably remain in the main part interlaced. Flat panel
displays de-interlace the signal, so that it can be displayed
progressively. This plays havoc with moving objects, in particular fast
moving scrolling credits which are unreadable in many cases.
When looking a flat panel displays, concentrate on their ability to
render fast moving objects well.
: A CRT TV should give you a better picture quality than a LCD TV but
: picture quality isn't necessarily the reason why people buy LCD TVs.
It depends what you mean by "picture quality" - CRT TVs do not have the
resolution for HD-TV (as sold in this country!)
Having said that I agree that people buy LCD and Plasma TVs also to
reclaim room space and to go to higher screen sizes.
<< the most helpful post I've seen on USENET in ages >>
> SteveT
Just saying "nice work"
BugBear
>Secondly, convergence.
>The great thing about LCD and plasma is that you don't need to worry
>about any of these.
Sadly, 99.9 percent of the general public would not notice if the focus
or convergence was slightly out :)
Well its fine on our old B&O and no doubt it will stay that way for some
time yet.
The main problem is that the digital pix aren't as good as the existing
analogue ones but then again sometimes the analogue pix we see nowadays
aren't as good either ..them having been nadgered before transmission..
So I'd hang onto your old TV for as long as you like assuming its got a
SCART socket and most all have you can always hang a external
terrestrial TV box on when conversion does take place or go freesat at
anytime...
--
Tony Sayer
If you mean resolution as in the subject then yes a decent LCD is likely
to have better resolution than a cheap CRT.
--
Dave Plowman da...@davenoise.co.uk London SW
To e-mail, change noise into sound.
They had them about 25 years ago at IBC.
>(as sold in this country!)
Of course. The usual story - "We could but we don't".
Rod.
--
Virtual Access V6.3 free usenet/email software from
http://sourceforge.net/projects/virtual-access/
[snip loads of good stuff]
I think you've covered everything except geometry. Most CRTs have
noticeable geometry problems, and cheap ones tend to be appalling.
My last CRT was a low-end Toshiba. Tall buildings would go all
wibbly-wobbly as the camera panned past them. On static shots, the
background would bend and sway as things moved around in the
foreground...
The low-end Samsung LCD I replaced it with (For its thinness) has some
pretty unnerving artifacts too of course. I particularly like the way
that things which ought to be rigidly fixed together, appear to move
independently. Walls and ceilings on interior shots for example, or
noses and eyes on full-face closeups. I haven't yet worked out if this
is down to the TV, or if it's an MPEG artifact. I'm sure I never
noticed it on DTT on the CRT though.
Cheers,
Colin.
And they'll still have them at this year's too !
--
Mark
Please replace invalid and invalid with gmx and net to reply.
Thank you! <blush>
SteveT
the main reason someone would think the picture was worse is due to size.
if there was a 65 inch crt tv available, SD channels would look strange on
that too.
people expect big screen lcd and plasma to somehow look as clear as their 28
inch crt.
--
Gareth.
that fly...... is your magic wand....
>> It depends what you mean by "picture quality" - CRT TVs do not have the
>> resolution for HD-TV
>
> They had them about 25 years ago at IBC.
>
>>(as sold in this country!)
>
> Of course. The usual story - "We could but we don't".
why would they make them? - the public certainly wouldn't buy them.
>Until recently I have assumed that the picture quality of an LCD
>screen was streets ahead of a CRT screen, but one or two comments here
>have made me wonder if that assumption is correct. Can someone
>enlighten me?
As you have a low end CRT, a decent LCD would be a huge improvement. I
dropped CRT's 2 years ago and wouldn't dream of going back.
--
Andrew, contact via http://interpleb.googlepages.com
Help make Usenet a better place: English is read downwards,
please don't top post. Trim replies to quote only relevant text.
Check groups.google.com before asking an obvious question.
>
> why would they make them? - the public certainly wouldn't buy them.
There's market manipulation going on, I was in Kazakhstan in June.
Browsing in their equivalent to Comet/Currys only half the TVs on sale were
LCD/Plasma, and they were the same models seen here (about the same price too).
The other half were CRTs. Almost all 4:3, and sizes up to 36inch. 'Quality
brands' Sony, Panny, etc.
you really think if half the tvs on sale in uk shops were still CRT they
would continue to sell?
you can buy one still in the uk if you want - but it's no coincidence they
are the cheap ones.
my last CRT was a loewe aconda - a top of the range model.
it does look nice but it just seems soft somehow compared to my lcd.
maybe it's just what you're used to.
> you really think if half the tvs on sale in uk shops were still CRT they
> would continue to sell?
Not any more no. People are buying large screen flat panels because they're
the latest 'must have' fashion item. Living rooms haven't suddenly doubled in
size, so why have the screen sizes ? There's no point at all having a 40 inch
screen unless you're going to double your viewing distance, or go HD.
An alarming number of people who are actually equipped with HD, are apparently
only connecting their Sky HD boxes via Scart, so they're not even watching HD
when they could be ! Shouldn't come as any surprise, look at what's been
going on with 16:9 TV sets being fed with 4:3 CCO and then having the image
stretched to fill the screen for the last 10 years.
Don't get me wrong, I've got a 40 inch LCD, but I bought it because I've also
got a PS3 (i.e BluRay player) and a Sky HD box. I'm well aware of the zero
market for CRT. I had to give away my 32inch CRT.
Only a few months ago I bought an excellent 36inch Toshiba (with 100Hz,
digital surround, etc) for under £200 second-hand, it cost over £1600 new
3-4 years ago. I could've bought a new but very sh*t LCD for a little more
than that. Very glad I didn't, but I have the space. I have no desire to pay
the subscription for HD or get Freesat until 90% of programs are broadcast
in HD and it is the defacto standard. Hopefully by then the latest flat
panel TVs will be on a par with today's (or should I say yesterday's) CRTs
at a reasonable price point.
As with cameras resolution alone does not equal image quality.
Z
Agreed, though for a great deal of typical television material, it
needn't be very noticeable. TV pictures tend to be moving, and not to be
dominated by static straight lines as in computer programs.
> My last CRT was a low-end Toshiba. Tall buildings would go all
> wibbly-wobbly as the camera panned past them. On static shots, the
> background would bend and sway as things moved around in the
> foreground...
This can often be helped simply by backing off the contrast a little.
Many people have their TV sets in brightly lit areas and overdrive them
to try and compete with daylight.
> The low-end Samsung LCD I replaced it with (For its thinness) has some
> pretty unnerving artifacts too of course. I particularly like the way
> that things which ought to be rigidly fixed together, appear to move
> independently. Walls and ceilings on interior shots for example, or
> noses and eyes on full-face closeups. I haven't yet worked out if this
> is down to the TV, or if it's an MPEG artifact. I'm sure I never
> noticed it on DTT on the CRT though.
I've been watching DTT on a CRT display for several years, and the
effect you describe is noticeable. It seems to be an effect of too much
bit-rate reduction in the digital transmission system. I think the
system must decide how much compresion to give various picture areas
based partly on brightness, because a particularly annoying
manifestation is when someone's face is side-lit, and they move
slightly. The two parts of the person's face that are at different
brightness levels are probably being given different amounts of
processing, thereby ending up with different amounts of delay, so when
the person moves their face, one part lags behind the other. As this
deficiency is inherent in the system, there's probably no cure, so we're
stuck with it forever, or at leat until one of our great-grandchildren
invents analogue television and the world thinks it's new.
The public will apparently buy what they are told to want, or what they are
told is fashionable, or in some cases what they are told is superior simply
because it is old. There isn't usually a problem making the public want
whatever it is most profitable to sell.
Some members of the public like the idea of messing about with gramophone
turntables and a recording medium that must be handled with surgical care,
even though we have CDs and solid state media that are less bother.
And some extol the virtues of thermionic valve amplifiers which are large,
inefficient, microphonic, and deteriorate in performance as the valves age.
Why some people even like log fires, vintage cars, steam engines, mechanical
typewriters and many other things that have better modern equivalents. I'm
sure it would only take the right kind of marketing to make people believe
in the superior performance of cathode ray tubes. I predict a revival in a
few years time - after everyone has bought flat screens of course.
On Sat, 09 Aug 2008 11:04:00 +0100, Roderick Stewart
<rj...@escapetime.removethisbit.myzen.co.uk> wrote:
[snip a lot of stuff that is SO true]
> Most people would agree that a really good CRT - correctly set up - looks
> better than an LCD or plasma.
These days I would question 'most people'.
Also it's a biased statement based on an unequal comparison, 'really
good' one against unqualified, and therefore run of the mill, other.
However, the rest of the post is pretty fair ..
> On the other hand, most would agree that an LCD picture is less rich and
> dynamic than a CRT picture
Totally disagree.
> and in particular they are still prone to
> "glowing" blacks.
>
> They've made great strides in this area, but LCDs show blacks by blocking
> off the light from the backlight. There is always some light leakage, hence
> the glowing greys. A few manufacturers have introduced hacks by varying the
> intensity of the backlight dynamically, but the bottom line is, it's an
> inherent weakness in the technology.
Yes this true, but to view dark scenes the brightness on the average
CRT has to adjusted as well, so it's six of one and half a dozen of
the other.
> In summary, then: most people would agree that a perfectly set up CRT gives
> a better picture than an LCD or plasma.
No, as above.
> 3/ Flat. Most people prefer the dead flat viewing screen an LCD or plasma
> can offer.
Yes this is a real plus.
> I would compare the picture on your particular TV with those you see in the
> shops, and then decide if you want to switch. There's certainly no rush:
> LCD and plasma are improving all the time, and OLED screens will be coming
> along in the next few years.
Good advice.
My own views have been often expressed in this ng, so I'll save on
column inches and just link.
http://tinyurl.com/5srngy
... standing in for ...
http://www.cemh.eclipse.co.uk/JavaJive/AudioVisualTV/ChooseTV/ChooseTV.html
Also, AFIAA I am the only person to publish results of an attempt at
an impartial experimental comparison, albeit I'm the first to admit
that the results would be more useful if I'd had access to better
equipment than that available at my home.
http://tinyurl.com/5ccryd
... standing in for ...
http://www.cemh.eclipse.co.uk/JavaJive/AudioVisualTV/CRTvsLCD/CRTvsLCD.html
> AFAICT this is totally untrue on both my LCDs ...
Really ? Well, then they're the first models in the world not to. Perhaps you
can let us all know their exact model numbers, so we can check your assertion ?
Of course you could turn up the bitrate, so it doesn't have to decide.
Or just use a newer standard, like MPEG4.
Andy
> On Sat, 09 Aug 2008 15:54:48 +0100, Mark Carver
> <mark....@invalid.invalid> wrote:
>> Really ? Well, then they're the first models in the world not to. Perhaps you
>> can let us all know their exact model numbers, so we can check your assertion ?
> Panasonics TX-15LT2, TX-22LT2
Right, I've just had a very interesting chat with a guru within my
company who deals with such matters.
The short answer is that no one knows for sure how the processing is
performed. You'd have to ask each individual manufacturer how they do
it. You're unlikely to receive answers, because just as is the case with
my company, the information is company confidential. Indeed I can't even
find out how our consumer monitors do it, because I'm from the broadcast
division.
One thing is certain however, he has witnessed tests of many different
types of screen all being fed with exactly the same signal. There are
huge differences with temporal resolution, so the processing techniques
are very different from mfr to mfr.
Apologies for supplying dodgy information in previous posts.
BTW, you were planning to do some CRT /LCD tests of your own at one
time, using rather more pro equipment than I can lay my hands on. Did
you not have time or something? I would have been interested in the
results ...
On Mon, 11 Aug 2008 10:17:23 +0100, Mark Carver
<mark....@invalid.invalid> wrote:
>
> Right, I've just had a very interesting chat with a guru within my
> company who deals with such matters.
[snip]
I did, and there were artefacts on the LCDs not evident on the CRTs,
interestingly even with the source camera set to progressive. However,
as predicted, I couldn't capture the effects in any recordable form. In
other words I think it's something that needs to be witnessed in 'real
life'.
Pity, published results from pro equipment might have moved things
forward significantly.
When I first saw this thread, my heart sank, thinking: "Here we go
again!", but I thought it a very reasonable discussion this time.
I don't know if I've just finally managed to plonk them all, or
perhaps my limited experimental offering has actually accomplished
something in cleaning up the debate, but I saw no stupid posts of the
"All LCDs are crap! All CRTs are perfect!" type that used to be so
common.
Or, human nature being what it is, will the above paragraph generate a
flood of them in response. Oh gawd, I hope not ...
I am sitting in front of a CRT easily capable of 1600x1200*, however most
domestic TV's (and certainly a down market Matsui) are probably quite low
resolution with quite large phosphor dots - giving perhaps 768 x 576 of you
are lucky.
OTOH a decent 37" or 42" LCD screen can have true 1920x1050 resolution, with
every pixel individually correct in position, intensity and fairly good on
colour.
CRT's run into problems as they scale up - a 36" widescreen weighed ~80kg
IIRC and goodness knows how much a huge 39 to 42 inch 4:3 set I once saw
weighed (or cost!).
The screen has to be curved (on the outside or the inside) and they can't be
abbutted in multi displays.
* it will display 2048x1536, but not in my view truly at that definition due
to aperture grill limitations.
Bill
Do you remember, many, many years ago, that bloke
who directed some workmen standing around Picadilly
to dig up the road. Before they were stopped, they
had this enormous hole...
Well ..with some ladders, a screwdriver and your
authoritative expertise... *;'))
;))
Bill ZFC
--
Adoption InterLink UK with -=- http://www.billsimpson.com/
Domain Host Orpheus Internet -=- http://www.orpheusinternet.co.uk/
I looked at the Loewe CRTs when buying my LCD. I was comparing to my
Sony CRT feed by STB RGB. The Loewe CRT looked just like the LCD with
lots of PAL artifacts, smearing, shadow etc when the Sony was pixel
perfect. They seem to use the same crap digital processing as used by
the LCD, ie convert to Pal then display convert. Unfortunately many
other CRTs also did this too, losing all the benefits of RGB, but
apparently reduces complains from people who want to adjust the 'colour'
and can't do it on pure RGB.
At the time I thought the Loewe LCD was the best in class, but all where
dissappoing due to the display conversion (Interlace to progressive and
scaling), which is pretty similar to the problems intoduced by Pal.
Ironic that just as we eliminated Pal by DTV STB-RGB TV that we
introduce a new way of messing up the picture.
Incidently one of the biggest benefits of HDTV is the lack of scaling
ans sometimes the lack on interlace conversion. Of course it can be
done properly (as with Pal), but the equipment costs £1000's.
In fairness the biggest attractor for LCD is the space saving, you
suddently get a whole new bit of living room. I was also watching a 25"
4:3 and moved to 32" 16:9 so the height stayed about the same. Don't
get an oversized screen you will just notice more problems.
My Loewe now seems to be dieing after only about 5 years, there is
greyness in the top of the picture (dust?) and the sound gets a
background buzz sometimes.
--
Tony
We have a panasonic CRT IDTV which we bought about 8 years ago. The
picture quality compared to friends LCDs is incredible. It is so good.
The LCD TV we had in the holiday rental we had last week was pathetic. I
messed with the settings as much as I could and it was acceptable. I
wouldn't buy one yet though. I'll wait until our CRT breaks.
--
John
My 14-year-old 34" Sony KV-S3412U is staying put [1] until it breaks
permanently [2] or the performance of LCDs on SD material gets good
enough. (Or until HD broadcasts become dominant).
However, at present the majority of my viewing is SD. The 3412 makes a
very good job of SD so I know SD, for all of its rather variable defects,
can be displayed well enough.
Today's market for TVs remains fixated on HD quality. Nevertheless I
do see ongoing improvements in SD material displayed on HD panels so I
expect a time will come soon to retire the 3412. But not yet.
[1] At 79 kg, Newton's first law applies in spades, anyway. There's a
lot of glass in a 3412.
[2] As usual with the Sony AE-2 chassis I have had twice now to re-solder
the line output chip leads to keep it going.
--
John Phillips
Bill
>We have a panasonic CRT IDTV which we bought about 8 years ago. The
>picture quality compared to friends LCDs is incredible.
>...
Presumably this was a side by side comparison with them both set up for
optimum picture quality, i.e. all fancy settings in the respective menus
unchecked, brightness, contrast and colour set properly and on the same
programme material?
--
Alan White
Mozilla Firefox and Forte Agent.
Twenty-eight miles NW of Glasgow, overlooking Lochs Long and Goil in Argyll, Scotland.
Webcam and weather:- http://windycroft.gt-britain.co.uk/weather
Oh yes, it must be a babel fish plug-in in Lars' newsreader converting the
correct English spelling of my name, to the continental variant.
How quaint.
>> Marc Carver wrote:
>> LCD, Plasma, or OLED, is that of temporal resolution.
>> TV is, and will probably remain in the main part interlaced.
>>
> For SDTV true, but the plan is that HD should be broadcast
> in 720p mode - ASAP - at least on Freeview.
Oh yes, I'd forgotten that. Of course the pictures will have been down
converted from an interlaced 1080 line production environment, which won't help.
Things will improve when 1080p studio production becomes the norm. Much easier
to down convert to 1080i for D-Sat/cable *and* 720p for DTT.
My next door neighbour has a CRT set (Sony widescreen) with a dreadful
picture.
Your point being?
--
*A fine is a tax for doing wrong. A tax is a fine for doing well*
Dave Plowman da...@davenoise.co.uk London SW
To e-mail, change noise into sound.
Indeed. And tellys had brightness and contrast controls long before PAL
was invented (even if contrast was sometimes actually RF gain), and both
are possible with RGB signals too.
> but of course the all-analogue chipsets it employs will be obsolete now.
> Setmakers want all processing to be digital because digital is so much
> better(!)
Yep. Everything will be adjusted in little jumpy steps. Probably through
an on-screen menu that obscures the picture you're trying to adjust.
Rod.
--
Virtual Access V6.3 free usenet/email software from
http://sourceforge.net/projects/virtual-access/
I think you've got that wrong! The geometry is dependent on the
electronics around it - the same CRT in two different receivers can show
excellent results in one and diabolical results in the other. (And that
is not just the geometry!)
Presumably the same is true of LCD and Plasma panels but I'm sticking
with my (old fashioned) CRT display for the moment - I'll worry about
flat screen devices and their performance when it bites the dust!
Terry
Depth of field?
>Roderick Stewart <rj...@escapetime.removethisbit.myzen.co.uk> wrote:
>
>>> All that said, I would NOT trade the set for a plasma or LCD, for
>>> standard definition viewing. No way! In all other respects the set
>>> blows away every flat panel I've seen. Colour accuracy, richness,
>>> depth of field, blacks, response time, image clarity, etc.
>>
>>Depth of field?
>
How does your TV restore information that's lost by the camera? Magic?
--
Dave Farrance
Thank you. I know what it means. What I didn't understand was how your
choice of television display could have any effect on it.
Sorry, I didn't make myself plain here.
Agreed about the geometry. What I should perhaps have said is that,
geometry excluded, presumably the performance of specific LCD and Plasma
panels also varies from excellent to diabolical dependant on the quality
of the accompanying electronics in different receivers ...
Terry
Holographic TV?
--
Max Demian
I might guess we were somehow talking at cross-purposes here, and that you
understood "depth of field" to mean something different, except that the
picture in the link you gave is a clear demonstration of exactly what I
understand by the expression. It's the range of distances from the camera
lens within which objects are in focus. It's determined by the aperture
setting of the lens, and you can't change it after the picture has been
taken. If objects within a certain range are in focus, then objects
outside that range will not be in focus, the fine detail from them will
not have been captured by the camera, and the display will be unable to
show detail that doesn't exist. Some objects will be sharper than others
depending on their distances from the lens, and this will be an
unalterable property of the picture no matter what kind of screen you use
to display it.
Unless you really did mean something different of course, in which case
"depth of field" wasn't the best expression to use because it already has
its well understood optical meaning.
Is your day job in politics by any chance? :-)
I've been puzzling over this.
The only thing I can think of that could affect the depth of
field as seen in a picture is the picture being shown slightly
out of focus or at reduced resolution. This would reduce the
difference in clarity between the in-focus and out-of-focus
parts of the picture.
>The only thing I can think of that could affect the depth of
>field as seen in a picture is the picture being shown slightly
>out of focus or at reduced resolution. This would reduce the
>difference in clarity between the in-focus and out-of-focus
>parts of the picture.
Aha, that's true. So the poorer the TV's focus then the greater the
apparent depth-of-field. That must have been what Mr Signal was trying
to explain - that his TV had a poorly focused picture. Obvious, isn't
it?
--
Dave Farrance
Surely with a crt the contrast ratio is determined by the peak
focussed white / the ambient light reflected off the unlit phosphor.
In most household I think that is a pretty low ratio.
The lcd has an advantage here in that the switched off screen under
the same ambient light is darker than a crt; the viewed contrast
ratio is determined by the ambient light until that gets below the tft
black level.
The brightside pictures appear to have been taken in total darkness.
The basic principle, of a coarse modulated light source behind the
picture, still limits at peak brightness divided by light reflected
from the screen, so it will be similar to any other lcd viewed in
roomlight.
HB
I don't doubt that what you describe will reduce the clarity of any
picture, but it remains that "depth of feld", with its usual meaning, is
an optical property of the lens and nothing you can do with a picture
that has already been taken will alter it. If the most sharply focused
objects in a picture are those within a particular range of distances
from the lens, then exactly the same ones will still be the most sharply
focused ones, even if an imperfect display makes everything look less
sharp.
Interesting. One of you says it's increased, one of you says it's
decreased, and I don't see how it can be affected at all.
And each of you thinks your own assertion is "obvious", one of the most
overworked and misused words in the English language.
Given that depth of field is the range of distances over which objects are
in focus, then uniformly blurring the resultant image at a later stage so
that *nothing* looks properly in focus could be said to have "reduced" that
range - but only by reducing it to zero, which seems a bit of a semantic
cheat. You could just as easily reduce everything to zero by turning the
contrast down to zero and declare anything to be equal to anything.
>In article <14iia4t5jmcnojh0b...@4ax.com>, Signal wrote:
>>Dave Farrance <DaveFa...@OMiTTHiSyahooANDTHiS.co.uk> wrote:
>> >Peter Duncanson <ma...@peterduncanson.net> wrote:
>> >>The only thing I can think of that could affect the depth of
>> >>field as seen in a picture is the picture being shown slightly
>> >>out of focus or at reduced resolution. This would reduce the
>> >>difference in clarity between the in-focus and out-of-focus
>> >>parts of the picture.
>> >
>> >Aha, that's true. So the poorer the TV's focus then the greater the
>> >apparent depth-of-field. That must have been what Mr Signal was trying
>> >to explain - that his TV had a poorly focused picture. Obvious, isn't
>> >it?
>>
>> You've got it backwards. Blur focus and you *reduce* apparent DOF. Try
>> it in any image editor..
>
>Interesting. One of you says it's increased, one of you says it's
>decreased, and I don't see how it can be affected at all.
>
>And each of you thinks your own assertion is "obvious", one of the most
>overworked and misused words in the English language.
>
>Given that depth of field is the range of distances over which objects are
>in focus, then uniformly blurring the resultant image at a later stage so
>that *nothing* looks properly in focus could be said to have "reduced" that
>range - but only by reducing it to zero, which seems a bit of a semantic
>cheat. You could just as easily reduce everything to zero by turning the
>contrast down to zero and declare anything to be equal to anything.
Ho hum. You can be so literal sometimes, Rod. The "obvious" was just
being jokey, given Mr Signal's attitude.
I was elaborating Peter Duncanson's comment that poor focus of the TV
would reduce the *difference* in clarity between the in-focus and
out-of-focus parts of the picture -- thus increasing the *apparent* depth
of field of what was left. You would normally judge the depth of field
by comparing the sharpest point to the point that was *relatively*
noticeably less sharp than that -- so if just the sharpest point was
rendered much less sharply then you would accept a greater depth as being
of comparable sharpness. See what I'm getting at now?
--
Dave Farrance
It would be more to the point to capture the outputs of identical
pictures being displayed by the two technologies, as I have tried to
do - good luck, you'll need it, it's not easy.
Nothing else is going to be convincing.
On Mon, 18 Aug 2008 16:29:17 +0100, Signal <sig...@lineone.net> wrote:
>
> OK I've simulated CRT and LCD type images in the way I described. If
> the first image doesn't reproduce DOF more keenly to you, we'll have
> to agree to disagree. I think the results are pretty indicative.
>
> www.wavefinder.dsl.pipex.com/dofc.jpg
> www.wavefinder.dsl.pipex.com/dofd.jpg
Yes, I can see that a lot hinges on the use of that word "apparent". Depth
of field as I understand it remains an optical property of a particular
lens configuration, and it can be specified and objectively measured, but
further degradation of an image, even though it doesn't alter the depth of
field, may make it less apparent what it was.
I'm sorry if you think that's being too literal, but when three people can
form three different opinions about the same situation based on different
understandings of the words, I'd suggest there is some value in precision.
Out of focus = less detail = less information = easier to compress
?
On Mon, 18 Aug 2008 16:12:12 +0100, Signal <sig...@lineone.net> wrote:
>
> I'd speculate that CRTs may effectively dither, or de-noise, parts of
> digital images which are prone to compression pixelation artifacts.
> Out of focus areas are likely to be more susceptible to this as they
> are harder to compress.
>Dave Farrance <DaveFa...@OMiTTHiSyahooANDTHiS.co.uk> wrote:
>Yes but if you de-focus an entire image, the sharp parts become
>noticably blurred.. whilst the effect is less pronounced on the
>already out of focus parts. Yet, as *everything* is now out of focus,
>you can throw the concept of depth of field out the window.. there is
>NO depth of field to speak of. (Slaps forehead... :)
I think that you've got your argument-by-extremes inverted there,
nevertheless I see that you understand the issue and that depth of field
is essentially a camera property. If you argue that your TV is notable
for clear reproduction of a camera's capture ability, then when showing
The X-Factor, it would be equally meaningful to say that your TV has
excellent Cheryl Cole rendering.
--
Dave Farrance
>Cheryl's camel toe does provide quite a lot of relief, considering the
>contrast between the flat areas and the ridges.. ...
Comedy relief...
> so yes I agree with
>you when you say that sufficient depth of field accurately reproduced
>is a major plus for CRT.
... and more of the same, eh?
> You've honed in on quite a good point here..
>however I have to say, despite being an avid viewer of Cheryl Cole,
>this *is* rather specific. Good rendering of depth of field is a wider
>reaching concept, which can apply to all manner of TV viewing.
--
Dave Farrance
>Dave Farrance <DaveFa...@OMiTTHiSyahooANDTHiS.co.uk> wrote:
>
>>>>>>I was elaborating Peter Duncanson's comment that poor focus of the TV
>>>>>>would reduce the *difference* in clarity between the in-focus and
>>>>>>out-of-focus parts of the picture -- thus increasing the *apparent* depth
>>>>>>of field of what was left. You would normally judge the depth of field
>>>>>>by comparing the sharpest point to the point that was *relatively*
>>>>>>noticeably less sharp than that -- so if just the sharpest point was
>>>>>>rendered much less sharply then you would accept a greater depth as being
>>>>>>of comparable sharpness. See what I'm getting at now?
>>>>>
>>>>>Yes but if you de-focus an entire image, the sharp parts become
>>>>>noticably blurred.. whilst the effect is less pronounced on the
>>>>>already out of focus parts. Yet, as *everything* is now out of focus,
>>>>>you can throw the concept of depth of field out the window.. there is
>>>>>NO depth of field to speak of. (Slaps forehead... :)
>>>>
>>>>I think that you've got your argument-by-extremes inverted there,
>>>>nevertheless I see that you understand the issue and that depth of field
>>>>is essentially a camera property. If you argue that your TV is notable
>>>>for clear reproduction of a camera's capture ability, then when showing
>>>>The X-Factor, it would be equally meaningful to say that your TV has
>>>>excellent Cheryl Cole rendering.
>>>
>>>Cheryl's camel toe does provide quite a lot of relief, considering the
>>>contrast between the flat areas and the ridges.. ...
>>
>>Comedy relief...
>
>Apparently.
>
>You were being serious then?
Relative to the 43rd dimension of nonsense? Maybe. ;-)
--
Dave Farrance
Link posted up thread
> >Nothing else is going to be convincing.
>
> I thought the second picture resembled LCD quite nicely.
Like I said, agree to disagree ...
> Here's a page which has examples of
> transmission and upscaling artifacts which are not dissimilar.
> http://www.highdefinitionblog.com/?page_id=101
But
a) AFAICT they're actual screen shots
b) They're faults in the signal, not one or other type of TV
c) As such they should occur on both types TV - if one or other
type of TV didn't show these faithfully, then it would be a *less*
good TV because it wasn't reproducing faithfully what was being fed to
it.
Exactly ...
>, but picture compression mainly revolves
> around splitting an image into small sections, then looking for
> similar adjacent pixels and making them the same.
... which is exactly what blurred focus creates, so it should compress
better ...
> The net result being
> compression is concentrated where there is less detail and variation.
... exactly what I said, and exactly the opposite of what you
originally said!
> Thus.. 'averaging out'.. aka pixelation.
>
> I think this is why when you see that horrible macroblock stuff in
> really poor digital, it occurs mostly in shadows and that.
It occurs simply when despite compression the bitrate is inadequate to
update rapidly moving detail, which necessitates losing detail, thus
causing the compression squares in things like seascapes, waterfalls,
wildfires, shoals of fish, flocks of birds, and the fly swarms around
footballers (those that wash, they might be real in the case of the
others).
It really is as simple as that.
The cure is to bin all the junk channels like the shopping so
releasing bandwidth back to the mainstream channels that people
actually want to watch. If the mainstream channels had half, perhaps
even a quarter, the bandwidth available under analogue, we would never
see such things.
But sadly, it'll never happen ...
No, you have directly contradicted yourself ...
On Mon, 18 Aug 2008 16:12:12 +0100, Signal <sig...@lineone.net> wrote:
>
> Out of focus areas are likely to be more susceptible to this as they
> are harder to compress.
On Mon, 18 Aug 2008 20:53:50 +0100, Signal <sig...@lineone.net> wrote:
>
> The net result being
> compression is concentrated where there is less detail and variation.
[snip]
> This is what it says in wikipedia :
>
> Macroblocking
But that is a different phenomenon from that which you seemed to be
discussing ...
When I dub a poor quality VHS tape via my Panasonic DVD Recorder, I
might get macroblocking where parts of the signal can not be read from
the tape, that part of the picture will appear as bright green blocks
in my authoring program.
> Thus.. 'averaging out'.. aka pixelation.
>
> I think this is why when you see that horrible macroblock stuff in
> really poor digital, it occurs mostly in shadows and that.
Perhaps macroblock wasn't the correct choice of term by you, I should
say not if I accept the Wikepedia definition, and I have no cause to
query it.
The, if I may coin a term, macro-pixelation that I described and to
which you seemed to refer has the causes, and cure, that I also
described.
> I suggested dither.. sometimes additional "noise" leads to better
> results.
Dither isn't noise added *after* a defect/limitation in the process/comms
chain has affected the results.
For 'dither' to correct/prevent/reduce the effects of a process or channel
limitation it has to be used before/during the process whose defects it is
intended to deal with. Not afterwards.
The above seems a confusion on your part similar to using DOF (an effect
caused by a limitation of the camera/image/sensor) as if it were an effect
in the display.
Or is your point that if you deliberately ruin the image quality in other
ways, then defects that show in an unruined image are less noticable, and
this is 'better'? If so, then you presumably regard a lack of signal as
delivering an ideal result. :-)
Slainte,
Jim
--
Change 'noise' to 'jcgl' if you wish to email me.
Electronics http://www.st-and.ac.uk/~www_pa/Scots_Guide/intro/electron.htm
Armstrong Audio http://www.audiomisc.co.uk/Armstrong/armstrong.html
Audio Misc http://www.audiomisc.co.uk/index.html
Yes, but surely this is of minor importance ...
Jaggies are a particular case of step values, which will occur with
any attempt to represent digitally an analogue world. If you zoom in
far enough in any graphics program you will see them in any straight
lines that are not exactly vertical or horizontal.
Their intrusiveness depends on sample size relative to other aspects
of the digitisation and the way that the samples are used, here that
means resolution compared with things like screen size, viewing
distance, etc.
The most significant contribution will be either shooting originally
in SD, or the conversion from HD to SD, not the other way round, which
lowers the resolution and increases the sample size by 1.875
vertically and 2.667 horizontally, and correspondingly increases the
size of jaggies.
Because SD & HD resolutions are not exact multiples of each other,
both down and up scaling will also introduce some further rounding
errors which won't help. Without investigating further, I can't say
exactly how significant these would be compared with the actual loss
of resolution from the downscaling itself, but on general principle
would guess that relatively they would be less important.
Jaggies will *always* be more noticeable on larger screens compared
with smaller ones (HD screens tend to be larger than SD ones), and in
SD compared with HD, regardless of whether or not the SD was
originally shot in HD and has been downscaled, and/or has been
upscaled (back) to HD, though, as suggested, these latter will
doubtless introduce their own contribution.
Jaggies are therefore much more a problem in the source signal than an
artifact introduced by a particular type of TV.
> I'd like to know why LCDs handle SD worse than Plasma.. any ideas?
I can't comment specifically on any LCDs or Plasmas that you may have
seen, I can only say that if TVs are truly comparable ...
* Showing the same source connected in the same way
* Same resolution
* Same size
* Same post-signal processing (which most here seem agreed should
always be disabled as far as possible anyway)
* Same quality of manufacture
... then, as with compression artifacts, regardless of display type
they should show all such source signal flaws about the same.
In the past, others have tried to claim that CRTs showed less jaggies
than LCDs, but, although at the time I never bothered to go through
the fag of capturing photographs as I did later, my own limited
investigations comparing jaggies on a CRT & and an LCD suggested that,
like so many of the then much-hyped flaws of LCDs, this one too was a
myth:
http://tinyurl.com/5lvhmm
... standing in for ...
http://groups.google.com/group/uk.tech.digital-tv/tree/browse_frm/thread/79d29213846ee51b/62e708e140626c65?rnum=41&_done=%2Fgroup%2Fuk.tech.digital-tv%2Fbrowse_frm%2Fthread%2F79d29213846ee51b%2F5fbd459b89cfe950%3Flnk%3Dst%26q%3D%26#doc_5fbd459b89cfe950
What commonly seems to happen to produce these myths is that someone
buys a new TV and because ...
* It's bigger than their old one;
* Their old one was probably an aging CRT which had got out of
focus, etc;
* Being new, they're looking at their TV critically for the first
time in ages;
* They haven't bothered to look in the menu and discover how best to
set it up;
* Etc;
... they then notice all sorts of flaws in the *source material* that
they've never noticed before, and blame them on the TV rather than the
source material. Hence myths build up about the superiority of one
(old) type of display technology over another (newer) one.
I strongly suspect that in making the above claim you likewise are not
being careful to compare like with like under carefully controlled
conditions.
> >c) As such they should occur on both types TV - if one or other
> >type of TV didn't show these faithfully, then it would be a *less*
> >good TV because it wasn't reproducing faithfully what was being fed to
> >it.
>
> I suggested dither.. sometimes additional "noise" leads to better
> results.
But BTAIM, a TV that introduced it would still be a less good TV which
it wouldn't be a good idea to buy.
> Java Jive <ja...@evij.com> wrote:
>
> Disingenuous of you to snip this...
>
> "Unless you're quibbling over "harder to compress"? That was an error
> and I'll take it back.."
>
> Aside from this non consequential error (which I'm man enough to admit
> to!) that you're getting so hot and bothered over, everything else
> I've said has been consistent.
My objection was that you were actually trying to brush aside a
significant error of understanding, as revealed by the initial attempt
to defend it which led to mutual contradiction between your mails, as
though it were a 'non consequential error' like a typo.
> you claimed these
> areas are *easier* to compress.
They are indeed 'easier' in the sense it should be possible to
compress them more than areas of higher detail. More exactly, the
level of lossless compression achievable, as in lossless audio
compression, and the level of loss in lossy compression, as here,
depends intimately on what is being compressed.
> Quit being an ass...
Quit trying to cover your own errors of understanding first by
claiming they were mere slips, and then by being insulting.
> That'll teach you for making assumptions.
Perhaps the entire sub-thread above, in which you have been criticised
by others than myself for fundamental errors of understanding, will
teach you for shooting off your mouth, but I am not hopeful.
So unhopeful that I welcome you to my plonk folder ...
Don't suppose you know where I can get into it on a Panasonic TX-29AD?
Or for that matter, how to use it? My set has had appalling vertical
linearity since I bought it. Sadly when the dealer's engineer called
for more information my wife told him she had no idea what I was on
about, and they didn't fix it!
Andy
Most CRTs are running in the native resolution of the broadcast
pictures. Most LCDs aren't - they tend to have some silly native
resolution like 1440x900 (That's a Samsung LE19R 86BDX just because I
found it first in Google, not for any other reason). As a result
there's some pretty clever software needed to upscale the 576-line PAL
picture to the 900-line screen - and if you don't get it right, jaggies
galore.
Why they use such silly sizes is beyond me. For that screen you have to
upscale 720, downscale 1080, and it isn't even the right aspect ratio!
Andy
I expect some of them were first devised for computer use, where exact
aspect ratio often doesn't matter very much, and it's easy to set the
output of a typical graphics card to match the physical pixel resolution of
the display.
Another consequence of this is that although the same pixel resolution can
be used for computer displays anywhere in the world, television pixel
resolutions are historically related to national broadcast standards, and
so there are several different ones. This means that no matter what
physical resolution you choose for a display, it's impossible for the same
one to be optimum for all broadcast systems, so in a world where video
recordings are sold internationally, there will always be some that don't
look their best on some displays. Thus the only way to avoid the need for
different display devices for different programme material is global
standardisation of television picture resolution. Let me know when this has
happened and I might consider buying a flat panel display.
You're right, computers are pretty flexible. But every desktop display
I've ever bought has been 4:3, 16:9 or 16:10. 1440x900 is... err...
4.8:3 or 14.4:9. A downright silly number.
<snip>
> Thus the only way to avoid the need for
> different display devices for different programme material is global
> standardisation of television picture resolution. Let me know when this has
> happened and I might consider buying a flat panel display.
err.. correct me if I'm wrong but hasn't the whole world gone to the
dual standards of 1920x1080 and 1280x720, both at 16:9?
Andy
There's also 1280x1024, which is 5:4, which is equally silly in the context of
television, but equally irrelevant when displaying a worprocessor.
> <snip>
> > Thus the only way to avoid the need for
> > different display devices for different programme material is global
> > standardisation of television picture resolution. Let me know when this has
> > happened and I might consider buying a flat panel display.
>
>
> err.. correct me if I'm wrong but hasn't the whole world gone to the
> dual standards of 1920x1080 and 1280x720, both at 16:9?
Don't we also have to cope with 576x720 and 640x480? In any case, even if TV
material is made in only two resolutions, they're both going to be on sale
everywhere in the form of disk recordings, and a flat panel display with a
physical pixel structure can only be right for one of them.
>You're right, computers are pretty flexible. But every desktop display
>I've ever bought has been 4:3, 16:9 or 16:10. 1440x900 is... err...
>4.8:3 or 14.4:9. A downright silly number.
Or 1.6:1. As are 1680x1050 and 1920x1200 monitors. The ratio seems to
be gaining in popularity. I think it's OK. They might well have been
thinking of the Golden Ratio, 1.618:1, considered to be aesthetically
pleasing. For a computer display, there will be small black bars top and
bottom for widescreen videos and thicker bars for 'scope videos -- but
you get a little extra height to help with office apps compared to 16:9.
--
Dave Farrance