Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Apple GPUs now 2 generations out of date?

3 views
Skip to first unread message

Jon Harrop

unread,
Mar 28, 2008, 6:11:24 PM3/28/08
to

I just received an e-mail from nVidia launching their 9-series graphics
cards. Is the highest-spec GPU available to a Mac running Mac OS still only
a 7 series?

--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u

Steve de Mena

unread,
Mar 28, 2008, 8:16:56 PM3/28/08
to
Jon Harrop wrote:
> I just received an e-mail from nVidia launching their 9-series graphics
> cards. Is the highest-spec GPU available to a Mac running Mac OS still only
> a 7 series?

There's an 8000 series available for the Mac Pro ("early 2008" model)
model but nothing newer.

I bought my Mac Pro in 2006 and Apple has released NO updated graphics
card for it since that time. (I had to remove my buggy ATI x1900XT and
buy a used crappy nVidia card to replace it).

I guess George Graves would just call me a "Wintroll" for mentioning
this, rather than admit to the possibility that Apple is not flawless
and has diverted their attention to iPhones and the like.

Steve

Sandman

unread,
Mar 28, 2008, 8:48:36 PM3/28/08
to
In article <J7ednbhMBt_kF3Da...@giganews.com>,

Steve de Mena <st...@stevedemena.com> wrote:

> > I just received an e-mail from nVidia launching their 9-series graphics
> > cards. Is the highest-spec GPU available to a Mac running Mac OS still only
> > a 7 series?
>
> There's an 8000 series available for the Mac Pro ("early 2008" model)
> model but nothing newer.

"Newer" being the "just launched" 9000... :-D


--
Sandman[.net]

Mitch

unread,
Mar 28, 2008, 9:07:03 PM3/28/08
to
In article <13uqr4i...@corp.supernews.com>, Jon Harrop
<use...@jdh30.plus.com> wrote:

> I just received an e-mail from nVidia launching their 9-series graphics
> cards. Is the highest-spec GPU available to a Mac running Mac OS still only
> a 7 series?

No, the three cards available FROM APPLE are:
ATI Radeon HD 2600 XT with 256MB of GDDR3 memory
NVIDIA GeForce 8800 GT with 512MB of GDDR3 memory
NVIDIA Quadro FX 5600 with 1.5GB of GDDR3 memory

http://www.apple.com/macpro/specs.html

But those are NOT the only cards available to Mac users, because, as
always, Mac users can upgrade their cards to the newest models, too.

Apple doesn't update every hardware generation just because a new GPU
comes out; they test and decide whether the spec change will confuse or
annoy the market, too. They may offer many of those as optional
upgrades through the store, but that's not really necessary for people
who want the biggest ad best all the time, either -- because those
people will just buy at a discount supplier and install themselves!

Alan Baker

unread,
Mar 28, 2008, 9:13:45 PM3/28/08
to
In article <J7ednbhMBt_kF3Da...@giganews.com>,
Steve de Mena <st...@stevedemena.com> wrote:

> Jon Harrop wrote:
> > I just received an e-mail from nVidia launching their 9-series graphics
> > cards. Is the highest-spec GPU available to a Mac running Mac OS still only
> > a 7 series?
>
> There's an 8000 series available for the Mac Pro ("early 2008" model)
> model but nothing newer.
>
> I bought my Mac Pro in 2006 and Apple has released NO updated graphics
> card for it since that time. (I had to remove my buggy ATI x1900XT and
> buy a used crappy nVidia card to replace it).

Apple doesn't sell video cards upgrades for earlier Mac systems. This is
not news.

>
> I guess George Graves would just call me a "Wintroll" for mentioning
> this, rather than admit to the possibility that Apple is not flawless
> and has diverted their attention to iPhones and the like.
>
> Steve

--
Alan Baker
Vancouver, British Columbia
"If you raise the ceiling four feet, move the fireplace from that wall
to that wall, you'll still only get the full stereophonic effect if you
sit in the bottom of that cupboard."

Jon Harrop

unread,
Mar 28, 2008, 9:26:26 PM3/28/08
to
Steve de Mena wrote:
> Jon Harrop wrote:
>> I just received an e-mail from nVidia launching their 9-series graphics
>> cards. Is the highest-spec GPU available to a Mac running Mac OS still
>> only a 7 series?
>
> There's an 8000 series available for the Mac Pro ("early 2008" model)
> model but nothing newer.

Early 2008 model of the Mac Pro, yes. The graphics card in it was already
out of date though.

> I bought my Mac Pro in 2006 and Apple has released NO updated graphics
> card for it since that time. (I had to remove my buggy ATI x1900XT and
> buy a used crappy nVidia card to replace it).

Yes, I particularly like this story:

http://forums.macrumors.com/showpost.php?p=4740679&postcount=21

where a loyal Apple customer was duped by Apple into paying for a product
that did not yet exist, prompting him to write to Steve Jobs himself to
explain that he was sick of waiting for yesteryears' graphics cards to work
in his $4,000 Mac Pro only to receive a personal response from Mr Jobs
advising him to "calm down" and continue to wait.

We evaluated moving into the Mac market at the end of last year. The lame
graphics cards swung our decision not to waste money on a Mac Pro (twice
the cost for half the performance!). We ended up buying a Mac Mini, which
is a nice toy but completely useless for business. Turns out the software
development environments available for Mac OS X are out of the dark ages.
Despite Apple's best efforts to push their Macs into technical computing:

http://www.apple.com/science/

Most scientists and engineers continue to use Linux and Windows. The
comp.sys.mac.scitech usenet group has been in decline for a decade and now
has roughly zero posts per month:

http://groups.google.com/group/comp.sys.mac.scitech/about

Sounds like Apple have milked their desktop users dry and are now moving on
to phone users...

Jon Harrop

unread,
Mar 28, 2008, 9:34:26 PM3/28/08
to
Mitch wrote:
> But those are NOT the only cards available to Mac users, because, as
> always, Mac users can upgrade their cards to the newest models, too.

The last time I checked this out, installing a current graphics card
prevented Mac OS X from booting. You could run modern graphics hardware in
Mac Pro hardware but you have to install a different OS, like Windows or
Linux.

Has that changed?

Timberwoof

unread,
Mar 28, 2008, 11:18:34 PM3/28/08
to
In article <13ur717...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Mitch wrote:
> > But those are NOT the only cards available to Mac users, because, as
> > always, Mac users can upgrade their cards to the newest models, too.
>
> The last time I checked this out,

Wen?

> installing a current graphics card

Which one?

> prevented Mac OS X from booting.

Which version?

> You could run modern graphics hardware in
> Mac Pro hardware but you have to install a different OS, like Windows or
> Linux.
>
> Has that changed?

In what line of work is a superduper graphics card necessary? My Mac
Cube has an ATI Rage128 Pro. My G4 1.4GHz DP has the cool card that came
with it. I can't t ell the difference between the graphics cards. What
am I missing out on?

--
Timberwoof <me at timberwoof dot com> http://www.timberwoof.com
"When you post sewage, don't blame others for
emptying chamber pots in your direction." ‹Chris L.

Steve de Mena

unread,
Mar 28, 2008, 11:58:46 PM3/28/08
to
Mitch wrote:
> In article <13uqr4i...@corp.supernews.com>, Jon Harrop
> <use...@jdh30.plus.com> wrote:
>
>> I just received an e-mail from nVidia launching their 9-series graphics
>> cards. Is the highest-spec GPU available to a Mac running Mac OS still only
>> a 7 series?
>
> No, the three cards available FROM APPLE are:
> ATI Radeon HD 2600 XT with 256MB of GDDR3 memory
> NVIDIA GeForce 8800 GT with 512MB of GDDR3 memory
> NVIDIA Quadro FX 5600 with 1.5GB of GDDR3 memory
>
> http://www.apple.com/macpro/specs.html
>
> But those are NOT the only cards available to Mac users, because, as
> always, Mac users can upgrade their cards to the newest models, too.

Nope.

None of those cards you listed will work in my Mac Pro.

So much for the "upgradeability" of my Mac. :(


> Apple doesn't update every hardware generation just because a new GPU
> comes out; they test and decide whether the spec change will confuse or
> annoy the market, too. They may offer many of those as optional
> upgrades through the store, but that's not really necessary for people
> who want the biggest ad best all the time, either -- because those
> people will just buy at a discount supplier and install themselves!

NO video card upgrades available for my Mac Pro (from Apple or 3rd
parties) since it came out in 2006.

Steve

Steve de Mena

unread,
Mar 29, 2008, 12:02:43 AM3/29/08
to
Timberwoof wrote:
> In article <13ur717...@corp.supernews.com>,
> Jon Harrop <use...@jdh30.plus.com> wrote:
>
>> Mitch wrote:
>>> But those are NOT the only cards available to Mac users, because, as
>>> always, Mac users can upgrade their cards to the newest models, too.
>> The last time I checked this out,
>
> Wen?
>
>> installing a current graphics card
>
> Which one?
>
>> prevented Mac OS X from booting.
>
> Which version?
>
>> You could run modern graphics hardware in
>> Mac Pro hardware but you have to install a different OS, like Windows or
>> Linux.
>>
>> Has that changed?
>
> In what line of work is a superduper graphics card necessary? My Mac
> Cube has an ATI Rage128 Pro. My G4 1.4GHz DP has the cool card that came
> with it. I can't t ell the difference between the graphics cards. What
> am I missing out on?

Well I can give you a simple example:

When I bought my Mac Pro in mid 2006 I bought the higher-end ATI
x1900XT. That card has problems with Leopard and I had to find a used
nVidia card (that was the default card when my Mac Pro came out).

Visually you can stand across the room and tell that the cheaper card
is more washed out and doesn't have as much "punch" as the ATI card.

This is bothersome in a plain email and browsing envronment.

Steve

Alan Baker

unread,
Mar 29, 2008, 12:05:22 AM3/29/08
to
In article <6tadnQtx46D-InDa...@giganews.com>,

Steve de Mena <st...@stevedemena.com> wrote:

<cough>bullshit<cough>

There are things that differentiate video cards, but "washed out"
colours and "punch" aren't among them. They all deliver video signals to
the same standard.

Timberwoof

unread,
Mar 29, 2008, 1:08:06 AM3/29/08
to
In article <6tadnQtx46D-InDa...@giganews.com>,
Steve de Mena <st...@stevedemena.com> wrote:

http://lowendmac.com/macpro/mac-pro-2006.html says these cards were
available:
€ nVidia GeForce 7300 GT with 256 MB RAM in double-wide 16-lane PCIe
slot
€ ATI Radeon X1900XT with 512 MB RAM optional (add $350)
€ nVidia Quadro FX 4500 with 512 MB RAM optional (add $1,650)

> Visually you can stand across the room and tell that the cheaper card
> is more washed out and doesn't have as much "punch" as the ATI card.

I guess if you connect the cheaper card to the cheaper monitor, yes. But
you're using a card with digital connections, yes? So I don't believe
you.

> This is bothersome in a plain email and browsing envronment.

Turn up the contrast on your monitor. Use the Max OS's built-in monitor
calibration software.

Steve de Mena

unread,
Mar 29, 2008, 2:13:06 AM3/29/08
to

Apple (nor any 3rd party) has not released any graphics cards since I
bought my Mac Pro in 2006 that are compatible with my Mac Pro. Those
cards you list above are the ones that were available when I bought my
Mac Pro. The 2nd one is the one that came with it, the nVidia 7300GT
is the used one I got to replace it, that sucks (just read the Apple
Mac Pro discussion forums). The $1,650 card is not a serious
option, but you're welcome to buy one for me.

The new nVidia in the 2008 Mac Pros would work in mine *if* nVidia
created a 32bit EFO firmware for it. Although Apple describes it as
being for "PCI Express 2.0" Mac Pros, it would work fine in my Mac Pro
with the 32bit EFI. I am very disappointed they have not come out
with this card for my Mac Pro. It makes me question why I invested so
much in this system.

>> Visually you can stand across the room and tell that the cheaper card
>> is more washed out and doesn't have as much "punch" as the ATI card.
>
> I guess if you connect the cheaper card to the cheaper monitor, yes. But
> you're using a card with digital connections, yes? So I don't believe
> you.
>
>> This is bothersome in a plain email and browsing envronment.
>
> Turn up the contrast on your monitor. Use the Max OS's built-in monitor
> calibration software.

I have the Mac Pro connected both ways to my Dell 24" (2407) monitor.
With the ATI card there was no visual difference using DVI or VGA
inputs. I prefer to use analog as I have a TrippLite switchbox with 4
other VGA systems connected. The Mac Pro looks better with the Dell
set to the DVI input (which I am using 75% of the time now) but it is
noticeably inferior to the ATI card I had.

I'd also like to buy an Apple 30" monitor, but their 4 year old $1799
monitor is horribly out of date.

Steve

George Graves

unread,
Mar 29, 2008, 2:14:56 AM3/29/08
to
On Fri, 28 Mar 2008 17:16:56 -0700, Steve de Mena wrote
(in article <J7ednbhMBt_kF3Da...@giganews.com>):

No, I call you a Wintroll because that's what you are. The fact that you are
correct occasionally or even always doesn't alter your motives.

--
Obama '08 = Osama '09

Steve de Mena

unread,
Mar 29, 2008, 2:36:46 AM3/29/08
to
Timberwoof wrote:

> Turn up the contrast on your monitor. Use the Max OS's built-in monitor
> calibration software.

There is no contrast control on my Dell 2407 24" monitor.

I ran through the calibration and adjusting the Gamma improved the
appearence of the image on the VGA input somewhat. But its still
blurrier than the ATI x1900XT image was.

Steve

Steve de Mena

unread,
Mar 29, 2008, 2:39:51 AM3/29/08
to

I better not mention I was in the Apple Store again today for a Genius
Bar appointment. My thin Aluminum keyboard died a month or two ago
and I just got around to bringing it in. They gave me a brand new
replacement with no problems. I like the new thin aluminum keyboards.

Steve

Snit

unread,
Mar 29, 2008, 3:06:28 AM3/29/08
to
"Steve de Mena" <st...@stevedemena.com> stated in post
xZCdndDuSqLifnDa...@giganews.com on 3/28/08 11:36 PM:

> Timberwoof wrote:
>
>> Turn up the contrast on your monitor. Use the Max OS's built-in monitor
>> calibration software.
>
> There is no contrast control on my Dell 2407 24" monitor.

Sounds odd. According to the Dell website the "Dell? 2407WFP Flat Panel
Monitor" and the "Dell(TM) 2407WFP-HC Flat Panel Monitor" both have contrast
controls... which monitor do you have that does not?

<http://support.dell.com/support/edocs/monitors/2407WFP/en/setup.htm>
<http://support.dell.com/support/edocs/monitors/2407WFPH/en/setup.htm>

> I ran through the calibration and adjusting the Gamma improved the
> appearence of the image on the VGA input somewhat. But its still
> blurrier than the ATI x1900XT image was.

No offense, but my guess is you know very little about your monitor or how
to set it up it you cannot find the contrast setting... but I will gladly
admit to an error if you can point to what monitor you have that lacks that
setting.


--
Look, this is silly. It's not an argument, it's an armor plated walrus with
walnut paneling and an all leather interior.

Alan Baker

unread,
Mar 29, 2008, 5:57:00 AM3/29/08
to
In article <xZCdndDuSqLifnDa...@giganews.com>,

Steve de Mena <st...@stevedemena.com> wrote:

"Blurrier"????

Oh, please.

MuahMan

unread,
Mar 29, 2008, 7:15:09 AM3/29/08
to

"Snit" <use...@gallopinginsanity.com> wrote in message
news:C4133884.B0B7D%use...@gallopinginsanity.com...

> "Steve de Mena" <st...@stevedemena.com> stated in post
> xZCdndDuSqLifnDa...@giganews.com on 3/28/08 11:36 PM:
>
>> Timberwoof wrote:
>>
>>> Turn up the contrast on your monitor. Use the Max OS's built-in monitor
>>> calibration software.
>>
>> There is no contrast control on my Dell 2407 24" monitor.
>
> Sounds odd. According to the Dell website the "Dell? 2407WFP Flat Panel
> Monitor" and the "Dell(TM) 2407WFP-HC Flat Panel Monitor" both have
> contrast
> controls... which monitor do you have that does not?
>
> <http://support.dell.com/support/edocs/monitors/2407WFP/en/setup.htm>
> <http://support.dell.com/support/edocs/monitors/2407WFPH/en/setup.htm>
>
>> I ran through the calibration and adjusting the Gamma improved the
>> appearence of the image on the VGA input somewhat. But its still
>> blurrier than the ATI x1900XT image was.
>
> No offense, but my guess is you know very little about your monitor or how
> to set it up it you cannot find the contrast setting... but I will gladly
> admit to an error if you can point to what monitor you have that lacks
> that
> setting.
>

Contrast is disabled when contecting via DVI or HDMI.

Snit

unread,
Mar 29, 2008, 10:44:28 AM3/29/08
to
"MuahMan" <mua...@cumcast.net> stated in post
P5mdnaLBEMVAuXPa...@comcast.com on 3/29/08 4:15 AM:

Ah, with the 2407WFP you are correct:

NOTE: When using DVI source, the contrast adjustment is not available.

Though that is not the case for the 2407WFPH.

I, clearly, stand corrected and offer Steve an apology - my mistake. There
is no reason to think you do not know the settings on your monitor.


--
Try not to become a man of success, but rather try to become a man of value.
--Albert Einstein

Jon Harrop

unread,
Mar 29, 2008, 10:42:38 AM3/29/08
to
Alan Baker wrote:
> They all deliver video signals to the same standard.

Of course they don't. Low-end cards can't even drive high-end displays for a
start.

Jon Harrop

unread,
Mar 29, 2008, 10:54:05 AM3/29/08
to
Timberwoof wrote:
> In article <13ur717...@corp.supernews.com>,
> Jon Harrop <use...@jdh30.plus.com> wrote:
>> Mitch wrote:
>> > But those are NOT the only cards available to Mac users, because, as
>> > always, Mac users can upgrade their cards to the newest models, too.
>>
>> The last time I checked this out,
>
> Wen?

Just over a year ago.

>> installing a current graphics card
>
> Which one?

At the time, any 8000-series nVidia card (which was their current range but
is now out of date).

>> prevented Mac OS X from booting.
>
> Which version?

IIRC, this was just as Leopard was being released.

>> You could run modern graphics hardware in
>> Mac Pro hardware but you have to install a different OS, like Windows or
>> Linux.
>>
>> Has that changed?
>
> In what line of work is a superduper graphics card necessary?

Well, we specialize in scientific visualization so we need lots of RAM on
the graphics card (>256Mb) and fast 3D but you need fast 2D for anything
these days if you're using a modern OS and have a big display like my
1920x1200 or better.

> My Mac
> Cube has an ATI Rage128 Pro. My G4 1.4GHz DP has the cool card that came
> with it. I can't t ell the difference between the graphics cards. What
> am I missing out on?

The programmable graphics pipelines in the current range of cards (and the
older 8000-series) make a big difference for any applications that can use
them, particularly games.

Jon Harrop

unread,
Mar 29, 2008, 10:54:10 AM3/29/08
to
Steve de Mena wrote:

> Mitch wrote:
>> Apple doesn't update every hardware generation just because a new GPU
>> comes out; they test and decide whether the spec change will confuse or
>> annoy the market, too. They may offer many of those as optional
>> upgrades through the store, but that's not really necessary for people
>> who want the biggest ad best all the time, either -- because those
>> people will just buy at a discount supplier and install themselves!
>
> NO video card upgrades available for my Mac Pro (from Apple or 3rd
> parties) since it came out in 2006.

Yes. Mitch's misinformation is really quite serious.

It looks as if Apple are deliberately trying to deceive their customers into
believing that a Mac can run decent graphics hardware, or that you can
upgrade from their lame top-of-the-range offering to something modern. In
reality, they've got complete lock-in: if you don't pay double or triple
the price for Apple's-own hardware it will probably never work.

If you do upgrade then you must ditch Mac OS X, of course. Which begs the
question: would Apple actually care? I mean, they've already got your
money.

What are the odds they will ditch Mac OS X altogether now? I mean, its no
secret that Jobs' last keynote brought nothing to the table: widely reputed
to have been the worst keynote ever. They've basically stopped work on it.

Jon Harrop

unread,
Mar 29, 2008, 10:56:11 AM3/29/08
to
Steve de Mena wrote:
> I better not mention I was in the Apple Store again today for a Genius
> Bar appointment. My thin Aluminum keyboard died a month or two ago...

Yes, the Apple keyboard on our Mac Mini is incredibly unreliable.

This begs the question: why do you still use Apple?

With the benefit of hindsight, I'm very glad I didn't waste money on
anything more than a Mac Mini...

Steve Carroll

unread,
Mar 29, 2008, 11:41:16 AM3/29/08
to
In article <P5mdnaLBEMVAuXPa...@comcast.com>,
"MuahMan" <mua...@cumcast.net> wrote:

You must be wrong... the IT 'teacher' knows all;)

--
"Apple is pushing how green this is - but it [Macbook Air] is
clearly disposable... when the battery dies you can pretty much
just throw it away". - Snit

Hasta La Vista

unread,
Mar 29, 2008, 2:34:36 PM3/29/08
to

"Steve Carroll" <troll...@TK.com> wrote in message
news:trollkiller-4D85...@newsgroups.comcast.net...

My Chimei CMV 946D monitor has a contrast control available in its menu
while I'm using DVI output. The box even indicates I'm using DVI.

Maybe Snit knows a bit more than you give him credit for?

Snit

unread,
Mar 29, 2008, 2:40:09 PM3/29/08
to
"Hasta La Vista" <noe...@all.to.me> stated in post
z4KdnaxRxbNdFnPa...@comcast.com on 3/29/08 11:34 AM:

Well, assuming the monitor is a 2407WFP then it does not have a contrast
control when using DVI, at least according the Dell. From the link, above:

NOTE: When using DVI source, the contrast adjustment is not available.

When this was pointed out to me I openly acknowledged my error and
apologized - as I said I would.

Steve has repeatedly lied about me being an "IT" teacher - it is not a mere
mistake on his part being that he has been corrected repeatedly. He also
puts the word "teacher" in quotes as though that were in question, even
though he has gone to my school's websites and checked to see if I really am
the computer teacher I say I am (of course I am!). He is simply repeatedly
lying about me and trying to antagonize me.


--
"For example, user interfaces are _usually_ better in commercial software.
I'm not saying that this is always true, but in many cases the user
interface to a program is the most important part for a commercial
company..." Linus Torvalds <http://www.tlug.jp/docs/linus.html>

Steve de Mena

unread,
Mar 29, 2008, 2:59:03 PM3/29/08
to

Here is the manual for my Dell 2407 WFP.

http://support.dell.com/support/edocs/monitors/2407WFP/en/index.htm

Please show me where the contrast setting is.

Note:


"
NOTE: When using DVI source, the contrast adjustment is not available."

Steve

Snit

unread,
Mar 29, 2008, 3:05:39 PM3/29/08
to
"Steve de Mena" <st...@stevedemena.com> stated in post
8qadnREL3vrlDHPa...@giganews.com on 3/29/08 11:59 AM:

If you did not see my other post to MuahMan:

-----


Ah, with the 2407WFP you are correct:

NOTE: When using DVI source, the contrast adjustment is not available.

Though that is not the case for the 2407WFPH.

I, clearly, stand corrected and offer Steve an apology - my mistake. There
is no reason to think you do not know the settings on your monitor.

-----

Fair enough?


--
I know how a jam jar feels...
... full of jam!

Hasta La Vista

unread,
Mar 29, 2008, 3:06:53 PM3/29/08
to

"Steve de Mena" <st...@stevedemena.com> wrote in message
news:8qadnREL3vrlDHPa...@giganews.com...

My monitor seems to have the contrast control available for DVI. Looks
like Dell isn't necessarily the best choice. :-D

Steve de Mena

unread,
Mar 29, 2008, 3:07:26 PM3/29/08
to
Hasta La Vista wrote:

> My Chimei CMV 946D monitor has a contrast control available in its menu
> while I'm using DVI output. The box even indicates I'm using DVI.
>
> Maybe Snit knows a bit more than you give him credit for?

Maybe not.
http://support.dell.com/support/edocs/monitors/2407WFP/en/index.htm

Dell 2407 WFP

Snit

unread,
Mar 29, 2008, 3:13:14 PM3/29/08
to
"Hasta La Vista" <noe...@all.to.me> stated in post
S8adnfdY1-3MDnPa...@comcast.com on 3/29/08 12:06 PM:

It is a bizarre thing to not have contrast controls... but it does look like
the 2407WFP does not. Just weird.


--
One who makes no mistakes, never makes anything.

Alan Baker

unread,
Mar 29, 2008, 3:27:25 PM3/29/08
to
In article <13usltf...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Alan Baker wrote:
> > They all deliver video signals to the same standard.
>
> Of course they don't. Low-end cards can't even drive high-end displays for a
> start.

Low end cards cannot provide the necessary *number* of signals per
second. That has nothing to do with the issue at hand.

Hasta La Vista

unread,
Mar 29, 2008, 3:32:36 PM3/29/08
to

"Snit" <use...@gallopinginsanity.com> wrote in message
news:C413E113.B0CB5%use...@gallopinginsanity.com...

An excellent post, Snit. Many here could learn by your example. Well
done.

Hasta La Vista

unread,
Mar 29, 2008, 3:34:51 PM3/29/08
to

"Steve de Mena" <st...@stevedemena.com> wrote in message
news:hZCdne_hP4LsDnPa...@giganews.com...

Okay, you're right. But he did admit his mistake and apologized. A rare
sight in this group, from what I've seen of it.

Snit

unread,
Mar 29, 2008, 3:42:05 PM3/29/08
to
"Hasta La Vista" <noe...@all.to.me> stated in post
VOydnSShv-vFBHPa...@comcast.com on 3/29/08 12:32 PM:

>>> Here is the manual for my Dell 2407 WFP.
>>>
>>> http://support.dell.com/support/edocs/monitors/2407WFP/en/index.htm
>>>
>>> Please show me where the contrast setting is.
>>>
>>> Note:
>>> "
>>> NOTE: When using DVI source, the contrast adjustment is not available."
>>>
>>> Steve
>>
>> If you did not see my other post to MuahMan:
>>
>> -----
>> Ah, with the 2407WFP you are correct:
>>
>> NOTE: When using DVI source, the contrast adjustment is not available.
>>
>> Though that is not the case for the 2407WFPH.
>>
>> I, clearly, stand corrected and offer Steve an apology - my mistake. There is
>> no reason to think you do not know the settings on your monitor.
>> -----
>>
>> Fair enough?
>
> An excellent post, Snit. Many here could learn by your example. Well
> done.

Thanks... and the apology is sincere...


--
Teachers open the door but you must walk through it yourself.

Snit

unread,
Mar 29, 2008, 3:42:57 PM3/29/08
to
"Hasta La Vista" <noe...@all.to.me> stated in post
hv2dnULlSoJDBHPa...@comcast.com on 3/29/08 12:34 PM:

How long until someone holds my apology against me. :)


--
Is Swiss cheese made out of hole milk?

Alan Baker

unread,
Mar 29, 2008, 3:43:29 PM3/29/08
to
In article <hv2dnULlSoJDBHPa...@comcast.com>,

Certainly such admissions are rare from you...

Hasta La Vista

unread,
Mar 29, 2008, 4:03:45 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-4546A6.12432829032008@[74.223.185.199.nw.nuvox.net]...

> In article <hv2dnULlSoJDBHPa...@comcast.com>,
> "Hasta La Vista" <noe...@all.to.me> wrote:
>
>> "Steve de Mena" <st...@stevedemena.com> wrote in message
>> news:hZCdne_hP4LsDnPa...@giganews.com...
>> > Hasta La Vista wrote:
>> >
>> >> My Chimei CMV 946D monitor has a contrast control available in its
>> >> menu
>> >> while I'm using DVI output. The box even indicates I'm using DVI.
>> >>
>> >> Maybe Snit knows a bit more than you give him credit for?
>> >
>> > Maybe not.
>> > http://support.dell.com/support/edocs/monitors/2407WFP/en/index.htm
>> >
>> > Dell 2407 WFP
>> >
>> > "NOTE: When using DVI source, the contrast adjustment is not
>> > available."
>> >
>> > Steve
>>
>> Okay, you're right. But he did admit his mistake and apologized. A
>> rare
>> sight in this group, from what I've seen of it.
>
> Certainly such admissions are rare from you...

You don't even know who you're talking to, so how can you say what you did,
Gimp?

Alan Baker

unread,
Mar 29, 2008, 4:04:42 PM3/29/08
to
In article <cLGdnRSLXLE4PXPa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

> "Alan Baker" <alang...@telus.net> wrote in message
> news:alangbaker-4546A6.12432829032008@[74.223.185.199.nw.nuvox.net]...
> > In article <hv2dnULlSoJDBHPa...@comcast.com>,
> > "Hasta La Vista" <noe...@all.to.me> wrote:
> >
> >> "Steve de Mena" <st...@stevedemena.com> wrote in message
> >> news:hZCdne_hP4LsDnPa...@giganews.com...
> >> > Hasta La Vista wrote:
> >> >
> >> >> My Chimei CMV 946D monitor has a contrast control available in its
> >> >> menu
> >> >> while I'm using DVI output. The box even indicates I'm using DVI.
> >> >>
> >> >> Maybe Snit knows a bit more than you give him credit for?
> >> >
> >> > Maybe not.
> >> > http://support.dell.com/support/edocs/monitors/2407WFP/en/index.htm
> >> >
> >> > Dell 2407 WFP
> >> >
> >> > "NOTE: When using DVI source, the contrast adjustment is not
> >> > available."
> >> >
> >> > Steve
> >>
> >> Okay, you're right. But he did admit his mistake and apologized. A
> >> rare
> >> sight in this group, from what I've seen of it.
> >
> > Certainly such admissions are rare from you...
>
> You don't even know who you're talking to, so how can you say what you did,
> Gimp?

Sorry, Edwin, but your protestations are getting awfully thin.

BTW, why "Gimp"?

Hasta La Vista

unread,
Mar 29, 2008, 4:10:20 PM3/29/08
to

"Snit" <use...@gallopinginsanity.com> wrote in message
news:C413E9D1.B0CDF%use...@gallopinginsanity.com...

I see what you mean. Sooner or later somebody like Alan will say something
like "why should we believe the guy who couldn't even read a monitor
manual," in response to a discussion that has nothing to do with monitors.
:-D

Steve Carroll

unread,
Mar 29, 2008, 4:10:27 PM3/29/08
to
In article <z4KdnaxRxbNdFnPa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

He obviously doesn't know about Steve De Mena's monitor... the one Snit
incorrectly guessed Steve knows very little about.

Steve Carroll

unread,
Mar 29, 2008, 4:12:48 PM3/29/08
to
In article <C413DB19.B0CA6%use...@gallopinginsanity.com>,
Snit <use...@gallopinginsanity.com> wrote:

So, in other words... you mouthed off again about someone else's
knowledge without actually knowing what you were talking about.

Hasta La Vista

unread,
Mar 29, 2008, 4:16:37 PM3/29/08
to
I'm not protesting anything. I'm giving you the facts, Gimp.


Alan Baker

unread,
Mar 29, 2008, 4:22:04 PM3/29/08
to
In article <SNKdnRuVQemtP3Pa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

Do the wounds still sting that much, Edwin?

If only you'd actually read the things you'd claimed to have read. If
only (on some occasions) you'd read your own words; you wouldn't have
stepped in it even deeper.

While looking for one of the earlier discussions with you about Avid
discontinuing their high-end products for Mac, I happened across the
classic where you ended up finally admitting that Pixar used Final Cut
and Macs, but insisting that you'd never been shown the info before.

Only I showed you a post where you replied to just that information.
But, as you often do, you didn't actually read what I posted and went
off on a tangent.

Snit

unread,
Mar 29, 2008, 4:23:40 PM3/29/08
to
"Hasta La Vista" <noe...@all.to.me> stated in post
SNKdnRuVQemtP3Pa...@comcast.com on 3/29/08 1:10 PM:

Steve Carroll has already attacked me for my mistake and my admission of
error... even with the apology... and yet he has not even acknowledged the
mistakes and outright lies of his he has made in regards to me in the same
discussion.

Same old same old...

Steve Carroll

unread,
Mar 29, 2008, 4:28:30 PM3/29/08
to
In article <C413F35C.B0CF1%use...@gallopinginsanity.com>,
Snit <use...@gallopinginsanity.com> wrote:


Your mistake was that you tried to lift yourself up by knocking De Mena
down... you failed... as you always do when you try this tactic.

Timberwoof

unread,
Mar 29, 2008, 5:01:50 PM3/29/08
to
In article <13usmit...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Timberwoof wrote:

<snip>

> > In what line of work is a superduper graphics card necessary?
>
> Well, we specialize in scientific visualization so we need lots of RAM on
> the graphics card (>256Mb) and fast 3D but you need fast 2D for anything
> these days if you're using a modern OS and have a big display like my
> 1920x1200 or better.

Therefore in your line of work, you need Linux with a superduper
graphics card. Probably not Windows. And as a person with access to
scientists and upon whom scientific thinking has presumably rubbed off,
you should realize that *your* need for a superduper graphics card is
not everyone's.

> > My Mac
> > Cube has an ATI Rage128 Pro. My G4 1.4GHz DP has the cool card that came
> > with it. I can't t ell the difference between the graphics cards. What
> > am I missing out on?
>
> The programmable graphics pipelines in the current range of cards (and the
> older 8000-series) make a big difference for any applications that can use
> them, particularly games.

What kind of a difference do they make?

The graphics-intensive work I do is Photoshop and video ... that doesn't
need OpenGL-style graphics, just a place to put the pixels from the
framebuffer.

--
Timberwoof <me at timberwoof dot com> http://www.timberwoof.com
"When you post sewage, don't blame others for
emptying chamber pots in your direction." ‹Chris L.

Timberwoof

unread,
Mar 29, 2008, 5:04:06 PM3/29/08
to
In article <13usmmr...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Steve de Mena wrote:
> > I better not mention I was in the Apple Store again today for a Genius
> > Bar appointment. My thin Aluminum keyboard died a month or two ago...
>
> Yes, the Apple keyboard on our Mac Mini is incredibly unreliable.
>
> This begs the question: why do you still use Apple?
>
> With the benefit of hindsight, I'm very glad I didn't waste money on
> anything more than a Mac Mini...

Because even though Steve's keyboard from Apple has turned out to be
unreliable, all the keyboards I ever got with my Macs have worked fine.
Are you suggesting that one unreliable type of keyboard from Apple is a
reason to ditch the brand?

--
Timberwoof <me at timberwoof dot com> http://www.timberwoof.com
"When you post sewage, don't blame others for

emptying chamber pots in your direction." æ°—hris L.

Timberwoof

unread,
Mar 29, 2008, 5:05:40 PM3/29/08
to
In article <xZCdndDuSqLifnDa...@giganews.com>,

Steve de Mena <st...@stevedemena.com> wrote:

> Timberwoof wrote:
>
> > Turn up the contrast on your monitor. Use the Max OS's built-in monitor
> > calibration software.
>
> There is no contrast control on my Dell 2407 24" monitor.
>

> I ran through the calibration and adjusting the Gamma improved the
> appearence of the image on the VGA input somewhat. But its still
> blurrier than the ATI x1900XT image was.

Stop, stop, stop! You can't blame poor image quality on the card if you
use a second-rate monitor. And you can't complain that VGA image quality
is inferior to direct digital.

Well, I suppose that *you* can.

Steve de Mena

unread,
Mar 29, 2008, 6:13:31 PM3/29/08
to
Timberwoof wrote:
> In article <xZCdndDuSqLifnDa...@giganews.com>,
> Steve de Mena <st...@stevedemena.com> wrote:
>
>> Timberwoof wrote:
>>
>>> Turn up the contrast on your monitor. Use the Max OS's built-in monitor
>>> calibration software.
>> There is no contrast control on my Dell 2407 24" monitor.
>>
>> I ran through the calibration and adjusting the Gamma improved the
>> appearence of the image on the VGA input somewhat. But its still
>> blurrier than the ATI x1900XT image was.
>
> Stop, stop, stop! You can't blame poor image quality on the card if you
> use a second-rate monitor. And you can't complain that VGA image quality
> is inferior to direct digital.
>
> Well, I suppose that *you* can.
>

The Dell Ulrasharp 2407 is not a second-rate monitor.

And as I said before:

1) The ATI card (from either input) looked better than this card (at
either input)

2) VGA image quality was not "inferior" with the ATI x1900xt card

Basically the nVidia GeForce 7300 GT card on a Mac Pro sucks. (Read
the Mac Pro forum on Apple's site if you need more opinions). Both
of my PCs, via VGA, look better. (nVidia GeForce 9600 GT and nVidia
GeForce 6200 TurboCache).

Steve

Tim Murray

unread,
Mar 29, 2008, 7:00:47 PM3/29/08
to
On Sat, 29 Mar 2008 00:02:43 -0400, Steve de Mena wrote:
> Visually you can stand across the room and tell that the cheaper card
> is more washed out and doesn't have as much "punch" as the ATI card.

I stealing a line from Alan, and coughing out a bullshit.

A digital value for a certain color is the same across video cards. Generally
I see you as pretty technical, so I don't think you'd say that without
reason. If you can enlighten me as to how that's possible, then I'm all ears.

Hasta La Vista

unread,
Mar 29, 2008, 7:32:49 PM3/29/08
to
You're a gimp.

Alan Baker

unread,
Mar 29, 2008, 7:42:47 PM3/29/08
to
In article <sZudnUkm6qg4THPa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

> You're a gimp.

Why?

John

unread,
Mar 29, 2008, 7:43:31 PM3/29/08
to

"Steve de Mena" <st...@stevedemena.com> wrote in message
news:J7ednbhMBt_kF3Da...@giganews.com...
> Jon Harrop wrote:
>> I just received an e-mail from nVidia launching their 9-series graphics
>> cards. Is the highest-spec GPU available to a Mac running Mac OS still
>> only
>> a 7 series?
>
> There's an 8000 series available for the Mac Pro ("early 2008" model)
> model but nothing newer.
>
> I bought my Mac Pro in 2006 and Apple has released NO updated graphics
> card for it since that time. (I had to remove my buggy ATI x1900XT and buy
> a used crappy nVidia card to replace it).
>
> I guess George Graves would just call me a "Wintroll" for mentioning this,
> rather than admit to the possibility that Apple is not flawless and has
> diverted their attention to iPhones and the like.
>
> Steve

Macs are used to do real work not gaming.

John

unread,
Mar 29, 2008, 7:44:41 PM3/29/08
to

"George Graves" <gmgr...@comcast.net> wrote in message
news:0001HW.C4132C70...@news.comcast.net...
> On Fri, 28 Mar 2008 17:16:56 -0700, Steve de Mena wrote
> (in article <J7ednbhMBt_kF3Da...@giganews.com>):

>
>> Jon Harrop wrote:
>>> I just received an e-mail from nVidia launching their 9-series graphics
>>> cards. Is the highest-spec GPU available to a Mac running Mac OS still
>>> only
>>> a 7 series?
>>
>> There's an 8000 series available for the Mac Pro ("early 2008" model)
>> model but nothing newer.
>>
>> I bought my Mac Pro in 2006 and Apple has released NO updated graphics
>> card for it since that time. (I had to remove my buggy ATI x1900XT and
>> buy a used crappy nVidia card to replace it).
>>
>> I guess George Graves would just call me a "Wintroll" for mentioning
>> this, rather than admit to the possibility that Apple is not flawless
>> and has diverted their attention to iPhones and the like.
>>
>> Steve
>
> No, I call you a Wintroll because that's what you are. The fact that you
> are
> correct occasionally or even always doesn't alter your motives.
>


Don't forget George that Steve is also technically incompetent.

John

unread,
Mar 29, 2008, 7:46:34 PM3/29/08
to

"Steve de Mena" <st...@stevedemena.com> wrote in message
news:6tadnQtx46D-InDa...@giganews.com...
> Timberwoof wrote:
>> In article <13ur717...@corp.supernews.com>,
>> Jon Harrop <use...@jdh30.plus.com> wrote:
>>
>>> Mitch wrote:
>>>> But those are NOT the only cards available to Mac users, because, as
>>>> always, Mac users can upgrade their cards to the newest models, too.
>>> The last time I checked this out,
>>
>> Wen?
>>> installing a current graphics card
>>
>> Which one?
>>> prevented Mac OS X from booting.
>>
>> Which version?
>>> You could run modern graphics hardware in
>>> Mac Pro hardware but you have to install a different OS, like Windows or
>>> Linux.
>>>
>>> Has that changed?
>>
>> In what line of work is a superduper graphics card necessary? My Mac Cube
>> has an ATI Rage128 Pro. My G4 1.4GHz DP has the cool card that came with
>> it. I can't t ell the difference between the graphics cards. What am I
>> missing out on?
>
> Well I can give you a simple example:
>
> When I bought my Mac Pro in mid 2006 I bought the higher-end ATI x1900XT.
> That card has problems with Leopard and I had to find a used nVidia card
> (that was the default card when my Mac Pro came out).

>
> Visually you can stand across the room and tell that the cheaper card is
> more washed out and doesn't have as much "punch" as the ATI card.
>
> This is bothersome in a plain email and browsing envronment.
>

You need a powerful card for email soi it doesn't look "washed out". Total
Bullshit. You have once again demonstrated your technical incompetence.

Timberwoof

unread,
Mar 29, 2008, 8:23:31 PM3/29/08
to
In article <lSzHj.27644$r76....@bignews8.bellsouth.net>,
Tim Murray <no-...@thankyou.com> wrote:

In any video system, even a digital-input monitor, there's an amplifier
at the end of the signal path that has to provide a specific voltage or
current level for the pixel element. If it's calibrated wrong (with the
wrong gain or bias levels) then the image will look bad.

However, analog video signals are standardized so that 0.0 V is black
and 1.0 V is white. It's not rocket science to calibrate an AD converter
to provide that voltage range for the full input range. There's more
variation between different types of monitors than between video cards.

The only way I'd accept that diagnosis is with two identical monitors
from the same production run, connected with an ABX switch, to the video
sources in question.

--
Timberwoof <me at timberwoof dot com> http://www.timberwoof.com
"When you post sewage, don't blame others for

emptying chamber pots in your direction." ‹Chris L.

Maverick

unread,
Mar 29, 2008, 9:12:31 PM3/29/08
to

I gathered that quite some time ago.

PeterBP

unread,
Mar 29, 2008, 10:35:04 PM3/29/08
to
Jon Harrop <use...@jdh30.plus.com> wrote:

> Steve de Mena wrote:
> > Jon Harrop wrote:
> >> I just received an e-mail from nVidia launching their 9-series graphics
> >> cards. Is the highest-spec GPU available to a Mac running Mac OS still
> >> only a 7 series?
> >
> > There's an 8000 series available for the Mac Pro ("early 2008" model)
> > model but nothing newer.
>

> Early 2008 model of the Mac Pro, yes. The graphics card in it was already
> out of date though.


>
> > I bought my Mac Pro in 2006 and Apple has released NO updated graphics
> > card for it since that time. (I had to remove my buggy ATI x1900XT and
> > buy a used crappy nVidia card to replace it).
>

> Yes, I particularly like this story:
>
> http://forums.macrumors.com/showpost.php?p=4740679&postcount=21
>
> where a loyal Apple customer was duped by Apple into paying for a product
> that did not yet exist, prompting him to write to Steve Jobs himself to
> explain that he was sick of waiting for yesteryears' graphics cards to work
> in his $4,000 Mac Pro only to receive a personal response from Mr Jobs
> advising him to "calm down" and continue to wait.
>
> We evaluated moving into the Mac market at the end of last year. The lame
> graphics cards swung our decision not to waste money on a Mac Pro (twice
> the cost for half the performance!).

CPU or gfx performance?

> We ended up buying a Mac Mini, which
> is a nice toy but completely useless for business.


What business exactly?

> Turns out the software
> development environments available for Mac OS X are out of the dark ages.

Xcode? That would have been true 4 years ago, but not today.

> Despite Apple's best efforts to push their Macs into technical computing:
>
> http://www.apple.com/science/
>
> Most scientists and engineers continue to use Linux and Windows.

How many, exactly?

> The
> comp.sys.mac.scitech usenet group has been in decline for a decade and now
> has roughly zero posts per month:
>
> http://groups.google.com/group/comp.sys.mac.scitech/about

And since when has usenet group posts been a useful indicator of
anything...?

>
> Sounds like Apple have milked their desktop users dry and are now moving on
> to phone users...

Sounds like you make a lot of unfounded conclusions.

--
regards , Peter B. P. http://macplanet.dk
Washington D.C.: District of Criminals

"I dont drink anymore... of course, i don't drink any less, either!

PeterBP

unread,
Mar 29, 2008, 10:35:05 PM3/29/08
to
Alan Baker <alang...@telus.net> wrote:

> In article <6tadnQtx46D-InDa...@giganews.com>,

> > Visually you can stand across the room and tell that the cheaper card
> > is more washed out and doesn't have as much "punch" as the ATI card.
> >

> > This is bothersome in a plain email and browsing envronment.
> >

> > Steve
>
> <cough>bullshit<cough>
>
> There are things that differentiate video cards, but "washed out"
> colours and "punch" aren't among them. They all deliver video signals to
> the same standard.

Sharpness of picture is not an issue with the DVI monitors of today. It
was with analogue monitors 10 years ago and with gfx cards with bad
RAMDACs.

PeterBP

unread,
Mar 29, 2008, 10:35:05 PM3/29/08
to
Jon Harrop <use...@jdh30.plus.com> wrote:

> Steve de Mena wrote:
> > Mitch wrote:
> >> Apple doesn't update every hardware generation just because a new GPU
> >> comes out; they test and decide whether the spec change will confuse or
> >> annoy the market, too. They may offer many of those as optional
> >> upgrades through the store, but that's not really necessary for people
> >> who want the biggest ad best all the time, either -- because those
> >> people will just buy at a discount supplier and install themselves!
> >
> > NO video card upgrades available for my Mac Pro (from Apple or 3rd
> > parties) since it came out in 2006.
>
> Yes. Mitch's misinformation is really quite serious.
>
> It looks as if Apple are deliberately trying to deceive their customers into
> believing that a Mac can run decent graphics hardware,

Decent according to what standards?

> or that you can
> upgrade from their lame top-of-the-range offering to something modern.

http://www.apple.com/macpro/technology/graphics.html

Is the 8800 not modern all of a sudden?

> In
> reality, they've got complete lock-in: if you don't pay double or triple
> the price for Apple's-own hardware it will probably never work.

Half a year ago there were threads here detailing how MP-comparable
offerings from Dell would cost you more than what a Mac Pro totals.

> If you do upgrade then you must ditch Mac OS X, of course. Which begs the
> question: would Apple actually care? I mean, they've already got your
> money.
>
> What are the odds they will ditch Mac OS X altogether now? I mean, its no
> secret that Jobs' last keynote brought nothing to the table: widely reputed
> to have been the worst keynote ever. They've basically stopped work on it.

Troll or outright moron detected.

PeterBP

unread,
Mar 29, 2008, 10:35:05 PM3/29/08
to
Steve de Mena <st...@stevedemena.com> wrote:

> Timberwoof wrote:
>
> > Turn up the contrast on your monitor. Use the Max OS's built-in monitor
> > calibration software.
>
> There is no contrast control on my Dell 2407 24" monitor.
>
> I ran through the calibration and adjusting the Gamma improved the
> appearence of the image on the VGA input somewhat. But its still
> blurrier than the ATI x1900XT image was.
>

> Steve

VGA - there is your culprit.

Use DVI. VGA is largely a thing of the past, for one very good reason.

PeterBP

unread,
Mar 29, 2008, 10:35:05 PM3/29/08
to
Jon Harrop <use...@jdh30.plus.com> wrote:

> Steve de Mena wrote:
> > I better not mention I was in the Apple Store again today for a Genius
> > Bar appointment. My thin Aluminum keyboard died a month or two ago...
>
> Yes, the Apple keyboard on our Mac Mini is incredibly unreliable.
>
> This begs the question: why do you still use Apple?

Rather beg the question why you snipped away the fact that he got a
replacement on the spot?

PeterBP

unread,
Mar 29, 2008, 10:35:05 PM3/29/08
to
Jon Harrop <use...@jdh30.plus.com> wrote:

> Alan Baker wrote:
> > They all deliver video signals to the same standard.
>

> Of course they don't. Low-end cards can't even drive high-end displays for a
> start.

"driving high-end displays" is largely a question of what interface you
use (miniDVI, DVI or double DVI) and how much video ram you have, and
the latter is a non-issue for most gfx cards today.

Jon Harrop

unread,
Mar 29, 2008, 9:38:36 PM3/29/08
to
Timberwoof wrote:
> In article <13usmmr...@corp.supernews.com>,
> Jon Harrop <use...@jdh30.plus.com> wrote:
>> With the benefit of hindsight, I'm very glad I didn't waste money on
>> anything more than a Mac Mini...
>
> Because even though Steve's keyboard from Apple has turned out to be
> unreliable, all the keyboards I ever got with my Macs have worked fine.

Mine doesn't. I posted about it here and the only response I got was someone
else saying that theirs was broken in exactly the same way. I've tried my
keyboard on PCs so it is a design flaw with the Apple.

> Are you suggesting that one unreliable type of keyboard from Apple is a
> reason to ditch the brand?

No, I was referring to Apple's general contempt for their customers rather
than keyboards specifically.

--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u

PeterBP

unread,
Mar 29, 2008, 10:35:04 PM3/29/08
to
Steve de Mena <st...@stevedemena.com> wrote:

> Jon Harrop wrote:
> > I just received an e-mail from nVidia launching their 9-series graphics
> > cards. Is the highest-spec GPU available to a Mac running Mac OS still only
> > a 7 series?
>
> There's an 8000 series available for the Mac Pro ("early 2008" model)
> model but nothing newer.
>

> I bought my Mac Pro in 2006 and Apple has released NO updated graphics
> card for it since that time. (I had to remove my buggy ATI x1900XT and
> buy a used crappy nVidia card to replace it).
>

> I guess George Graves would just call me a "Wintroll" for mentioning
> this, rather than admit to the possibility that Apple is not flawless
> and has diverted their attention to iPhones and the like.
>
> Steve

Of course Apple is not flawless. Who is?

Alan Baker

unread,
Mar 29, 2008, 9:59:52 PM3/29/08
to
In article <13utsb9...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Timberwoof wrote:
> > In article <13usmmr...@corp.supernews.com>,
> > Jon Harrop <use...@jdh30.plus.com> wrote:
> >> With the benefit of hindsight, I'm very glad I didn't waste money on
> >> anything more than a Mac Mini...
> >
> > Because even though Steve's keyboard from Apple has turned out to be
> > unreliable, all the keyboards I ever got with my Macs have worked fine.
>
> Mine doesn't. I posted about it here and the only response I got was someone
> else saying that theirs was broken in exactly the same way. I've tried my
> keyboard on PCs so it is a design flaw with the Apple.

And based on your total of two reports, you happily declare the
keyboard's design to be flawed.

You say you're a scientist? Really?

>
> > Are you suggesting that one unreliable type of keyboard from Apple is a
> > reason to ditch the brand?
>
> No, I was referring to Apple's general contempt for their customers rather
> than keyboards specifically.

What contempt?

Jon Harrop

unread,
Mar 29, 2008, 9:50:37 PM3/29/08
to
PeterBP wrote:

> Jon Harrop <use...@jdh30.plus.com> wrote:
>> We evaluated moving into the Mac market at the end of last year. The lame
>> graphics cards swung our decision not to waste money on a Mac Pro (twice
>> the cost for half the performance!).
>
> CPU or gfx performance?

The CPUs are great but the GPUs are awful.

>> We ended up buying a Mac Mini, which
>> is a nice toy but completely useless for business.
>
> What business exactly?

Technical computing and scientific visualization.

>> Turns out the software development environments available for Mac OS X
>> are out of the dark ages.
>
> Xcode? That would have been true 4 years ago, but not today.

A glorified text editor is no good without languages and libraries. The best
the Mac currently to offer are hand-me-downs from Linux. Nothing like the
innovation that Microsoft are doing with .NET, for example.

>> Despite Apple's best efforts to push their Macs into technical computing:
>>
>> http://www.apple.com/science/
>>
>> Most scientists and engineers continue to use Linux and Windows.
>
> How many, exactly?

Since we've ported our stuff to Mac and been pushing Mac-related business
the number of Mac users has risen to almost 5%. We were hoping for 25%.

>> The
>> comp.sys.mac.scitech usenet group has been in decline for a decade and
>> now has roughly zero posts per month:
>>
>> http://groups.google.com/group/comp.sys.mac.scitech/about
>
> And since when has usenet group posts been a useful indicator of
> anything...?

Since we discovered they correlate very strongly with sales.

>> Sounds like Apple have milked their desktop users dry and are now moving
>> on to phone users...
>
> Sounds like you make a lot of unfounded conclusions.

What do you use a Mac for?

Hasta La Vista

unread,
Mar 29, 2008, 10:04:23 PM3/29/08
to

"Timberwoof" <timberw...@inferNOnoSPAMsoft.com> wrote in message
news:timberwoof.spam-23...@nnrp-virt.nntp.sonic.net...

> In article <lSzHj.27644$r76....@bignews8.bellsouth.net>,
> Tim Murray <no-...@thankyou.com> wrote:
>
>> On Sat, 29 Mar 2008 00:02:43 -0400, Steve de Mena wrote:
>> > Visually you can stand across the room and tell that the cheaper card
>> > is more washed out and doesn't have as much "punch" as the ATI card.
>>
>> I stealing a line from Alan, and coughing out a bullshit.
>>
>> A digital value for a certain color is the same across video cards.
>> Generally
>> I see you as pretty technical, so I don't think you'd say that without
>> reason. If you can enlighten me as to how that's possible, then I'm all
>> ears.
>
> In any video system, even a digital-input monitor, there's an amplifier
> at the end of the signal path that has to provide a specific voltage or
> current level for the pixel element. If it's calibrated wrong (with the
> wrong gain or bias levels) then the image will look bad.
>
> However, analog video signals are standardized so that 0.0 V is black
> and 1.0 V is white. It's not rocket science to calibrate an AD converter
> to provide that voltage range for the full input range. There's more
> variation between different types of monitors than between video cards.
>
> The only way I'd accept that diagnosis is with two identical monitors
> from the same production run, connected with an ABX switch, to the video
> sources in question.

You described an analog input. Digital inputs aren't levels of current or
voltage to brightness level, i.e. they're not analogs of brightness of the
scene. The brightness of each pixel in a digital system are expressed by a
binary number, not a current or voltage level.

Jon Harrop

unread,
Mar 29, 2008, 10:01:06 PM3/29/08
to
Timberwoof wrote:
> In article <13usmit...@corp.supernews.com>,
> Jon Harrop <use...@jdh30.plus.com> wrote:
>> Well, we specialize in scientific visualization so we need lots of RAM on
>> the graphics card (>256Mb) and fast 3D but you need fast 2D for anything
>> these days if you're using a modern OS and have a big display like my
>> 1920x1200 or better.
>
> Therefore in your line of work, you need Linux with a superduper
> graphics card.

As a rule of thumb, I'd expect to pay the same for graphics cards as I do
for CPUs in my computers now. They are equally important to us.

> Probably not Windows.

Microsoft have certainly screwed the pooch when it comes to graphics
libraries but .NET is so good that it really offsets that. F# is by far the
best language we have found to develop in.

> And as a person with access to
> scientists and upon whom scientific thinking has presumably rubbed off,
> you should realize that *your* need for a superduper graphics card is
> not everyone's.

Sure. I'm not saying its everyone's but it is our requirement because it is
our target market's requirements. If they can't buy a decent spec Mac then
there's no point in us trying to sell tools for them, of course.

>> > My Mac
>> > Cube has an ATI Rage128 Pro. My G4 1.4GHz DP has the cool card that
>> > came with it. I can't t ell the difference between the graphics cards.
>> > What am I missing out on?
>>
>> The programmable graphics pipelines in the current range of cards (and
>> the older 8000-series) make a big difference for any applications that
>> can use them, particularly games.
>
> What kind of a difference do they make?

You can perform arbitrary computation on the GPU and, therefore, accelerate
a wide variety of heavily-parallel numerical computations. For games, that
equates to very sophisticated per-pixel lighting and real-time simulations.
For other applications it can be anything from FFTs to linear algebra.

On numerical problems like FFTs and linear algebra, current graphics cards
are many times faster that any available CPU. For graphics-related stuff,
they are well over 10x faster.

> The graphics-intensive work I do is Photoshop and video ... that doesn't
> need OpenGL-style graphics, just a place to put the pixels from the
> framebuffer.

You could probably do a lot more hardware accelerated computation in that
context as well. nVidia are even shipping their new GPUs as a Tesla range
that is entirely for computation and doesn't even have graphics output!

We've only started delving into this recently but these GPUs are easy to
program (the Tesla even has a C compiler). I'd be surprised if the creators
of Photoshop and your video editing software were working to get this
functionality into their products because it offers such huge performance
gains. However, I'm not sure if this is possible on the Mac yet even though
PCs have been doing it for a few years now.

Jon Harrop

unread,
Mar 29, 2008, 10:02:24 PM3/29/08
to
PeterBP wrote:
> "driving high-end displays" is largely a question of what interface you
> use (miniDVI, DVI or double DVI) and how much video ram you have, and
> the latter is a non-issue for most gfx cards today.

Not if you want max FSAA at full screen resolution. That was a great aspect
of our 512Mb cards for 2D work when we first got them.

Alan Baker

unread,
Mar 29, 2008, 10:16:58 PM3/29/08
to
In article <Nc6dnTtpy-u2aHPa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

Yes, Edwin.

That's why he said, "However, analog video signals"
^^^^^^^

As usual, your failure to read what is actually there leads you into
trouble.

Jon Harrop

unread,
Mar 29, 2008, 10:07:49 PM3/29/08
to
Alan Baker wrote:
> In article <13utsb9...@corp.supernews.com>,
> Jon Harrop <use...@jdh30.plus.com> wrote:
>> Timberwoof wrote:
>> > In article <13usmmr...@corp.supernews.com>,
>> > Jon Harrop <use...@jdh30.plus.com> wrote:
>> >> With the benefit of hindsight, I'm very glad I didn't waste money on
>> >> anything more than a Mac Mini...
>> >
>> > Because even though Steve's keyboard from Apple has turned out to be
>> > unreliable, all the keyboards I ever got with my Macs have worked fine.
>>
>> Mine doesn't. I posted about it here and the only response I got was
>> someone else saying that theirs was broken in exactly the same way. I've
>> tried my keyboard on PCs so it is a design flaw with the Apple.
>
> And based on your total of two reports, you happily declare the keyboard's
> design to be flawed.

As I said, there's nothing wrong with the keyboards: they work fine in other
machines.

> You say you're a scientist? Really?

Yes.

Alan Baker

unread,
Mar 29, 2008, 10:20:28 PM3/29/08
to
In article <13utu22...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Alan Baker wrote:
> > In article <13utsb9...@corp.supernews.com>,
> > Jon Harrop <use...@jdh30.plus.com> wrote:
> >> Timberwoof wrote:
> >> > In article <13usmmr...@corp.supernews.com>,
> >> > Jon Harrop <use...@jdh30.plus.com> wrote:
> >> >> With the benefit of hindsight, I'm very glad I didn't waste money on
> >> >> anything more than a Mac Mini...
> >> >
> >> > Because even though Steve's keyboard from Apple has turned out to be
> >> > unreliable, all the keyboards I ever got with my Macs have worked fine.
> >>
> >> Mine doesn't. I posted about it here and the only response I got was
> >> someone else saying that theirs was broken in exactly the same way. I've
> >> tried my keyboard on PCs so it is a design flaw with the Apple.
> >
> > And based on your total of two reports, you happily declare the keyboard's
> > design to be flawed.
>
> As I said, there's nothing wrong with the keyboards: they work fine in other
> machines.

So where does the design flaw lie, Jon?

>
> > You say you're a scientist? Really?
>
> Yes.

One could not determine that from the "analysis" you've demonstrated so
far...

Timberwoof

unread,
Mar 29, 2008, 10:29:23 PM3/29/08
to
In article <13uttlf...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> Timberwoof wrote:

> > What kind of a difference do they make?
>
> You can perform arbitrary computation on the GPU and, therefore, accelerate
> a wide variety of heavily-parallel numerical computations. For games, that
> equates to very sophisticated per-pixel lighting and real-time simulations.
> For other applications it can be anything from FFTs to linear algebra.
>
> On numerical problems like FFTs and linear algebra, current graphics cards
> are many times faster that any available CPU. For graphics-related stuff,
> they are well over 10x faster.
>
> > The graphics-intensive work I do is Photoshop and video ... that doesn't
> > need OpenGL-style graphics, just a place to put the pixels from the
> > framebuffer.
>
> You could probably do a lot more hardware accelerated computation in that
> context as well. nVidia are even shipping their new GPUs as a Tesla range
> that is entirely for computation and doesn't even have graphics output!
>
> We've only started delving into this recently but these GPUs are easy to
> program (the Tesla even has a C compiler). I'd be surprised if the creators
> of Photoshop and your video editing software were working to get this
> functionality into their products because it offers such huge performance
> gains.

That's pretty cool! All of it, really. And I get it. :-)


> However, I'm not sure if this is possible on the Mac yet even though
> PCs have been doing it for a few years now.

Since the hardware architecture is now pretty much the same as the rest
of the PC world, and since someone has undoubtedly written toolchains
for BSD and Linux to do this, I'm sure someone bright will write one to
let one do the same thing on a Mac under OS X.

--
Timberwoof <me at timberwoof dot com> http://www.timberwoof.com
"When you post sewage, don't blame others for

emptying chamber pots in your direction." æ°—hris L.

Steve de Mena

unread,
Mar 29, 2008, 10:30:11 PM3/29/08
to

Luckily, I don't need to enlighten you (or anyone) as to "how this is
possible".

Steve

Steve de Mena

unread,
Mar 29, 2008, 10:32:42 PM3/29/08
to
Maverick wrote:
> John wrote:

>> Don't forget George that Steve is also technically incompetent.
>
> I gathered that quite some time ago.

You gathered that because I called you out on your lies, like "Windows
XP is selling for less than Vista". I honestly thought you were going
to sit in the corner and cry when you finally realized you had been
caught in the lie and had no way out except to call me "dishonest".

Steve

Timberwoof

unread,
Mar 29, 2008, 10:37:07 PM3/29/08
to
In article <13utt1q...@corp.supernews.com>,
Jon Harrop <use...@jdh30.plus.com> wrote:

> >> Despite Apple's best efforts to push their Macs into technical computing:
> >>
> >> http://www.apple.com/science/
> >>
> >> Most scientists and engineers continue to use Linux and Windows.
> >
> > How many, exactly?
>
> Since we've ported our stuff to Mac and been pushing Mac-related business
> the number of Mac users has risen to almost 5%. We were hoping for 25%.

Here's an angle you probably didn't think of: Your Mac version may
enable more sales of other versions. If someone wants to buy a site
license for a company full of your users, and your competitors only work
on one OS, he may buy yours simply because a Mac version is available.
So your Mac version made Windows or Linux sales possible.

> >> The
> >> comp.sys.mac.scitech usenet group has been in decline for a decade and
> >> now has roughly zero posts per month:
> >>
> >> http://groups.google.com/group/comp.sys.mac.scitech/about
> >
> > And since when has usenet group posts been a useful indicator of
> > anything...?
>
> Since we discovered they correlate very strongly with sales.

Hm. Apple has been doing increasingly well over the past ten years, so I
think you should abandon that hypothesis.

Look at these numbers and compare them to history! The serious drop-off
happened in 1997, probably the darkest year in Apple history. I remember
MacWorld that year: very depressing. That newsgroup has not recovered
since. That's not Apple's fault.

So explain to me what all those laptops are that Steve Squyres and his
friends on the Mars Rover team use.

Timberwoof

unread,
Mar 29, 2008, 10:38:41 PM3/29/08
to
In article <Nc6dnTtpy-u2aHPa...@comcast.com>,
"Hasta La Vista" <noe...@all.to.me> wrote:

That's true at the graphics card's connector, yes. But you missed where
I wrotre,

> > In any video system, even a digital-input monitor, there's an amplifier
> > at the end of the signal path that has to provide a specific voltage or
> > current level for the pixel element.

The final feed to a pixel is analog, not digital. So the same problems
apply.

Alan Baker

unread,
Mar 29, 2008, 10:41:02 PM3/29/08
to
In article <EI6dnRycjeapZnPa...@giganews.com>,

Steve de Mena <st...@stevedemena.com> wrote:

Translation "I just got caught in a big lie".

Steve de Mena

unread,
Mar 29, 2008, 10:41:05 PM3/29/08
to
PeterBP wrote:

>> There are things that differentiate video cards, but "washed out"
>> colours and "punch" aren't among them. They all deliver video signals to
>> the same standard.
>
> Sharpness of picture is not an issue with the DVI monitors of today. It
> was with analogue monitors 10 years ago and with gfx cards with bad
> RAMDACs.

The ATI x1900 XT card looked better. Live with it.

Steve

Steve de Mena

unread,
Mar 29, 2008, 10:42:49 PM3/29/08
to
PeterBP wrote:
> Jon Harrop <use...@jdh30.plus.com> wrote:
>
>> Steve de Mena wrote:
>>> Mitch wrote:
>>>> Apple doesn't update every hardware generation just because a new GPU
>>>> comes out; they test and decide whether the spec change will confuse or
>>>> annoy the market, too. They may offer many of those as optional
>>>> upgrades through the store, but that's not really necessary for people
>>>> who want the biggest ad best all the time, either -- because those
>>>> people will just buy at a discount supplier and install themselves!
>>> NO video card upgrades available for my Mac Pro (from Apple or 3rd
>>> parties) since it came out in 2006.
>> Yes. Mitch's misinformation is really quite serious.
>>
>> It looks as if Apple are deliberately trying to deceive their customers into
>> believing that a Mac can run decent graphics hardware,
>
> Decent according to what standards?
>
>> or that you can
>> upgrade from their lame top-of-the-range offering to something modern.
>
> http://www.apple.com/macpro/technology/graphics.html
>
> Is the 8800 not modern all of a sudden?

Yes, but it won't work in my Mac Pro (contrary to what Alan Baker
initially thought).

Steve

Alan Baker

unread,
Mar 29, 2008, 10:42:58 PM3/29/08
to
In article <NOOdncpVLoReYHPa...@giganews.com>,

Steve de Mena <st...@stevedemena.com> wrote:

How could that possibly have happened, Steve?

Steve de Mena

unread,
Mar 29, 2008, 10:44:19 PM3/29/08
to
PeterBP wrote:
> Steve de Mena <st...@stevedemena.com> wrote:
>
>> Timberwoof wrote:
>>
>>> Turn up the contrast on your monitor. Use the Max OS's built-in monitor
>>> calibration software.
>> There is no contrast control on my Dell 2407 24" monitor.
>>
>> I ran through the calibration and adjusting the Gamma improved the
>> appearence of the image on the VGA input somewhat. But its still
>> blurrier than the ATI x1900XT image was.
>>
>> Steve
>
> VGA - there is your culprit.
>
> Use DVI. VGA is largely a thing of the past, for one very good reason.

The ATI looked great using the VGA input. Do I need to say that 10
more times for it to sink it?

I have 4 other systems connected to a TrippLite switchbox and its a
pain in the ass to switch between VGA and DVI on the Dell 24" monitor.
You have to cycle through all possible 5 inputs to switch back and
forth.

Steve

Hasta La Vista

unread,
Mar 29, 2008, 10:45:14 PM3/29/08
to
"In any video system, even a digital-input monitor, there's an amplifier
at the end of the signal path that has to provide a specific voltage or
current level for the pixel element."

That's claiming a digial input uses an analog signal, Gimp.


Hasta La Vista

unread,
Mar 29, 2008, 10:49:41 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-98BFC3.19410229032008@[74.223.185.199.nw.nuvox.net]...

> In article <EI6dnRycjeapZnPa...@giganews.com>,
> Steve de Mena <st...@stevedemena.com> wrote:
>
>> Tim Murray wrote:
>> > On Sat, 29 Mar 2008 00:02:43 -0400, Steve de Mena wrote:
>> >> Visually you can stand across the room and tell that the cheaper card
>> >> is more washed out and doesn't have as much "punch" as the ATI card.
>> >
>> > I stealing a line from Alan, and coughing out a bullshit.
>> >
>> > A digital value for a certain color is the same across video cards.
>> > Generally
>> > I see you as pretty technical, so I don't think you'd say that without
>> > reason. If you can enlighten me as to how that's possible, then I'm all
>> > ears.
>>
>> Luckily, I don't need to enlighten you (or anyone) as to "how this is
>> possible".
>>
>> Steve
>
> Translation "I just got caught in a big lie".

You seem to spend most of your time translating posts into things that were
never said, Gimp.

Alan Baker

unread,
Mar 29, 2008, 10:57:32 PM3/29/08
to
In article <EKOdnSWM28AjY3Pa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

No, that's claiming that a digital input gets converted to an analog
voltage to actually drive the pixels. Since each pixel doesn't have it's
own dedicated logic in interpret a signal of ones and zeros, this is
hardly surprising.

As usual, your lack of care in reading has come back to bite you.

Alan Baker

unread,
Mar 29, 2008, 10:58:48 PM3/29/08
to
In article <B_ednaAgQ8VUYnPa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

LOL

Like you tried to translate your claim that Pixar was an all PC shop
into me having to prove that they used Final Cut?

Hasta La Vista

unread,
Mar 29, 2008, 11:02:53 PM3/29/08
to

"Timberwoof" <timberw...@inferNOnoSPAMsoft.com> wrote in message
news:timberwoof.spam-61...@nnrp-virt.nntp.sonic.net...

I don't think I missed it.

>> > In any video system, even a digital-input monitor, there's an amplifier
>> > at the end of the signal path that has to provide a specific voltage or
>> > current level for the pixel element.
>
> The final feed to a pixel is analog, not digital. So the same problems
> apply.

No, the final feed is not analog. Of the three color elements that make up
a pixel, each receives a discrete level, defined by a number. Analog feeds
to the pixel can vary any level between minimum and maximum, there's no
fixed steps of graduation.

Hasta La Vista

unread,
Mar 29, 2008, 11:06:29 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-221D51.19584829032008@[74.223.185.199.nw.nuvox.net]...

WTF are you talking about, Gimp?

Hasta La Vista

unread,
Mar 29, 2008, 11:12:13 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-9E4D21.19573229032008@[74.223.185.199.nw.nuvox.net]...

> In article <EKOdnSWM28AjY3Pa...@comcast.com>,
> "Hasta La Vista" <noe...@all.to.me> wrote:
>
>> "In any video system, even a digital-input monitor, there's an amplifier
>> at the end of the signal path that has to provide a specific voltage or
>> current level for the pixel element."
>>
>> That's claiming a digial input uses an analog signal, Gimp.
>
> No, that's claiming that a digital input gets converted to an analog
> voltage to actually drive the pixels.

So the answer is yes, not no.

> Since each pixel doesn't have it's
> own dedicated logic in interpret a signal of ones and zeros,

Yes it does. Each pixel has its own discrete range of steps that are
defined by a number, not a continuously varying analog level.

> this is hardly surprising.

That you got it so wrong, Gimp.

> As usual, your lack of care in reading has come back to bite you.

You're a gimp.

Alan Baker

unread,
Mar 29, 2008, 11:17:59 PM3/29/08
to
In article <jc-dnQsnKp58n3La...@comcast.com>,

By your own definition, the final feed *is* analog.

You defined "digital" in this context binary numbers and "not a current
or a voltage level".

The fact that each binary number corresponds to a discrete voltage level
doesn't change the fact that it *is* a voltage level.

All a DVI system does is move where the change from binary
representation to discrete voltage takes place. In a VGA system, the
video card converts a pixels binary representation into discrete
voltages which get put onto the output pins. From there they got to the
monitor which converts the VGA standard levels to those actually
necessary to drive its LCD pixel elements. In a DVI system, the binary
representation of the pixels colours is what gets transmitted down the
wire and the monitor converts those binary numbers into discrete
voltages.

Alan Baker

unread,
Mar 29, 2008, 11:20:38 PM3/29/08
to
In article <TJSdnWoW5MSMmHLa...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

> "Alan Baker" <alang...@telus.net> wrote in message
> news:alangbaker-9E4D21.19573229032008@[74.223.185.199.nw.nuvox.net]...
> > In article <EKOdnSWM28AjY3Pa...@comcast.com>,
> > "Hasta La Vista" <noe...@all.to.me> wrote:
> >
> >> "In any video system, even a digital-input monitor, there's an amplifier
> >> at the end of the signal path that has to provide a specific voltage or
> >> current level for the pixel element."
> >>
> >> That's claiming a digial input uses an analog signal, Gimp.
> >
> > No, that's claiming that a digital input gets converted to an analog
> > voltage to actually drive the pixels.
>
> So the answer is yes, not no.

No, the answer remains no.

>
> > Since each pixel doesn't have it's
> > own dedicated logic in interpret a signal of ones and zeros,
>
> Yes it does. Each pixel has its own discrete range of steps that are
> defined by a number, not a continuously varying analog level.

No. Each actual pixel element can only take an voltage. That is not your
definition of "digital" in this context. Must I quote it again?

>
> > this is hardly surprising.
>
> That you got it so wrong, Gimp.

LOL

>
> > As usual, your lack of care in reading has come back to bite you.
>
> You're a gimp.

Why are you calling me that, Edwin? Have you been doing a little Usenet
"stalking"?

Alan Baker

unread,
Mar 29, 2008, 11:21:08 PM3/29/08
to
In article <m9ydnQsxfP8nnnLa...@comcast.com>,

LOL

Hasta La Vista

unread,
Mar 29, 2008, 11:33:30 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-FA3EA6.20210829032008@[74.223.185.199.nw.nuvox.net]...

WTF are you laughing about, Gimp?

Hasta La Vista

unread,
Mar 29, 2008, 11:36:52 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-34C24E.20203829032008@[74.223.185.199.nw.nuvox.net]...

> In article <TJSdnWoW5MSMmHLa...@comcast.com>,
> "Hasta La Vista" <noe...@all.to.me> wrote:
>
>> "Alan Baker" <alang...@telus.net> wrote in message
>> news:alangbaker-9E4D21.19573229032008@[74.223.185.199.nw.nuvox.net]...
>> > In article <EKOdnSWM28AjY3Pa...@comcast.com>,
>> > "Hasta La Vista" <noe...@all.to.me> wrote:
>> >
>> >> "In any video system, even a digital-input monitor, there's an
>> >> amplifier
>> >> at the end of the signal path that has to provide a specific voltage
>> >> or
>> >> current level for the pixel element."
>> >>
>> >> That's claiming a digial input uses an analog signal, Gimp.
>> >
>> > No, that's claiming that a digital input gets converted to an analog
>> > voltage to actually drive the pixels.
>>
>> So the answer is yes, not no.
>
> No, the answer remains no.

You're a gimp.

>>
>> > Since each pixel doesn't have it's
>> > own dedicated logic in interpret a signal of ones and zeros,
>>
>> Yes it does. Each pixel has its own discrete range of steps that are
>> defined by a number, not a continuously varying analog level.
>
> No. Each actual pixel element can only take an voltage.

In discrete steps, not a continuously variable analog, Gimp.

> That is not your
> definition of "digital" in this context.

Yes it is, Gimp.


Alan Baker

unread,
Mar 29, 2008, 11:41:38 PM3/29/08
to
In article <ZYSdnYNdiJ5Fl3La...@comcast.com>,

"Hasta La Vista" <noe...@all.to.me> wrote:

> "Alan Baker" <alang...@telus.net> wrote in message
> news:alangbaker-34C24E.20203829032008@[74.223.185.199.nw.nuvox.net]...
> > In article <TJSdnWoW5MSMmHLa...@comcast.com>,
> > "Hasta La Vista" <noe...@all.to.me> wrote:
> >
> >> "Alan Baker" <alang...@telus.net> wrote in message
> >> news:alangbaker-9E4D21.19573229032008@[74.223.185.199.nw.nuvox.net]...
> >> > In article <EKOdnSWM28AjY3Pa...@comcast.com>,
> >> > "Hasta La Vista" <noe...@all.to.me> wrote:
> >> >
> >> >> "In any video system, even a digital-input monitor, there's an
> >> >> amplifier
> >> >> at the end of the signal path that has to provide a specific voltage
> >> >> or
> >> >> current level for the pixel element."
> >> >>
> >> >> That's claiming a digial input uses an analog signal, Gimp.
> >> >
> >> > No, that's claiming that a digital input gets converted to an analog
> >> > voltage to actually drive the pixels.
> >>
> >> So the answer is yes, not no.
> >
> > No, the answer remains no.
>
> You're a gimp.

Really? On what do you base that? Doing a little Usenet "stalking",
Edwin?

>
> >>
> >> > Since each pixel doesn't have it's
> >> > own dedicated logic in interpret a signal of ones and zeros,
> >>
> >> Yes it does. Each pixel has its own discrete range of steps that are
> >> defined by a number, not a continuously varying analog level.
> >
> > No. Each actual pixel element can only take an voltage.
>
> In discrete steps, not a continuously variable analog, Gimp.

And that is not the definition of a digital signal.

>
> > That is not your
> > definition of "digital" in this context.
>
> Yes it is, Gimp.

Apparently, I do have to quote it:

"The brightness of each pixel in a digital system are expressed by a
binary number, not a current or voltage level."

--

Hasta La Vista

unread,
Mar 29, 2008, 11:45:38 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-BCB586.20175929032008@[74.223.185.199.nw.nuvox.net]...

Not by my defintion, Gimp.

> You defined "digital" in this context binary numbers and "not a current
> or a voltage level".

I defined digital as levels in discrete steps, and analog as a continuously
variable level that can be value between minimum and maximum.

> The fact that each binary number corresponds to a discrete voltage level
> doesn't change the fact that it *is* a voltage level.

But it does make it digital instead of analog.

> All a DVI system does is move where the change from binary
> representation to discrete voltage takes place. In a VGA system, the
> video card converts a pixels binary representation into discrete
> voltages which get put onto the output pins. From there they got to the
> monitor which converts the VGA standard levels to those actually
> necessary to drive its LCD pixel elements. In a DVI system, the binary
> representation of the pixels colours is what gets transmitted down the
> wire and the monitor converts those binary numbers into discrete
> voltages.

You clearly don't understand what you're talking about, Gimp.

Hasta La Vista

unread,
Mar 29, 2008, 11:51:42 PM3/29/08
to

"Alan Baker" <alang...@telus.net> wrote in message
news:alangbaker-4628FA.20413729032008@[74.223.185.199.nw.nuvox.net]...

> In article <ZYSdnYNdiJ5Fl3La...@comcast.com>,
> "Hasta La Vista" <noe...@all.to.me> wrote:
>
>> "Alan Baker" <alang...@telus.net> wrote in message
>> news:alangbaker-34C24E.20203829032008@[74.223.185.199.nw.nuvox.net]...
>> > In article <TJSdnWoW5MSMmHLa...@comcast.com>,
>> > "Hasta La Vista" <noe...@all.to.me> wrote:
>> >
>> >> "Alan Baker" <alang...@telus.net> wrote in message
>> >> news:alangbaker-9E4D21.19573229032008@[74.223.185.199.nw.nuvox.net]...
>> >> > In article <EKOdnSWM28AjY3Pa...@comcast.com>,
>> >> > "Hasta La Vista" <noe...@all.to.me> wrote:
>> >> >

>> > No. Each actual pixel element can only take an voltage.
>>
>> In discrete steps, not a continuously variable analog, Gimp.
>
> And that is not the definition of a digital signal.

Yes it is, Gimp.

>>
>> > That is not your
>> > definition of "digital" in this context.
>>
>> Yes it is, Gimp.
>
> Apparently, I do have to quote it:

Quoting it didn't help you to understand it.

> "The brightness of each pixel in a digital system are expressed by a
> binary number, not a current or voltage level."

That depends where in the system you're talking about, Gimp. It's binary
numbers at the input. It's discrete steps corresponding to the input
numbers at the output. Neither input nor output are analog.

It is loading more messages.
0 new messages