Is the PowerVR relying on AGP for more memory? An interesting fact is
that, so far, no manly video card (FireGL 4000 or GLint) is based on
AGP. What does this tell you about how much of an impact AGP has on a
well designed card?
I remember when the original PowerVR chip was the Voodoo killer, but now
that I've seen the PowerVR in action, there is no comparison. Big
claims with little or no delivery on the promises.
One thing sticks in my mind. This chip was engineered to be cheap so
that the average consumer could suck it up. The Voodoo2 was designed to
be the best sub-$1000 card/cards for 3D gaming. 3Dfx makes no excuses
about price; once you see the cards in action no excuses are necessary.
I was blown away on seeing Voodoo1 for the first time and I've always
been impressed with 3Dfx's great image quality. PowerVR has always been
a late starter with no real improvement over a 3Dfx product.
I'll wait until I see both cards in action, but I predict that the PVR2
will compare to the Voodoo2 the same way the PowerVR compared to
Voodoo1. If I'm wrong, I'll sell my V2 cards and pick up a PVR2.
Also, the site you pointed us to is obviously biased toward PVR2. It
carries as much weight on my purchasing decisions as a Packard Bell add
in a magazine. So far, PowerVR has failed to deliver on all their
claims while 3Dfx has yet to dissapoint me.
Kai wrote:
> Look here:
>
> http://www.canadawired.com/~gvink/Global/PVRSG/Features.html
>
> http://www.canadawired.com/~gvink/Global/PVRSG/Comparison.html
>That's all well and good, but we're still talking about a card that
>doesn't exist. I read the whole Features article and I have to admit
>that their ideas sound good, but can you really eliminate all the
>Z-buffer memory and keep the performance and visual quality level at top
>notch?
The reason WHY it can be so fast and cheap is becuase it eliminates
the zbuffer...It doesn't need it. It's method of rendering allows for
higher quality pixels since you don't write pixels that you don't
see(i.e. you can spend time making prettier pixels than drawing lots
of ugly one's).
>Is the PowerVR relying on AGP for more memory? An interesting fact is
>that, so far, no manly video card (FireGL 4000 or GLint) is based on
>AGP. What does this tell you about how much of an impact AGP has on a
>well designed card?
PVR can go upto 32MB memory, so...It can texture from AGP(read +
write), but uses local memory if there.
>I remember when the original PowerVR chip was the Voodoo killer, but now
>that I've seen the PowerVR in action, there is no comparison. Big
>claims with little or no delivery on the promises.
Yeah, there was a butt-load of hype around the first generation. This
time NEC/VDO aren't hyping their product too early...Just happens the
followers are doing it for them.
Problems with PCX1 was mostly no bilinear filtering. PCX2 was
incomplete alpha blending modes. And in general is was very CPU
limited due to ALL setup being done by CPU.
PVRSG fixes all of that. All setup is done on-chip, quality and
features are totally DX6 and OpenGL compliant.
>One thing sticks in my mind. This chip was engineered to be cheap so
>that the average consumer could suck it up. The Voodoo2 was designed to
>be the best sub-$1000 card/cards for 3D gaming. 3Dfx makes no excuses
>about price; once you see the cards in action no excuses are necessary.
Yeah, I'm sure 3Dfx will do well in the high-end market...They'll have
a tougher time with Banshee, even if out a little earlier.
>I was blown away on seeing Voodoo1 for the first time and I've always
>been impressed with 3Dfx's great image quality. PowerVR has always been
>a late starter with no real improvement over a 3Dfx product.
always=2 products, so hard to extrapolate statistically or anything.
We'll see what happens.
>I'll wait until I see both cards in action, but I predict that the PVR2
>will compare to the Voodoo2 the same way the PowerVR compared to
>Voodoo1. If I'm wrong, I'll sell my V2 cards and pick up a PVR2.
You'll be wrong, but difference is timing. V2 will be out 7-8 months
by the time PVRSG starts pumping out their 2d/3d mid-range.
Big difference is V2's dual-texturing units. Problem is games only
need dual-texturing for only parts of the scene, not all of it...So
extra TMU goes to waste alot of the time, even for DX6 games.
Quake1/2 are exceptions...Quake3 will be back to normal(little
dual-texture use).
>Also, the site you pointed us to is obviously biased toward PVR2. It
>carries as much weight on my purchasing decisions as a Packard Bell add
>in a magazine.
Well, this is obvious.
>So far, PowerVR has failed to deliver on all their
>claims while 3Dfx has yet to dissapoint me.
Well, Voodoo Rush was a flop. We'll see how things fall in place, if
at all.
Problem here...................I don't see the PowerVR2 card yet and I've heard the hype plenty of times before. Another problem: software companies have to want to support it. Right now where is the support? 3Dfx..................ÂÂÂKai wrote in message <6ifuef$q93$1...@www.3dfx.com>...
Warren B. wrote in message <6ig78j$ri0$1...@www.3dfx.com>...
>Well, Voodoo Rush was a flop. We'll see how things fall in place, if
>at all.
What do you mean a flop? From a sales standpoint or a technical? The
Rush got a bad name at first because of the Hercules thing, but it turned
out to be decent card. Almost as fast as VG. I think the misinformation
that was spread around spooked a lot of people.
The sub-standard 2D wasn't good, I'll admit that..
In order for 3Dfx to do anything w/Banshee, it should be:
Faster than Voodoo2
Full AGP 2X w/sidebands
Totally killer 2D
All the new 3D features that the other chip makers are boasting about now.
I am a little nervous (from a stockholders view) that Banshee has been in
development so long that it may be outdated before it comes out. I really
hope 3Dfx is on the ball with this one.
Tom
>Jon wrote in message <354e97df...@news.3dfx.com>...
>
>>Well, Voodoo Rush was a flop. We'll see how things fall in place, if
>>at all.
>
>What do you mean a flop? From a sales standpoint or a technical? The
>Rush got a bad name at first because of the Hercules thing, but it turned
>out to be decent card. Almost as fast as VG. I think the misinformation
>that was spread around spooked a lot of people.
Even Forbes' analysis speaks about how Banshee fixes all the problems
Rush had(and they go in parenthesis to say Rush was a failure).
>The sub-standard 2D wasn't good, I'll admit that..
3D had flaws due to the interface, board was CHOCK full of chips, too
expensive for what it offered.
>In order for 3Dfx to do anything w/Banshee, it should be:
>
>Faster than Voodoo2
>Full AGP 2X w/sidebands
>Totally killer 2D
>All the new 3D features that the other chip makers are boasting about now.
I agree.
>I am a little nervous (from a stockholders view) that Banshee has been in
>development so long that it may be outdated before it comes out. I really
>hope 3Dfx is on the ball with this one.
I'm nervous just as a sympathizer for 3Dfx, because they are a
kick-ass company.
We'll see how on the ball they are probably soon enough.
Method=?
All the problems are solved. This doesn't mean there aren't new
one's. But your bias is just as silly as the bias some PVR promoters
display.
> All the problems are solved. This doesn't mean there aren't new
> one's. But your bias is just as silly as the bias some PVR promoters
> display.
And how exactly do you know that all the problems are solved? Just
because PowerVR said they were? Only when final shipping silicon
reaches consumers will you or anyone else be able to make that claim.
What a lot of people (myself included) are saying is this: Videologic/
NEC have a *looong* way to go before they rebuild their reputation. The
first two versions of PowerVR didn't exactly threaten the first Voodoo
graphics chipset. Come to think of it, they didn't exactly threaten the
Rush, either, which I'm sure most people would agree isn't the
barnburner that the first Voodoo was. But based on the hype that
Videologic/NEC put out, you'd have thought that one of those first gen
PowerVR cards would have made you think you were playing Quake on an SGI
workstation. Hmm...not exactly, guys. Add to the fact that Videologic
outright rigged their drivers to run faster in benchmarking (and lied
about it afterwards), and you have a company that I (and a lot of other
people) don't take seriously.
Will the PVRSG be faster than Voodoo2? One on one, probably. But,
PVRSG still won't be able to match the fill rate of an SLI V2 solution,
and won't be that much faster than a single card. Each will have its own
advantages and weaknesses. The lack of a native API will probably end
up being a drawback for the PVRSG...a lot of developers aren't too keen
on DX, which is what the PVRSG is designed to use (as well as OpenGL).
3Dfx may not have as many features as the PVRSG, but Voodoo2 has been
out now since the end of February....it'll have at the least a six-month
head start on the PowerVR card. In computer terms, that's one hell of a
head start. Just imagine how the market might have changed if AMD had
gotten the K6-2 out 6 months before *any* Pentium II ever hit the
shelves....
Besides, PVRSG has been in development for quite some time now,
something on the order of two years or so from what I've read. I doubt
that Voodoo2 has nearly so much development time alloted to it. If a
company can't make a card that's top-notch after two years of
development, then every engineer working there should be fired and never
allowed near high-tech equipment again.
So, let me just say this: people in this NG may have a bias towards
3Dfx, but it is deserved, and rightfully so. Only 3Dfx has backed up
their claims with products to match. Videologic/NEC sure spews a lot of
crap, and thus far that's what their boards have been. Rendition has
made a great chipset that's fully the equal of the first Voodoo
chipset...too bad the first Voodoo is the baseline now, and not the top
dog. NVidia made a fast chip, and all they had to give up in return was
any kind of image quality. Intel and Real3D have made a good chip with
fantastic image quality, but lackluster speed. Only 3Dfx has made the
best balance of the two, and for that they deserve some bias/praise. I
don't think that's unreasonable.
irongen
P.S. Just on a sidenote: I was looking around today and came across
some interesting comments made by people who went to 3Dfx's stockholder
meeting a week or so ago. They discussed Banshee there....Greg Ballard
stated that the specs that were released on Banshee some time ago were
leaked, and were from early in the design stage. They have completely
changed since then. When asked about the upcoming PVRSG and Riva TNT,
Ballard went on to say that even if their performance is the same as
their respective press releases state (and we all know how the TNT is
being hyped), he said that he still wouldn't be worried...Banshee would
be BETTER. Chew on that.....
Christopher Cloud wrote in message <354C17...@earthlink.net>...
>Jon wrote:
>Will the PVRSG be faster than Voodoo2? One on one, probably. But,
>PVRSG still won't be able to match the fill rate of an SLI V2 solution,
>and won't be that much faster than a single card. Each will have its own
>advantages and weaknesses. The lack of a native API will probably end
>up being a drawback for the PVRSG...a lot of developers aren't too keen
>on DX, which is what the PVRSG is designed to use (as well as OpenGL).
PVRSG will have a native API in the form of PowerSGL. I think they keep
beating the drum about directX and openGL performance because that is where
the previous generation fell down. Also, like it or not, they (D3D and
OpenGL) are becomming far more popular. You say that a lot of developers
aren't too keen on directX support but this is changing with every release
of directX. OpenGL is becoming pretty popular now too. It's true that there
are still a few glide games around but these are quite often also released
in powerSGL format - F1RS, FIFA 98, everything by activision (nearly). I
know that there are exceptions to this so don't spout back a list of games
with only glide support.
My point is, as other cards are beginning to reach the performance of voodoo
and may actually have some advantages, then native API's may start to
decline. This is already happening IMHO. I think the main culprit for this
is the Riva cards which have no native API.
Andy
> PVRSG will have a native API in the form of PowerSGL.
I hadn't heard that. I knew the existing PowerVR cards had a native
API, but the last I had heard it wasn't going to be used for PVRSG,
since developers hadn't taken a shine to it.
> Also, like it or not, they (D3D and OpenGL) are becomming far more popular. You say that a lot of developers aren't too keen on directX support but this is changing with every release of directX.
Well, Microsoft still has a LOT of work to do with Direct3D. From what
I've read, D3D is an *extremely* clumsy interface to work with, which is
why developers don't like it, but absolutely love OpenGL, which has
extensive documentation. DX 6.0 may change this; we'll have to see.
But as for DX becoming more popular...sure, I believe that it is. But
as for it being more liked.....that's an entirely different story. Many
developers still don't like it. Just take a look at Dynamix. For
Starsiege (formerly Earthsiege 3), Direct3D is *so* bad that they won't
even support it.....it will be all OpenGL and native APIs instead. Talk
about a slap in the face to Microsoft....
Don't get me wrong. One unified standard is a fine and noble idea.
Microsoft just doesn't seem to know how to do it.
> OpenGL is becoming pretty popular now too. It's true that there
> are still a few glide games around but these are quite often also released in powerSGL format - F1RS, FIFA 98, everything by activision (nearly). I know that there are exceptions to this so don't spout back a list of games with only glide support.
I won't, but just so everyone is on the same page, there *are* games
with only Glide support. Far more games are ported to Glide than any
other API, excepting D3D. Glide is, by all accounts, just a pleasant
system to work with.
> My point is, as other cards are beginning to reach the performance of voodoo and may actually have some advantages, then native API's may start to decline.
Decline? Sure, it's to be expected. But go away completely? I doubt
it. So long as there are a variety of chipsets to choose from, we'll
always have native APIs. Every chip will have different capabilities,
and native APIs will be the only way to fully wring the best performance
out of them. It's the *only* way to allow each chipset to fully shine.
D3D is a jack of all trades, master of none interface. It will get the
job done, but it will never be the best overall solution.
>This is already happening IMHO. I think the main culprit for this is the Riva cards which have no native API.
I'm not sure how many people know this, but the Riva chipset did have a
native API in development for it. NVidia cancelled it and decided to
focus on D3D and OpenGL instead, but it may be resurrected for the TNT
chipset. It would have been interesting to see if this native API would
have addressed some of the image quality problems the Riva cards have.
irongen
>Jon wrote:
>
>> All the problems are solved. This doesn't mean there aren't new
>> one's. But your bias is just as silly as the bias some PVR promoters
>> display.
>
>And how exactly do you know that all the problems are solved? Just
>because PowerVR said they were? Only when final shipping silicon
>reaches consumers will you or anyone else be able to make that claim.
Because I know what the problems were:
1) not all OpenGL alpha blending modes(DX6 too) in hardware
2) Used CPU to do tile and plane setup. Why PCX2 VERY CPU dependent.
3) Difficult native API, DirectX incompatibility problems.
These are the 3 MAJOR problems PCX2 has. PVRSG solves all of these in
the sense that:
1) Supports all OpenGL and DX6 alpha blending modes...Well, hell,
supports everything DX6 does.
2) Total tile and plane setup done in hardware, including transulency
sorting, which TNT and V2 don't do(probably not Banshee either..It's
just simpler with tile arch). Setup allows upto 4 million tri/sec.
3) Native API is now infinite plane agnostic. Hardware appears as
normal arch to programmer. DirectX6 solves 2d/3d overlapping
problems, and other such things you might do with normal archs but
can't with tile based.
>What a lot of people (myself included) are saying is this: Videologic/
>NEC have a *looong* way to go before they rebuild their reputation.
Sure, I even said this already. That's irrelevent to the technical
aspects of the chips/boards.
>Add to the fact that Videologic
>outright rigged their drivers to run faster in benchmarking (and lied
>about it afterwards), and you have a company that I (and a lot of other
>people) don't take seriously.
This is a common misconception spread by 3Dfx zealots, who will never
let the idea die! Carl Muller explained this in detail before, but
here's a summary:
In DirectX2-5, some applications were incompatible with PVR's
architecture, due to the way they accessed certain things. They coded
the drivers to "do it the hard way" for some programs by their
filename. This made things go slower for those programs.
This doesn't mean they were cheating for everything else. The other
programs simply weren't incompatible with PVR as much or at all.
DirectX6 solves all these potential problems.
>Will the PVRSG be faster than Voodoo2? One on one, probably. But,
>PVRSG still won't be able to match the fill rate of an SLI V2 solution
Careful. If you understand the PVRSG architecture, it's "fillrate"
isn't equivelent to normal archs. It processes only those pixels
visible. So if the depth complexity is 4(say for Quake3), then at
1024x768 for 60fps you'd need a fillrate of 189Mpixels/sec on
V2...Which is about what V2SLI gives in raw fillrate(and remember
Quake3 won't use dual-TMUs for speed at all!).
So what about PVRSG? It only has a mesely 100Mpixels/sec. Well, it
only processes those pixels visible, and so, given a full screen of
transparent pixels worth, you'd only need 94Mpixels/sec.
Yes, it's really magical like this. It simply doesn't render nor
sends pixels you don't see to the framebuffer.
Anyways, you'll see. :)
>and won't be that much faster than a single card. Each will have its own
>advantages and weaknesses. The lack of a native API will probably end
>up being a drawback for the PVRSG...a lot of developers aren't too keen
>on DX, which is what the PVRSG is designed to use (as well as OpenGL).
PVR will seek exclusive games for use with their native API. With the
native API you have better access to volumetric effects, which ARE
VERY NICE.
>3Dfx may not have as many features as the PVRSG, but Voodoo2 has been
>out now since the end of February....it'll have at the least a six-month
>head start on the PowerVR card. In computer terms, that's one hell of a
>head start. Just imagine how the market might have changed if AMD had
>gotten the K6-2 out 6 months before *any* Pentium II ever hit the
>shelves....
I clearly stated that V2's major advantage(as well as Banshee's
probably) is being out sooner.
>Besides, PVRSG has been in development for quite some time now,
>something on the order of two years or so from what I've read. I doubt
>that Voodoo2 has nearly so much development time alloted to it. If a
>company can't make a card that's top-notch after two years of
>development, then every engineer working there should be fired and never
>allowed near high-tech equipment again.
Well, then about 50 companies such as S3 should be given your desired
fate. Anyways, you'll see soon enough.
>So, let me just say this: people in this NG may have a bias towards
>3Dfx, but it is deserved, and rightfully so. Only 3Dfx has backed up
>their claims with products to match. Videologic/NEC sure spews a lot of
>crap, and thus far that's what their boards have been.
Again, people act as if there's some statistical basis for such
feelings, but we are only talking 2 products here, not 1000. One can
hardly state with much meaning on the "thus farness" of things.
>Rendition has
>made a great chipset that's fully the equal of the first Voodoo
>chipset...too bad the first Voodoo is the baseline now, and not the top
>dog. NVidia made a fast chip, and all they had to give up in return was
>any kind of image quality. Intel and Real3D have made a good chip with
>fantastic image quality, but lackluster speed. Only 3Dfx has made the
>best balance of the two, and for that they deserve some bias/praise. I
>don't think that's unreasonable.
Of course they deserve praise, but that's meaningless to the future.
> When asked about the upcoming PVRSG and Riva TNT,
>Ballard went on to say that even if their performance is the same as
>their respective press releases state (and we all know how the TNT is
>being hyped), he said that he still wouldn't be worried...Banshee would
>be BETTER. Chew on that.....
I read the entire post by that person, and commented on it as well.
He laters speaks with Scott Sellers, and in light of his comments
you'd better have a more tempered idea of Banshee's performance.
CEO's and friends are awefully zeolous.
Christopher Cloud wrote in message <354C26...@earthlink.net>...
>I hadn't heard that. I knew the existing PowerVR cards had a native
>API, but the last I had heard it wasn't going to be used for PVRSG,
>since developers hadn't taken a shine to it.
This is untrue. Developers don't need to like it anyway as NEC will just
open their wallets :->
>I won't, but just so everyone is on the same page, there *are* games
>with only Glide support. Far more games are ported to Glide than any
>other API, excepting D3D. Glide is, by all accounts, just a pleasant
>system to work with.
Agreed
>D3D is a jack of all trades, master of none interface. It will get the
>job done, but it will never be the best overall solution.
Fair comment. But directX6 is more 'feature heavy' in a lot of areas than
glide. DirectX6 will allow hardware support for bump mapping for example.
I'm not saying that directX6 is better than native API's. That would be
stupid. However, the difference between them is getting smaller all the
time. IMHO, some of the best looking games are programmed in D3D and OpenGL
at the moment. Take QuakeII, G-Police, Forsaken and Incoming. I'm well aware
that these games would look better if programmed natively but they haven't
been, and they still look amazing and are extremely smooth. I'm not sure if
I'm being clear here. I just don't think that companies will make the effort
of making many different native ports when directX6 will do a reasonable
job. Glide and, to some extent PowerSGL are being used a lot at the moment
but there will be many more major players by the end of the year. The market
won't be so clear cut then.
>irongen
Best Regards,
Andy
>I am a little nervous (from a stockholders view) that Banshee has been in
>development so long that it may be outdated before it comes out. I really
>hope 3Dfx is on the ball with this one.
Errr... if it is postponed, doesn't it simply mean it is being
improved even further?
>As far as support goes, all of its features are fully supported under DirectX6.
But whether NEC has inplemented them the same way as they will be
implemented in DX6, that remains to be seen. Sure PCX2 supported
"transparency", but not the same way as most Direct3D games wanted.
>The reason WHY it can be so fast and cheap is becuase it eliminates
>the zbuffer...It doesn't need it.
How much does the z-buffer memory cost again? No, it does not make
extra $100-$200 to the price. Why are you hyping up a vaporware
product about which you know nothing?
>It's method of rendering allows for
>higher quality pixels since you don't write pixels that you don't
>see(i.e. you can spend time making prettier pixels than drawing lots
>of ugly one's).
This method got PCX2 into lots of trouble. It made it very slow in
degraded its visual quality, it had major problems with running
several Direct3D titles etc. etc. Sure, PVRNG will not have any of
these problems... AS IF you knew.
You must be one of those losers who kept claiming PCX1 and PCX2 will
kick 3Dfx's butt. How many times it will take for you to realize you
know nothing?
>Yeah, there was a butt-load of hype around the first generation. This
>time NEC/VDO aren't hyping their product too early...
You must be kidding. It's all the same bs all over again.
>Just happens the followers are doing it for them.
You must be one of them.
>Problems with PCX1 was mostly no bilinear filtering. PCX2 was
>incomplete alpha blending modes. And in general is was very CPU
>limited due to ALL setup being done by CPU.
And it couldn't run lots of popular Direct3D titles successfully.
Enough said.
>PVRSG fixes all of that. All setup is done on-chip, quality and
>features are totally DX6 and OpenGL compliant.
You don't know that. All you know is what NEC is lying to you, and you
have already bought it.
>We'll see what happens.
Funny thing for you to say, since you already claim to know how good
PVRNG is.
>>I'll wait until I see both cards in action, but I predict that the PVR2
>>will compare to the Voodoo2 the same way the PowerVR compared to
>>Voodoo1. If I'm wrong, I'll sell my V2 cards and pick up a PVR2.
>
>You'll be wrong, but difference is timing. V2 will be out 7-8 months
>by the time PVRSG starts pumping out their 2d/3d mid-range.
What did you just say? Was it "we'll see what happens"?
Why are you contradicting yourself?
>Well, Voodoo Rush was a flop. We'll see how things fall in place, if
>at all.
Ah, and again. You just keep going back and forth with hyping up
vaporware and then saying "we'll see".
> >Add to the fact that Videologic
> >outright rigged their drivers to run faster in benchmarking (and lied
> >about it afterwards), and you have a company that I (and a lot of > >other people) don't take seriously.
>
> This is a common misconception spread by 3Dfx zealots, who will never
> let the idea die!
Sorry, but this is no misconception. Users reported that when various
D3D .exe programs were renamed D3Dtest.exe, the images would be screwed
up....in effect, Videologic had a special driver configuration
specifically for D3Dtest.exe, which was one of the more popular
benchmarks used at that time. When pressed about the issue, Videologic
at first denied it, but finally admitted that this was done to show a
true reflection of the card's capabilities, and that future driver
releases would yield this performance without any driver modifications.
I don't make this stuff up. This came from the Jon Peddie report, so I
tend to give it just a little bit of wieght and credit for accuracy.
;-) To me, this is lying. This is part of the reason that I am
extremely wary of Videologic's products. They're going to have to blow
everything's socks off before I'd consider buying from them, and I don't
think that PVRSG is going to beat my SLI V2 rig, or even a single V2
card. Just my opinion, mind you.
>
> >and won't be that much faster than a single card. Each will have its > >own advantages and weaknesses. The lack of a native API will > >probably end up being a drawback for the PVRSG...a lot of developers > >aren't tookeen on DX, which is what the PVRSG is designed to use (as > >well as OpenGL).
>
> PVR will seek exclusive games for use with their native API. With the
> native API you have better access to volumetric effects, which ARE
> VERY NICE.
>
I'm sure they will be. After all, who's going to make an API and then
say, "Here you go...the visual effects i our API are no better than
D3D!" ;-)
> >Besides, PVRSG has been in development for quite some time now,
> >something on the order of two years or so from what I've read. I > >doubt that Voodoo2 has nearly so much development time alloted to it. > >If a company can't make a card that's top-notch after two years of
> >development, then every engineer working there should be fired and > >never allowed near high-tech equipment again.
>
> Well, then about 50 companies such as S3 should be given your desired
> fate. Anyways, you'll see soon enough.
>
You mean S3 is still around? Wow. I thought that they *were* all fired
for foisting the Virge upon the unsuspecting public.
> >So, let me just say this: people in this NG may have a bias towards
> >3Dfx, but it is deserved, and rightfully so. Only 3Dfx has backed up
> >their claims with products to match. Videologic/NEC sure spews a lot > >of crap, and thus far that's what their boards have been.
>
> Again, people act as if there's some statistical basis for such
> feelings, but we are only talking 2 products here, not 1000. One can
> hardly state with much meaning on the "thus farness" of things.
>
Two products is still giving you a place to start from, and based on
what's been available PowerVR hasn't been able to do the job on 3Dfx.
> >Rendition has
> >made a great chipset that's fully the equal of the first Voodoo
> >chipset...too bad the first Voodoo is the baseline now, and not the > >top dog. NVidia made a fast chip, and all they had to give up in > >return was any kind of image quality. Intel and Real3D have made a > > >good chip with fantastic image quality, but lackluster speed. Only > > >3Dfx has made the best balance of the two, and for that they deserve some bias/praise. I don't think that's unreasonable.
>
> Of course they deserve praise, but that's meaningless to the future.
Agreed. But your statement prior to this one made it sound as though
you felt 3Dfx didn't even deserve any praise in the here and now, which
is just untrue.
> > When asked about the upcoming PVRSG and Riva TNT,
> >Ballard went on to say that even if their performance is the same as
> >their respective press releases state (and we all know how the TNT is
> >being hyped), he said that he still wouldn't be worried...Banshee > >would be BETTER. Chew on that.....
>
> I read the entire post by that person, and commented on it as well.
> He laters speaks with Scott Sellers, and in light of his comments
> you'd better have a more tempered idea of Banshee's performance.
> CEO's and friends are awefully zeolous.
Agreed here, as well. But, I have a much easier time believing that
3Dfx will make a revolutionary product than either Videologic or NVidia
will. Their prior experience in producing excellent products speaks for
itself. Still, we'll just have to see what Banshee does when it comes
out. Just to note, though, I've heard that some vendors have seen
reference boards in action of the Banshee, and the buzz has been
extremely positive. Take that for what it's worth (which, admittedly,
isn't very much at this point in the game).
irongen
I'd like to add that I have had both the voodoo 1 and powerVR PCX2 in my
previous system. (P233MMX).
Quite honestly the voodoo 1 blew away PVR with quake. But when I ran
tombraider at 1024*768 on the power VR card it was amazing. The colour of
the water was a true blue and the fogging effects under water were massively
superior to the 3dfx version. Ultimate race bloody gorgeous(800*600) as was
Mech warriors 2(640*480).
The powerVR with these 3 games wasted the 3dfx.
My point is I agree it was not a 3dfx killer but if it could have ran all
games like this there would have been no contest. It is also of note that I
picked up this card for £80 with a great bundle.(Tombraider 1, Mechwarriors,
Ultimate Race, Wipeout and Terracide).
Just think for a second, everyone is talking about SLI with voodoo 2 to get
1024*768 but powerVR had it in it first chip. The fact that NEC has
partnership with Microsoft re. DirectX6 makes the PVRSG an exciting piece of
hardware to look out for.
rgds physman.
>It's true that there are still a few glide games around but these are quite often also released
>in powerSGL format - F1RS, FIFA 98, everything by activision (nearly).
I may be wrong, but I believe that F1RS is only done in Glide and
Direct3D - it wasn't ported to PowerSGL.
Marc.
In other words.... I want the BEST product for the money and if NEC makes it...
fine.... if 3dfx makes it fine. Making wild claims (true or not) coinidentally
the same month as your competitor releases his new product (v2) when yours is
still months away smells and you have to admit is suspect at best.
Show me the money.... show me the product!
-JB
Jon wrote:
> On Sat, 02 May 1998 21:50:28 -0500, jrb531
> <"jrb531[remove-me]"@interaccess.com> wrote:
>
> >But they said the same thing about the pcx2 and it looks like crap compared
> >to the v1. Now they say that they are using the same method but this time
> >(unlike last time) it will look better? Hmmm prove it to me.... I don't
> >believe them. -JB
>
> Method=?
>
All joking aside, the thing that worries me about PVRSG is that other people
have seen it (other than NEC/Videologic). Now those people could have been
duped, but I think PVRSG may be pretty decent this time around.
We will see.
Tom
>jcmc...@uiuc.edu (Jon) wrote:
>
>>The reason WHY it can be so fast and cheap is becuase it eliminates
>>the zbuffer...It doesn't need it.
>
>How much does the z-buffer memory cost again? No, it does not make
>extra $100-$200 to the price. Why are you hyping up a vaporware
>product about which you know nothing?
Nothing? I know quite a bit, indeed.
z-buffer memory doesn't cost much in and of itself...But in
eliminating the zbuffer using a tile architecture, you eliminate the
need for alot of bandwidth. You don't need 1000-bit interfaces, just
64-bit. This is where the cost is an issue.
I'm not hyping up a vaporware product, I'm speaking of the facts. You
on the otherhand are a zealot incapable of dealing with facts, under
suspicion of anything that is contrary to your own personal feelings.
>>It's method of rendering allows for
>>higher quality pixels since you don't write pixels that you don't
>>see(i.e. you can spend time making prettier pixels than drawing lots
>>of ugly one's).
>
>This method got PCX2 into lots of trouble. It made it very slow in
>degraded its visual quality, it had major problems with running
>several Direct3D titles etc. etc. Sure, PVRNG will not have any of
>these problems... AS IF you knew.
I see you aren't very keen on this subject, so I'll try to be kind,
even though you are being an ass.
deferred texturing never got PCX2 in trouble, all that got PCX2 in
trouble is the things I listed in another message. Go read.
AS IF I know? I do.
>You must be one of those losers who kept claiming PCX1 and PCX2 will
>kick 3Dfx's butt. How many times it will take for you to realize you
>know nothing?
Ah, I see you are TRULY a 3Dfx zealot. You are full of emotions, but
know about as much as a rat.
No, I knew Voodoo1 would kick PCX2's butt. I've had a Voodoo1(2
different kinds) since they existed..Knew about Voodoo a year before
that. Have a V2SLI setup...Who the hell are you to judge?
Ah, talking to you is worthless, why do I do it...
>>Yeah, there was a butt-load of hype around the first generation. This
>>time NEC/VDO aren't hyping their product too early...
>
>You must be kidding. It's all the same bs all over again.
Your emotional attachments are entertaining, but have no place in this
discussion.
>>Just happens the followers are doing it for them.
>
>You must be one of them.
Oooooh. Got me there! You are so quick and bad! Riiiight.
>>Problems with PCX1 was mostly no bilinear filtering. PCX2 was
>>incomplete alpha blending modes. And in general is was very CPU
>>limited due to ALL setup being done by CPU.
>
>And it couldn't run lots of popular Direct3D titles successfully.
>Enough said.
Enough said? Wow, very powerful you are. DX6 solves all these
problems. Enough said. (cute, isn't it)
>>PVRSG fixes all of that. All setup is done on-chip, quality and
>>features are totally DX6 and OpenGL compliant.
>
>You don't know that. All you know is what NEC is lying to you, and you
>have already bought it.
Nope, all I know are facts. You're inability to deal with fact is
interesting, but a matter of psychosis, not 3D hardware discussions.
>
>>We'll see what happens.
>
>Funny thing for you to say, since you already claim to know how good
>PVRNG is.
DUH, it's a comment letting (you) know you should not pass judgement
until you see for yourself.
However, YOU simply pass judgement all OVER the place...On ME, on
chips you know obviously very little about, etc. Again, you should
keep your emotional responses to yourself. They make you look very
stupid.
>>>I'll wait until I see both cards in action, but I predict that the PVR2
>>>will compare to the Voodoo2 the same way the PowerVR compared to
>>>Voodoo1. If I'm wrong, I'll sell my V2 cards and pick up a PVR2.
>>
>>You'll be wrong, but difference is timing. V2 will be out 7-8 months
>>by the time PVRSG starts pumping out their 2d/3d mid-range.
>
>What did you just say? Was it "we'll see what happens"?
>Why are you contradicting yourself?
You must not be familiar with these expressions. It's ok, I
understand some foreigners don't know these things.
>>Well, Voodoo Rush was a flop. We'll see how things fall in place, if
>>at all.
>
>Ah, and again. You just keep going back and forth with hyping up
>vaporware and then saying "we'll see".
Oh, such a powerful argument...I'm changed my ways and have aligned
myself with your awesome intellect. HERE, HERE!
>Jon wrote:
>>
>> On Sun, 03 May 1998 03:07:21 -0400, Christopher Cloud
>> <iro...@earthlink.net> wrote:
>>
>> >Jon wrote:
>>
>> > What a lot of people (myself included) are saying is this: > >Videologic/ NEC have a *looong* way to go before they rebuild their > >reputation.
>>
>> Sure, I even said this already. That's irrelevent to the technical
>> aspects of the chips/boards.
>>
> No, it's not. The first two PowerVR cards didn't even meet their
>press release's design specs (fillrate, polygon throughput, etc), so
>until a shipping product comes to the market, no one should assume that
>the PVRSG will meet Videologic's claims this time around, either. Hype
>doesn't put pictures on your monitor. Everyone is accepting what
>Videologic has said thus far as gospel. Until we all see the board, the
>numbers thrown around are irrelevant.
I'm not talking about the numbers they are throwing around, I'm saying
that whatever their reputation is, that's irrelevent. The reputation
itself determines nothing about the technical aspects of the card.
One is emotional the other is technical.
Noone is accepting what Videologic claims about their performance or
quality...But this doesn't change HOW PVRSG works, or it's technical
specifications.
But beyond that, several people have seen PVRSG, many of them active
3Dfx supporters(Alex Sharkey, Tom Pabst, etc.). These people don't go
ape over a demonstration just because they got free buttons. They
were impressed. This is enough to make me be interested...Not being
interested means you have an emotional attachment to being against
VDO/NEC, which is fine, but I don't care.
>> >Add to the fact that Videologic
>> >outright rigged their drivers to run faster in benchmarking (and lied
>> >about it afterwards), and you have a company that I (and a lot of > >other people) don't take seriously.
>>
>> This is a common misconception spread by 3Dfx zealots, who will never
>> let the idea die!
>
>Sorry, but this is no misconception. Users reported that when various
>D3D .exe programs were renamed D3Dtest.exe, the images would be screwed
>up....in effect, Videologic had a special driver configuration
>specifically for D3Dtest.exe, which was one of the more popular
>benchmarks used at that time. When pressed about the issue, Videologic
>at first denied it, but finally admitted that this was done to show a
>true reflection of the card's capabilities, and that future driver
>releases would yield this performance without any driver modifications.
>
>I don't make this stuff up. This came from the Jon Peddie report, so I
>tend to give it just a little bit of wieght and credit for accuracy.
>;-) To me, this is lying. This is part of the reason that I am
>extremely wary of Videologic's products. They're going to have to blow
>everything's socks off before I'd consider buying from them, and I don't
>think that PVRSG is going to beat my SLI V2 rig, or even a single V2
>card. Just my opinion, mind you.
Interesting how you deleted my own explanation, which is identical to
your own. I'm fully aware of the details.
They modified their drivers to AVOID normal functioning with programs
that weren't compatible with their hardware. This just makes sense to
me. I initially was skeptical, but it makes total sense. That's why
I say zealots won't let this die...They heard initial HYPE and now
they won't let go....They've put too much energy into negative
identifications with VDO/NEC.
>Agreed here, as well. But, I have a much easier time believing that
>3Dfx will make a revolutionary product than either Videologic or NVidia
>will. Their prior experience in producing excellent products speaks for
>itself.
Sure, but I've been keeping track of the technology. There are
indications(Tom's reactions, etc.) that show the product is likely to
be successful. Doesn't mean I'm not skeptical. And this discussion
started as a technical discussion, why it reduced to emotion states is
probably due to people's lack of technical knowledge about PVRSG..All
they know is PCX2 sucks...VDO sucks...Blabla.
> Still, we'll just have to see what Banshee does when it comes
>out. Just to note, though, I've heard that some vendors have seen
>reference boards in action of the Banshee, and the buzz has been
>extremely positive. Take that for what it's worth (which, admittedly,
>isn't very much at this point in the game).
It means more than if you replace Banshee with PVRSG, but when Tom
Pabst(of www.tomshardware.com) and Alex Sharky(of
www.voodooextreme.com) talk about how badass PVRSG is....It makes you
go..HMM.
On Sun, 3 May 1998 17:49:16 +0100, "physman" <afs...@email.msn.com>
wrote:
> Interesting how you deleted my own explanation, which is identical to
> your own. I'm fully aware of the details.
>
> They modified their drivers to AVOID normal functioning with programs
> that weren't compatible with their hardware. This just makes sense to
> me. I initially was skeptical, but it makes total sense. That's why
> I say zealots won't let this die...They heard initial HYPE and now
> they won't let go....They've put too much energy into negative
> identifications with VDO/NEC.
>
>
This is where you and I differ as to what was actually done. You say
you understand why Videologic did what was done, and I still say they
deceived the public.
Videologic modified their drivers that certain programs would behave
differently to allow them to work with the PVR architecture. I don't
dispute that. BUT, when the drivers were modified to allow D3Dtest.exe
to function properly, it resulted in inaccurate benchmarks that
indicated the card was more capable than it truly was. Videologic then
attempted to justify this action, and even attempted to use the inflated
results of such benchmarks in their advertising campaign. Now, it would
have been different if they had come out and said "Look, we know that
the benchmarks were distorted by some changes we had to make to the
drivers. The next driver release will indicate true performance." I,
and many others, could have respected that. But that's not what they
did, and by (attempting) to use the benchmark results in advertising,
that's lying. They purposefully used false information in an attempt to
make the public buy their product. I say again, that's lying, pure and
simple. There's nothing else you can call it, now matter how hard you
think about it. And that's why I'll wait to see what the PVRSG is like
before I make any decisions about it one way or the other. Videologic
managed with that one act to take and totally destroy their reputation
with a great many potential customers, myself included. They have a
*lot* of work ahead of them to catch up to Voodoo2, let alone overtake
it.
Will PVRSG be a good card? I don't doubt that it will. I don't think
Videologic is so incompetent as to make three disappointing chipsets in
a row. S3 and ATI, yes, I seriously doubt their ability to make a
decent 3D chipset, but not Videologic. They have too many resources
available to them for them to screw up completely.
We'll just have to wait until the boards start shipping though, yes?
Even if Sharky Ross and Tom saw working hardware, they didn't see FINAL
silicon, and things can change from Beta to final silicon. If PVRSG
does everything Videologic claims it can, fine. It will raise the bar
such that Voodoo3 will *have* to be absolutely amazing for 3Dfx to stay
in the technology race. If not, then I'll sleep even better at night
knowing that my investment in Voodoo2 was worth every penny.
irongen
It may very well be that they will keep their promises "this" time but past performance is a good measure of the future and it should not be dismissed as
being "emotional" or bias.
Tom hates Intel and proclaimed that the K6 233mhz beats out the Intel 200mhz and that the K6 is a pentium killer. The K6, while a good chip, is not a
"pentium killer" and performs equal (at the same speed hey Tom!) or better in some areas and is now about the same price.
Now Tom (who has loved 3dfx boards like everyone else) suddenly praises the new NEC chip which is months away from production as being the second coming
and puts up very suspicious numbers comparing the 3dfx v2 with an unfinished board and now states that this new NEC board has better graphics and
features! What is his motivation?
I think this is it:
How did Tom get one of the few demo NEC boards? How does Tom get most of his test products? Does he buy them and perform independant test like a consumer
reports? Does he allow ads from the same products he reviews? Don't get me wrong.... Tom does not have to do anything and we do not have to read his page
but some of these questions or answers to these questions do make some of his reviews or statements suspect.
If he bashes any company or product he will not get any more demo hardware from them. If he gives a good review will the company "reward" him? According
to Tom.... people are just dropping hundreds of dollars in products in the mail to him daily.... motherboards, cpus, video cards.... you name it. The only
product or company I have seen Tom really bash was intel after they tried to shut him down (his words) and right after this event he proclaims the K6 233
to be an Intel 200 pentium killer and there is no reason for anyone to ever buy another intel chip.
Sure he's quiet about the PII line as there are no alternative chips but as soon as there is.... guess what he will say?
Conclusion:
I don't trust NEC based on past lies NOT emotion.
I don't trust Tom based on past "suspicious conclusions" NOT emotion.
Your past reputation does matter (as much as these companies want you to forget) and is one way to keep companies somewhat honest. Just ask anyone who has
been burned by a company how quick they are to buy their next product no matter how good it may be.
Look at Activision! I (and others) will NEVER buy another product from them after being burned and lied to for so long with product after product. This
new Battlezone is supposed to be great but I will not buy it. Is this emotion.... maybe but what checks do we have on any company if they can do anything
they want with no recourse?
-JB
>We'll just have to wait until the boards start shipping though, yes?
>Even if Sharky Ross and Tom saw working hardware, they didn't see FINAL
>silicon, and things can change from Beta to final silicon. If PVRSG
>does everything Videologic claims it can, fine. It will raise the bar
>such that Voodoo3 will *have* to be absolutely amazing for 3Dfx to stay
>in the technology race. If not, then I'll sleep even better at night
>knowing that my investment in Voodoo2 was worth every penny.
The shown silicon was 3D-only with no hardware tile accelerator, no
hardware translucency accelerator, no AA, PCI slave mode, running at
66Mhz. This means all setup was done in CPU. The setup shown had a
P2-266 CPU.
Just think, even as this, they were "amazed" at the quake2 benchmarks.
Beta boards will go out in June, if all goes well, shipping in August,
Volume in October.
Whether or not PVRSG is more bad-ass than V2...I'm perfectly happy
with my V2SLI, because I'm playing games NOW at 1024x768.
August->October is a long time in the 3D market!
Cheers,
Jose Moura
Regards, Juardis T.
Jon <jcmc...@uiuc.edu> wrote in article
<3550e1e5...@news.3dfx.com>...
HAVE FUN
FWIW I'm playing BZ on a P133 with a V1 card, with every single video option
on, cd-music, 640x480, etc.. it rocks. No slow downs like the Mercs engine.
And if you don't believe me, d/l the demo, or get your pirate copy from
where ever pirates get that stuff from.
--
Richard '[EXS]Slowtreme' Celeste
exxt...@warzone.com
Making Quake2 players extremely good
http://exxtreme.warzone.com
You have the right to remain silent. Anything you say will
be misquoted, then used against you.
-
Please send much spam to ki2...@worldnet.att.net
jrb531 <"jrb531[remove-me]"@interaccess.com> wrote in message
<354D4C64...@interaccess.com>...
You call it emotion... I call it principle. After the mercs disaster I
will never buy another Activision product. I am not alone on this
either. Activision lied to us and when we proved it they said so what. I
don't care how good battlezone is.... they will not get any of my
money. -JB
jrb531[remove-me]@interaccess.com wrote in article
<354E04...@interaccess.com>...
> You call it emotion... I call it principle. After the mercs disaster I
> will never buy another Activision product. I am not alone on this
> either. Activision lied to us and when we proved it they said so what. I
> don't care how good battlezone is.... they will not get any of my
> money. -JB
I couldn't agree more. Activision has lied, stole, and cheated customers
mroe than any other company I can think of. I haven't bought an Activision
product since Mechwarrior II, when they screwed users on the Netmech patch.
Yes, it was finally released, but not until they'd already released an
addon that already had it that you had to buy to use it. Then, 3-4 months
later, they released the Netmech patch for the original MWII. HAven't
bought anything with Activision's name on it since then.
Oh God, can't I have a little fun? I post 1 humerous post out of 20
informative posts, and I get slammed.
I have V2SLI, I'm perfectly happy with it, the purchase of them has
already paid off, so PVRSG is irrelevent to that idea(unlike some
others who feel their purchase is threatened and so are overwelmingly
biased.
OF COURSE I expect 3Dfx zealots here, but I'll attempt to show them
they shouldn't be so secure in their zeolotness.
And next time, reply to an informative post, posting an informative
reply, instead of replying to a "troll" message, acting as Mr. Troll
watcher! :)
Sure the patches were placed on the net but who ever said that being on the
internet or even owning a modem was a condition for getting support for a
defective product. They will not accept returns like other software companies
and if you call them they will not even send a patched cd!
I wonder how many people bought mercs 1.0, could not get it to work, could not
return it (most stores still do not allow returns), could not return it to
activision, could not get a patch from activision, and just put their $50 game
on the shelf to rot?
I'm sure these people are lining up to buy Battlezone! -JB
Anyway, I dont want you to loose out on one of the best games of the year.
If you are so hot at getting back at Activision, here you go:
alt.binaries.games had the full game posted on the NG.
FWIW the only thing that I'm peeved at is that After d/l the Mercs 3D patch,
which played horribly, I went out and bought the 3Dfx version, thinking that
it really was 3Dfx (glide) not D3D. So I bought the game twice. But this was
_MY_ mistake not Activision. And I sold off the old copy to someone with no
3Dfx card.
--
Richard '[EXS]Slowtreme' Celeste
exxt...@warzone.com
Making Quake2 players extremely good
http://exxtreme.warzone.com
You have the right to remain silent. Anything you say will
be misquoted, then used against you.
-
Please send much spam to ki2...@worldnet.att.net
>> jrb531 <"jrb531[remove-me]"@interaccess.com> wrote in message
>> >new Battlezone is supposed to be great but I will not buy it. Is this
>> emotion....
>Slowtreme wrote:
>> BattleZone is awesome if you like anything from the Mechwarrior series
and
>> any real-time strategy game. This time your emotion is causing you to
loose
>> sight of the goal.
>
Juardis T.
<snip>