I'll make these points short and crisp so hopefully nobody will miss them.
We've worked with id for a long time on DirectX. Much of the architectures
early design and philosophy was driven directly by input I got from John
almost three years ago now. Naturally it's disappointing to hear that one
of our earliest and most valued customers is dissatisfied, and we're going
to attempt to rectify that.
The vast bulk of Johns complaint appears to reduce to the following;
"Direct3D is too hard to use/figure out." Yup, he's right. We've done a
woeful job of providing sufficiently robust documentation and sample code
of Direct3D Immediate mode. If one of the brightest minds in real time 3D
graphics can't figure it out, who can? Clearly we've fallen over on that
one. Many of the Direct3D titles shipping today, or shipping shortly have
been developed with assistance from our developers and our porting lab.
Clearly the API is not simple enough or well documented enough to lend
itself to easy learning and adoption, for which we apologize, and I won't
bore anybody with excuses. We've been aware of this shortcoming for a
while, and have a whole slew of new sample code, and documentation in the
works. In the mean time id and any companies with major titles in
development have always had an open invitation to our porting lab, or to
talk to our engineers if they need to.
We're very comfortable that the Direct3D API can be strongly defended on
its technical merits to anyone who understands it, and it's design
philosophy. Direct3D's number one purpose in life is to provide a standard
driver model for consumer oriented 3D hardware acceleration. It's is not
designed to be the ultimate software rendering engine, John will always
beat us. It's first design goal was to provide thin fast access to
hardware acceleration. That's the number one take away I got from John in
those early meetings.. "get your fat API's out of my way and let me at the
iron". Direct3D absolutely does this, it may not be pretty, but it does
it. It has drivers for every major 3D hardware accelerator shipping,
OpenGL does not, and I'm pretty sure that by the year 2000, everybody is
going to have 3D hardware and John is going to want to be using that iron.
That hardware isn't going to look anything like an SGI pipeline, and OGL is
not likely to evolve fast enough as a real-time gaming API to meet Johns
needs over time.
We're going to add the simpler 3D primitive API's such as draw triangle to
D3D IM in the next release to make it easier for people doing quick ports
or evaluations to get up and running without the fuss, with the
understanding that these primitive API's though easier to use may not
provide the versatility and fastest possible performance that tweaking D3D
execute buffers will. I'll elucidate further on this point in the random
part of this letter.
Okay those are the important points. The points I'm going to attempt to
make next fall more strongly in the category of opinion. I gather that some
might find my point of view on the subject of OGL vs. D3D interesting so
I'll make some further points on that subject and attempt to substantiate
them without getting into a "bit head debate".
Don't let the fact that they have the term "3D" in common confuse you. OGL
and D3D are designed for very different purposes. Forks and Spoons are
both silverware but it hardly makes sense to say that you don't need forks
because they're lousy for eating soup, or that spoons suck because you
can't eat spagetti with them. Saying you have to always eat with a spoon,
or that you're only going to eat soup because you can't figure out how to
use a fork is a fair point, but kind of pitiful and should be fixed.
Microsoft aggressively supports D3D and OGL as 3D API standards without
playing favorites. We see no contradiction in doing so because the API's
are designed to satisfy the needs of very different markets. Direct3D is a
real-time consumer multimedia API designed to support very fast access to a
diverse range of new hardware accelerators. OGL is an industrial strength
3D API designed to serve the needs of high end CAD, modeling, and
scientific visualization. It has no driver model because it was designed
to support a proprietary pixel accurate rendering pipeline that most
consumers won't find in a 200 dollar accelerator any day soon. Nor is
there much evidence that it's the kind of hardware consumers would want
anyway. Sure there may be some OGL drivers made for consumer iron, but
they're not likely to make as good a use of it for games as D3D is. D3D is
about direct access to a broad array of consumer 3D acceleration.
It's going to get very weird out there for 3D hardware. The problem of
making 3D fast is unsolved. I think we can all agree that it will be a
while before we see real-time Toy Story graphics rendered on our desktops.
There are dozens of hardware companies out there trying all kinds of exotic
technologies and formulations to find a way to squeeze a few more megpixels
out of a 50 dollar piece of silicon. Microsoft certainly doesn't want to
stand in the way of their progress, yet at the same time we don't want the
PC platform to get so fragmented with divergent 3D hardware designs that
nothing works properly. (If you think 2D video card driver problems are a
PITA, wait until you see what the 3D hardware does.) Thus the need for a
standard driver architecture that is flexible enough to accommodate
innovation, yet rigid enough to actually be a standard that gets used.
Think it's easy? Here are a few thought problems for you bit heads out
there to solve;
How does your real-time game balance it's processing needs across the bus
to consistently get the fastest possible performance on hardware that may
vary one or more of the following ways; 1. It has a 3D hardware
accelerator which may or may not be slower than the CPU at some 3D
rendering tasks. 2. Some hardware goes 10x faster if you re-author your
game to support weird feature X. You have to decide to support that
feature 18 months before you know how popular that hardware accelerator
really is. 3. Next year new PC's will ship with buses that go much faster
than the current hardware, there will be some new 3D hardware that takes
advantage of this, and a lot that doesn't. The equation for determining
which resources to off load from the CPU to the hardware to get the best
performance changes completely, now how do you write your game? By the way
the OS and OS version may also vary, and you don't know exactly how and
when it's taking CPU cycles from you.
Driver architecture "A" is very fast for 3D hardware shipping this Xmas,
but very slow for hardware shipping next Xmas. Architecture "B" will slow
down this Xmas's titles. If your architecture slows down this Xmas's games
nobody will use your API and it won't be a standard, if you slow down next
Xmas's games your API will be obsolete then, and everybody's new games will
have to write to something different. If they have to write to something
different you still don't have a standard, and you're faced with huge
driver upgrades and incompatibility problems. If you support both "A" and
"B" your architecture is too complicated, ships 9 months later, and nobody
can figure out how to use it, or write a proper driver for it. Which
architecture do you choose.. there is no "C".
Are these questions anybody asked in designing OGL?
Does stripping down, or speeding up OGL a bit and touting it as a consumer
3D API demonstrate much comprehension of these issues?
How much relevant experience can anybody claim to have at solving these
problems? Does all the spoon experience out there translate directly to
chopsticks mastery? The consumer 3D HW market is so new that no real
experts exist yet, nobody knows how it's really going to evolve.
I dunno, but if you think you've got all the answers then I've got a few
questions I'm dying to put to you. In the mean time keep an eye out for the
next release of Direct3D which will include the new draw triangle API, new
sample code and documentation. I encourage John and everybody else out
there to take another look at the next release, and see if it all doesn't
start to make more sense.
--
The opinions expressed in this message are my own personal views
and do not reflect the official views of Microsoft Corporation
Here's my two cents worth.
I love the idea of DirectX. Having someone provide me access to the metal
through a common, but very thin, layer of code makes my life theoretically
a lot easier than having to code to every damned piece of hardware that
comes down the pike.
And I want to use DirectX, so I can write hardware-accelerated real-time
rendering engines which allow me to use metal that I may not even have
heard of when I started coding my engine.
I just hate exactly one thing, and you've captured it right there.
I hate the documentation.
I hate the documentation that came with DirectDraw; I can't figure out
what the individual routines are supposed to do, or what the overall
gestalt of DirectDraw is supposed to be. Yeah, I know it allows me
'access to the video hardware', but between the clip lists and the
discussions of the HAL verses the HEL, I can't figure out which calls
I need to make or why I need to make them.
I hate the documentation that came with DirectSound. I can't easily
at a glance figure out which COM object I need, nor can I figure out
why I need it. The overall gestalt is also missing from the discussion;
yeah, there are all these calls that do useful things, but when do I
need them, what do I need them for, and what do I need to do before
I make the call?
The same also goes for Direct3D, DirectPlay, and DirectInput--the
documentation gives this general overview of why it's cool to use
that particular module, makes some vague noises about how "I should
initialize the widget before modularizing the dohicky", and then
gives an alphabetical list of a few dozen or a few hundred calls.
Bleeech.
Please, when you rewrite the documentation, treat me and other end-audience
users like total morons. Like idiots. Tell me what I need to call, and
*why* I need to call it. Introduce me to terms like Surface and Sound
Buffer and the like: treat me like I've never heard of these terms
before. (Because chances are, even the most expert reader in your
audience will have areas of knowledge they are lacking--even if it's
just how you are using the term.)
And tell me why I need to use a particular object. Give it to me in an
orderly fashion, and give lots of examples: don't just make some vague
noise about how I need a IDirectSoundBuffer object to do real-time
playback of sound without a concrete example *right* *there* in the
documentation, and with plenty of explaination as to *why*.
Vague noises only makes vague, poorly written code.
And for God's sake don't make the mistake of simply filling out the
documentation with a bunch of tutorials. That never works: the tutorial
may get someone off the ground, but without solid documentation for
the 90% of the material that is not covered in the tutorial, most
people out there will only use carbon copies of the tutorial material
in their own games, rather than use the routines they *should* be using
for optimal performance.
Just my $0.03 worth. (Inflation.)
- Bill
--
William Edward Woody - In Phase Consulting - wo...@alumni.caltech.edu
http://www.alumni.caltech.edu/~woody
>* We've worked with id for a long time on DirectX. Much of the
>architectures early design and philosophy was driven directly by input I
>got from John almost three years ago now. Naturally it's disappointing
>to hear that one of our earliest and most valued customers is
>dissatisfied, and we're going to attempt to rectify that.
The implication being that D3D is John Carmack's fault? How much was
Carmack consulted on the latter part of D3D's design? It's pretty
insulting to pass the buck in such a manner.
>* We're very comfortable that the Direct3D API can be strongly defended
>on its technical merits to anyone who understands it, and it's design
>philosophy.
Yet no one has managed to do this, either in e-mail or on Usenet. How
about a single position piece stating why Microsoft feels that A.)
Direct3D is technically superior to OpenGL and B.) Direct3D is
technically better suited for games than OpenGL?
>It has drivers for every major 3D hardware
>accelerator shipping, OpenGL does not
Now wouldn't this be different if Microsoft had evangelized OpenGL to
IHVs as
much as it had evangelized D3D?
>want to be using that iron. That hardware isn't going to look anything
>like an SGI pipeline
ah-ha. So D3D is designed for Talisman, although D3D predates Talisman
by
at least a year if not more?
>, and OGL is not likely to evolve fast enough as a
>real-time gaming API to meet Johns needs over time.
There's this thing called an "extension" that OpenGL has which, by
golly,
Direct3D does not. So if you want to "evolve" your hardware, you had
better get real good at kissing Microsoft's ass to get them to add
support
for your hardware or you're out in the cold. And the likelihood of them
adding support for your hardware when you may be competing with
Talisman...
>* Don't let the fact that they have the term "3D" in common confuse you.
> OGL and D3D are designed for very different purposes. Forks and Spoons
>are both silverware but it hardly makes sense to say that you don't need
>forks because they're lousy for eating soup, or that spoons suck because
>you can't eat spagetti with them. Saying you have to always eat with a
>spoon, or that you're only going to eat soup because you can't figure
>out how to use a fork is a fair point, but kind of pitiful and should be
>fixed.
This is a crock. OpenGL and Direct3D solve the same problems, cute (and
inaccurate) analogies aside. You can write a game using OpenGL or
Direct3D, it's just easier with OpenGL.
>* Microsoft aggressively supports D3D and OGL as 3D API standards
>without playing favorites. We see no contradiction in doing so because
>the API's are designed to satisfy the needs of very different markets.
What design criteria, once again, make D3D so radically different from
OpenGL? It's curious that no one from Microsoft answers this question.
>modeling, and scientific visualization. It has no driver model because
>it was designed to support a proprietary pixel accurate rendering
Why don't you ask your OpenGL group about "It has no driver model". For
some reason I think they'd disagree with you on that one. There is no
"proprietary" aspect of the pixel accurate rendering pipeline -- it's a
specification that was agreed upon by a CONSORTIUM, not just one
company. Christ, Microsoft is a part of the ARB. Proprietary...shyeah,
right.
>pipeline that most consumers won't find in a 200 dollar accelerator any
>day soon.
That's funny, even the Virge has an OpenGL MCD available for it. As
does
3DLabs, and 3Dfx has some support for OpenGL, and there are at least
half a
dozen other companies with boards at the consumer price levels that do
or
will imminently announce support for OpenGL.
> Nor is there much evidence that it's the kind of hardware
>consumers would want anyway.
Nor is there evidence to the contrary.
> Sure there may be some OGL drivers made
>for consumer iron, but they're not likely to make as good a use of it
>for games as D3D is. D3D is about direct access to a broad array of
>consumer 3D acceleration.
Based on what data do you make this rather remarkably stupid assertion?
>the 3D hardware does.) Thus the need for a standard driver architecture
>that is flexible enough to accommodate innovation, yet rigid enough to
>actually be a standard that gets used.
Right, so long as that innovation is dictated by Microsoft, correct?
And
what would D3D and standards have anything to do with each other --
where's
that D3D reference implementation and conformance suite?
>3D rendering tasks. 2. Some hardware goes 10x faster if you re-author
>your game to support weird feature X. You have to decide to support
>that feature 18 months before you know how popular that hardware
>accelerator really is.
This is a gross exagerration. You can often retrofit support for some
trivial feature in a matter of hours, days, or weeks. Carmack had to
make
some changes to GL Quake to make it faster on some hardware that had
poor
texture download performance. I don't think he had to plan 18 months in
advance for that.
>* Driver architecture "A" is very fast for 3D hardware shipping this
>Xmas, but very slow for hardware shipping next Xmas. Architecture "B"
>will slow down this Xmas's titles. If your architecture slows down this
>Xmas's games nobody will use your API and it won't be a standard, if you
>slow down next Xmas's games your API will be obsolete then, and
>everybody's new games will have to write to something different. If
>they have to write to something different you still don't have a
>standard, and you're faced with huge driver upgrades and incompatibility
>problems. If you support both "A" and "B" your architecture is too
>complicated, ships 9 months later, and nobody can figure out how to use
>it, or write a proper driver for it. Which architecture do you choose..
>there is no "C".
>* Are these questions anybody asked in designing OGL?
These are questions EVERYONE asked when designing OpenGL, and the answer
was real simple -- the API and driver model MUST be separate for the
very
reasons you describe. YOU HAVE THE ANSWER RIGHT THERE. You're hitting
yourself over the head with the facts and not realizing it.
The API is SEPARATE from the driver model. An application that uses
OpenGL
calls will NOT fail later when some new OpenGL driver model comes into
play. That's because OpenGL is an API specification, NOT a driver spec.
>I dunno, but if you think you've got all the answers then I've got a few
>questions I'm dying to put to you.
Hey, I'm right here, throw 'em at me.
> In the mean time keep an eye out for
>the next release of Direct3D which will include the new draw triangle
>API, new sample code and documentation. I encourage John and everybody
>else out there to take another look at the next release, and see if it
>all doesn't start to make more sense.
And based on the quality of the work you guys have put out so far we're
supposed to just trust that things are going to get better? That good
documentation, robust DLLs, backwards compatibility, intuitive
interfaces, a plethora of good published work, ports to non-MS
platforms, etc. are going to erupt overnight just because you guys have
swept out the old team and replaced them with a new team? If this is
what you believe, then you either grossly underestimate the intelligence
of your target audience, or we now know what you're spending your stock
profits on.
If someone wants to forward the above to Alex St. John directly I'll
take
this off line (I don't have his e-mail).
Brian
--
+-----------------------------------------------------------------+
+ Brian Hook, b...@wksoftware.com +
+ WK Software, http://www.wksoftware.com +
+ Consultants specializing in 3D graphics hardware and software +
+ --------------------------------------------------------------- +
+ For a list of publications on 3D graphics programming, +
+ including Direct3D, OpenGL, and book reviews: +
+ http://www.wksoftware.com/publications.html +
+-----------------------------------------------------------------+
1) Johns complaint reduce to the following; "Direct3D is too hard to
use/figure out."
2) We've done a woeful job of providing sufficiently robust documentation
and sample code of Direct3D Immediate mode.
3) Direct3D's number one purpose in life is to provide a standard
driver model for consumer oriented 3D hardware acceleration.
4) It [Direct3D] has drivers for every major 3D hardware accelerator
shipping, OpenGL does not.
5) Microsoft aggressively supports D3D and OGL as 3D API standards without
playing favorites.
6) Direct3D is a real-time consumer multimedia API designed to support
very fast access to a diverse range of new hardware accelerators.
7) OGL is an industrial strength 3D API designed to serve the needs of
high end CAD, modeling, and scientific visualization.
8) It has no driver model because it was designed to support a
proprietary pixel accurate rendering pipeline...
= - = - = - =
While 1) and 2) may be true, solving 2) does not solve 1).
4) is true because Microsoft told all the PC ISVs that Direct3D
is the solution.
5) is debatable. Microsoft does indeed support OpenGL, but it
surely hasn't made it easy for IHVs, having had 3 driver models
(ICD, 3D-DDI, MCD) in about as many years. Implementing all of
OpenGL, yourself (i.e., an ICD) is not a trivial task, and is
about the only way to have had OpenGL on Windows from the
beginning to today. This isn't to say that implementing Direct3D
is a picnic either, given the state of the DDKs...
6) & 7): I still don't know what makes an API a consumer API vs.
an industrial strength API. The OpenGL conformance test really only
requires you don't have cracks and aren't wildly off in your
internal calculations (oh, and you don't stop in the face of
garbage data--conform divzero.c...). Certainly an OpenGL
*implementation* can have relatively low internal geometry precision
and range, don't do subpixel displacement ("subpixel correction",
"stepping on pixel centers") (or only in the y-direction), affinely
interpolate texture coordinates (not "perspective-correct"), select
lods per polygon, only fast path point-sampled filtering, only fast
path linear fog, only fast path less-equal depth comparison, etc...
OpenGL 1.1 has a few weaknesses compared to Direct3D. It does
not support frame-oriented ("sort-first") architectures well, e.g.
Apple's scanline board, S-MOS/RSSI PIX, NEC/VideoLogic PowerVR.
It is a big API, and given the wide range of uses it is put to,
optimizing many paths takes a while.
I believe you are looking at 8) in the wrong way. 8) is true
because it is multiplatform, multi-vendor, "open". Many vendors
would not have signed on if SGI/OpenGL ARB _did_ dictate a specific
driver model. Microsoft is in a fairly unique position. Unlike
most non-PC vendors, it controls the operating system, but not the
hardware. Hence, they _need_ driver models for Windows IHVs to
write drivers for.
Sam Paik
--
408-749-8798 / pa...@webnexus.com
I speak for xyne KS since I AM xyne KS.
Resisitance is not futile! http://www.be.com/
Thank you for an enlightening article. One thing I didn't see addressed
was D3D's lack of conformance testing. We need some assurance that
drivers do what they say they do, and do it in a consistent way.
By the way, the subject of the message says "Alex St John", but
the sender was "Philip Taylor". Who wrote this?
--
_ _ ___ _
|_) |_ |\ | | / \ |\ | Benton Jackson, Goat Rider
|_) |_ | \| | \_/ | \| Fenris Wolf Electronic Games
ben...@fenriswolf.com http://www2.bitstream.net/~benton
I understand it and it's design philosophy. I've got one acronym for
you:
PHIGS. Retained, editable data and command structure stores passed to a
rendering pipeline to "improve performance". API-needed data structures
co-mingled with my application data creating a great big bloated PIGS of
an app. Been there, hated it. Did I mention that PHIGS is an ISO
standard?
> it. It has drivers for every major 3D hardware accelerator shipping,
> OpenGL does not, and I'm pretty sure that by the year 2000, everybody is
> going to have 3D hardware and John is going to want to be using that iron.
> That hardware isn't going to look anything like an SGI pipeline, and OGL is
SGI did not invent this "pipeline". D3D seems to follow this pipeline of
polygon-based, world-defined, transformed, lit, rastered graphics. How
will
D3D survive the new "pipeline" (Talisman?) architecture?
> not likely to evolve fast enough as a real-time gaming API to meet Johns
> needs over time.
So you are going to "evolve" D3D to meet this need? Will it be backward
compatible?
> We're going to add the simpler 3D primitive API's such as draw triangle to
> D3D IM in the next release to make it easier for people doing quick ports
> or evaluations to get up and running without the fuss, with the
> understanding that these primitive API's though easier to use may not
> provide the versatility and fastest possible performance that tweaking D3D
> execute buffers will. I'll elucidate further on this point in the random
> part of this letter.
So I can evaluate something that I won't end up using if I want an
optimized program? Why?
>
> Don't let the fact that they have the term "3D" in common confuse you. OGL
> and D3D are designed for very different purposes. Forks and Spoons are
> both silverware but it hardly makes sense to say that you don't need forks
> because they're lousy for eating soup, or that spoons suck because you
> can't eat spagetti with them. Saying you have to always eat with a spoon,
> or that you're only going to eat soup because you can't figure out how to
> use a fork is a fair point, but kind of pitiful and should be fixed.
Most people use both spoons and forks to eat their meals, which makes
them tools in the toolset called silverware. Silverware can be used in
various combinations without modification to eat all sorts of food.
> Microsoft aggressively supports D3D and OGL as 3D API standards without
> playing favorites. We see no contradiction in doing so because the API's
> are designed to satisfy the needs of very different markets. Direct3D is a
> real-time consumer multimedia API designed to support very fast access to a
D3D supports sound and text and graphics seemlessly?
> diverse range of new hardware accelerators. OGL is an industrial strength
> 3D API designed to serve the needs of high end CAD, modeling, and
> scientific visualization. It has no driver model because it was designed
> to support a proprietary pixel accurate rendering pipeline that most
> consumers won't find in a 200 dollar accelerator any day soon. Nor is
> there much evidence that it's the kind of hardware consumers would want
> anyway. Sure there may be some OGL drivers made for consumer iron, but
Ever hear of Nintendo 64? Grossly outsells any D3D based cards and
games.
People seem to like it's advanced graphics.
> they're not likely to make as good a use of it for games as D3D is. D3D is
> about direct access to a broad array of consumer 3D acceleration.
> It's going to get very weird out there for 3D hardware. The problem of
> making 3D fast is unsolved. I think we can all agree that it will be a
Bull. The solutions are on-going and change as capabilities are added
through experimentation. How do people experiment with D3D? Only you
(Microsoft) can experiment. OpenGL allows extensions. Period.
> while before we see real-time Toy Story graphics rendered on our desktops.
I've got an SGI Octane sitting on my desktop doing realtime graphics
equal
to or better than any scene from "Toy Story". Now.
> How does your real-time game balance it's processing needs across the bus
> to consistently get the fastest possible performance on hardware that may
My answer: send as little down the bus as possible! APP-CULL-DRAW!
Parallelism! But I've only written visual simulation apps that need to
respond to user interaction, draw as mush of a scene as possible within
reasonable framerates, and update any app data that I need, not any
games :-)
> vary one or more of the following ways; 1. It has a 3D hardware
> accelerator which may or may not be slower than the CPU at some 3D
> rendering tasks. 2. Some hardware goes 10x faster if you re-author your
> game to support weird feature X. You have to decide to support that
> feature 18 months before you know how popular that hardware accelerator
> really is. 3. Next year new PC's will ship with buses that go much faster
> than the current hardware, there will be some new 3D hardware that takes
> advantage of this, and a lot that doesn't. The equation for determining
> which resources to off load from the CPU to the hardware to get the best
> performance changes completely, now how do you write your game? By the way
> the OS and OS version may also vary, and you don't know exactly how and
> when it's taking CPU cycles from you.
So I trust you to understand how to evaluate (let alone come up with)
this "equation"? I can probably characterize OS version performance
differences pretty quick compared to actually building my app, and
retool
the necessary sections.
> Driver architecture "A" is very fast for 3D hardware shipping this Xmas,
<Stuff deleted>
How does any of this conjecture relate to OpenGL?
> Are these questions anybody asked in designing OGL?
Support on 95% of all shipping hardware today (workstations, PC)
across many tens of graphics adapters available on all that hardware
says "Damn straight!" to me.
> Does stripping down, or speeding up OGL a bit and touting it as a consumer
> 3D API demonstrate much comprehension of these issues?
It's never been touted as a consumer 3D API, just a general 3D API
that's
applicable across many fields, including consumer applications (games).
You
are the one touting a need for a special 3D API because YOU understand
all the changes between hardware and software likely to appear in the
next
5 years. No one else is as smart as YOU.
> How much relevant experience can anybody claim to have at solving these
> problems? Does all the spoon experience out there translate directly to
Mulitple companies with years of experience and diverse hardware and
software requirements all converged on the same API.
> chopsticks mastery? The consumer 3D HW market is so new that no real
No mastery, but a better understanding of the intent then someone who
has only used his hands to eat.
> experts exist yet, nobody knows how it's really going to evolve.
>
> I dunno, but if you think you've got all the answers then I've got a few
Answers only produce new questions. My experience tells me that D3D is
an
answer, but there are better ones.
Bob
------------------------------------------------------------------------
Robert A. Schmitt sch...@cineon.kodak.com
Digital Motion Imaging Cineon Post-Production Group
Eastman Kodak Company 716.726.5279, 716.253.9467 (FAX)
>I've gotten a lot of email lately about John Carmack's posting on the
>internet about Direct3D. As Microsofts dually appointed
>DirectRepresentative for this technology I think it's time I posted a
>response.
>
>I'll make these points short and crisp so hopefully nobody will miss them.
>
>We've worked with id for a long time on DirectX. Much of the architectures
>early design and philosophy was driven directly by input I got from John
>almost three years ago now. Naturally it's disappointing to hear that one
>of our earliest and most valued customers is dissatisfied, and we're going
>to attempt to rectify that.
>The vast bulk of Johns complaint appears to reduce to the following;
>"Direct3D is too hard to use/figure out." Yup, he's right. We've done a
>woeful job of providing sufficiently robust documentation and sample code
>of Direct3D Immediate mode. If one of the brightest minds in real time 3D
>graphics can't figure it out, who can?
I have to say this post makes my blood boil.
While I fully appreciate that Microsoft/their individual employees
have finally realised that Direct3D is a bit of a mess, and are
working to rectify this, I really cannot understand why they pander
to every whim of individual software developers.
What the fuck is John Carmack's problem? Direct3D may be difficult to
code, but it's not impossible, and just a week's work, especially with
support directly from Microsoft developers should have been enough for
a 'genius' such as John to get Direct3D working. Instead Id took the
(rather strange) route of implementing hardware support for one 3d
card only. While it's rather a nice implementation, 3dfx must be
rubbing their hands together with glee, and I wonder exactly how much
they're paying him - perhaps this has some sort of bearing on
Carmack's comments on Direct3D.
Possibly John gave up trying to use Direct3D because the Quake engine
technology is not compatible with it. It relies on multi-sided
concave polygons, a ridiculous lighting model, and platform specific
rendering tricks - none of which are good engine design if he had
taken into account the imminent arrival of Direct3D and 3d graphics
hardware.
Perhaps if Microsoft had worked more closely with some of the other
industry players, Direct3D wouldn't have been such a mess in the first
place.
Jim
>hardware acceleration. That's the number one take away I got from John in
>those early meetings.. "get your fat API's out of my way and let me at the
>iron". Direct3D absolutely does this, it may not be pretty, but it does
What?? As best I can tell, D3D absolutely *does not* do this. This
is 180 degrees from the truth. D3D "Immediate" Mode gets in the way by
forcing you to create these display lists (known as execute buffers)
that must then be decoded by the CPU and/or graphics hardware to do any
rendering. This is the antithesis of "let[ting] me at the iron."
This has been discussed before, esp. in SGIs white paper on D3D, and I
will not rehash it here. If you intend to stand by this statement I
would like to see it defended against the aforementioned objections. If
you cannot do this then I, for one, can only assume that your claims are
100% FUD.
-----
I speak for no one but myself.
We are not paying him anything, and he did not write it just for 3Dfx -
he actually had it running on Intergraph machines first I believe. And
it runs on at least 4 different cards.
I agree - D3D does not just let you at the iron. It imposes too many
things on you, that are just wrong. Most of these come from being too
heavily influenced by software rendering.
Alex St. John also wrote:
>> It (OpenGL) has no driver model because it was designed
>> to support a proprietary pixel accurate rendering pipeline that most
>> consumers won't find in a 200 dollar accelerator any day soon.
Wrong - our (3Dfx) hardware is as pixel accurate as most GL engines from SGI.
And it now costs $219 at Frys.
Alex St. John also wrote:
>> Nor is there much evidence that it's the kind of hardware consumers
>> would want anyway.
Wrong - consumers want texture mapped, filtered, zbuffered, alphablended
graphics, just like OpenGL provides (as evidenced by glQuake).
Alex St. John also wrote:
>> D3D is about direct access to a broad array of consumer 3D acceleration.
Yes, but it's too imposing right now.
Alex St. John also wrote:
>> The problem of making 3D fast is unsolved.
I think we've done a good "first cut" job of it...
> The vast bulk of Johns complaint appears to reduce to the following;
> "Direct3D is too hard to use/figure out." Yup, he's right. We've done a
> woeful job of providing sufficiently robust documentation and sample code
> of Direct3D Immediate mode.
Implying that Direct3D is too hard to use because the documentation is
poor is incorrect. It could also be that the interface is poor (I'm not
saying this is the case, but read on).
> We're very comfortable that the Direct3D API can be strongly defended on
> its technical merits to anyone who understands it, and it's design
> philosophy. Direct3D's number one purpose in life is to provide a standard
> driver model for consumer oriented 3D hardware acceleration. It's is not
> designed to be the ultimate software rendering engine...
Actually the API is limited to the few features the Direct3D immediate mode
team thought would be important. In many ways Direct3D ensures that the
industry does not advance beyond those features common in 1995/96 because
its API doesn't allow for features that may be common in 2000 or 2005.
Curved surfaces could easily be supported through a simple API, relying
on software where hardware doesn't support the feature directly. Because,
however, the Direct3D imm team decided the industry shouldn't have curved
surfaces, little incentive is given to hardware manufacturers to provide
this feature.
There is a big difference between a good 3D hardware accelerator and a
good Direct3D accelerator. We (DWI - Lost World PC team) could give numerous
other examples of functionality missing from the Direct3D API (and have
done so to Alex St John).
The irony here is that Direct3D certainly contains all the features to
accelerate Quake, but it certainly will not meet the needs of next
generation software or hardware.
Anyway, all the credit to Microsoft for taking a bullet in the chest
regarding documentation and the goals behind the software rendering portions
of Direct3D. The case you make for why Direct3D is important to game
developers versus OpenGL (hardware abstraction) is exactly why it is so
important that Direct3D should be forward looking, and not forever locked
into the vision of a few (and yes, isolated) engineers in 1994.
> We're going to add the simpler 3D primitive API's such as draw triangle to
> D3D IM in the next release to make it easier for people doing quick ports
> or evaluations to get up and running without the fuss, with the
> understanding that these primitive API's though easier to use may not
> provide the versatility and fastest possible performance that tweaking D3D
> execute buffers will. I'll elucidate further on this point in the random
> part of this letter.
What's wrong with this? There is no reason why the triangle primitive can't
write to the execute buffer instead of directly to hardware. Isn't this
the point of an abstraction layer? The draw triangle is a step in the right
direction, but is only half implemented.
> I dunno, but if you think you've got all the answers then I've got a few
> questions I'm dying to put to you.
Arrogant crap.
> In the mean time keep an eye out for the
> next release of Direct3D which will include the new draw triangle API, new
> sample code and documentation. I encourage John and everybody else out
> there to take another look at the next release, and see if it all doesn't
> start to make more sense.
In general, the next release of Direct3D is a step in the right direction --
as difficult as Direct3D (actually, it isn't; its more of a time consuming
nuisance than difficult) may be, it certainly beats writing to every ma
and pa 3D accelerator's API. The hope is that:
1) The draw triangle primitives will be for real use, not an "API for
dummies."
2) That Microsoft will do its part to advance the state of the art, and not
simply "freeze" 3D technology in 1994.
Mine too, but not for the same reasons. This microserf is basically saying:
oh you poor hapless non-Microsoft software developers, we made it too hard
for you. How about we document it better and give you some vaporware sample
code to make it easy for you? Uncle Gatesy's here to save you. There,
there, now, feel better?
It looks as though they've missed Carmack's point. Or more probably, they
are hoping you've all missed his point (see below.)
> While I fully appreciate that Microsoft/their individual employees
> have finally realised that Direct3D is a bit of a mess, and are
> working to rectify this, I really cannot understand why they pander
> to every whim of individual software developers.
> What the fuck is John Carmack's problem? Direct3D may be difficult to
> code, but it's not impossible, and just a week's work, especially with
> support directly from Microsoft developers should have been enough for
> a 'genius' such as John to get Direct3D working.
Then you missed his entire point. John Carmack cannot afford to waste
time learning all the nuances of Direct 3D, especially if they have no
hard fast demonstrable benefit over the much cleaner alternative:
OpenGL. John Carmack does more with his time than waste it just porting
DOOM and Quake to every platform under the sun. He's pushing the
envelope of 3D immersive game play. He can't do that and at the same
time figure out when to lock, and unlock his surfaces while trying to
second guess the most optimal way of arranging his execute buffers
(and writing reams and reams of code doing it.)
Of course Carmack can figure it out, the guys is not selectively stupid.
He's figured out something more important: Its not worth his time to
waste it playing with Direct 3D, he needs to play, to experiment, to
see things happen very quickly. OpenGL is a better sandbox, simple as
that.
> [...] Instead Id took the
> (rather strange) route of implementing hardware support for one 3d
> card only.
Huh? Quake has been directly ported to Rendition, and 3DFX.
Furthermore, if you have an OpenGL driver, you have a free port via
GLQuake, which only relies on the OpenGL32.DLL driver. So it ought
to work just fine on a PerMedia, or a InterGraph Reactor (as
mentioned by Carmack himself in an earlier .plan)
> [...] While it's rather a nice implementation, 3dfx must be
> rubbing their hands together with glee, [...]
The 3DFX guys have much more than just Quake to keep their hands
warm.
> and I wonder exactly how much they're paying him - perhaps this has
> some sort of bearing on Carmack's comments on Direct3D.
Geez, as long as we're making irresponsible comments, just how much
is Microsoft paying you to try and shoot down his comments?
> Possibly John gave up trying to use Direct3D because the Quake engine
> technology is not compatible with it.
Huh? I find this very hard to believe. They are compatible with
OpenGL, but not Direct 3D?
> [...] It relies on multi-sided concave polygons,
I don't see how this hurts them w.r.t. a Direct 3D port.
> a ridiculous lighting model,
This may be, I have not reasoned out Quake's lighting model just
yet. Either way, Direct 3D supports lighting, the change couldn't
be that hard.
> and platform specific rendering tricks -
Only for the rasterization layer. Of course this disappears in their
OpenGL port, what makes Direct 3D special?
> none of which are good engine design if he had taken into account
> the imminent arrival of Direct3D and 3d graphics hardware.
They've known about 3D hardware for quite some time. Rendition
enabled versions of Quake were being demoed before Verite, 3DFX, or
even Quake itself was generally available.
> Perhaps if Microsoft had worked more closely with some of the other
> industry players, Direct3D wouldn't have been such a mess in the first
> place.
Well, they were working with Abrash himself on 3D DDI, but that got
scrapped after he left to go work for id. They also had some sort of
Open GL effort some time ago, both of which I think most will agree
are far better alternatives to Direct 3D.
--
Paul Hsieh
qed "at" chromatic "dot" com
http://www.geocities.com/SiliconValley/9498
Graphics Programmer
Chromatic Research
What I say and what my company says are not always the same thing
Are you a telemarketter? Boy, do I have an URL for you to check
out!:
You have facts sort of screwed up here. For starters, the first
implementation of Quake for hardware was for Rendition (VQUAKE, maybe
you've heard of it). Second, GLQuake wasn't implemented on 3Dfx until
AFTER it had been implemented on Intergraph and 3DLabs hardware. Third,
GLQuake is also available on Irix.
> Possibly John gave up trying to use Direct3D because the Quake engine
> technology is not compatible with it. It relies on multi-sided
> concave polygons, a ridiculous lighting model, and platform specific
> rendering tricks - none of which are good engine design if he had
> taken into account the imminent arrival of Direct3D and 3d graphics
> hardware.
"Imminent arrival"? Let's see, Quake was well under way before
RenderMorphics was acquired. As a matter of fact, early versions of
Quake and Quake levels existed at id when 3D-DDI was the name of the
game.
Given Quake's target audience and release date, and Direct3D's
immaturity even today, I don't think Carmack can be faulted for
targeting software only rendering for the original version of Quake.
With all due respect, this sounds more like jealousy than rigorous
analysis. Why can't some people respect Carmack's work without feeling
that they're somehow smaller at the same time?
> Perhaps if Microsoft had worked more closely with some of the other
> industry players, Direct3D wouldn't have been such a mess in the first
> place.
Perhaps if Microsoft had left 3D graphics APIs up to the big boys who
know how 3D graphics hardware and software is written this whole mess
could have been avoided.
I think you missed the point: John's problem was NOT with lack of tutorials
and sample code - it was with how things have to be done in Direct3D. The
interface is overly complex. When will Microsoft's API creators realize that
100 pages of sample code demonstrating how to do X does not solve the problem
of an unintuitive, byzantine interface. I don't intend to belittle MSoft's
efforts - certainly the programmer's job is made easier by the presence of
standard APIs - it is just that sometimes we wish that they were not so
sullied with unnecessary complications (like backwards compatibility, which is
good for the user but bad for the programmer)
> Direct3D's number one purpose in life is to provide a standard
> driver model for consumer oriented 3D hardware acceleration. It's is not
> designed to be the ultimate software rendering engine, John will always
> beat us. It's first design goal was to provide thin fast access to
> hardware acceleration. That's the number one take away I got from John in
> those early meetings.. "get your fat API's out of my way and let me at the
> iron".
When he said "let me get at the iron" he was not talking about the "iron" of
your execute-buffered software renderer (which is pretty decent, I guess).
Tell me, how is building a list of commands in memory, to be executed at some
later time, going to the metal? If buffers make sense for efficiency reasons,
they can be incorporated at the DRIVER level, since the details that provide
the best speed vary across platforms (hardware, and your software renderer)
Exposing the internals of your software renderer may help its performance, as
direct3D programs essentially become "to the metal" of your software renderer,
without having to go through a clean API abstraction. The simplest, cleanest
interface to the widest variety of platforms is a procedural one. There is no
reason to write to memory, and then to the renderer, when you can just plain
write to the hardware (this may vary per platform, but for 3dfx, which has an
ON CARD buffer already, this is the most efficient way - again, this should be
decided by the driver, NOT BY THE API)
> Microsoft aggressively supports D3D and OGL as 3D API standards without
> playing favorites. We see no contradiction in doing so because the API's
> are designed to satisfy the needs of very different markets. Direct3D is a
> real-time consumer multimedia API designed to support very fast access to a
> diverse range of new hardware accelerators. OGL is an industrial strength
> 3D API designed to serve the needs of high end CAD, modeling, and
> scientific visualization. It has no driver model because it was designed
> to support a proprietary pixel accurate rendering pipeline that most
> consumers won't find in a 200 dollar accelerator any day soon.
3dfx?
Yes it's a rasterizer only, but think about the future. The API will be with
us a long time. OpenGL's approach allows future cards to accelerate much more
than just rasterization. If no consumer level cards do it today, this does
not mean that it will never be advantageous to allow it
> Nor is
> there much evidence that it's the kind of hardware consumers would want
> anyway.
How did you come by this opinion?
> Sure there may be some OGL drivers made for consumer iron, but
> they're not likely to make as good a use of it for games as D3D is. D3D is
> about direct access to a broad array of consumer 3D acceleration.
"Direct access" = execute buffer?
(snipped - difficulty of optimizing to hardware / busses / OSs of different
relative capabilities
> Are these questions anybody asked in designing OGL?
Do you really think they are that stupid? By the way, do you think caps bits
address the issue of what features to use?
> Does stripping down, or speeding up OGL a bit and touting it as a consumer
> 3D API demonstrate much comprehension of these issues?
What is the difference between a consumer 3D API and a non-consumer 3D API?
If you are referring to CosmoGL, I believe the intent was to fix up what used
to be a pretty inefficient software MS openGL and add a RAMP mode similar to
direct3d, so as to get good performance on non-hardware accelerated games (or
at least on par with direct3D RAMP). If this can be done, without cheating
and exposing your software renderer's internal data structures and calling it
an API, then SGI will have done game programmers who want to sell to
non-hardware accelerated users a big favor.
Regards,
Jon Graehl
How much did it cost you? How much does the typical consumer pay for a
PC?
Chris
----------------------------------------------------------------
Chris Marriott, SkyMap Software, U.K. e-mail: ch...@skymap.com
Creators of fine astronomy software for Windows.
For full details, visit our web site at http://www.skymap.com
Will we see public responses to this post or the points brought up in
followups? I can see no better way to defend your viewpoint/position
than to put it out for public debate. The Microsoft view of this whole
D3D flare-up has been completely absent and this is leading to
(almost) completely one-sided discussions. I would also like to hear a
response about who wrote this post: It's attributed to Alex St.John,
posted by Philip Taylor with an official-ish header and a disclaimer
footer - rather a confusing parentage :)
Looking forward to further position statements from Microsoft...
Cheers,
Stephen Wilkinson "Programming is like pinball.
Software Engineer the reward for doing it well
Interactive Creations Inc. is the opportunity to do it
wi...@airmail.net again." (anon - Wizards Bane)
RNDM in Warbirds http://www.icigames.com
And more to the point: the situation is certainly temporary. In all
likelyhood it will be rectified by year's end.
> >want to be using that iron. That hardware isn't going to look anything
> >like an SGI pipeline
>
> ah-ha. So D3D is designed for Talisman, although D3D predates Talisman
> by at least a year if not more?
Right now, D3D is a joke as far as Talisman is concerned. It shares the
same implementation problems as OpenGL with respect to Talisman. In
theory, D3D Retained Mode should be good for an architecture like Talisman.
In practice, D3D Retained Mode is implemented on top of D3D Immediate
Mode, so there is no benefit. It is very apparent that in the past,
Microsoft has not been synchronizing their OpenGL, D3D, and Talisman
projects. I hope that this situation is changing as we speak.
> >, and OGL is not likely to evolve fast enough as a
> >real-time gaming API to meet Johns needs over time.
Who's pushing the evolution: hardware vendors? Microsoft? Seems like John
Carmack is pushing the evolution. And everyone else who's deciding to do
"native" 3d drivers for their games, because D3D doesn't offer enough
performance. Not that the OpenGL situation is any better right now, but it
seems that OpenGL and D3D are in a similar boat as far as games vendors are
concerned.
> >* Don't let the fact that they have the term "3D" in common confuse you.
> > OGL and D3D are designed for very different purposes. Forks and Spoons
> >are both silverware but it hardly makes sense to say that you don't need
> >forks because they're lousy for eating soup, or that spoons suck because
> >you can't eat spagetti with them. Saying you have to always eat with a
> >spoon, or that you're only going to eat soup because you can't figure
> >out how to use a fork is a fair point, but kind of pitiful and should be
> >fixed.
Give me a break. There are a lot of technically astute 3d people reading
these posts. Give us all a break.
> >* Microsoft aggressively supports D3D and OGL as 3D API standards
> >without playing favorites. We see no contradiction in doing so because
> >the API's are designed to satisfy the needs of very different markets.
You are correct that the markets are different. The technologies are not,
except that D3D has more design flaws than OpenGL. In practice, D3D has
pretty much failed to leverage the reduced requirements of its API, the
"cutting corners" as it were. Hence, this is mostly about Microsoft's
marketing.
> >modeling, and scientific visualization. It has no driver model because
> >it was designed to support a proprietary pixel accurate rendering
>
> Why don't you ask your OpenGL group about "It has no driver model". For
> some reason I think they'd disagree with you on that one.
Indeed, we've been working with the MS OpenGL Mini-Client Driver model
since June 1996.
Cheers,
--
Brandon J. Van Every | Free3d: old code never dies! :-)
| Starter code for GNU Copyleft projects.
DEC Graphics & Multimedia |
Windows NT Alpha OpenGL | vane...@blarg.net www.blarg.net/~vanevery
Well no, that's not fair. The MS OpenGL folks certainly know what they're
doing, and to date they have the fastest software-only OpenGL to prove it.
Remember, Microsoft is not some unified entity, it's more like a coral
reef. You have to consider that there's an OpenGL group, a Direct3D group,
a Talisman group, and who knows how many upper level managers floating
around somewhere. I understand that Microsoft has gone through some
internal restructuring recently with regards to its 3d efforts, and I hope
that brings some unity to the picture.
No way. As far as I remember, some of the scene files for Toy Story
frames were several 100 MBytes large. They use advanced lighting
effects, shadows, sophisticated anti-aliasing algorithms, etc. Every
leaf on the trees is a trimmed NURB, and there are *many* leaves on a
tree. All current hardware would be a few orders of magnitude too slow
to render something of that complexity and quality in real time.
--
Reto Koradi (k...@mol.biol.ethz.ch, http://www.mol.biol.ethz.ch/~kor)
| The problem is that you see Direct3D as an API, or 3d graphic engine.
| It's actually closer to being a driver spec. It is a low-level
| direct-access to 3d accelerator, like DirectDraw is a direct access
| mecanism to the 2D video buffer. That's why it's called 'Direct' 3D.
It never ceases to amaze me how eagerly some people swallow whatever
tripe the Microsoft Marketing Machine(tm) serves up.
--
Michael I. Gold Silicon Graphics Inc. http://reality.sgi.com/gold
And my mama cried, "Nanook a no no! Don't be a naughty eskimo! Save your
money, don't go to the show!" Well I turned around and I said, "Ho! Ho!"
:Bob Schmitt wrote:
:> I've got an SGI Octane sitting on my desktop doing realtime graphics
:> equal to or better than any scene from "Toy Story". Now.
:
:No way. As far as I remember, some of the scene files for Toy Story
:frames were several 100 MBytes large. They use advanced lighting
:effects, shadows, sophisticated anti-aliasing algorithms, etc. Every
:leaf on the trees is a trimmed NURB, and there are *many* leaves on a
:tree. All current hardware would be a few orders of magnitude too slow
:to render something of that complexity and quality in real time.
:--
:Reto Koradi (k...@mol.biol.ethz.ch, http://www.mol.biol.ethz.ch/~kor)
we all know that, it was obviously a troll.
--
Martijn Dekker
UvA - Math department Universiteit van Amsterdam
PFF - Computer Games
http://turing.fwi.uva.nl/~mdekker/pff/ (Xcogitate, Templar)
>j...@curved-logic.com said:
>> "Philip Taylor" <pta...@microsoft.com> wrote:
>>
>> >We've worked with id for a long time on DirectX. Much of the architectures
>> >early design and philosophy was driven directly by input I got from John
>> >almost three years ago now. Naturally it's disappointing to hear that one
>> >of our earliest and most valued customers is dissatisfied, and we're going
>> >to attempt to rectify that.
>> >The vast bulk of Johns complaint appears to reduce to the following;
>> >"Direct3D is too hard to use/figure out." Yup, he's right. We've done a
>> >woeful job of providing sufficiently robust documentation and sample code
>> >of Direct3D Immediate mode. If one of the brightest minds in real time 3D
>> >graphics can't figure it out, who can?
>>
>> I have to say this post makes my blood boil.
>
>Mine too, but not for the same reasons. This microserf is basically saying:
>oh you poor hapless non-Microsoft software developers, we made it too hard
>for you. How about we document it better and give you some vaporware sample
>code to make it easy for you? Uncle Gatesy's here to save you. There,
>there, now, feel better?
Oh, did I forget to mention that infuriated me too ;)
>> While I fully appreciate that Microsoft/their individual employees
>> have finally realised that Direct3D is a bit of a mess, and are
>> working to rectify this, I really cannot understand why they pander
>> to every whim of individual software developers.
>> What the fuck is John Carmack's problem? Direct3D may be difficult to
>> code, but it's not impossible, and just a week's work, especially with
>> support directly from Microsoft developers should have been enough for
>> a 'genius' such as John to get Direct3D working.
>
>Then you missed his entire point. John Carmack cannot afford to waste
>time learning all the nuances of Direct 3D, especially if they have no
>hard fast demonstrable benefit over the much cleaner alternative:
>OpenGL. John Carmack does more with his time than waste it just porting
>DOOM and Quake to every platform under the sun. He's pushing the
>envelope of 3D immersive game play. He can't do that and at the same
>time figure out when to lock, and unlock his surfaces while trying to
>second guess the most optimal way of arranging his execute buffers
>(and writing reams and reams of code doing it.)
John Carmack can afford almost anything he wants (especially if the
reports about him giving away his first ferrari are true :). He can
afford to spend 3 years re-writing the software renderer 7 or 8 times
to get it right, he can afford to get 3rd party graphics card
manufacturers to build mini-drivers for his game.
The whole premise of id has been to rake in the cash from Quake, for
the sake of a week's work, they could have had an even bigger bank
account. And if he can't do it himself, why not get some other Id
employee to do it?
>> [...] Instead Id took the
>> (rather strange) route of implementing hardware support for one 3d
>> card only.
>
>Huh? Quake has been directly ported to Rendition, and 3DFX.
>Furthermore, if you have an OpenGL driver, you have a free port via
>GLQuake, which only relies on the OpenGL32.DLL driver. So it ought
>to work just fine on a PerMedia, or a InterGraph Reactor (as
>mentioned by Carmack himself in an earlier .plan)
That's hardly a large sub-set of the 3d hardware available.
>> [...] While it's rather a nice implementation, 3dfx must be
>> rubbing their hands together with glee, [...]
>The 3DFX guys have much more than just Quake to keep their hands
>warm.
I guess so, the 3dfx is obviously the card of choice at the moment.
>> Possibly John gave up trying to use Direct3D because the Quake engine
>> technology is not compatible with it.
>Huh? I find this very hard to believe. They are compatible with
>OpenGL, but not Direct 3D?
>> [...] It relies on multi-sided concave polygons,
>I don't see how this hurts them w.r.t. a Direct 3D port.
It boils down to how badly current hardware/Direct3D deals with
anything other than triangles.
>> a ridiculous lighting model,
>
>This may be, I have not reasoned out Quake's lighting model just
>yet. Either way, Direct 3D supports lighting, the change couldn't
>be that hard.
>> and platform specific rendering tricks -
>
>Only for the rasterization layer. Of course this disappears in their
>OpenGL port, what makes Direct 3D special?
>
The static lights are pre-rendered into the texture maps. This is
thoroughly inefficient when you consider limited amounts of texture
RAM available, you'd be better of using the gouraud-texturing on the
card. Also, the original Quake used 8bit colour, and loads of LUT
cheats to do this, when that really doesn't apply to hardware
rendering. It wasn't designed with the hardware in mind.
>> none of which are good engine design if he had taken into account
>> the imminent arrival of Direct3D and 3d graphics hardware.
>
>They've known about 3D hardware for quite some time.
Then why design an engine which is so tied to specific hardware? The
OpenGL versions get away with it due to sheer brute force, not good
design.
>> Perhaps if Microsoft had worked more closely with some of the other
>> industry players, Direct3D wouldn't have been such a mess in the first
>> place.
I agree with Brian Hook's comment on this. Microsoft re-invented the
wheel with Direct3D, and made it square.
Jim
>With all due respect, this sounds more like jealousy than rigorous
>analysis. Why can't some people respect Carmack's work without feeling
>that they're somehow smaller at the same time?
Sure, I'm jealous - I wish the games I've written sold as many copies
as Quake :). Why though are Carmack's comments taken as bespoke when
it comes to 3d hardware support, when he's taking a completely
different route than almost everyone else, and getting a lot of
personal attention from manufacturers and API writers?
I respect his game, but I don't think his comments are helping the
muddy waters clear on this subject. Perhaps the best thing that will
come out of it is just for that reason - that people listen to him.
Problem is, is he telling them the right thing?
>> Perhaps if Microsoft had worked more closely with some of the other
>> industry players, Direct3D wouldn't have been such a mess in the first
>> place.
>
>Perhaps if Microsoft had left 3D graphics APIs up to the big boys who
>know how 3D graphics hardware and software is written this whole mess
>could have been avoided.
I couldn't agree more.
Jim
Carmack doesn't want to use Direct3D. Direct3D needs support form
leading edge games to take off. Carmack doesn't want to give it that
support or he'll have to use it. It's simple. It's strategic
positioning.
I sure don't want to be writing code for an API that changes at
Microsoft's whim, will probably never be seen any other operating
systems. No one should.
We need to promote other operating systems and show that anything
available for MS SlaveWare can work elsewhere also. When Mesa has
some solid hardware support, I'd bet that the Linux port of Quake will
take advantage of it. Not possible with Direct3D. We need to
promote the open standards or everyone is going to end up paying a lot
of money to MS just for the privilege of writing software.
I have no idea whether this is Carmack's motivation. It's the net
result. It's a result I strongly support, as do many other people and
companies.
>That's hardly a large sub-set of the 3d hardware available.
Huh? What hardware isn't supported by OGL or the Verite port that
could actually run Quake at useful speeds?
----
Mike Chapman - http://www.paranoia.com/~mike
> I sure don't want to be writing code for an API that changes at
> Microsoft's whim, will probably never be seen any other operating
> systems. No one should.
I couldnt have said it better. I think this sums up alot of the problems
people have with DirectX. Its about time someone put it as succinctly<sp?>
as this!
Actually, these things aren't imposed by D3D, but by Windows. It isn't at
all clear in my mind whether the OpenGL path within Windows will be much
less involuted; unless, of course, we're only talking about full screen
operation.
Alberto.
I think what Mr. Gold wanted to say was that D3D 'appears' to be a
direct-access driver API for 3D hardware, when in fact the D3D driver spec
is really _not_ reflective of the way current 3D hardware operates
(D3DExecuteBuffers???). Future 3D hardware could be designed to optimize
the D3D spec.
The argument has been presented in this newsgroup many times that OpenGL is
as close to or closer to the hardware than D3D. In fact, current hardware
was _designed_ to optimize the OpenGL spec, so you would expect this.
The marketing aspect _is_ amazing.
Eric Powers
pow...@deltanet.com
Michael I. Gold <go...@asd.sgi.com.spam-free> wrote in article
<5e1gl0$c...@fido.asd.sgi.com>...
I didn't mean to imply that Microsoft's OpenGL team was incapable. I
meant that Microsoft should have left the DESIGN of 3D graphics APIs up
to people who have experience working in the field. MS OpenGL is a
great product written by very talented engineers. D3D IM was
a...learning experience.
This is just plain wrong. The pre-rendered light maps are only 1/16th the
resolution of the texture maps, and they are combined at run time, either
through a texture cache, or through alpha blending in glQuake, which gets
around the terrible dynamic lighting texture re-download performance hits you
see in vQuake. Either way, texture cached, or alpha blended, is compatible
with hardware acceleration (texture caching would be best suited to a faster
bus than PCI) Gourad shading would require a LOT of tesselation to get the
same lighting detail as the light maps, and would hurt performance MUCH more.
Jon Graehl
gra...@usc.edu
Carmack is generally considered a visionary. Hell, when Doom came out
they were still doing ray casters, and even today people are all talking
about "Quake killers", yet not a SINGLE game has shipped with a level of
visual quality approaching Quake's. It's easy to get the magazine
articles and interviews (Prey, Unreal, etc.), but no one is shipping
games that get anywhere near Quake's amount of immersiveness or speed.
Carmack has done a lot for the industry and has often implemented things
years before everyone else jumps onto the bandwagon. So when it came to
3D acceleration, people were definitely listening.
> I respect his game, but I don't think his comments are helping the
> muddy waters clear on this subject. Perhaps the best thing that will
> come out of it is just for that reason - that people listen to him.
> Problem is, is he telling them the right thing?
This is a common problem, and he has even admitted as such. I've said
this before, and I guess I'll say it again -- game developers often know
what they want, but they are thinking about THIS product or the NEXT
product. Hardware designers and API designers have to think YEARS
ahead. Carmack was quoted as saying he didn't need a Z-buffer, didn't
need an API, and wanted to write straight to the registers. He has
since retracted all those statements.
The onus is on IHVs and people like Microsoft and SGI to listen to game
developers' desires, but temper that with the fact that game developers
often don't want what they don't know about. When I was writing Glide,
there was a lot of feeling that only a few people would use it and the
rest would go straight to the registers. I was among those people ("use
Glide as sample source"). And that was just wrong. People SAID they
wanted it, but when crunch time came, developers want something that
works.
You can talk to developers about two ways they want things -- in theory,
and in practice. In theory, developers want control of EVERYTHING. In
practice, developers want things to work, period. Always try and
provide the latter. They'll bitch and moan about how they don't have
control, but when you GIVE them that control (Direct3D, register specs,
whatever), they won't use it, or they won't understand it, or they'll
complain about its complexity and won't support it because it's not time
efficient. That's just a fact.
The other thing to watch out for when talking to developers is that
often they don't understand what some things are good for. When I was
at 3Dfx and we first started talking to developers, they were saying how
they couldn't give up paletted video modes (!), or how they didn't a
Z-buffer, or how alpha blending was useless, and who needed bilinear
blending, etc. Comments like these are usually based out of lack of
education about the topics at hand.
So it's up to IHVs and the Big Guys to LISTEN to developers, but not
necessarily blindly follow everything the developers say. If you do
this, you end up with a nice theoretical API like Direct3D that everyone
hates, even though everyone wanted it to begin with. I am quite certain
that a lot of developers said they wanted control over EBs because,
well, most developers are control freaks. And now that they have this
control, they have to accept the responsibility that comes along with
it. And that's not nearly as much fun.
A Reality Engine equiped SGI was +$100k when first introduced in
1992(?). I bought one last year (called Nintendo 64) for US$199.
The original poster could not conceive of highly realistic rendering
capable machines available in the forseeable future. I point to SGI's
mid-range machine as being able to do this type of rendering now with
the belief that in the relatively near future these capabilites will
be single chip implementations available at commodity pricing.
Bob
------------------------------------------------------------------------
Robert A. Schmitt sch...@cineon.kodak.com
You are correct. I apologize for my unbridled enthusiam for this new
machine and making over-reaching claims. I'm just so amazed at the rate
of advancement in graphics hardware and this machine really does
perform for my apps. It rocks!
Alberto C Moreira <amor...@nine.com> wrote in article
<MPLANET.3304cdd...@desmond.nine.com>...
No, Execute Buffers have nothing to do with windows. The reason we all get
to fill execute buffers now is because way back when rendermorphics was
written, the internals of that particular library probably used execute
buffers. Then Microsoft buys the library, renames all of the functions and
creates "Immediate Mode" which just bypasses the high-level part of the
library. Nothing to do with "Windows".
greg
Some of these impositions are caused by Windows, some are caused by focusing
on software rendering. Both bother me.
Which other API are you suggesting using that runs across OS's, and
would not involve ANY re-coding to port your product ?
"Ryan Drake" <stil...@psu.edu> wrote:
------------------------------------------------------------------------------------------------------------------------------
Casey Charlton ca...@caseyc.demon.co.uk
------------------------------------------------------------------------------------------------------------------------------
>
>No, Execute Buffers have nothing to do with windows. The reason we all get
>to fill execute buffers now is because way back when rendermorphics was
>written, the internals of that particular library probably used execute
>buffers. Then Microsoft buys the library, renames all of the functions and
>creates "Immediate Mode" which just bypasses the high-level part of the
>library. Nothing to do with "Windows".
I don't think so.. with Applications running at Ring 3, the task switch to
Ring 0 and hardware access can increase the overhead significantly..
That's why most of the comm drivers are packet oriented, even for async..
Wilbur
If John Carmack has any "genius," it's only being far-sighted enough to see
that Direct3D is annoying to work with, it wastes people's time, there's no
performance advantage, there's no reason to inflict Direct3D on anybody,
and it's just Microsoft's way of trying to get a proprietary 3D API. And
that's no genius.
The only technical excuse for Direct3D is the claim that it could
theoretically "cut corners" by reducing rendering accuracy requirements.
In practice, it really hasn't demonstrated this effectively. And the time
window to demonstrate this effectively is most definitely running out. By
winter 1997 all the 3D HW will have negated any "corner cutting" that D3D
might have once bought. Reducing accuracy is pretty much about reducing
rasterization precision, and by this winter that little 3d graphics problem
will be mercifully "solved" by the available hardware. Had D3D made it to
market much earlier in a highly usable, optimized form, then it would have
had an advantage. But it's pretty well squandered that advantage.
> > Instead Id took the
> > (rather strange) route of implementing hardware support for one 3d
> > card only. While it's rather a nice implementation, 3dfx must be
> > rubbing their hands together with glee, and I wonder exactly how much
> > they're paying him - perhaps this has some sort of bearing on
> > Carmack's comments on Direct3D.
Sierra is doing a very similar thing with "native" versions for Rendition
cards. In fact, they're making their own cards with Rendition chips to
support their games! Doesn't seem strange, it seems to be an intelligent
decision on the part of the games industry. If you don't like what D3D or
OpenGL is buying you, you get something else. I hope that by year's end
we'll be having discussions like "there's no point in writing a "native"
driver, the OpenGL's have caught up." As for D3D, hopefully it'll be
"D3D??!? Who cares!" At any rate, we'll see.
You "guess?" Why are you guessing? Have a look at SGI's analysis:
http://www.sgi.com/Technology/openGL/direct-3d.html
Parts of it are SGI-biased, but I'd say the discussion of the execute
buffer's flaws is "spot on."
Because there are SGI and DEC people sitting on this list who agree with
him, that actually know something about 3d hardware and its effect on 3d
API's? I think you're missing the point that Carmack is really
"downstream" from SGI's analysis of D3D's shortcomings. SGI did that work
much earlier. And frankly, SGI isn't just spewing marketroid bullshit
about this stuff.
> I respect his game, but I don't think his comments are helping the
> muddy waters clear on this subject. Perhaps the best thing that will
> come out of it is just for that reason - that people listen to him.
> Problem is, is he telling them the right thing?
Yes, he is. Stop thinking that it's only "Carmack! Carmack! Carmack!"
that's saying all of this.
Or to be more specifc: it's amazing that Microsoft thought a large chunk of
disorganized execute buffer memory was a good thing to shove to a 3d card
that has a very limited supply of VRAM and/or register FIFO's.
The most amusing thing about this, is that Microsoft tried to go to the 3d
hardware vendors and get them to all redesign their hardware "in the
Microsoft Way." Guess what most of 'em said? "Sorry."
You'd think that Microsoft would have had the sense to design their 3d API
around extant 3d hardware. Instead, they just exposed the software-only
rendering internals of the RenderMorphics product, and tried to shove it
down everyone's throats as "what 3d hardware should look like."
Fortunately, reality intervened.
Actually there is a bug in the current SGI Quake. Run the hardware
version with +gl_nomip 1 and from the console type:
gl_texturemode gl_linear_mipmap_nearest and only the lightmaps will be
rendered. It is what I like to call "Escher Quake". It is actually a
rather useful mode to see just what your light sources are contributing
to a scene. Could be helpful for level builders.
Todd
P.S. Newsgroups were trimmed considerably.
[major snippage]
: Answers only produce new questions. My experience tells me that D3D is
: an
: answer, but there are better ones.
: Bob
: ------------------------------------------------------------------------
: Robert A. Schmitt sch...@cineon.kodak.com
: Digital Motion Imaging Cineon Post-Production Group
: Eastman Kodak Company 716.726.5279, 716.253.9467 (FAX)
Bob,
I don't want to address everything point-by-point in your reply.
Some of it I don't feel qualified to address.
However I do have almost 20 years experience in the PC industry that
pre-dates Microsoft as a player in it. Having worked closely with
Intel during most of that time I'm pretty sure that the combination
of Intel and Microsoft, who are virtually inseparable these days,
has a lot better handle on the evolution of the PC than does the
mind share between Eastman Kodak and Silicon Graphics Inc.
The PC industry has evolved now to nearly the point of maturity
of the auto industry. The vast majority of PC's are now commodity
sales by the 10 largest PC manufacturers in the world. I happen
to work at one of those, Dell Computer, and I can assure you that
Dell and the rest of the top ten develop their product lines around
what Intel and Microsoft develop... NOT around what Eastman Kodak,
Silicon Graphics, or Id Software develops.
You may not always agree with the direction that Intel or Microsoft
takes, I know I don't at any rate, but you're a fool if you don't
buckle down and work with it nonetheless. Like it or not, they
ARE the entities that control the evolution of this industry.
David Springer
--
*************** IGAMES INTERNET GAME LOBBY ****************
* *
* NOW SUPPORTING MICROSOFT DirectPlay 3 LOBBY STANDARD ! *
* *
* A real-time game lobby for the internet with many *
* exciting games and thousands of players. Game *
* developers, players, and ISP's can try it out at: *
* *
****************** http://www.igames.com ******************
Uh, no. Last I heard:
1) PowerPC NT is _over_.
2) NT still doesn't have released D3D support. (anyone?)
This limits you to X86 PCs running Win95.
| You may not always agree with the direction that Intel or Microsoft
| takes, I know I don't at any rate, but you're a fool if you don't
| buckle down and work with it nonetheless. Like it or not, they
| ARE the entities that control the evolution of this industry.
Conform!! Conform to standards of mediocrity!!
Sheep dip, anyone?
*giggle*
Sorry, I couldn't help myself. Especially since Microsoft is so terribly
guilty of this in so many other areas. Like Java (J++), internet applets
(ActiveX), or a Windows 95 user interface that even people working for
Microsoft don't particularly care for.
I think the reason why some people inherently bristle at Microsoft is
that many of their divisions are horribly guilty of saying "We're
Microsoft--we are better than you, and we know how to write software/
design hardware/create 3D engines/build operating systems better than
anyone else in the world."
I think it comes from being isolated from reality up there on the
various campuses in Seattle...
- Bill
(Who is not suggesting all of Microsoft is this way: just some of
the groups there.)
--
William Edward Woody - In Phase Consulting - wo...@alumni.caltech.edu
http://www.alumni.caltech.edu/~woody
Microsoft may be the 800 pound gorilla, but they don't always win
every market or battle they set out to win. Take the wordprocessor
market--Microsoft has wanted WordPerfect to go away for quite a
while now, but WordPerfect keeps hanging in there.
Or Microsoft's effort to upseat Adobe's preminent position in the
graphics art/typesetting/desktop publishing market. Which failed
so miserably a few years back that Microsoft stripped down their internal
desktop publishing group.
That's because just because Microsoft is the overall PC OS winner
and the software developer winner, doesn't mean Microsoft is
dominant in every market.
In the area of Computer Graphics, Silicon Graphics is the 800 pound
gorilla. That's because, even with Microsoft's recent acquisition of
some of the most talented computer graphics researchers in the world,
Silicon Graphics still has more brainpower, done more research, and
perfected more graphics technologies than just about any corporation
out there, Evens & Sutherland not withstanding.
I'm certain Dell and the rest of the top 10 PC manufacturers (excluding
Apple, which last I checked was still in that list) will take their
cues from Intel/Microsoft.
But unless Microsoft wants to get creamed in the graphics market in
the same way they got creamed in the desktop publishing market, they
better start taking a few of their cues from Silicon Graphics for
the higher-end stuff and ID Software (and others) for the lower-end
stuff.
I'm not so concerned with the 3D hardware: as long as it accelerates
*something* about the 3D pipeline (even if it's just the bitblit at
the end), and as long as there exists a universal driver set which
allows me to get to that hardware, I'm certain *someone* out there will
use it.
But they may not use the rest of the Direct3D pipeline Microsoft
has provided us.
- Bill
>Casey Charlton (ca...@caseyc.demon.co.uk) wrote:
>:
>: This means that your DirectX apps will
>: run on PC's, Macs, Dec Alphas, etc.
>
>Uh, no. Last I heard:
>
>1) PowerPC NT is _over_.
Reasonable bet - but mostly because PowerPC's are over (never really
started actually, mainly because IBM 'forgot' to support them, and
Apple are too much of a bit part player)
>2) NT still doesn't have released D3D support. (anyone?)
Note the word 'will' in the comment - look at least 12-18 months from
now.
>This limits you to X86 PCs running Win95.
Not exactly a small market.
It'd be a lot easier to conform if the MS documentation made any
sense. Given their inability to create consistent APIs and document
them in a competitive market, I'd hate to see what they do when
there's no viable competition.
And people talk about Linux being a hack?? I wonder if they've ever
coded for Windblows. Have a bug in your program? Watch the entire
system come down. Think you can read the MS-supplied documentation
and understand how functions work?? Guess again. You'll need to buy
a book or 6 from them for that. X may be tedious, but at least you
can look at the man page and *know* how it should be done, and
understand a basic philosophy of design that will be consistent in
every function.
It's not a bet, it's a fact:
Linkname: Microsoft pulls plug on NT development for PowerPC
Filename: http://www.pcweek.com/news/0203/07apower.html
: >This limits you to X86 PCs running Win95.
:
: Not exactly a small market.
True enough, but the point of your post was the highly suspect
proposition that developing for D3D leaves you in a better cross-
platform position than developing for OpenGL.
I have a hard time believing this would withstand informed analysis.
>In article <330360...@cineon.kodak.com>, Bob Schmitt
><sch...@cineon.kodak.com> writes
>>I've got an SGI Octane sitting on my desktop doing realtime graphics
>>equal
>>to or better than any scene from "Toy Story". Now.
>
>How much did it cost you? How much does the typical consumer pay for a
>PC?
I thought that.
Then I thought that by the time MS have an API worth using, Matrox will
probably selling that capabilty for 125 USD !
J/.
--
John Beardmore
Bravo. The biggest problem I've seen with Microsoft continues to grow
larger and larger and more and more frustrating: it takes them forever
to get things done. As a result, those things that are not
Microsoft-spawn seem to speed past them. The Linux kernel has Java
binary support, because they can update it and ship it out. Mesa keeps
getting faster and faster, because they can update it and ship it out.
It's amazing that of the random work of a large diverse group of
undergrads, Linux has become a _far_ more stable platform for anything
than the OS from the tyrant of software corporations. Bureaucracy bogs
them down. The other part of that is the source of the bureraucracy.
Last I read, SGI was hesitant to release Cosmo OpenGL as an OPENGL32.DLL
for fear of offending Microsoft, and I'm now having more serious doubts
about whether we'll ever see it. That source of bureaucracy?
Microsoft's dedication to self-serving goals. The fact is, whenever the
computer industry as a whole reaches another milestone, we all benefit.
Too often, Microsoft's goals (D3D, in this case) are designed solely to
produce revenue for Microsoft, and, in fact, hinder the computer
indsutry (or more specifically, the 3D industry). It's like that Aesop
fable about the guy who keeps a chunk of gold buried in his backyard and
never spends it -- Microsoft is successful financially, but what good is
financial success if all you do is loft it over others and slow
everything down?
Ok, I'm done ranting. At any rate, I totally concur.
> I have no idea whether this is Carmack's motivation. It's the net
> result. It's a result I strongly support, as do many other people and
> companies.
Myself included -- I've been avoiding moving to Visual J++ as a Java
development platform because I just know that Microsoft embedded some
platform-dependent stuff in there, which only -totally- defies the whole
idea of Java. Damn, would it be nice if they'd rely on quality to bring
in success instead of monopoly.
Brian
: Bob,
David,
I consult to Kodak and other clients and do not
represent my client's views.
Do you have anything you want to discuss about 3D
graphics APIs or the comparison of OpenGL and Direct3D for
games programming?
Bob
------------------------------------------------------------------------
Robert A. Schmitt RESolution Graphics Inc.
Information Visualization Consulting r...@world.std.com
--
Bob
------------------------------------------------------------------------
Robert A. Schmitt RESolution Graphics Inc.
Information Visualization Consulting r...@world.std.com
Mike,
Are you a Windows programmer? From the above "rant" I suspect not.
Personally I've been writing Windows software for many years, and find
it a VERY good platform to develop for. The on-line documentation of
Microsoft's Visual C++ compiler is absolutely superb - I also develop
for Unix (mainly Solaris and HP/UX) and their development documentation
is absolutely diabolical by comparison.
Windows NT is the most stable platform I've EVER developed for in my 20-
odd years in this business - my machine stays up for months at a time
and never, ever, crashes. If your NT box crashes, you've either got
faulty hardware or a dodgy device driver.
Regards,
Chris
----------------------------------------------------------------
Chris Marriott, SkyMap Software, U.K. e-mail: ch...@skymap.com
Creators of fine astronomy software for Windows.
For full details, visit our web site at http://www.skymap.com
: >creates "Immediate Mode" which just bypasses the high-level part of the
: >library. Nothing to do with "Windows".
: I don't think so.. with Applications running at Ring 3, the task switch to
: Ring 0 and hardware access can increase the overhead significantly..
: That's why most of the comm drivers are packet oriented, even for async..
I think Wilbur is correct. It probably becomes even more of an issue
with NT as the device drivers are more isolated by protection layers.
[snip]
: > > Instead Id took the
: > > (rather strange) route of implementing hardware support for one 3d
: > > card only. While it's rather a nice implementation, 3dfx must be
: > > rubbing their hands together with glee, and I wonder exactly how much
: > > they're paying him - perhaps this has some sort of bearing on
: > > Carmack's comments on Direct3D.
: Sierra is doing a very similar thing with "native" versions for Rendition
: cards. In fact, they're making their own cards with Rendition chips to
: support their games! Doesn't seem strange, it seems to be an intelligent
: decision on the part of the games industry. If you don't like what D3D or
: OpenGL is buying you, you get something else. I hope that by year's end
: we'll be having discussions like "there's no point in writing a "native"
: driver, the OpenGL's have caught up." As for D3D, hopefully it'll be
: "D3D??!? Who cares!" At any rate, we'll see.
Sierra supporting native mode for one card and Id supporting another
seems to be an *intelligent* decision ? A damn strange definition of
intelligent. Are game players supposed to have a carousel of graphics
cards now ? "Hey Mikey, let's play Quake ! ... Wait a minute Johnny
I was just playing Leisure Suit Larry, I need to swap out my graphics
card first !"
: | You may not always agree with the direction that Intel or Microsoft
: | takes, I know I don't at any rate, but you're a fool if you don't
: | buckle down and work with it nonetheless. Like it or not, they
: | ARE the entities that control the evolution of this industry.
: Conform!! Conform to standards of mediocrity!!
: Sheep dip, anyone?
If you want to continue beating your head against a brick wall
go right ahead. However, if you think the bleating about how much
it hurts is going to have any significant effect then you've made
it to the point where you've beaten yourself senseless.
: Microsoft may be the 800 pound gorilla, but they don't always win
: every market or battle they set out to win. Take the wordprocessor
: market--Microsoft has wanted WordPerfect to go away for quite a
: while now, but WordPerfect keeps hanging in there.
Yeah, Apple and OS/2 keep hanging in there too. I fail to see the
insight obtained by exposing the fact there's a few die-hards who
stay with the minority products. Granted there are any number
of niches populated by people with undying brand loyalty. You can
even make money catering to them.
The bottom line is that the betting man always bets that Wintel will
win what they go after.
: Or Microsoft's effort to upseat Adobe's preminent position in the
: graphics art/typesetting/desktop publishing market. Which failed
: so miserably a few years back that Microsoft stripped down their internal
: desktop publishing group.
Ditto.
: That's because just because Microsoft is the overall PC OS winner
: and the software developer winner, doesn't mean Microsoft is
: dominant in every market.
No, but this particular thread IS about the OS. Specifically and
exclusively about the Win 95 OS and its API/driver model to take
advantage of 3D hardware acceleration.
: In the area of Computer Graphics, Silicon Graphics is the 800 pound
: gorilla. That's because, even with Microsoft's recent acquisition of
: some of the most talented computer graphics researchers in the world,
: Silicon Graphics still has more brainpower, done more research, and
: perfected more graphics technologies than just about any corporation
: out there, Evens & Sutherland not withstanding.
They're pretty clueless when it comes to commodity markets, Bill.
I grant their expertise in the high end. It takes a LOT more than
technical expertise to succeed in a commodity market. In fact I'd
say that in-house technical expertise probably has the least to
do with it. You can buy techincal expertise cheaper and faster
than you can buy consumer brand recognition.
The average Joe Consumer doesn't know Silicon Graphics from Silicon
Implants. But they DO know Intel and Microsoft. They don't know
Sun Computer from Son of Sam nor do they know a Net PC from a Hair
Net. But they DO know an Intel Inside and Designed for Microsoft
Windows 95 logo on a computer means it will do all the things they
hear about people doing with PC's.
: I'm certain Dell and the rest of the top 10 PC manufacturers (excluding
: Apple, which last I checked was still in that list) will take their
: cues from Intel/Microsoft.
Re: Apple - better check daily basis cause they're going down fast...
: But unless Microsoft wants to get creamed in the graphics market in
: the same way they got creamed in the desktop publishing market, they
: better start taking a few of their cues from Silicon Graphics for
: the higher-end stuff and ID Software (and others) for the lower-end
: stuff.
We aren't talking about the graphics market. We're talking about the
games market. There's a huge difference between $100,000 high end
workstations with software prices to match and $2,000 commodity PC's
running $40 games. To exacerbate the difference even more the PC
and the game will both be obsolete in 18-24 months. Surely the game
will have been replaced (played out) if not the hardware too.
That same thing doesn't happen in the high end workstation market and
it doesn't even happen to that degree in the desktop publishing market.
You aren't comparing apples to apples.
: I'm not so concerned with the 3D hardware: as long as it accelerates
: *something* about the 3D pipeline (even if it's just the bitblit at
: the end), and as long as there exists a universal driver set which
: allows me to get to that hardware, I'm certain *someone* out there will
: use it.
Then don't slam the major mover in getting standards implemented in the
PC market. Intel and Microsoft are the only ones with the leverage to
get that done nowadays. Some might say the standards coming out of
them are "mediocre" and reek of conformity. The real story is that the
haphazardly adhered to "de-facto" standards which have been the central
storyline in the evolution of the IBM "compatible" PC have done more
to stifle that evolution than to help it.
: >This limits you to X86 PCs running Win95.
: Not exactly a small market.
Not a shrinking market either...
: It'd be a lot easier to conform if the MS documentation made any
: sense. Given their inability to create consistent APIs and document
: them in a competitive market, I'd hate to see what they do when
: there's no viable competition.
: And people talk about Linux being a hack?? I wonder if they've ever
: coded for Windblows. Have a bug in your program? Watch the entire
: system come down. Think you can read the MS-supplied documentation
: and understand how functions work?? Guess again. You'll need to buy
: a book or 6 from them for that. X may be tedious, but at least you
: can look at the man page and *know* how it should be done, and
: understand a basic philosophy of design that will be consistent in
: every function.
Welcome to the world of PC's. You're beginning to find out why it
isn't a simple thing to write a robust application for a PC. You'd
have really hated it 6 or 8 years ago when it was REALLY bad. It's
a piece of cake today compared to back then... at least now there's
some serious and moderately successful attempts at creating AND
enforcing standards.
: : Bob,
: David,
: I consult to Kodak and other clients and do not
: represent my client's views.
: Do you have anything you want to discuss about 3D
: graphics APIs or the comparison of OpenGL and Direct3D for
: games programming?
Bob,
Some people think that technical aspects are not the only concern
when comparing graphics APIs.
Is your vision so narrow that you think there isn't room to discuss
other aspects of what does or does not contribute to the commercial
success of a product ?
Do you have anything OTHER than technical points to make ?
If not have the good grace to let others make them.
>Casey Charlton (ca...@caseyc.demon.co.uk) wrote:
>:
>: This means that your DirectX apps will
>: run on PC's, Macs, Dec Alphas, etc.
>
>Uh, no. Last I heard:
>
>1) PowerPC NT is _over_.
>
>2) NT still doesn't have released D3D support. (anyone?)
It's released in the DirectX3 distribution.. but not an official included
release in the delivery of the OS.. I'm doing my D3D research on NT 4.0
right now..
Wilbur
Well then David, you'd better get on the stick. Compaq just joined the
OpenGL Architecture Review Board (the ARB.) Intel is already there, and of
course so is
Digital. Where's Dell?
Seems to me that Id Software is making the right move in pursuing OpenGL
technology.
Cheers,
--
Brandon J. Van Every | Seattlites! The Northwest Cyberartists want YOU
| to help build their artsy-fartsy electro-tech
DEC Commodity Graphics | virtual worlds stuff! If interested, contact
me.
I agree with you on the low-end stuff. Microsoft needs to get over this
artificial OpenGL / Direct3D dichotomy that they invented.
For high-end stuff, I don't see anything wrong with Microsoft's current
strategies. They provide the OS and SoftImage, we provide the cheap OpenGL
horsepower required to out price-perform an SGI box....
I'm not so sure about that. A large chunk of the PC market is relatively
"savvy" about what's out there, and SGI receives an awful lot of press via
the activities of Industrial Light and Magic. Jurassic Park, modifications
to Star Wars film footage....
Granted Microsoft is still much more prominent than all of that.
> Then don't slam the major mover in getting standards implemented in the
> PC market. Intel and Microsoft are the only ones with the leverage to
> get that done nowadays.
You seem to forget that Microsoft has a very active OpenGL group, and that
Intel is a member of the OpenGL ARB. Also, I don't see how Intel really
cares whether people use Direct3D or OpenGL. All they care about is
leveraging MMX and AGP, which can benefit any 'ole 3d API.
> Some might say the standards coming out of
> them are "mediocre" and reek of conformity. The real story is that the
> haphazardly adhered to "de-facto" standards which have been the central
> storyline in the evolution of the IBM "compatible" PC have done more
> to stifle that evolution than to help it.
But this is a straw man with respect to the evolution of OpenGL.
There's nothing "haphazard" about that standard. It has an industry
cross-section of vendors (the ARB) to evolve the standard; since it first
debuted in 1992 it has had 4 years to mature; and it leverages some 10-odd
years of SGI's experience with IrisGL before that. Small wonder that
Direct3D comes off as seriously half-baked by comparison. OpenGL may not
be perfect, but nowadays the number of entries on my OpenGL "gripe list"
are very small (thanks mainly to InterleavedArrays in OpenGL 1.1). And
they'll almost certainly get fixed by the ARB in due course.
"You disagree with me so you're ignorant."
>Personally I've been writing Windows software for many years, and find
>it a VERY good platform to develop for. The on-line documentation of
>Microsoft's Visual C++ compiler is absolutely superb - I also develop
>for Unix (mainly Solaris and HP/UX) and their development documentation
>is absolutely diabolical by comparison.
I completely disagree, and am speaking of the Win32 API documentation
only. Documentation aside, the Win32 API is horribly inconsistent and
not based on a sound, comprehensive, object-oriented design concept.
It's a hack. I'll leave it at that and for anyone else to discover or
not discover on their own. It's not at all worth arguing over.
If you're using Visual C++ (which I've never touched) and the MFC, you
may not be seeing much of the Win32 API in its raw state. Same goes
for people using OWL. I have used OWL and it certainly puts a pretty,
OOP face on a hacked API. You can buy nice development tools for just
about any OS.
>Windows NT is the most stable platform I've EVER developed for in my 20-
>odd years in this business - my machine stays up for months at a time
>and never, ever, crashes. If your NT box crashes, you've either got
>faulty hardware or a dodgy device driver.
I was talking about 95 actually, where any bug in your program may
bring down the system - immediately or more likely slowly.
Nice harangue. Too bad it's spoken from ignorance. But, that's never
stopped you before, has it?
The Rendition Verite (as found in Sierra's card) is the only 3D chip
to get a chip-specific version of Quake produced for it.
Only because the David Springers of the world keep preaching that
every developer should be a good little lemming and swallow any chocolate
covered turd that emerges from that "Huge Anal Orifice" in Redmond.
I'm starting to believe that Davey gets a check for every post
he sends to the newsgroups....
--
****************************************************************************
"We are being Flimflammed by Bill Gates and his partners. Look at
Windows '95. Thats a lot of flimflam you know." Ray Bradbury
****************************************************************************
email address mangled to avoid the spammers
Jim Riblett
jriblett at gate dot net
I was writing Windows programs long before MFC existed. I'm very well
aware of the Win32 API, and it seems OK to me.
: I'll make these points short and crisp so hopefully nobody will miss them.
I think you're the one who's missing the point.
: We've worked with id for a long time on DirectX. Much of the architectures
: early design and philosophy was driven directly by input I got from John
: almost three years ago now. Naturally it's disappointing to hear that one
: of our earliest and most valued customers is dissatisfied, and we're going
: to attempt to rectify that.
: The vast bulk of Johns complaint appears to reduce to the following;
: "Direct3D is too hard to use/figure out." Yup, he's right. We've done a
: woeful job of providing sufficiently robust documentation and sample code
: of Direct3D Immediate mode. If one of the brightest minds in real time 3D
: graphics can't figure it out, who can? Clearly we've fallen over on that
: one. Many of the Direct3D titles shipping today, or shipping shortly have
: been developed with assistance from our developers and our porting lab.
: Clearly the API is not simple enough or well documented enough to lend
: itself to easy learning and adoption, for which we apologize, and I won't
: bore anybody with excuses. We've been aware of this shortcoming for a
: while, and have a whole slew of new sample code, and documentation in the
: works. In the mean time id and any companies with major titles in
: development have always had an open invitation to our porting lab, or to
: talk to our engineers if they need to.
Developer: "Direct3D is HORRIBLY designed."
Microsoft: "We're coming out with better sample code!"
Developer: "Direct3D is HORRIBLY DESIGNED."
Microsoft: "We're coming out with better documentation!"
Developer: "DIRECT3D IS HORRIBLY DESIGNED."
Microsoft: "We'll help you at our PORTING LABS!!!"
Developer: *exasperated look*
Sometimes, I'm not sure if Microsoft speaks the same English language as
I (we) do.
: We're very comfortable that the Direct3D API can be strongly defended on
: its technical merits to anyone who understands it, and it's design
: philosophy. Direct3D's number one purpose in life is to provide a standard
: driver model for consumer oriented 3D hardware acceleration. It's is not
: designed to be the ultimate software rendering engine, John will always
: beat us. It's first design goal was to provide thin fast access to
: hardware acceleration.
Ummm, let's see.
300 microseconds on a P166 to execute an *empty* ExecuteBuffer?
That's 50,000 clock cycles to figure out it shouldn't do anything.
Predictably enough, it's even slower if some processing is required.
: That's the number one take away I got from John in
: those early meetings.. "get your fat API's out of my way and let me at the
: iron". Direct3D absolutely does this, it may not be pretty, but it does
: it.
"Writing to iron" means writing to hardware registers.
It doesn't mean creating a display list to pass to Windows, which then
goes through a call gate to go from ring 3 to ring 0, calls the memory
manager to check if the pointers are okay, then calls the driver, which
then takes the data, munges it into device-specific format, then returns
to Windows, which then passes back through the call gate, and returns
back to your program, which seems to take 50,000 clock cycles.
: It has drivers for every major 3D hardware accelerator shipping,
: OpenGL does not, and I'm pretty sure that by the year 2000, everybody is
: going to have 3D hardware and John is going to want to be using that iron.
: That hardware isn't going to look anything like an SGI pipeline, and OGL is
: not likely to evolve fast enough as a real-time gaming API to meet Johns
: needs over time.
OpenGL is not a "real-time gaming API". It is a 3D graphics API.
: We're going to add the simpler 3D primitive API's such as draw triangle to
: D3D IM in the next release to make it easier for people doing quick ports
: or evaluations to get up and running without the fuss, with the
: understanding that these primitive API's though easier to use may not
: provide the versatility and fastest possible performance that tweaking D3D
: execute buffers will. I'll elucidate further on this point in the random
: part of this letter.
Developer: "Direct3D is too HORRIBLY designed."
Microsoft: "Okay, we'll give you more APIs!."
Usually, when people make programs simpler, they take things out of the
program, not put more things into it - unless you work at IBM, where they
gauge your productivity by K-LOCS (thousands of lines of code).
: Okay those are the important points. The points I'm going to attempt to
: make next fall more strongly in the category of opinion. I gather that some
: might find my point of view on the subject of OGL vs. D3D interesting so
: I'll make some further points on that subject and attempt to substantiate
: them without getting into a "bit head debate".
: Don't let the fact that they have the term "3D" in common confuse you. OGL
: and D3D are designed for very different purposes. Forks and Spoons are
: both silverware but it hardly makes sense to say that you don't need forks
: because they're lousy for eating soup, or that spoons suck because you
: can't eat spagetti with them. Saying you have to always eat with a spoon,
: or that you're only going to eat soup because you can't figure out how to
: use a fork is a fair point, but kind of pitiful and shoul
I think it's Microsoft which needs to understand this concept.
Microsoft is forcing us to use things like the COM interface (spoons)
even though it's pretty terrible for a 3D API.
Microsoft is forcing us to use an ExecuteBuffer (another spoon)
even though it's horrible for most 3D games.
I could ramble on for pages, but you get the idea.
: Microsoft aggressively supports D3D and OGL as 3D API standards without
: playing favorites. We see no contradiction in doing so because the API's
: are designed to satisfy the needs of very different markets. Direct3D is a
: real-time consumer multimedia API designed to support very fast access to a
: diverse range of new hardware accelerators.
300 microseconds, null ExecuteBuffer call.
: OGL is an industrial strength
: 3D API designed to serve the needs of high end CAD, modeling, and
: scientific visualization. It has no driver model because it was designed
: to support a proprietary pixel accurate rendering pipeline that most
: consumers won't find in a 200 dollar accelerator any day soon. Nor is
: there much evidence that it's the kind of hardware consumers would want
: anyway. Sure there may be some OGL drivers made for consumer iron, but
: they're not likely to make as good a use of it for games as D3D is. D3D is
: about direct access to a broad array of consumer 3D acceleration.
: It's going to get very weird out there for 3D hardware. The problem of
: making 3D fast is unsolved. I think we can all agree that it will be a
: while before we see real-time Toy Story graphics rendered on our desktops.
: There are dozens of hardware companies out there trying all kinds of exotic
: technologies and formulations to find a way to squeeze a few more megpixels
: out of a 50 dollar piece of silicon. Microsoft certainly doesn't want to
: stand in the way of their progress, yet at the same time we don't want the
: PC platform to get so fragmented with divergent 3D hardware designs that
: nothing works properly. (If you think 2D video card driver problems are a
: PITA, wait until you see what the 3D hardware does.) Thus the need for a
: standard driver architecture that is flexible enough to accommodate
: innovation, yet rigid enough to actually be a standard that gets used.
Microsoft should stop trying to support all the weird fringe 3D chipsets
out there. You can't support some of the anyway (Nvidia NV1) and some are
just too weird to support well anyway (SMOS PixelSquirt, NEC PowerVR).
Direct3D should provide good support for decent 3D cards only, really;
what it attempts to do now is provide horrible support for all 3D cards.
: Think it's easy? Here are a few thought problems for you bit heads out
: there to solve;
: How does your real-time game balance it's processing needs across the bus
: to consistently get the fastest possible performance on hardware that may
: vary one or more of the following ways; 1. It has a 3D hardware
: accelerator which may or may not be slower than the CPU at some 3D
: rendering tasks. 2. Some hardware goes 10x faster if you re-author your
: game to support weird feature X. You have to decide to support that
: feature 18 months before you know how popular that hardware accelerator
: really is. 3. Next year new PC's will ship with buses that go much faster
: than the current hardware, there will be some new 3D hardware that takes
: advantage of this, and a lot that doesn't. The equation for determining
: which resources to off load from the CPU to the hardware to get the best
: performance changes completely, now how do you write your game? By the way
: the OS and OS version may also vary, and you don't know exactly how and
: when it's taking CPU cycles from you.
: Driver architecture "A" is very fast for 3D hardware shipping this Xmas,
: but very slow for hardware shipping next Xmas. Architecture "B" will slow
: down this Xmas's titles. If your architecture slows down this Xmas's games
: nobody will use your API and it won't be a standard, if you slow down next
: Xmas's games your API will be obsolete then, and everybody's new games will
: have to write to something different. If they have to write to something
: different you still don't have a standard, and you're faced with huge
: driver upgrades and incompatibility problems. If you support both "A" and
: "B" your architecture is too complicated, ships 9 months later, and nobody
: can figure out how to use it, or write a proper driver for it. Which
: architecture do you choose.. there is no "C".
: Are these questions anybody asked in designing OGL?
They didn't have to, becuse OpenGL doesn't specify a specific driver model
or architecture.
Look at your assertions three paragraphs prior.
: Does stripping down, or speeding up OGL a bit and touting it as a consumer
: 3D API demonstrate much comprehension of these issues?
: How much relevant experience can anybody claim to have at solving these
: problems? Does all the spoon experience out there translate directly to
: chopsticks mastery? The consumer 3D HW market is so new that no real
: experts exist yet, nobody knows how it's really going to evolve.
: I dunno, but if you think you've got all the answers then I've got a few
: questions I'm dying to put to you. In the mean time keep an eye out for the
: next release of Direct3D which will include the new draw triangle API, new
: sample code and documentation. I encourage John and everybody else out
: there to take another look at the next release, and see if it all doesn't
: start to make more sense.
It'll still be the same design.
: --
: The opinions expressed in this message are my own personal views
: and do not reflect the official views of Microsoft Corporation
Never managed that with NT 4.0. If the program has a bug, I get a
dialog box, giving me the option of looking at it in the debugger.
> Think you can read the MS-supplied documentation
> and understand how functions work?? Guess again. You'll need to buy
> a book or 6 from them for that. X may be tedious, but at least you
> can look at the man page and *know* how it should be done, [..]
After something like 6 years of Motif experience, and 1 month of
Win32 programming, I don't agree.
The online documentation of Visual C++ is almost excellent, it only
left me clueless (because it was either wrong or misleading) once or
twice. I have one book in addition (from Microsoft Press...), and I
wouldn't want to do without that.
On the other hand, using the first Motif releases, with only the
official documentation, was a true nightmare. Things got much better
after Motif got somewhat less buggy, but mainly with the Heller book,
only this made it possible to use Motif more or less efficiently.
I still find Motif to be at least as unpredictable as Windows, even
though I have much more experience using it.
From a technical point of view, Win32 is not beautiful, there are
many things that are better designed in Motif. But both can be used,
and I don't find one to be much easier than the other. I'd be (almost)
the last to say that Windows is technically superiour to UNIX, but
it still deserves a fair judgement.
--
Reto Koradi (k...@mol.biol.ethz.ch, http://www.mol.biol.ethz.ch/~kor)
There is a difference between a 'nitch market' and a 'minority
manufacturer.'
Suggesting brand loyalty can totally define a market overlooks the
fact that the computer industry is not a single monolythic industry
whose needs can be completely satisfied by a single software
manufacturer. There is no single monolythic market--there are many
different software markets, each with different needs, different
customers, and different players.
Microsoft just happens to be a player in most of them.
> : Or Microsoft's effort to upseat Adobe's preminent position in the
> : graphics art/typesetting/desktop publishing market. Which failed
> : so miserably a few years back that Microsoft stripped down their internal
> : desktop publishing group.
>
> Ditto.
Uh, hate to burst your Microsoft bubble, but Adobe has so thorougly
kicked Microsoft's ass out of the desktop publishing market that Microsoft
doesn't even try anymore.
> : But unless Microsoft wants to get creamed in the graphics market in
> : the same way they got creamed in the desktop publishing market, they
> : better start taking a few of their cues from Silicon Graphics for
> : the higher-end stuff and ID Software (and others) for the lower-end
> : stuff.
>
> We aren't talking about the graphics market. We're talking about the
> games market. There's a huge difference between $100,000 high end
> workstations with software prices to match and $2,000 commodity PC's
> running $40 games. To exacerbate the difference even more the PC
> and the game will both be obsolete in 18-24 months. Surely the game
> will have been replaced (played out) if not the hardware too.
There may be a difference in the resulting market, there may be a
difference between workstations and game engines, but the software is
pretty much the same, regardless of it being in a $100,000 Onyx or a
$2,000 Dell PC.
> Then don't slam the major mover in getting standards implemented in the
> PC market. Intel and Microsoft are the only ones with the leverage to
> get that done nowadays. Some might say the standards coming out of
> them are "mediocre" and reek of conformity. The real story is that the
> haphazardly adhered to "de-facto" standards which have been the central
> storyline in the evolution of the IBM "compatible" PC have done more
> to stifle that evolution than to help it.
I am not 'slamming' Intel or Microsoft for getting a standard across.
In fact, I aplaud it.
I'm just saying that before Microsoft goes off and tries to invent a
'standard' that people are supposed to use, they need to consider a
standard that people _will_ use. Which means taking into account the
work of Silicon Graphics, ID and others before Microsoft starts shoving
defacto standards down our throats.
Microsoft has a history of trying to create standards that people
wind up not using. It would be a shame if they did the same thing here.
: Well then David, you'd better get on the stick. Compaq just joined the
: OpenGL Architecture Review Board (the ARB.) Intel is already there, and of
: course so is
: Digital. Where's Dell?
Where's Dell ? Creating value for stockholders by focusing on relevant
technologies. Dell was rated the number one stock in the world to own
in 1996. It creamed Intel, Microsoft, and Compaq in equity growth.
And Digital - they're a monumental flop in the PC market.
Anyhow - I've never said OpenGL is going to die nor that no one should
pay any attention to it. My claim is that it won't make it as a PC
game API. Quote me some press where Compaq and Intel are pushing it
as a game API and I'll concede some ground to you.
By the way... wasn't Intel pushing 3DR this time last year ? Intel
should stay out of the software business.
: Seems to me that Id Software is making the right move in pursuing OpenGL
: technology.
Maybe they did make the right move for Id - who knows - I don't know
who is paying them to make product endorsements (or denouncements).
My opinion continues to be that the story of OpenGL as a PC 3D *game*
API is "too little, too late".
At least you have the good grace to discard any credibility you
might have faked with your closing remark.
Cheers,
Angus.
'course, it always never ceases to amaze me how people are so eager to
hand out the SAME Marketing Hype - Folks, do note he is FROM Silicon
Graphics. Do you think he has an agenda? BY THE WAY, I OWN, (or have
owned) every Silicon Graphics machine since they started, so I am NOT
slamming their equipment (although it IS overpriced).
What I am saying is that it is humerous to see someone slam one comany
for doing something while doing the same HIMSELF!!!
: I'm not so sure about that. A large chunk of the PC market is relatively
: "savvy" about what's out there, and SGI receives an awful lot of press via
: the activities of Industrial Light and Magic. Jurassic Park, modifications
: to Star Wars film footage....
You're wrong. You can't even say a large chunk of the PC market even
plays games and be correct. You know too many technophiles and can't
see the forest for the trees.
The entire PC game market is $650 million. I'd entertain an argument
that a majority of that market segment knows Silicon Graphics from
Silicon Implants but I wouldn't concede it without reliable data.
The average PC user - which is an appliance user - usually knows who
Intel and Microsoft are and that's about the extent of it. They'll
recognize Apple but would need someone to explain the difference
between an Apple and an IBM compatible PC and will come away still
not quite understanding the difference.
: Granted Microsoft is still much more prominent than all of that.
Orders of magnitude more well known... the name Bill Gates is probably
as well known, if not more so, than Bill Clinton.
: > Then don't slam the major mover in getting standards implemented in the
: > PC market. Intel and Microsoft are the only ones with the leverage to
: > get that done nowadays.
: You seem to forget that Microsoft has a very active OpenGL group, and that
: Intel is a member of the OpenGL ARB. Also, I don't see how Intel really
: cares whether people use Direct3D or OpenGL. All they care about is
: leveraging MMX and AGP, which can benefit any 'ole 3d API.
You seem to be forgetting that no one is arguing against OpenGL's continued
use as a high end 3D API and that the ARB members are there for that
reason rather than to push for its adoption as the game API of choice.
: > Some might say the standards coming out of
: > them are "mediocre" and reek of conformity. The real story is that the
: > haphazardly adhered to "de-facto" standards which have been the central
: > storyline in the evolution of the IBM "compatible" PC have done more
: > to stifle that evolution than to help it.
: But this is a straw man with respect to the evolution of OpenGL.
: There's nothing "haphazard" about that standard. It has an industry
: cross-section of vendors (the ARB) to evolve the standard; since it first
: debuted in 1992 it has had 4 years to mature; and it leverages some 10-odd
: years of SGI's experience with IrisGL before that. Small wonder that
: Direct3D comes off as seriously half-baked by comparison. OpenGL may not
: be perfect, but nowadays the number of entries on my OpenGL "gripe list"
: are very small (thanks mainly to InterleavedArrays in OpenGL 1.1). And
: they'll almost certainly get fixed by the ARB in due course.
It would be a straw man if I'd used it in that context. I used it to
point out that Microsoft driven standards are a good thing. But, I'll
tailor it for you...
What prevents a graphics card company from doing an OpenGL driver and
selling it without certification ? Are they breaking a law ? Can
someone force them to certify it ?
Now - assuming that a certification suite for D3D is forthcoming, which
I've no reason to doubt will happen in the next 12 months, what prevents
the same graphics board company from doing a D3D driver with no
certification ?
Answer in both cases is nothing but market pressure. However, in
the second case the product won't get a "Designed for Microsoft
Windows" logo on it and NONE of the major PC manufacturers will sell
it.
Microsoft has tremendous financial leverage and if a company sells
any windows ready PC with a component that hasn't either passed
certification at an independant test lab or received a written waiver
for the failure from Microsoft then said company loses the right
to apply a "Designed for Microsoft Win 95" logo throughout the
ENTIRE COMPANY and the entire company also loses its discounts
on Microsoft products. For any one (except Apple) of the big
ten PC makers that's tens of millions (probably over 100 mil for
Compaq) of bottom line profit - GONE.
And nobody even wants to contemplate what the lack of a Microsoft
logo on their PC's will do to their sales... as if losing 25% of
your net profit isn't bad enough.
Note that the technical points of OpenGL or D3D do not come into
play. They don't NEED to come into play. Unless D3D was completely
and hopelessly unusable or vastly inferior in performance to some
OpenGL implementation on a PC it doesn't stand a chance of being
used contrary to how Microsoft wants it used. It has been
sufficiently proven that D3D is NOT unusable and NOT inferior
in performance. At worst it is more difficult to use and that
simply isn't enough to make it go away.
| Maybe they did make the right move for Id - who knows - I don't know
| who is paying them to make product endorsements (or denouncements).
There you go again. You seem to be arguing from a perspective that the
only thing which motivates people is money. Like Carmack needs any more
money! Its because of his financial independence that he can take this
"risky" stand for "what is right" as opposed to "what is profitable".
Not everyone shares your value system of "money first". I could leave
SGI tomorrow and make more money, but I'm staying because I love my job
and believe in the technology. I'm not paid to believe or evangelize.
You, on the other hand, admit to being motivated by money:
In <5a6r6q$9fp$1...@boris.eden.com> David Springer writes:
>> Yeah, my pointy little head is up Wintel's ass. My nose is plugged
>> (although the smell isn't really that bad) and my beady little
>> eyes are wide open.
>>
>> You know what I see ? Money - the inside of Wintel's ass is papered
>> in the filthy green shit. Since I don't figure I'm ever going to
>> own an asshole that shits money I figure the next best thing is
>> to hang out under one...
Please don't project your own nasty value system onto everyone else.
> No, Execute Buffers have nothing to do with windows. The reason we all
get
> to fill execute buffers now is because way back when rendermorphics was
> written, the internals of that particular library probably used execute
> buffers. Then Microsoft buys the library, renames all of the functions
and
> creates "Immediate Mode" which just bypasses the high-level part of the
> library. Nothing to do with "Windows".
Actually, user-to-kernel transitions are quite expensive, specially in NT.
Hence the concept of batching data words before sending them down to the
kernel driver. Hence execute buffers. Hence command word batching in the
OpenGL ICD and MCD. It's all the same issue, be it D3D or OpenGL.
Alberto.
| What I am saying is that it is humerous to see someone slam one comany
| for doing something while doing the same HIMSELF!!!
No, sorry, I'm not spreading lies. The notion that Direct3D IM is a driver
spec is just wrong. And despite your attempts at indirection, the fact
that SGI is imperfect does not change the fact that this assertion by the
Microsoft documentation is just plain wrong.
HEY EVERYBODY!!! LOOK AT MY RETURN ADDRESS!!! I WORK FOR SGI!!!
That doesn't alter the facts of this discussion.
> Or to be more specifc: it's amazing that Microsoft thought a large chunk
of
> disorganized execute buffer memory was a good thing to shove to a 3d card
> that has a very limited supply of VRAM and/or register FIFO's.
Every Windows graphics card has enough video memory to maintain, depending
on resolution and bit depth, a fair amount of offscreen memory plus the full
frame buffer. Virtually all Windows graphics cards have a visible frame
buffer, to take advantage of the DIB Engine. Someone coming into Windows
with a card that doesn't have a large linear-addressable video memory could
be considered non-standard, to say the least. As for FIFO, our Imagine 128
is one of the fastest Windows cards around, and it doesn't have a FIFO: it
has a VRAM-resident display list, and a zero-wait-state cache.
When 3D came to be in the PC arena, people had been doing graphics boards
for a while, and many de-facto standards had evolved. Most everyone in the
Windows graphics marketplace adhere to those standards.
> The most amusing thing about this, is that Microsoft tried to go to the 3d
> hardware vendors and get them to all redesign their hardware "in the
> Microsoft Way." Guess what most of 'em said? "Sorry."
Microsoft is doing something different: it's coming to the Windows graphics
designers and saying, "do 3D my way". Traditional 3D hardware vendors are at
a disadvantage here, because a Windows card must have superior 2D
performance and functionality besides being good at 3D; you can have 3D
performance, but you won't go anywhere without excellent Winbench scores and
video frame rates.
Those 3D developers who say "sorry" are risking being marginalized in the
rush for Windows market niches.
> You'd think that Microsoft would have had the sense to design their 3d API
> around extant 3d hardware. Instead, they just exposed the software-only
> rendering internals of the RenderMorphics product, and tried to shove it
> down everyone's throats as "what 3d hardware should look like."
> Fortunately, reality intervened.
Microsoft designed its 3D API around extant Windows standards. What they did
with Direct 3D was to plug it into DirectDraw, which was already a standard
- and software supported by many Windows graphics vendors - before D3D came
to be. Microsoft's path has been to evolve things within Windows, and their
approach to 3D has been the same. Even OpenGL has had to tilt slightly
towards Windows, with the concept of Rendering Context and what not. It is
likely - no, certain - that Windows OpenGL programs won't run anywhere else
without conversion.
When talking about D3D, or about OpenGL in a PC world, one cannot afford to
be 3D-centered; one must become Windows centered.
Alberto.
>James Shaw wrote:
>> I respect his game, but I don't think his comments are helping the
>> muddy waters clear on this subject. Perhaps the best thing that will
>> come out of it is just for that reason - that people listen to him.
>> Problem is, is he telling them the right thing?
>
>This is a common problem, and he has even admitted as such. I've said
>this before, and I guess I'll say it again -- game developers often know
>what they want, but they are thinking about THIS product or the NEXT
>product. Hardware designers and API designers have to think YEARS
>ahead. Carmack was quoted as saying he didn't need a Z-buffer, didn't
>need an API, and wanted to write straight to the registers. He has
>since retracted all those statements.
So you agree with my concerns that people may be getting the wrong
message? Perhaps John will have to retract his statements on Direct3D
when Microsoft finally get their act together and ditch the execute
buffer system, and replace it with/add a triangle drawing interface,
and proper access to the state flags.
I've never wanted to access the hardware at register level. Many of
my colleagues had their fingers burned when trying to do this with
individual hi-res graphics cards - they soon realised that register
bashing is thoroughly nasty. It may be the fastest way to drive an
individual card, but it doesn't drive a whole set of cards - it needs
reprogramming for each. Now we have API's that help us do this - to
rubbish them because they are not simple to use is unfair, though
comments about their efficiency are fair game.
Jim
I don't understand that - The Verite chip-set is not the same as the
3dfx one (which has had an OpenGL mini-driver built for it).
Jim
At some point in time most of us will have to be taught the facts of life.
That is, we are talking about running 3D graphics, UNDER WINDOWS - which is
a Microsoft operating system. Further more, we're talking about running it
ON PC'S - which are a consumer-level hardware standard fueled by Intel 386+
compatible CPUs.
Remember: Windows and PC. That's the baseline; the rest is fringe. More, a
commodity-priced PC cannot fit two graphics chips or boards. Consequence:
your 3D card must also have excellent Windows 2D graphics characteristics.
No Windows 2D, and you're in a fringe market at best, or a dead duck in the
worst scenario.
Now, Windows has several standards, which evolved accross the years, way
before 3D intersected PC space. PC people are not going to give up those
standards. Hence: any new hardware or software that wants to exist in PC
space must coexist and or adhere to those standards. And, surprise! Many of
these are Microsoft-driven and Microsoft-enforced.
Non-PC people first entering the PC world face several problems. Programming
in Windows involves a model that's different from programming in other
environments;. no matter what advanced technology is introduced, it must
work well within the Windows environment. A less apparent but no less
stringent issue is that PC buyers must be willing to fork out their hard-
earned money. We're not talking about university computer centers, rich
Film-making corporations, or high-power graphics startups; we're talking
about John Q Public going to Lechmere's or CompUSA and spending his precious
salary on a machine to take home and use for his personal needs, and his
family's. In this environment, $2,000 is already a lot of money, and it must
buy a whole system, 16Mb of memory plus 2Gb hard disk, CDROM, and the whole
caboodle. How much of that can your graphics card hog ?
Now go back to your drawing board and design under the constraints: the
finished product must integrate seamlessly into Windows; it must show
competitive performance on existing Windows apps; it must be attractive from
a consumer standpoint; and it cannot cost much more than a couple hundred
bucks or so at the retail outlet. And to make things more complicated: in
order to sell your wares, you may have to get it approved by Microsoft. And
if you're not willing to do it, hold no illusions: someone else will, and
they'll kick you out of the market. And by the time you get there, you will
probably have realized that the battle between API A and API B is the least
you must worry about; you'll probably have to support both, and much more.
Mediocre ? Stifling ? Restrictive ? Unfair ?
Welcome to the PC world.
Alberto.
This is stretching the truth. To claim that D3D was designed around extant
hardware because MS belatedly bolted it into the 2D DirectDraw support is
misleading. There was some 3D hardware around then as well as a plethora of
3D experience which was simply ignored. D3D design has far more to do with
what RenderMorphics could squeeze out of a software renderer 3 years ago
than good hardware support, then or today.
> approach to 3D has been the same. Even OpenGL has had to tilt slightly
> towards Windows, with the concept of Rendering Context and what not. It is
> likely - no, certain - that Windows OpenGL programs won't run anywhere else
> without conversion.
OpenGL is independent of the window environment, the OpenGL parts won't
require porting, and if you use the glut utilities then you won't have to
port any of the graphics or windowing. For many applications glut won't
suffice but you could equally say that if I program for an X server using
Motif I may have to write some platform specific Motif code.
This is not OpenGL changing to to suit the platform, it's the platform
specific implementation details. OpenGL is _only_ the 3D API.
>
> When talking about D3D, or about OpenGL in a PC world, one cannot afford to
> be 3D-centered; one must become Windows centered.
Yes, but a decent 3D API makes the most challenging part a lot easier and
completely portable.
Cheers,
Angus.
| In article <5e57rd$e...@fido.asd.sgi.com>, go...@asd.sgi.com.spam-free says...
| > spri...@eden.com (David Springer) writes:
| >
| > | You may not always agree with the direction that Intel or Microsoft
| > | takes, I know I don't at any rate, but you're a fool if you don't
| > | buckle down and work with it nonetheless. Like it or not, they
| > | ARE the entities that control the evolution of this industry.
| >
| > Conform!! Conform to standards of mediocrity!!
|
| At some point in time most of us will have to be taught the facts of life.
|
| That is, we are talking about running 3D graphics, UNDER WINDOWS - which is
| a Microsoft operating system. Further more, we're talking about running it
| ON PC'S - which are a consumer-level hardware standard fueled by Intel 386+
| compatible CPUs.
Sure but David seems to be arguing "you may turn off your brain and allow
Wintel to drive innovation. Resistance is futile. You WILL be assimilated."
I consider it a Good Thing that there are innovators in this industry and
strongly object to the NIH attitude prevelent at Microsoft.
| Remember: Windows and PC. That's the baseline; the rest is fringe. More, a
| commodity-priced PC cannot fit two graphics chips or boards. Consequence:
| your 3D card must also have excellent Windows 2D graphics characteristics.
| No Windows 2D, and you're in a fringe market at best, or a dead duck in the
| worst scenario.
What's your point? That a card which supports OpenGL cannot also support
2D? That D3D is somehow a better "fit" for cards designed to handle both?
This is just wrong.
| Now, Windows has several standards, which evolved accross the years, way
| before 3D intersected PC space. PC people are not going to give up those
| standards. Hence: any new hardware or software that wants to exist in PC
| space must coexist and or adhere to those standards. And, surprise! Many of
| these are Microsoft-driven and Microsoft-enforced.
No problem there. Microsoft has a fine implementation of OpenGL, and has
an MCD kit which allows 2D/3D cards to accelerate OpenGL.
I'm not suggesting that people go design hardware without consideration
for the market, driving technology and current interfaces. I'm saying
we don't all need to ask "how high?" when Microsoft barks out "jump!".
Also you are approaching this from a hardware standpoint, apparently
with the mistaken impression that the hardware requirements of D3D and
OpenGL are somehow fundamentally different. I suggest you look again.
| Microsoft designed its 3D API around extant Windows standards. What they did
| with Direct 3D was to plug it into DirectDraw, which was already a standard
| - and software supported by many Windows graphics vendors - before D3D came
| to be. Microsoft's path has been to evolve things within Windows, and their
| approach to 3D has been the same. Even OpenGL has had to tilt slightly
| towards Windows, with the concept of Rendering Context and what not. It is
| likely - no, certain - that Windows OpenGL programs won't run anywhere else
| without conversion.
1) OpenGL can fit into DirectDraw as easily as can Direct3D. I understand
this has already been done but not released.
2) OpenGL on all platforms support the notion of a Rending Context.
3) I have lots of OpenGL samples which run on Windows and UNIX without
modification. Obviously the window creation and even management code
differs, but I used GLUT so those differences are masked. There is no
fundamental difference in the way OpenGL is used between the platforms;
in fact most of the wgl interface has direct analogs in the glX interface.
And all the companies suffering under this should be working together
to put an end to it. Better to spend money now than suffer for years
to come.
This is quite true -- which makes me wonder about the long term
feasibility of ICD and MCD when it comes to obtaining maximum
performance. Ideally, a call to glVertex3f() will result in something
like:
mov edi, commandslotaddress
mov [edi], cmd_gl_vertex3f
mov [edi+VX], x
mov [edi+VY], y
mov [edi+VZ], z
But with ICD or MCD the above would get batched to minimize the
user/kernel transitions you mention.
Which means for ultimate performance we may end up seeing IHVs
implementing their own OpenGL32.DLL simply because they're not going to
have a choice in the matter for performance reasons -- they're going to
be banging up against a very real memory bandwidth and kernel transition
overhead barrier, and may end up having to go straight to the metal.
Now, if IHVs have to go this route because of WinNT's intrinsically
shitty driver architecture, then the whole PC industry is going to end
up paying the price, since the OGL driver model(s) currently in place
are going to be ignored.
And there are MANY MANY issues related to implementing a pure
OpenGL32.DLL that are non-trivial that will have to be addressed.
But that's the price you pay for working in a commodity space.
Brian
--
+-----------------------------------------------------------------+
+ Brian Hook, b...@wksoftware.com +
+ WK Software, http://www.wksoftware.com +
+ Consultants specializing in 3D graphics hardware and software +
+ --------------------------------------------------------------- +
+ For a list of publications on 3D graphics programming, +
+ including Direct3D, OpenGL, and book reviews: +
+ http://www.wksoftware.com/publications.html +
+-----------------------------------------------------------------+
> So you agree with my concerns that people may be getting the wrong
> message? Perhaps John will have to retract his statements on Direct3D
> when Microsoft finally get their act together and ditch the execute
> buffer system, and replace it with/add a triangle drawing interface,
> and proper access to the state flags.
I agree that sometimes people's opinions of game developers are a little
too extreme, but I do NOT disagree with John's opinions. My background
is simple -- I've used two of the premiere 3D graphics APIs out there
(D3D and OpenGL), worked on one of them (OpenGL), and architected a
third (3Dfx Glide).
> reprogramming for each. Now we have API's that help us do this - to
> rubbish them because they are not simple to use is unfair, though
> comments about their efficiency are fair game.
John's point is that D3D offers NOTHING significant over OpenGL in terms
of features or performance capabilities, and has significant useability
problems to boot. GLQuake was essentially written in a weekend and
ROCKS on 3Dfx. Hardware isn't getting any slower, and if GLQuake runs
as well as it does, I think that other titles should come over very
cleanly.
I don't know much about OpenGL or D3D, and this is the first time I've read
this newsgroup, so forgive my ignorance. The articles I've read seem to
imply
that OpenGL is much easier to program, is supported by many more parties
than D3D, is stable, and supports technological advances more readily than
D3D. Also, I've read that NT has OpenGL built-in, as well as Windows 95b.
If that's the case, why does microsoft feel the need to create their own
API?
The only reason I could see would be to implement technological advances
not made possibly by other APIs. It seems as if Microsoft is setting up a
long
term strategy to muscle other computer hardware standards out of the scene.
Dejay Clayton, University of Delaware - Computer Science
dark...@udel.edu
The statement above is correct. The Verite is the only 3D chip to get
a chip-specific Quake port. The Quake port to OpenGL is NOT chip
specific, it runs on many different chips and boards.
Come on. A N64 isn't close to a RE. How many reality engines do you
know of that have 4 megs of ram? A N64 is a great game machine, but to
compare it to a RE is insane.
Yes, BUT....
> If it is possible, then glBegin() could be used to lock into user space the
> command registers, glVertex() etc. would write directly to registers
> _without_ a kernel transition, and glEnd() would unlock the command
> registers.
It is my understanding that you cannot do this with a device driver
because there is, in fact, a transition from user space to kernel space
for each call. So effectively an ICD needs to take calls from
glVertex() and batch them up, and then turn around and enter driver
space with batched data to minimize this transition overhead.
The other problem is that you need a "rendering context" for
multitasking applications. When multiple applications are rendering at
the same type, a call to glVertex() just decompose into four register
writes -- you have to reset the state.
Now, a pure DLL can, in fact, map in memory directly under NT, but this
is not the same as the ICD or MCD driver models, and effectively means
that the IHV is implementing an OpenGL from scratch. I'm fairly certain
that Microsoft would frown upon this, but I don't think there is much
they can do. This method offers the least compatibility but the most
performance.
Brian
PS Note trimmed followups
The word is _silicone_ implants, and I just answered a post from a book
editor on CompuServe who was asking if a reference to "Silicone Graphics"
was correct :
"
>> Silicone Graphics.
This is one of the most annoying mis-spellings and mis-pronounciations I
regularly run across!
The name of the company (and the material, and the area in California) is
sans-"e", "e"-less, without the "e".
Silicon is what you build computers with.
Silicone is what you use to fill the breasts of the models which point at
your computers at trade shows!
"
------------------------------------------
Alex P. Madarasz, Jr. - mada...@erols.com
> Do you have anything OTHER than technical points to make ?
Maybe he hasn't. Or maybe he doesn't want to make them in a group
where technical points belong, like this one.
> If not have the good grace to let others make them.
But take them to alt.worship.wintel and trim your group list.
Ralf
--
"*Always* get a contract when working with a dark, omnipotent power."
Ralf Helbing, University of Magdeburg, Department of Computer Science
39106 Magdeburg, UniPlatz 2 Phone: +49 0391 67-12189
I was under the impression that DDSurface::Lock()/Unlock() on a primary or
secondary video frame buffer was doing just that.
If it is possible, then glBegin() could be used to lock into user space the
command registers, glVertex() etc. would write directly to registers
_without_ a kernel transition, and glEnd() would unlock the command
registers.
????
Eric Powers
pow...@deltanet.com
Brian Hook <b...@wksoftware.com> wrote in article
<3308D3...@wksoftware.com>...
Well, congrats that your focus has been good in the past. I think you'll
need to look closely at OpenGL in the future, though. Your competition
certainly is.
> And Digital - they're a monumental flop in the PC market.
Won't argue with you there. I work for Workstations, and I only do Alphas.
Although DEC will offer you either an Intel or an Alpha as you like, and
that's a reasonable strategy, my mission in life is to give you strong
reason to pick the Alpha. :-)
> Anyhow - I've never said OpenGL is going to die nor that no one should
> pay any attention to it. My claim is that it won't make it as a PC
> game API.
Well, we'll see. I think there's a whole host of reasons why D3D might not
make it as an API, simply because it's such a lousy API. I'll be very
interested to see what DirectX 5.0 has in store for us.
> Quote me some press where Compaq and Intel are pushing it
> as a game API and I'll concede some ground to you.
Neither one pushes "games API's" per se. Compaq is almost surely getting
into OpenGL so it can go after CAD and animation markets and such. I don't
see why they'd care, in any event. They just sell hardware, not API's.
> By the way... wasn't Intel pushing 3DR this time last year ? Intel
> should stay out of the software business.
Actually, 3DR was widely reported to be a pretty good technology. It's
more like Microsoft said "you can write all the 3DR you want, but we're not
going to support it because we're doing D3D." Microsoft owns the OS.
Microsoft wins.
> : Seems to me that Id Software is making the right move in pursuing
OpenGL
> : technology.
>
> Maybe they did make the right move for Id - who knows - I don't know
> who is paying them to make product endorsements (or denouncements).
Their capital investments. Really, a lot of developers keep talking about
how D3D is "no big deal to use," but we haven't seen the end of their
design cycle vs. OpenGL design cycles, now have we. Id Software is
gambling that they can make better products faster with OpenGL. It's
risky, but then again, they're pretty much the "mindshare leaders" of the
games world and they have a track record for managing to stay at the
forefront. I don't know of any game that gets as much attention as Quake.
Probably this happens because they make ballsy decisions about their
capital investments, and they don't follow like sheep.
> My opinion continues to be that the story of OpenGL as a PC 3D *game*
> API is "too little, too late".
My opinion continues to be that the whole concept of a PC 3D "games" API is
a Microsoft marketroid invention, and nothing more. They've had their
chance to produce a genuinely distinctive technology for games, and they've
clearly blown it. By winter there will be nothing to stop people from
using OpenGL to write their games, D3D will fail to offer any technological
innovations, OpenGL will gradually erode D3D because it's easier to use,
etc.
Cheers,
--
Brandon J. Van Every | Seattlites! The Northwest Cyberartists want YOU
| to help build their artsy-fartsy electro-tech
DEC Commodity Graphics | virtual worlds stuff! If interested, contact
me.
Windows NT Alpha OpenGL | vane...@blarg.net www.blarg.net/~vanevery
"Dejay Clayton" <dark...@udel.edu> wrote:
>
>I don't know much about OpenGL or D3D, and this is the first time I've read
>this newsgroup, so forgive my ignorance. The articles I've read seem to
>imply
Note the word imply - please also note the conflicting responses.
>that OpenGL is much easier to program, is supported by many more parties
>than D3D, is stable, and supports technological advances more readily than
>D3D. Also, I've read that NT has OpenGL built-in, as well as Windows 95b.
Debatable - but probably true - none the less.....
>If that's the case, why does microsoft feel the need to create their own
>API?
Because OpenGL does not have the kind of games performance and market
share that people require NOW.
>The only reason I could see would be to implement technological advances
>not made possibly by other APIs. It seems as if Microsoft is setting up a
>long
>term strategy to muscle other computer hardware standards out of the scene.
Possibly, or maybe they wanted to get games developers developing for
Win95 - I don't see SGI offering to do the work and then giving away
the OpenGL drivers free of charge (as DirectX is).
>Dejay Clayton, University of Delaware - Computer Science
>dark...@udel.edu
------------------------------------------------------------------------------------------------------------------------------
Casey Charlton ca...@caseyc.demon.co.uk
------------------------------------------------------------------------------------------------------------------------------
>
> Ooops, did you put your foot into a flame war!!!!!!
And I will venture a toe. And for those who don't know what PEX/PHIGS are you
may consider looking it up, it has bits that are uncannily similar to D3D.
>
> "Dejay Clayton" <dark...@udel.edu> wrote:
[snip]
>
> >If that's the case, why does microsoft feel the need to create their own
> >API?
>
> Because OpenGL does not have the kind of games performance and market
> share that people require NOW.
Market Share: Beginning of 1996 PCs were running OpenGL, how many ran D3D ?
So MS created an API in a market where one already existed.
Performance: Quake sums up from a gaming point of view the lie that OpenGL
can't be used for games. Also do people actually belive that people shell out
$00,000s to use a SLOW 3D API ? Of course they don't.
So MS created a 3D API (not the first one they have created BTW) even though
they supported a different one, for which they didn't own the standard, they
then pushed this as THE 3D API FOR GAMES, OpenGL was presented as a big time
API that wouldn't work for games.
Well the number one selling game is in OpenGL and NOT in D3D (nor will it be
apparently). If you can use OpenGL to design a real-time aircraft simulator
you can sure as hell get it to do a computer game aircraft simulation, there
are no reasons to prevent this other than marketing hype and the lemming
effect.
>
> >The only reason I could see would be to implement technological advances
> >not made possibly by other APIs. It seems as if Microsoft is setting up a
> >long
> >term strategy to muscle other computer hardware standards out of the scene.
>
> Possibly, or maybe they wanted to get games developers developing for
> Win95 - I don't see SGI offering to do the work and then giving away
> the OpenGL drivers free of charge (as DirectX is).
So how much do people pay for the mini-driver on the 3dfx ? For the drivers
on the Permedia and FireGL cards ? Nothing. So there must be another reason
then.
One 3D API is an open standard that is used throughout the industry for every
sort of application from Games to Simulators and Film animation software. The
other has a poorly documented interface, is designed by a company who have
already published failed 3D interfaces, it uses concepts that were rejected
the first time around and its spec is owned by one company.
--
Un Lupe en France | Cat 1, Cha, Cha, Cha -- NERC offical drinking song
----The above of opinions rarely reflect my own and never my employers------
Do not add me to mailing lists violations will be billed for time.
: >James Shaw wrote:
: So you agree with my concerns that people may be getting the wrong
: message? Perhaps John will have to retract his statements on Direct3D
: when Microsoft finally get their act together and ditch the execute
: buffer system, and replace it with/add a triangle drawing interface,
: and proper access to the state flags.
My bet is that the execute buffer stays because its need will be
obvious in a year or two. When NT and Win95 merge the overhead of
an application calling a device driver service is going to be large.
That overhead can be reduced to insignificance by batching up
service requests. Mark my words. You'll get DirectTriangle now
because of popular demand but won't be using it two years from now.
Only time (and then DejaNews) will tell.
> This is quite true -- which makes me wonder about the long term
> feasibility of ICD and MCD when it comes to obtaining maximum
> performance. Ideally, a call to glVertex3f() will result in something
> like:
>
> mov edi, commandslotaddress
> mov [edi], cmd_gl_vertex3f
> mov [edi+VX], x
> mov [edi+VY], y
> mov [edi+VZ], z
>
> But with ICD or MCD the above would get batched to minimize the
> user/kernel transitions you mention.
Ideally, one would set the above as one display list item and queue it into
offscreen memory. For example, in our Imagine 2 one can write either three
or four 32-bit chip registers with one 128-bit display list instruction. The
advantage of this approach is that successive registers occupy increasing
addresses in VRAM space, therefore if you send out two vertices, x[1] and
x[2] don't need to be written to the same register address. This increases
the probability of the PCI bus bursting, and allows the library to "render"
while the graphics engine is busy.
> Which means for ultimate performance we may end up seeing IHVs
> implementing their own OpenGL32.DLL simply because they're not going to
> have a choice in the matter for performance reasons -- they're going to
> be banging up against a very real memory bandwidth and kernel transition
> overhead barrier, and may end up having to go straight to the metal.
I believe IHVs will be itching to implement their own OpenGL library because
the current model is too restrictive. It is still sort of adequate for NT,
but Win95 is much more open, and there are, in my opinion, better ways to
integrate OpenGL functionality into the system.
If Microsoft is listening - and if they're willing to bite the bullet - what
we need in Win95 is a new GDI/Device Driver interface. Leave the 16-bit GDI
to legacy apps, give a Device Driver API to the 32-bit GDI, and define
OpenGL32.DLL as an extension of the 32-bit GDI, going through the same
Device Driver API. OpenGL32.DLL would be to 3D the same that the DIB Engine
is to 2D: a punting layer. I would really like to see a 32-bit device driver
API, which includes the full OpenGL functionality; but that would probably
require that the 32-bit GDI was rewritten to use OpenGL as its rendering
layer. If things were arranged like that, I could write my OpenGL support as
a portion of my display driver; synchronization and sharing issues between
2D and 3D would be resolved because GDI and OpenGL coexist within the same
environment; and I could have fast deployment of my OpenGL driver, because,
like the current 2D Display Driver, I'd only need to implement a bare
minimum that's not handled by OpenGL32. Once I had it running, I'd add
functionality slowly, at my own speed. And having OpenGL32 at the same level
as the DIB Engine would guarantee a fast path from app to library.
If they don't do that, it is certain that IHVs will eventually replace
OpenGL32.DLL with their own stuff. People are already overwriting portions
of the 16-bit GDI, and it's a question of time before people start doing
that to the 32-bit GDI too. And if people are ready to bypass Windows
itself, they'll not be bashful about OpenGL32.DLL.
> Now, if IHVs have to go this route because of WinNT's intrinsically
> shitty driver architecture, then the whole PC industry is going to end
> up paying the price, since the OGL driver model(s) currently in place
> are going to be ignored.
I'd worry about Win95 - or the next one - before I'd worry about NT.
Because, I believe, that's where the money is. I have the feeling that NT
will preserve the current model, because there's a lot less people
programming for NT and it's a bit more involuted to interface with the
system. But in Win95, where everything is open, chances are the model will
be subverted as competition heats up and IHVs require more performance.
> And there are MANY MANY issues related to implementing a pure
> OpenGL32.DLL that are non-trivial that will have to be addressed.
> But that's the price you pay for working in a commodity space.
Yes. But while the problems related to OpenGL 3D have by and large been
solved, the problem of fitting it within a Windows framework is still an
open one.
Remember the old Chinese curse: may you live in interesting times!
Alberto.