Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

VBE2.0 STINKS!

11 views
Skip to first unread message

Jason Chong

unread,
Nov 9, 1996, 3:00:00 AM11/9/96
to

Hi People, hope the subject caught your attention.

I just read the VBE2.0 official specs. I think we are going back to the old
days of headache for SVGA programming again. I thought VESA is supposed to
make SVGA access easier for coders. But it seems that so many information
need to be retrieved before an application can actually start using the SVGA
mode. Worst still, VESA had to give SVGA hardware manufacturers options
whether to implement certain functions or not. Imagine, an SVGA maker can
opt not to support 640x480x256 modes. What's gonna happen to games
like WarCraft II and the likes of it?

I mean, you have to find out so many information from the parameter block
returned from the function before you can actually start using the modes, and
if the modes you want are not supported, then you are in shit. I maybe
pro-DOS but... VBE is getting out of hand. IT STINKS! VESA really STINKS!
Even Directx is not that complicated. Hate to say this, but looking at the
imcompetence of the VBE standard, I think Directx seems to be the lesser
EVIL. And not all SVGA cards support VBE2.0. It is gonna take some time,
but I don't think it will make any difference since you need to make
your app aware of things returned from the VBE20 service. Why the hell do
coders have to go through so much ridiculous and stupid functions just to set
an SVGA mode? Why does VESA have to give hardware manufacturers an option?
Why the F**K must VBE20 be so complicated? It is so much trouble to just use
a stupid video mode.

Face it buddies, we either need better and easier standards or switch to
Directx. VESA really sucks! You need to get the WINSIZE, WINGRANULARITY and
all other stupid things. Why won't it just be as simple as setting
320x200x256 with just int 10h AX=0013h? Why the F**K must we consider whether
which window is readable or writeable? Why don't just make window A/B both
readable and writeable like VGA? How I wish using SVGA is as simple as using
VGA. Why does the stupid VBE20 functions have to have so many parameter
blocks filled with disgusting info that need to be taken care of before
a coder can actually start using the god-forsaken mode? Why the F**K linear
frame buffers are not implemented years ago?

Anyways, please don't flame me. I am just a frustrated DOS coder. But I would
appreciate a more logical and diplomatic answer to my questions. I really
hope to support VBE20 , but the way they make it..... It's so damned
troublesome to use that I have to admit WIN95 just isn't that hateful at all.
I think this could be where Directx wins hands down. Is Directx as
complicated to use as VBE2.0 ?


Jason.

Michiel Ouwehand

unread,
Nov 10, 1996, 3:00:00 AM11/10/96
to

>I just read the VBE2.0 official specs. I think we are going back to the old
>days of headache for SVGA programming again. I thought VESA is supposed to
>make SVGA access easier for coders. But it seems that so many information
>need to be retrieved before an application can actually start using the SVGA
>mode. Worst still, VESA had to give SVGA hardware manufacturers options
>whether to implement certain functions or not. Imagine, an SVGA maker can
>opt not to support 640x480x256 modes. What's gonna happen to games
>like WarCraft II and the likes of it?

They will exit with a message "VBE 640x480x256 not supported" and the
people who bought the crummy video card will be very angry and they
will buy a different video card and the problem is solved.

>I mean, you have to find out so many information from the parameter block
>returned from the function before you can actually start using the modes, and
>if the modes you want are not supported, then you are in shit. I maybe
>pro-DOS but... VBE is getting out of hand. IT STINKS! VESA really STINKS!
>Even Directx is not that complicated. Hate to say this, but looking at the
>imcompetence of the VBE standard, I think Directx seems to be the lesser
>EVIL.

But still, DirectX is a Win32 API so it should standardize, VBE brings
a sort of a standard to DOS, which is quite cool.

btw, DirectX also has a variable and driver-dependant set of video
modes. Direct manufacturers may opt not to support 640x480x256 if that
is what they like.

>And not all SVGA cards support VBE2.0. It is gonna take some time,
>but I don't think it will make any difference since you need to make
>your app aware of things returned from the VBE20 service. Why the hell do
>coders have to go through so much ridiculous and stupid functions just to set
>an SVGA mode?

You don't. You load the right video mode, do an interrupt and you're
done. BTW, the SVGAKIT is available from scitechsoft and they do all
necessary code if you would like some more 'complicated' behaviour.

DirectDraw requires quite some calls to get running correctly, though.

>Why does VESA have to give hardware manufacturers an option?
>Why the F**K must VBE20 be so complicated? It is so much trouble to just use
>a stupid video mode.

I thought VBE 2.0 was quite straightforward.

>Face it buddies, we either need better and easier standards or switch to
>Directx. VESA really sucks! You need to get the WINSIZE, WINGRANULARITY and
>all other stupid things. Why won't it just be as simple as setting
>320x200x256 with just int 10h AX=0013h?

Because video memory might not be linear. If you are in REAL mode and
you set a video mode 1024x768x256, where do you want to keep those 768
KB's of video memory? Remember, you only HAVE a meg of adressable
memory.

If you aren't all that concerned about REAL mode any more, take VBE
2.0 and check for FLAT video memory and you'll have a linear pointer
to your video memory.

Don't expect things which are physically impossible.

And still, granularities and sizes are easy to implement in your
code.. There is also no way around it because the WinSizer and
granularity represent exactly how the video board works. There is no
way around it.

>Why the F**K must we consider whether which window is readable or writeable?

Because that is how the video board has been built. If you don't care
if a window is readable or writeable, just set both.

>Why don't just make window A/B both readable and writeable like VGA?

Again, this is not a matter of the VBE standard, but hardware related.
VBE gives you the option to have one read bank and one write bank when
the hardware supports it which allows for very fast copying of video
memory. If you don't care about it, set both window A and B to the
bank you need and you're done.

Don't make problems that aren't there.

>How I wish using SVGA is as simple as using
>VGA. Why does the stupid VBE20 functions have to have so many parameter
>blocks filled with disgusting info that need to be taken care of before
>a coder can actually start using the god-forsaken mode? Why the F**K linear
>frame buffers are not implemented years ago?

They are implemented now, why are you so angry? Do you have a personal
aversion to VBE? Don't you like standards? Have you just tried to code
VBE and it didn't work? WHAT THE HELL IS WRONG WITH YOU??

Anyway, you don't have to fill in many parameter blocks, mostly you
just wipe 'em out with 0's (that's what memset is for) and set one
value.

Linear frame buffers weren't implemented years ago because they must
be supported by the hardware, which they weren't years ago. VBE 2.0
supports it because now most - but still not all - video cards support
this. VBE 1.2 couldn't standardize it because there was next to no use
of FLAT protected mode at that time and there were just a couple of
exiotic video boards available which did have the option of a FLAT
framebuffer, but they were buggy.

>Anyways, please don't flame me. I am just a frustrated DOS coder. But I would
>appreciate a more logical and diplomatic answer to my questions. I really
>hope to support VBE20 , but the way they make it..... It's so damned
>troublesome to use that I have to admit WIN95 just isn't that hateful at all.
>I think this could be where Directx wins hands down. Is Directx as
>complicated to use as VBE2.0 ?

Yes, and a lot worse. Take it from me, doing DirectX can realy drive
you up the wall, took me some time to get running, and VBE 2.0 took me
no more than a day or so. DirectX sucks compared to something as plain
and simple as VBE 2.0.

But still, DirectX is easy too.. Just wait until you get tio the
really sucking parts of Win32, like drawing a custom caption and
fiddling around with GDI's clip regions in windows messages.

I program for DOS and Win32 (currently programming the DOS and Win32
version of Jazz JackRabbit 2), DOS supports VBE 2.0 and Win32 supports
DirectDraw, DispDIB and DIBSections for display. There are harder
things to do than VBE2.0.. From all supported video access methods
I've seen, only mode 13h is easier..

Michiel Ouwehand

0 new messages