Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

glReadPixels() and BMP

1,275 views
Skip to first unread message

V-man

unread,
Jul 29, 2000, 3:00:00 AM7/29/00
to

I tried using glReadPixels() to get a screen capture, but the image's
colors looks distorted.

glPixelStorei(GL_PACK_ALIGNMENT, 1);

glReadPixels(0, 0, (GLsizei)Width, (GLsizei)Height, GL_BGR_EXT,
GL_UNSIGNED_BYTE, pColorData);

pColorData is of type (unsigned char *) and has Width*Height*3 amount of
memory allocated to it.

I tried different things with glPixelStore() but all the results were
similar.

http://members.xoom.com/mr_carbone/view/Untitled.bmp
http://members.xoom.com/mr_carbone/view/Untitled2.bmp

(680K each image, respect the capitals)

All I need is the data where each component is 8 bit.

V-man


Charles E Hardwidge

unread,
Jul 29, 2000, 3:00:00 AM7/29/00
to
Do everybody a favour next time, and use smaller images. I have found 10x10
is more than enough. A screen made up of a few squares and simple colours
might be easier on the eyes, and easier to examine in a hex editor.

Try dumping a raw image before getting clever. When you can do that move
onto saving in a given image format.
Always write the image file as a binary. So many people slip up on that
one,...

Dump raw rgba data to a file (alter as required);

CHAR *pScreenImageBuffer = (CHAR*)malloc(Width*Height)*4);
FILE* OutputFile = fopen(Filename, "wb");

glReadPixels(0,0,Width,Height, GL_RGBA,GL_UNSIGNED_BYTE,ScreenImageBuffer);
fwrite(ScreenImageBuffer,Width*Height*4,1,OutputFile);
fclose(pOutputFile);
free(ScreenImageBuffer);

--
Anti-spam: Replace nospam with blueyonder to reply

V-man

unread,
Jul 29, 2000, 3:00:00 AM7/29/00
to
Forgive me if I made things difficult there. Having a hunch about what
could be going on, I decided to give it a try by "reseting things to 32
bit" instead of 24 bit. What I mean by that is I passed GL_RGBA instead of
GL_RGB to glReadPixels() and also passed 32 instead of 24 bit to my "save
as BMP" function. You also seem to have GL_RGBA in your suggestion. Was
that a coincidence?

THe BMP came out perfect. I attemting some tests at 10x10 , 30x30 but my
drivers crashed....never mind that.

So THE real question is how do I use glReadPixels()? That includes what to
do about glPixelStore(). I'm running at 32bpp. Frankly, I don't see what
the problem is with GL_RGB. Isn't it suppose to skip the alpha component?
Is it my drivers?

I would like to see some examples (win32), or at least a better
understanding of OGL. I have a lot to learn it seems.

V-man

Charles E Hardwidge

unread,
Jul 30, 2000, 3:00:00 AM7/30/00
to
Just to double-check, I tested a screen dump routine in various bitdepths.
Again, no problem with dumping a raw image in RGBA, BGRA, or RGB. RGB quite
correctly skipped the alpha channel. The only time I could replicate
anything like in your screen dumps was when the ATI RAGE based card I am
currently using messed up a mode change ( it doesn't seem to like
ChangeDisplaySettings to 16 bit for some reason, I have to manually set the
desktop). The screen looks a bit weird to say the least, and a raw screen
dump is stripey, like yours. This is a driver fault at my end. In all other
instances, including an old Voodoo card I use, a screen grab with
glReadPixels was perfect. Unless I'm missing something (not beyond the
realms of possibility), I suggest you have a driver problem. I enclose a
small routine knocked up to test RGBA and BGRA. Examine the raw output, just
in case the fault lies with your BMP routine.

1. Try setting the desktop size/bitdepth manually.
2. Try using the MS Generic OpenGL.
3. Try using a different card and ICD (if you have that option), although
option 2 should be sufficient.
4. If all of the above fails, sent a detailed email to your manufacturers
support people.

/* Example dump for BGRA and RGB */
VOID DumpImage(CHAR *Filename, INT Width, INT Height)
{
CHAR *ScreenImageBuffer = (CHAR*)malloc(Width*Height*4);


FILE* OutputFile = fopen(Filename, "wb");

if(SUPPORTS_EXT_bgra)
{
CHAR *ScreenImageBuffer = (CHAR*)malloc(Width*Height*4);


FILE* OutputFile = fopen(Filename, "wb");

glReadPixels(0,0,Width,Height,GL_BGRA_EXT,GL_UNSIGNED_BYTE,ScreenImageBuffer
);
fwrite(ScreenImageBuffer,Width*Height*4,1,OutputFile);
fclose(OutputFile);
free(ScreenImageBuffer);
}
else
{
CHAR *ScreenImageBuffer = (CHAR*)malloc(Width*Height*3);


FILE* OutputFile = fopen(Filename, "wb");

glReadPixels(0,0,Width,Height,GL_RGB,GL_UNSIGNED_BYTE,ScreenImageBuffer);
fwrite(ScreenImageBuffer,Width*Height*3,1,OutputFile);
fclose(OutputFile);

V-man

unread,
Jul 30, 2000, 3:00:00 AM7/30/00
to

Setting glPixelStorei() to GL_PACK_ALIGNMENT, 1
causes a distortion (on my machine anyway)

For those of you who haven't seen it, I have a link in my first post.
I went back to the default, which is GL_PACK_ALIGNMENT, 4.
I also had to modify my mem allocation accordingly.

Charles, and anyone else, could you try something for me.
Copy the improved code below.

void DumpImage(char *Filename, int Width, int Height)
{
glPixelStorei(GL_PACK_ALIGNMENT, 1);
char *ScreenImageBuffer = (char*)malloc(Width*Height*3);


FILE* OutputFile = fopen(Filename, "wb");

glReadPixels(0, 0, Width, Height, GL_BGR_EXT, GL_UNSIGNED_BYTE,
ScreenImageBuffer);
fwrite(ScreenImageBuffer, Width*Height*3, 1, OutputFile);
fclose(OutputFile);
free(ScreenImageBuffer);
}

The file created is .RGB or .raw (I don't know). I have a slitly different
version making it into a .BMP

I tried MS GDI yesterday. You wouldn't beleive how screwed up things got
on it. glReadPixels() was only returning the background color. Some
functions were failing....I don;t know what that was about. My code seems
alright, unless if there something wrong with the NURBS stuff.

Thanks for the help,
V-man

Charles E Hardwidge

unread,
Jul 30, 2000, 3:00:00 AM7/30/00
to
glPixelStorei(GL_PACK_ALIGNMENT, 1); produces no problems at my end, as it
shouldn't do.

glReadPixels only returning the background colour, ugh ?

It helps, when debugging, to narrow the problem down to as few things as
possible. This reduces the chance of being confused with irrelevences and
increases the chance of finding the exact cause rapidly. Hence my example
only pushing the screen dump to a raw file. If an untested file format
module was placed in the way, who knows where the fault would be.

An example would be to draw four boxes on the screen, each one a different
colour. This keeps it simple. Apply screen dump routine, with file format
routine of choice in the way. If the image comes out barfed, and the raw
image in fine, it's your image format code thats wrong. Unless you've
managed to knock your drivers into an undefined state, I cannot image how
the most buggy of drivers would mess this up.

MS GDI, think you mean MS Generic.

William Brodie-Tyrrell

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to

Stoned koala bears drooled eucalyptus spit in awe as Charles E Hardwidge...:

>glPixelStorei(GL_PACK_ALIGNMENT, 1); produces no problems at my end, as it
>shouldn't do.
>
>glReadPixels only returning the background colour, ugh ?

Make sure the ReadPixels is before the buffer swap - some cards won't
necessarily give you a buffer you expected with the previous frame in it, and
if you've drawn only one frame then the old front buffer definitely won't
have a useful picture in it.


William Brodie-Tyrrell

========================================================================
A child of five could understand this. Fetch me a child of five.
-- Groucho Marx
========================================================================
http://www.smug.adelaide.edu.au/~wfbrodie/

<wil...@cs.adelaide.edu.au> <wfbr...@smug.adelaide.edu.au>
========================================================================


Charles E Hardwidge

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to
Hmmm, interesting. I've always put the glReadPixels after swapping buffers
out of habit/tidiness, without thinking of it much. Bearing in mind that
glReadPixels operates on the framebuffer, not the back buffer, I would have
thought this is where it should go. (Ages ago I put it before swapping
buffers when writing my first OGL app and wondered why it didn't work).

So, cards that don't return a correct result from glReadPixels after a
buffer swap, don't exhibit correct behaviour, right? This brings to mind two
things; when grabbing the framebuffer one could assume worst behaviour, or
complain like stink to the manufacturers (this being a hobby horse of mine).

Robert A. Schmitt

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to

Charles E Hardwidge wrote:
>
> Hmmm, interesting. I've always put the glReadPixels after swapping buffers
> out of habit/tidiness, without thinking of it much. Bearing in mind that
> glReadPixels operates on the framebuffer, not the back buffer, I would have
> thought this is where it should go. (Ages ago I put it before swapping
> buffers when writing my first OGL app and wondered why it didn't work).

This is incorrect. glReadPixels will read from what ever buffer is currently
set as the read buffer in the state machine:

glReadBuffer (GL_BACK);
glReadPixels (...);

glReadBuffer (GL_FRONT);
glReadPixels (...);

By default, the read buffer is GL_FRONT in single-buffered configurations, and GL_BACK
in double-buffered configurations.

> So, cards that don't return a correct result from glReadPixels after a
> buffer swap, don't exhibit correct behaviour, right? This brings to mind two
> things; when grabbing the framebuffer one could assume worst behaviour, or
> complain like stink to the manufacturers (this being a hobby horse of mine).
>

In a double buffer scheme, the contents of the back buffer are "undefined" after
a buffer swap. Some cards will retain the pixels, some will not. You cannot depend
on the back buffer contents being correct once you swap.

Bob
--
Robert A. Schmitt
RESolution Graphics Inc.
ra...@home.com

V-man

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to

That effect was on MS implemento, and thanks for pointing out the reason.

Frankly, it makes me sick. It completly discusts me. Why in the world
should the back buffer be undefined after a swap?

V-man

V-man

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to

Firstly, the reason why my first attempt failed on MS GDI was that I was
using the unsupported GL_BGR_EXT.

After clearing that, I found the same effect on both MS GDI and my
hardware implementation.
I painted the whole screen blue, with a QUAD. There is some oddity when
the width is not divisible by 4 on both implementations.

Check out
http://members.xoom.com/mr_carbone/view/screendump2.txt
http://members.xoom.com/mr_carbone/view/screendump.bmp

The txt seems fine, but the bmp seems screwed. I think this is a BMP
format problem, but..

I'm not exactly sure what's going on.

PS I was able to get it to work with glPixelStorei(GL_PACK_ALIGNMENT, 2)

V-man


John Fischer

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to
Hi,


I got the same problem moreless..

When I compute simple OpenGl functions I succeed to generate a BMP file with
everything.

But when I use glScale glrotate.. I obtain also the background color..

Maybe glScale call Swapp Buffer functions could you explain me about that ?

THank a Lot for your time..


John

Andrew F. Vesper

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to
V-man wrote:

> That effect was on MS implemento, and thanks for pointing out the reason.
>
> Frankly, it makes me sick. It completly discusts me. Why in the world
> should the back buffer be undefined after a swap?

Sorry, V-man, but when you don't understand the reason for something, it
shouldn't make you sick. It should just make you ask questions.

OpenGL was designed to support a number of different implementations. In some
of those, a SwapBuffers looks like a true swapping of the front and back buffers.
(This is done by flipping a 'front/back buffer' bit per pixel; this can be much
faster
than actually moving pixels.) On other systems, a SwapBuffers simply moves the
contents of the back buffer into the front buffer. Here, the back buffer doesn't
change.

In order to handle both implementations, the contents of the back buffer are
explicitly
undefined.

If you really want the 'copy' semantics, you can use glCopyPixels.
--
Andy V (OpenGL Alpha Geek)
"In order to make progress, one must leave the door to the unknown ajar."
Richard P. Feynman, quoted by Jagdish Mehra in _The Beat of a Different Drum_.

Paul Martz's OpenGL FAQ: http://www.opengl.org/About/FAQ/Technical.html

V-man

unread,
Jul 31, 2000, 3:00:00 AM7/31/00
to
yes, it's guessible. But the docs say it's a hint only. It's all in the
hands of the drivers.

I'm more for "one rule for all time". NO "maybe" "buts" or "how abouts"

V-man

On Tue, 1 Aug 2000, fungus wrote:

>
>
> V-man wrote:
> >
> > Frankly, it makes me sick. It completly discusts me. Why in the world
> > should the back buffer be undefined after a swap?
> >
>
>

> Maybe you should look at the docs for PIXELFORMATDESCRIPTOR.
>
> Pixel formats have two settings you can select from - PFD_SWAP_COPY
> and PFD_SWAP_EXCHANGE.
>
>
> Can you guess what they do?
>
>
>
> --
> <\___/>
> / O O \
> \_____/ FTB.
>
>


fungus

unread,
Aug 1, 2000, 3:00:00 AM8/1/00
to

Ruud van Gaal

unread,
Aug 3, 2000, 3:00:00 AM8/3/00
to
On Mon, 31 Jul 2000 23:29:39 -0400, V-man
<v_me...@alcor.concordia.ca> wrote:

>yes, it's guessible. But the docs say it's a hint only. It's all in the
>hands of the drivers.
>
>I'm more for "one rule for all time". NO "maybe" "buts" or "how abouts"

Indeed, when I switched from an SGI Indy to the SGI O2, I did almost a
year to discover that the glXSwapBuffers() did a
nearly-but-not-quite-ideal swap. The O2's gfx hw tiles its
framebuffer, and there's a process (with very high priority) which
does the actual swapping, per tile (!).
So you can actually see the swapping taking place if you're running
some high priority stuff yourself.
Some tiles are swapped in the usual manner; front & back flipping,
where your old front is exactly the back buffer. But some tiles seem
to be copied?! So your back buffer is basically undefined, depending
even on where your window is and how big it is.
Very problematic.


Ruud van Gaal, GPL Rank +53.53
MarketGraph / MachTech: http://www.marketgraph.nl
Art: http://www.marketgraph.nl/gallery

Andrew F. Vesper

unread,
Aug 3, 2000, 3:00:00 AM8/3/00
to
Ruud van Gaal wrote:

> Indeed, when I switched from an SGI Indy to the SGI O2, I did almost a
> year to discover that the glXSwapBuffers() did a
> nearly-but-not-quite-ideal swap. The O2's gfx hw tiles its
> framebuffer, and there's a process (with very high priority) which
> does the actual swapping, per tile (!).
> So you can actually see the swapping taking place if you're running
> some high priority stuff yourself.
> Some tiles are swapped in the usual manner; front & back flipping,
> where your old front is exactly the back buffer. But some tiles seem
> to be copied?! So your back buffer is basically undefined, depending
> even on where your window is and how big it is.
> Very problematic.

Very interesting, but if you follow the rules, it isn't problematic.

Ruud van Gaal

unread,
Aug 4, 2000, 3:00:00 AM8/4/00
to
On Thu, 03 Aug 2000 22:06:22 -0400, "Andrew F. Vesper"
<ave...@wn.net> wrote:

>Ruud van Gaal wrote:
>
>> Indeed, when I switched from an SGI Indy to the SGI O2, I did almost a
>> year to discover that the glXSwapBuffers() did a
>> nearly-but-not-quite-ideal swap.

...


>Very interesting, but if you follow the rules, it isn't problematic.

That's right, but I came from the Amiga where the backbuffer was
always in a known state, and the Indy worked like that too, so I never
checked. In fact, it would seem *more* work to juggle with the
backbuffer than just a hardware flip.
My Banshee again seems to do the flipping nicely for 2D work (in which
you often only have to update just a small part of the screen, which
is often quicker than repainting the lot)
I've recently done a demo on a PC, with the same framework as I am
using on the SGI, and it works as good on the PC as on the SGI (well,
the SGI is about 100 times faster in the 2D area compared to the
Banshee). So there's really tricky times when the
half-copied-backbuffer becomes obvious.

0 new messages