glPixelStorei(GL_PACK_ALIGNMENT, 1);
glReadPixels(0, 0, (GLsizei)Width, (GLsizei)Height, GL_BGR_EXT,
GL_UNSIGNED_BYTE, pColorData);
pColorData is of type (unsigned char *) and has Width*Height*3 amount of
memory allocated to it.
I tried different things with glPixelStore() but all the results were
similar.
http://members.xoom.com/mr_carbone/view/Untitled.bmp
http://members.xoom.com/mr_carbone/view/Untitled2.bmp
(680K each image, respect the capitals)
All I need is the data where each component is 8 bit.
V-man
Try dumping a raw image before getting clever. When you can do that move
onto saving in a given image format.
Always write the image file as a binary. So many people slip up on that
one,...
Dump raw rgba data to a file (alter as required);
CHAR *pScreenImageBuffer = (CHAR*)malloc(Width*Height)*4);
FILE* OutputFile = fopen(Filename, "wb");
glReadPixels(0,0,Width,Height, GL_RGBA,GL_UNSIGNED_BYTE,ScreenImageBuffer);
fwrite(ScreenImageBuffer,Width*Height*4,1,OutputFile);
fclose(pOutputFile);
free(ScreenImageBuffer);
--
Anti-spam: Replace nospam with blueyonder to reply
THe BMP came out perfect. I attemting some tests at 10x10 , 30x30 but my
drivers crashed....never mind that.
So THE real question is how do I use glReadPixels()? That includes what to
do about glPixelStore(). I'm running at 32bpp. Frankly, I don't see what
the problem is with GL_RGB. Isn't it suppose to skip the alpha component?
Is it my drivers?
I would like to see some examples (win32), or at least a better
understanding of OGL. I have a lot to learn it seems.
V-man
1. Try setting the desktop size/bitdepth manually.
2. Try using the MS Generic OpenGL.
3. Try using a different card and ICD (if you have that option), although
option 2 should be sufficient.
4. If all of the above fails, sent a detailed email to your manufacturers
support people.
/* Example dump for BGRA and RGB */
VOID DumpImage(CHAR *Filename, INT Width, INT Height)
{
CHAR *ScreenImageBuffer = (CHAR*)malloc(Width*Height*4);
FILE* OutputFile = fopen(Filename, "wb");
if(SUPPORTS_EXT_bgra)
{
CHAR *ScreenImageBuffer = (CHAR*)malloc(Width*Height*4);
FILE* OutputFile = fopen(Filename, "wb");
glReadPixels(0,0,Width,Height,GL_BGRA_EXT,GL_UNSIGNED_BYTE,ScreenImageBuffer
);
fwrite(ScreenImageBuffer,Width*Height*4,1,OutputFile);
fclose(OutputFile);
free(ScreenImageBuffer);
}
else
{
CHAR *ScreenImageBuffer = (CHAR*)malloc(Width*Height*3);
FILE* OutputFile = fopen(Filename, "wb");
glReadPixels(0,0,Width,Height,GL_RGB,GL_UNSIGNED_BYTE,ScreenImageBuffer);
fwrite(ScreenImageBuffer,Width*Height*3,1,OutputFile);
fclose(OutputFile);
For those of you who haven't seen it, I have a link in my first post.
I went back to the default, which is GL_PACK_ALIGNMENT, 4.
I also had to modify my mem allocation accordingly.
Charles, and anyone else, could you try something for me.
Copy the improved code below.
void DumpImage(char *Filename, int Width, int Height)
{
glPixelStorei(GL_PACK_ALIGNMENT, 1);
char *ScreenImageBuffer = (char*)malloc(Width*Height*3);
FILE* OutputFile = fopen(Filename, "wb");
glReadPixels(0, 0, Width, Height, GL_BGR_EXT, GL_UNSIGNED_BYTE,
ScreenImageBuffer);
fwrite(ScreenImageBuffer, Width*Height*3, 1, OutputFile);
fclose(OutputFile);
free(ScreenImageBuffer);
}
The file created is .RGB or .raw (I don't know). I have a slitly different
version making it into a .BMP
I tried MS GDI yesterday. You wouldn't beleive how screwed up things got
on it. glReadPixels() was only returning the background color. Some
functions were failing....I don;t know what that was about. My code seems
alright, unless if there something wrong with the NURBS stuff.
Thanks for the help,
V-man
glReadPixels only returning the background colour, ugh ?
It helps, when debugging, to narrow the problem down to as few things as
possible. This reduces the chance of being confused with irrelevences and
increases the chance of finding the exact cause rapidly. Hence my example
only pushing the screen dump to a raw file. If an untested file format
module was placed in the way, who knows where the fault would be.
An example would be to draw four boxes on the screen, each one a different
colour. This keeps it simple. Apply screen dump routine, with file format
routine of choice in the way. If the image comes out barfed, and the raw
image in fine, it's your image format code thats wrong. Unless you've
managed to knock your drivers into an undefined state, I cannot image how
the most buggy of drivers would mess this up.
MS GDI, think you mean MS Generic.
>glPixelStorei(GL_PACK_ALIGNMENT, 1); produces no problems at my end, as it
>shouldn't do.
>
>glReadPixels only returning the background colour, ugh ?
Make sure the ReadPixels is before the buffer swap - some cards won't
necessarily give you a buffer you expected with the previous frame in it, and
if you've drawn only one frame then the old front buffer definitely won't
have a useful picture in it.
William Brodie-Tyrrell
========================================================================
A child of five could understand this. Fetch me a child of five.
-- Groucho Marx
========================================================================
http://www.smug.adelaide.edu.au/~wfbrodie/
<wil...@cs.adelaide.edu.au> <wfbr...@smug.adelaide.edu.au>
========================================================================
So, cards that don't return a correct result from glReadPixels after a
buffer swap, don't exhibit correct behaviour, right? This brings to mind two
things; when grabbing the framebuffer one could assume worst behaviour, or
complain like stink to the manufacturers (this being a hobby horse of mine).
Charles E Hardwidge wrote:
>
> Hmmm, interesting. I've always put the glReadPixels after swapping buffers
> out of habit/tidiness, without thinking of it much. Bearing in mind that
> glReadPixels operates on the framebuffer, not the back buffer, I would have
> thought this is where it should go. (Ages ago I put it before swapping
> buffers when writing my first OGL app and wondered why it didn't work).
This is incorrect. glReadPixels will read from what ever buffer is currently
set as the read buffer in the state machine:
glReadBuffer (GL_BACK);
glReadPixels (...);
glReadBuffer (GL_FRONT);
glReadPixels (...);
By default, the read buffer is GL_FRONT in single-buffered configurations, and GL_BACK
in double-buffered configurations.
> So, cards that don't return a correct result from glReadPixels after a
> buffer swap, don't exhibit correct behaviour, right? This brings to mind two
> things; when grabbing the framebuffer one could assume worst behaviour, or
> complain like stink to the manufacturers (this being a hobby horse of mine).
>
In a double buffer scheme, the contents of the back buffer are "undefined" after
a buffer swap. Some cards will retain the pixels, some will not. You cannot depend
on the back buffer contents being correct once you swap.
Bob
--
Robert A. Schmitt
RESolution Graphics Inc.
ra...@home.com
Frankly, it makes me sick. It completly discusts me. Why in the world
should the back buffer be undefined after a swap?
V-man
Firstly, the reason why my first attempt failed on MS GDI was that I was
using the unsupported GL_BGR_EXT.
After clearing that, I found the same effect on both MS GDI and my
hardware implementation.
I painted the whole screen blue, with a QUAD. There is some oddity when
the width is not divisible by 4 on both implementations.
Check out
http://members.xoom.com/mr_carbone/view/screendump2.txt
http://members.xoom.com/mr_carbone/view/screendump.bmp
The txt seems fine, but the bmp seems screwed. I think this is a BMP
format problem, but..
I'm not exactly sure what's going on.
PS I was able to get it to work with glPixelStorei(GL_PACK_ALIGNMENT, 2)
V-man
I got the same problem moreless..
When I compute simple OpenGl functions I succeed to generate a BMP file with
everything.
But when I use glScale glrotate.. I obtain also the background color..
Maybe glScale call Swapp Buffer functions could you explain me about that ?
THank a Lot for your time..
John
> That effect was on MS implemento, and thanks for pointing out the reason.
>
> Frankly, it makes me sick. It completly discusts me. Why in the world
> should the back buffer be undefined after a swap?
Sorry, V-man, but when you don't understand the reason for something, it
shouldn't make you sick. It should just make you ask questions.
OpenGL was designed to support a number of different implementations. In some
of those, a SwapBuffers looks like a true swapping of the front and back buffers.
(This is done by flipping a 'front/back buffer' bit per pixel; this can be much
faster
than actually moving pixels.) On other systems, a SwapBuffers simply moves the
contents of the back buffer into the front buffer. Here, the back buffer doesn't
change.
In order to handle both implementations, the contents of the back buffer are
explicitly
undefined.
If you really want the 'copy' semantics, you can use glCopyPixels.
--
Andy V (OpenGL Alpha Geek)
"In order to make progress, one must leave the door to the unknown ajar."
Richard P. Feynman, quoted by Jagdish Mehra in _The Beat of a Different Drum_.
Paul Martz's OpenGL FAQ: http://www.opengl.org/About/FAQ/Technical.html
I'm more for "one rule for all time". NO "maybe" "buts" or "how abouts"
V-man
On Tue, 1 Aug 2000, fungus wrote:
>
>
> V-man wrote:
> >
> > Frankly, it makes me sick. It completly discusts me. Why in the world
> > should the back buffer be undefined after a swap?
> >
>
>
> Maybe you should look at the docs for PIXELFORMATDESCRIPTOR.
>
> Pixel formats have two settings you can select from - PFD_SWAP_COPY
> and PFD_SWAP_EXCHANGE.
>
>
> Can you guess what they do?
>
>
>
> --
> <\___/>
> / O O \
> \_____/ FTB.
>
>
>yes, it's guessible. But the docs say it's a hint only. It's all in the
>hands of the drivers.
>
>I'm more for "one rule for all time". NO "maybe" "buts" or "how abouts"
Indeed, when I switched from an SGI Indy to the SGI O2, I did almost a
year to discover that the glXSwapBuffers() did a
nearly-but-not-quite-ideal swap. The O2's gfx hw tiles its
framebuffer, and there's a process (with very high priority) which
does the actual swapping, per tile (!).
So you can actually see the swapping taking place if you're running
some high priority stuff yourself.
Some tiles are swapped in the usual manner; front & back flipping,
where your old front is exactly the back buffer. But some tiles seem
to be copied?! So your back buffer is basically undefined, depending
even on where your window is and how big it is.
Very problematic.
Ruud van Gaal, GPL Rank +53.53
MarketGraph / MachTech: http://www.marketgraph.nl
Art: http://www.marketgraph.nl/gallery
> Indeed, when I switched from an SGI Indy to the SGI O2, I did almost a
> year to discover that the glXSwapBuffers() did a
> nearly-but-not-quite-ideal swap. The O2's gfx hw tiles its
> framebuffer, and there's a process (with very high priority) which
> does the actual swapping, per tile (!).
> So you can actually see the swapping taking place if you're running
> some high priority stuff yourself.
> Some tiles are swapped in the usual manner; front & back flipping,
> where your old front is exactly the back buffer. But some tiles seem
> to be copied?! So your back buffer is basically undefined, depending
> even on where your window is and how big it is.
> Very problematic.
Very interesting, but if you follow the rules, it isn't problematic.
>Ruud van Gaal wrote:
>
>> Indeed, when I switched from an SGI Indy to the SGI O2, I did almost a
>> year to discover that the glXSwapBuffers() did a
>> nearly-but-not-quite-ideal swap.
...
>Very interesting, but if you follow the rules, it isn't problematic.
That's right, but I came from the Amiga where the backbuffer was
always in a known state, and the Indy worked like that too, so I never
checked. In fact, it would seem *more* work to juggle with the
backbuffer than just a hardware flip.
My Banshee again seems to do the flipping nicely for 2D work (in which
you often only have to update just a small part of the screen, which
is often quicker than repainting the lot)
I've recently done a demo on a PC, with the same framework as I am
using on the SGI, and it works as good on the PC as on the SGI (well,
the SGI is about 100 times faster in the 2D area compared to the
Banshee). So there's really tricky times when the
half-copied-backbuffer becomes obvious.