whereas this one won't:
void RenderScene()
{
glClear( GL_COLOR_BUFFER_BIT );
glBegin( GL_QUADS );
glVertex2i( -100, -100 );
glVertex2i( 100, -100 );
glVertex2i( 100, 100 );
glVertex2i( -100, 100 );
glEnd();
SwapBuffers( g_hDC );
}
To make sure that I wasn't actually insane, I went to nehe.gamedev.net and
downloaded his lesson 1, which simply sets up an OpenGL window. To reproduce
the memory leak, all I had to do was comment out the glClear() and disable
depth testing (to ensure that a quad was drawn each frame). Sure enough, a
memory leak of about 8KB/s was found. If I cleared the buffer OR enabled
depth testing in his example, the leak went away.
It's not a big deal to clear the color buffer each frame, I just *shouldn't
have to*.
I am running WinXP SP2 w/ ATI FireGL 3100. I have tested versions 8.043.11
and 8.023.1 of ATI's drivers; the memory leak was the same with both versions.
Has anyone seen behavior like this before?
Thanks in advance,
Dan Nawrocki