GLFW with VS 8 and Windows Vista eats Memory

nobody wrote on Sunday, April 27, 2008:

Hi,

I really love GLFW. It is so compact and stable. Now I wanted to start a big project (a remake of an old Retro Game^^)
under Windows Vista with VS8. So I compiled a static GLFW-Lib 2.6 for that and started to code. All works fine,
like it works on Linux or WindowsXP. But there is a strange behavior with my GLFW-Program. If I look into
the Taskmanager then I can see, that my Program eats my memory. The memory-usage increases each 1-2 seconds
by 2-4 KByte and this doesn’t stop! I have compared this behavior with an old Programm I wrote under VS 2003 und WindowsXP and this programm works fine and has not this behavior. Has somebody an idea what I can do ?
I will not fallback to glut :frowning:

Here is one sample Programm I compile and which has got this strange behavior:

#include <stdlib.h>
#include <stdio.h>
#include <math.h>
#include "GLFW/glfw.h"

void Draw( void )
{
int width, height;
double t;
t = glfwGetTime();
glfwGetWindowSize( &width, &height );
height = height < 1 ? 1 : height;
glViewport( 0, 0, width, height );
glClearColor( 0.0f, 0.0f, 0.0f, 0.0f );
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );

  glMatrixMode\( GL\_PROJECTION \);
  glLoadIdentity\(\);
  gluPerspective\(
      65.0,
      \(double\)width/\(double\)height,
      1.0,
      100.0
  \);
  
  glMatrixMode\( GL\_MODELVIEW \);
  glLoadIdentity\(\);            
  gluLookAt\(                   
      0.0, 0.0, 10.0,          
      0.0, 0.0, 0.0,           
      0.0, 1.0, 0.0            
  \);

  glBegin\( GL\_TRIANGLES \);
  glColor3f\( 1.0f, 0.0f, 0.0f \);
  glVertex3f\( -5.0f, -4.0f, 0.0f \);
  glColor3f\( 0.0f, 1.0f, 0.0f \);
  glVertex3f\(  5.0f, -4.0f, 0.0f \);
  glColor3f\( 0.0f, 0.0f, 1.0f \);
  glVertex3f\(  0.0f,  4.5f, 0.0f \);
  glEnd\(\);   

}

int main( int argc, char **argv )
{
int ok;
int running;

glfwInit\(\);

ok = glfwOpenWindow\(640, 480,8, 8, 8, 8,24,0,GLFW\_WINDOW\);

if\( \!ok \)
\{
	glfwTerminate\(\);
    return 0;
\}

glfwSetWindowTitle\( &quot;My OpenGL program&quot; \);
glfwEnable\( GLFW\_STICKY\_KEYS \);

do
\{
	Draw\(\);
	glfwSwapBuffers\(\);

	running = \!glfwGetKey\( GLFW\_KEY\_ESC \) &amp;&amp;
               glfwGetWindowParam\( GLFW\_OPENED \);
\}
while\( running \);

glfwTerminate\(\);
return 0;

}

More Infos:

Environment: VS 8
Full Optimization (/Ox)
Runttime Library: Multi-threaded (/MT)
Additional Library directories: :\Users\xxx\Desktop\Retro2D\GLFW (Here is the static library glfw.lib, glfw.h)
Additional Dependencies: glfw.lib opengl32.lib glu32.lib
Entry Point: mainCRTStartup

My System:

Intel Core 2 Quad 2,4 GHZ
2048 MB Ram (666MHZ)
Windows Vista
Graphiccard: Geforce 8500 GT
Driver: 169.25_forceware_winvista_32bit_international_whql.exe

Thanks for all help

Greetz

fbrjogl

nobody wrote on Sunday, April 27, 2008:

It is not VS 8 …it is VS 2008!

(Sry for my bad english…german ^^)

greetz

fbrjogl

shurcool wrote on Thursday, May 01, 2008:

I’ve looked at your code and see nothing wrong with it (that would generate that kind of behaviour).

I’ve tried to compile it with my own compiled version of glfw v2.6, and as I expected the memory usage stayed reasonably constant and didn’t increase like you’re describing.

The only thing I changed was your #include "GLFW/glfw.h" statement, to #include <gl/glfw.h>, because I like to put glfw header and lib into standard library locations:

C:\Program Files\Microsoft SDKs\Windows\v6.0A\Include\gl, and
C:\Program Files\Microsoft SDKs\Windows\v6.0A\Lib

respectively.

I doubt that makes a difference though. Only thing I can think of is there’s something wrong with your compiled version of glfw. Did you download the latest source code (v2.6 final)?

Maybe you should try the pre-compiled GLFW v2.6 binaries and link with those, see if it makes a difference.

There’s definitely nothing wrong with GLFW with the sample you posted though, so it’s a local problem that you’ll have to sort out.

shurcool wrote on Thursday, May 01, 2008:

I should add I tried it on Windows XP with VS 2008 Express Edition, which corresponds to VS 9.0 as the AP pointed out above. VS8 is Visual Studio 2005.

uzbeche wrote on Sunday, April 19, 2009:

I got this leak too. Compiled both GLFW 2.6 and latest SVN checkout using MSVC 2008 SP1. Actually the code itself is rather straightforward:

/*
* GLFW basic aplication
*/

#include <iostream>
#include <cstdlib>
#include <GL/glfw.h>

using namespace std;

int main(int argc, char **argv)
{
glfwInit();
atexit(glfwTerminate);

glfwOpenWindowHint\(GLFW\_WINDOW\_NO\_RESIZE, GL\_TRUE\);
if \( \!glfwOpenWindow\(500, 500, 0, 0, 0, 16, 16, 0, GLFW\_WINDOW\) \)
\{
	cerr &lt;&lt; &quot;Failed to create window&quot; &lt;&lt; endl;
	return 1;
\}

glfwEnable\(GLFW\_STICKY\_KEYS\);
glfwDisable\(GLFW\_AUTO\_POLL\_EVENTS\);

bool bRunning = true;
while \( bRunning \)
\{
	//glClear\( GL\_COLOR\_BUFFER\_BIT \);
	glfwSwapBuffers\(\);
	glfwPollEvents\(\);
	bRunning = \!glfwGetKey\( GLFW\_KEY\_ESC \) &amp;&amp;
		glfwGetWindowParam\( GLFW\_OPENED \);
\}

return 0;

}

If I comment out GL calls the leak is gone.
Seems like it could be related to GDI but I couldn’t figure out what’s going wrong

elmindreda wrote on Monday, April 20, 2009:

Huh. Sounds like a problem, but I don’t have access to Vista so someone else will have to look into it.

uzbeche wrote on Monday, April 20, 2009:

I forgot to mention - I’ve encounter the same issue on XP SP3

artblanc wrote on Monday, April 20, 2009:

Hi,

I created and compiled a project with the code provided by Uzbeche (with the glClear call uncommented and with one extra glClearColor).

I compiled the project in:

- Visual Studio 2008 Express Edition under Windows Vista.
- Visual Studio 2008 Professional Edition under Windows Server 2008.

When I run both .exe in Windows Vista i get the leak, but when I run both executables on Windows Server 2008 I don’t get the leak (the number doesn’t change even a kb).

I don’t know for now what this means but maybe you have an idea.

uzbeche wrote on Tuesday, April 21, 2009:

Compiling both GLFW and test app. with debug heap enabled as suggested on GameDev (see the thread here: http://www.gamedev.net/community/forums/topic.asp?topic_id=230532) didn’t produce any leak reports neither with MTd nor with MDd CDT.
It’s worth noticing that the same code (not only the test app) on top of latest FreeGLUT snapshot doesn’t produce any leaks. Considering artblanc’s findings and the fact that no leaks were reported on other platforms, I tend to think that it’s somehow related either to WGL stuff or to MS OpenGL stack implementation

uzbeche wrote on Thursday, April 23, 2009:

The behavior is related to nVidia OpenGL implementation.
Actually there’s no leak - if left running for 20-30 minutes, memory consumption stabilizes. The same issue was described on GameDev also: http://www.gamedev.net/community/forums/topic.asp?topic_id=465798

artblanc wrote on Thursday, April 23, 2009:

That makes sense for me. When I did the test in Windows Vista and Windows 2008 Server. The computer with Windows Vista installed had a NVidia card. The one with Windows 2008 Server had an intel video card.

We were relating the “bug” to the platform or to the compiler, that’s why I didn’t mention anything about the graphic cards.

I have tested it again, and yes, it stops after a few minutes.

Perfect! No leaks!

:slight_smile:

robindegen wrote on Wednesday, September 23, 2009:

I have had the exact same problem with the simplest of opengl programs on my
windows xp machine. I never figured out why it did it. Compiling the exact
same code on windows 7 (on the same machine) i had no problems at all.