OpengGL 3+ & Linux: animation stuttering

yabdallah wrote on Thursday, September 08, 2011:

I’m using glfw for a mac/linux project.
I wrote the following mainloop:

    int running = GL_TRUE;
    do {
        glClearColor(0.752f, 0.800f, 0.432f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
        animator->stage();
        model->draw();
        framecount++;
        showFPS();
        glfwSwapBuffers();
        running = !glfwGetKey(GLFW_KEY_ESC) && glfwGetWindowParam(GLFW_OPENED);
    } while (running);

I initialize the window with:

    glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
    glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
    glfwOpenWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
    glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
...
glfwOpenWindow(width, height, 0, 0, 0, 0, 0, 0, GLFW_WINDOW)
...
 glfwSwapInterval(1);

This loop works flawless with OsX10.7 (AMD HD6xxx) . With Linux (AMD HD4xxx
and HD5xxx) in contrast, the animation stutters. After some profiling it
turned out that the mainloop run-time is imblalanced and uses up to 0.08
seconds for an iteration (one vbo with simple shader).

Is there a fix for this problem?

yabdallah wrote on Thursday, September 08, 2011:

Another observation:

Removing the vsync (swap interval = 0) increases the framerate from 60 to
1600 fps.
But the framerate is still unstable (0.0015 to 0.012) and the animation
remains crude.

Is it a problem to read glfwGetTime in quick succession?

elmindreda wrote on Thursday, September 08, 2011:

On a modern OS, frame times will never be exactly the same over a longer
period of time, as there are other processes running, taking up time and
purging your data from the caches.

Calling glfwGetTime in rapid succession should work as expected. It’s
basically just a call to gettimeofday. While this should and will be replaced
by clock_gettime, it should still work as it is.