Problem with double-buffering and Nvidia on Linux / wayland

First up, I don’t think this is really a problem of glfw, but I am looking for some ideas. For example: is this only happening on my machine ?

I wrote a very simple program that makes one of the framebuffers “orange” and the other “blue”. Then the “orange” and “blue” are swapped every 0.5s. This works as expected on my notebook with Intel graphics, but not on my desktop with Nvidia:

int main( int argc, char *argv[])
{
    if (!glfwInit())
        exit(EXIT_FAILURE);

    glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
    glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
    glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
    glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);

    GLFWwindow* window = glfwCreateWindow(640, 480, "Demo", NULL, NULL);
    if (!window)
    {
        glfwTerminate();
        exit(EXIT_FAILURE);
    }

    glfwMakeContextCurrent(window);
    glfwSwapInterval(0);

    glClearColor( 0.7, 0.3, 0.0, 0.0);
    glClear(GL_COLOR_BUFFER_BIT);
    glfwSwapBuffers(window);

    glClearColor( 0.0, 0.3, 0.7, 0.0);
    glClear(GL_COLOR_BUFFER_BIT);
    glfwSwapBuffers(window);

    glfwSwapInterval(30);

    while (!glfwWindowShouldClose(window))
    {
        glfwSwapBuffers(window);
        glfwPollEvents();
    }

    glfwDestroyWindow(window);

    glfwTerminate();
    exit( EXIT_SUCCESS);
}

On Nvidia I just see the “blue” buffer. That’s with the all the drivers I have available: nouveau and the 470/515 proprietary drivers.

I am running: Ubuntu 22.04.01 LTS / Wayland / nouveau / Nvidia 3060 Ti

Welcome to the GLFW forum!

GFLW on Wayland uses EGL by default for context creation, and thus the glfwSwapBuffers calls eglSwapBuffers.

From the documentation EGL_SWAP_BEHAVIOR attribute would need to be set to EGL_BUFFER_PRESERVED to preserve the back buffer, and this is not set by GLFW. The behaviour of buffer swapping is not consistent across APIs/OSs/Drivers so I do not think it would be useful for GLFW to set this, but you could use the native interface to get the EGL surface and set the surface attribute yourself.

Generally OpenGL programs which wish to preserve rendered pixel data do so through render to texture.

Thanks, the information about EGL_SWAP_BEHAVIOR is very interesting, as I didn’t know that. Yet that can’t really be the problem here, as a swap should then reveal the destroyed framebuffer contents as some random garbage or black. Instead it’s just “blue” as if doesn’t really swap. I modified the example to use a single buffer and this just shows black, which is even worse :slight_smile: :

int main(void)
{
   struct timespec delay = { 0, 500 * 1000L * 1000L};

   if (!glfwInit())
     exit(EXIT_FAILURE);

   glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
   glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
   glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
   glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
   glfwWindowHint(GLFW_DOUBLEBUFFER, GLFW_FALSE);

   GLFWwindow* window = glfwCreateWindow(640, 480, "Demo", NULL, NULL);
   if (!window)
   {
      glfwTerminate();
      exit(EXIT_FAILURE);
   }

   glfwMakeContextCurrent(window);

   for(;;)
   {
      glfwPollEvents();
      if( glfwWindowShouldClose(window))
         break;

      glClearColor( 0.7, 0.3, 0.0, 0.0);
      glClear(GL_COLOR_BUFFER_BIT);
      glFlush();
      nanosleep( &delay, NULL);

      glfwPollEvents();
      if( glfwWindowShouldClose(window))
         break;

      glClearColor( 0.0, 0.3, 0.7, 0.0);
      glClear(GL_COLOR_BUFFER_BIT);
      glFlush();
      nanosleep( &delay, NULL);
   }

   glfwDestroyWindow(window);

   glfwTerminate();
   exit( EXIT_SUCCESS);
}

One thing I found out is, that I actually didn’t compile with GLFW_USE_WAYLAND anymore, due to a recently introduced misconfiguration in my build setup :frowning:. So the code is actually using X11.

Now with glfw compiled for Wayland, the window contents are complete trash, which is probably what one would expect. So now I tried eglSurfaceAttrib( display, surface, EGL_SWAP_BEHAVIOR, EGL_BUFFER_PRESERVED); but it returns GL_FALSE.

The EGL dox say:

EGL_BAD_MATCH is generated if attribute is EGL_SWAP_BEHAVIOR, value 
is EGL_BUFFER_PRESERVED, and the EGL_SURFACE_TYPE attribute 
of the EGLConfig used to create surface does not contain 
EGL_SWAP_BEHAVIOR_PRESERVED_BIT.

I tried patching this into src/egl_context.c but this yields error: 'EGL_SWAP_BEHAVIOR_PRESERVED_BIT' undeclared.

The single buffer test code, now fails, as no window is being created, which is probably what one would expect in this situation, if framebuffer content preservation is not set.

Oh well…

I’m guessing you mean it fails to compile because EGL_SWAP_BEHAVIOR_PRESERVED_BIT is undeclared? This is because GLFW places required defines in the internal.h header, you need to look up the value of the defines you need and add them to this (this is done to simplify building GLFW).

The behaviour of EGL without EGL_BUFFER_PRESERVED set and the behaviour of GLX are that The contents of the color buffer are undefined. The swap behaviour of some drivers/hardware is to copy the contents, so the results you see on GLX could be due to this. Generally it is not safe to assume undefined behaviour will do a specific thing.

I would again recommend looking into render to texture (using OpenGL framebuffers) if you want to preserve your rendered pixel content for future use, especially as you are using OpenGL 3.3 which fully supports this.

Rendering into textures is no problem per se. But I thought I could make this easier for me, by reusing framebuffer contents. Which I can’t now, unfortunately. Thanks for the help.