Broken vsync behavior in MacOS Catalina

Hi,

I have some code that alternates between clearing the window to black and white, and prints the time passed between each iteration of the loop, like this:

#include <GLFW/glfw3.h>
#include <iostream>
#include <iomanip>

int main(int argc, char** argv)
{
    glfwSetErrorCallback([](int error, const char* description) {
        std::cout << "Error: " << description << "\n";
    });

    if (!glfwInit()) {
        return -1;
    }

    GLFWwindow* window = glfwCreateWindow(800, 600, "Game", nullptr, nullptr);
    if (!window) {
        glfwTerminate();
        return -1;
    }

    glfwMakeContextCurrent(window);

    glfwSetKeyCallback(window, [](GLFWwindow* win, int key, int scancode, int action, int mods) {
        if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS) {
            glfwSetWindowShouldClose(win, GLFW_TRUE);
        }
    });

    glfwSwapInterval(1);

    auto prev = glfwGetTime();
    int frame = 0;
    while (!glfwWindowShouldClose(window)) {
        int width, height;
        glfwGetFramebufferSize(window, &width, &height);
        glViewport(0, 0, width, height);

        // Alternate between black and white
        float val = frame++ % 2 ? 1 : 0;
        glClearColor(val, val, val, 1.0);
        glClear(GL_COLOR_BUFFER_BIT);

        glfwSwapBuffers(window);
        auto curr = glfwGetTime();
        std::cout
            << std::fixed << std::setprecision(4)
            << curr - prev << "s\n";
        prev = curr;

        glfwPollEvents();
    }

    glfwDestroyWindow(window);
    glfwTerminate();

    return 0;
}

I would expect the window to flicker consistently between black and white and the output to be around 0.0160s each time (since vsync is enabled). That’s what happens most of the time, but every few seconds, glfwSwapBuffers starts returning early every other frame, and keeps doing so for a while. The result is that the screen is “stuck” in one color for a number of frames, and the output looks like this:

...
0.0167s
0.0167s
0.0168s
0.0012s  <-- Here, something goes wrong
0.0153s
0.0016s
0.0152s
0.0015s
0.0152s
0.0018s
0.0150s
0.0015s
0.0154s
0.0020s
0.0146s
0.0019s
0.0149s
0.0019s
0.0147s
0.0019s
0.0148s
0.0016s
0.0152s
0.0021s
0.0145s
0.0022s
0.0147s
0.0021s
0.0147s
0.0022s
0.0142s
0.0168s <-- Here, we're good again
0.0166s
0.0168s
...

I’m using GLFW 3.3.2 and MacOS 10.15.5, but I’ve also observed this problem on MacOS 10.14-something.

The same thing happens when I try to use vsync in SFML and SDL (with OpenGL rendering).

Any ideas? Is it a bug in MacOS, and in that case is there a workaround?

Have you tried against the latest Github code? There is an issue with Mac OS X breaking vsync which has been resolved, and is expected in release 3.3.3.

I tried now after installing GLFW with

brew install glfw --HEAD

Same result. Also, reading the issue, it seems like that bug was resolved in MacOS last year, and I’m using a later version. I tried again with the latest Catalina (10.15.6), just to be sure.

Maybe I should mention, when I do a similar test with SDL using the Metal backend, vsync works fine. But with the OpenGL backend, SDL has the same problem as GLFW.

Thanks for checking.

Do you see this issue when full screen?

This appears to be a OpenGL / OS related issue, as it affects GLFW, SDL and SFML, but I think it’s worth submitting an issue to GLFW on Github in case there’s a potential work around which can be developed.

Yes, fullscreen made no difference. I also tested on a different computer again, with MacOS 10.14.6, and it also has the problem in fullscreen.

I think it’s worth you opening an issue on Github for this, with as much detail of your system as possible (which Mac hardware you have, with which GPU as well as the OS information you’ve given here).