How to force tearing with GLFW? (VSYNC OFF)

Hello,

I’m Mark, founder of Blur Busters.
I’m researching GLFW and am evaluating its suitability as an alternative engine for Tearline Jedi rasterdemo.

However, it mandatorily requires me to enable VSYNC OFF mode, because tearlines are mandatory for this demo. Right now it works with native Direct3D, native OpenGL, and also C# MonoGame, on PC, Mac, on Intel/AMD/NVIDIA GPUs.

For me to successfully port Tearline Jedi to GLFW, requires me to gain access to tearlines.

So I’d like to force VSYNC OFF mode. Which doesn’t work with GLFW

How do I force VSYNC OFF tearlines for GLFW? GLFW is refusing to let me enable tearing (which requires exclusive full screen mode, not windowed full screen).

(Also – there’s a Use Case #2 for needing to enable the tearing mode: It’s the world’s lowest-lag mode. Useful for tests, reaction time research, and competitive gaming. Competitive gamers like it, much to some of us developer’s chagrin. But I understand why. Some high-end competitive eSports gamers require it, for the lowest input lag, since milliseconds can matter in top leagues – for “race to the finish line” events of simultaneous reaction times of tightly-paired gamers (see each other, react same time, simultaneous draw) – for simultaneously-reacting humans reacting to the same stimulus; the lower lag configured mode (e.g. VSYNC OFF) even even just a scant few milliseconds, can win “race-to-finish” type of in-game eSports events even if they cannot feel the millisecond. So that’s another use case of many competitive players preferring VSYNC OFF mode for lower latencies. But this is altogether another debate; I just simply mention this as a Use Case #2 for letting developers choose to optionally intentionally enable tearing).

1 Like

Here’s a testcase, which follows the glfw docs for this feature but still doesn’t achieve tearing: https://drive.google.com/open?id=1omtZYjwNETlBRoY6q-o25ifIiNLmn6x3

Notice in glfw_minimal.cpp:
Line 46: swap interval = 0
Line 122: glFlush rather than glfwSwapBuffers
Line 138: double buffer off
Line 147: window dimension that doesn’t match the monitor’s dimensions. According to the glfw docs, this should force it to be true fullscreen. It does change the monitor dimensions but no tearing.
Line 148: additional attempt to set fullscreen using glfwSetWindowMonitor. But still no tearing.

What is happening is that glfw is introducing a 1-frame delay to prevent tearing. It would be nice if we could remove this 1-frame delay.

I haven’t taken a look at your code as I don’t want to download a zip file from an unknown source, but glfwSwapInterval(0) on examples in GLFW such as boing.c works for me. Note that this can depend on your GPU driver settings as per the documentation for Buffer Swapping.

Have you verified that with swap interval 0 the frame time is no longer capped to your monitor refresh interval?

Yes, I forgot to note: in all my tests, I did set my Intel HD4000’s vsync setting to “Use application settings”, which disables the driver’s mandatory vsync.

We’re already able to uncap the framerate, but it doesn’t help. Rendering at 1000 fps, as my testcase does, still leaves a 1 frame delay under glfw, and no tearing. The way I benchmarked this was drawing a triangle at the mouse position and rendering at 1000fps. The triangle’s delay behind the mouse is exactly one frame, not more or less.

I compiled and ran boing.c. boing.c does not tear. At least on Windows, boing.c has no chance to tear because it sits in a window instead of fullscreen, which means it is not bypassing the compositor.

You can switch boing.c to fullscreen using ALT+ENTER.

On my Windows 10 system with an AMD GPU, I can see the occasional tear - to see it better I added a timing while loop in the main for loop to get ~16ms which is my refresh rate, here I can see a crawl line fairly clearly on the ball in fullscreen.

This was still using the same mode as my desktop, where GLFW simply changes the window to remove WS_OVERLAPPEDWINDOW and sets to HWND_TOPMOST. However the GLFW SwapBuffers functons then do not use DwmFlush() but the SwapBuffers function. I also checked with a hardcoded resolution change on line 262 of boing.c.

So for me, on my driver I’m seeing tearing.

The 1 frame delay you mention is odd, I can’t see anywhere in GLFW which would do this other than through the standard double buffering and SwapBuffers with a non zero swap interval. Unless there is some underlying compositing then a 1 frame delay would still see a tear with a framerate not locked to the refresh as it’s the timing of the swap which matters here.

Sadly though I used to work at Intel on the HD4000 a long time back I’m out of touch with what the OpenGL drivers are doing here, though I may have a chance to take a look on a HD4000 at some point.

1 Like

Excellent. I followed your steps, added a 16ms delay and fullscreened boing. Now I too see a crawling tearline, thanks. That’s enough for me to bisect what is happening.

EDIT: new users aren’t allowed to reply > 3 times, so this is my followup:
I fixed the problem and found the line that causes it, but not the underlying cause. glfwWindowHint(GLFW_DOUBLEBUFFER, GL_FALSE) prevents the buffer from tearing. Removing it (and adding glfwSwapBuffers) creates tearing, as desired. I assume this is caused by Windows. The GLFW line that uses double-buffering seems fine to me: https://github.com/glfw/glfw/blob/master/src/wgl_context.c#L385.

1 Like