X Error BadMatch from RRSetCrtcGamma on Linux with latest NVidia driver

I got bug report for my game saying that the game crashes when starting in full-screen mode. I can’t reproduce the issue myself, but I do have fairly good logs from the incident.

These are the last words of the executable:

X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 24 (RRSetCrtcGamma)
Serial number of failed request: 212
Current serial number in output stream: 213

The game was run on 64-bit Antergos Linux on Intel CPU and GeForce GTX 980 Ti graphics card using the latest proprietary drivers (378.13).

Any ideas what might be causing this?

Are you using glfwSetGamma or glfwSetGammaRamp?

I use glfwSetGamma() if game is launched full-screen.

At first glance, the error seems to be caused by GLFW trying to set a ramp of a different size than the current one, which RandR prohibits but GLFW doesn’t check for. If so, you can fix it on your end right away by building your own ramp of the current size and setting that with glfwSetGammaRamp. Check the implementation of glfwSetGamma for any details, as it’s just a wrapper for the former function.

I’ll fix this in master asap but won’t have much coding time this and next week.

The BadMatch should be fixed now.

https://github.com/glfw/glfw/commit/66b16f1fc1cfb1ee68ccbbc09e56646e3ab1e6fe

This still leaves glfwSetGamma potentially producing ramps of invalid sizes on RandR, but the documentation specifies a static ramp size so I’m not sure yet which choice breaks the least code.

Thank you for the help! I wasn’t expecting to find a solution this fast :smile:

According to NVidia driver change log (http://www.nvidia.com/download/driverResults.aspx/101818/en-us), it now uses 1024 entry gamma ramp, so this is almost certainly the cause of my error.