Not entirely sure how this works, but I’ll try to explain with some code snippets. But this is not framerate dropping on mouse input - it’s the opposite, I only get the framerate I want when I move the mouse.
This is on Windows, using glfw 3.3. I’m on an nvidia rtx 2070 if that matters. The application is compiled for dev, running in a debugger if it helps. This is using glfwSwapInterval 0.
In my update loop, I calculate all the frame timings after updating, rendering, and polling for events. Then I will try to sleep the thread using glfwWaitEventsTimeout to hit a steady, say, 100 fps.
update();
render();
glfwPollEvents();
current_frame_time = glfwGetTime();
delta_time = current_frame_time - prev_frame_time;
prev_frame_time = current_frame_time;
double desired_frametime = 1.0 / engine.fps_limit;
if (delta_time < desired_frametime) {
double delay = desired_frametime - delta_time;
glfwWaitEventsTimeout(delay);
}
If I log delta_time I get these weird inconsistencies.
If I just let the application run every other frame seems to be slow
[INFO] 0.012421
[INFO] 0.000877
[INFO] 0.013879
[INFO] 0.000822
[INFO] 0.015218
[INFO] 0.000836
[INFO] 0.014395
[INFO] 0.000888
[INFO] 0.014158
[INFO] 0.000853
[INFO] 0.014593
[INFO] 0.000847
But if I start waving the mouse around over the window, then the frame times are consistently 100 fps:
[INFO] 0.001996
[INFO] 0.001800
[INFO] 0.002243
[INFO] 0.001663
[INFO] 0.001982
[INFO] 0.002114
[INFO] 0.001891
[INFO] 0.002102
[INFO] 0.002079
[INFO] 0.002117
[INFO] 0.001741
[INFO] 0.002178
[INFO] 0.001988
[INFO] 0.001988
If I wave the mouse outside the window, it’s still the incorrect behavior. Only if I keep moving it in a steady motion over the window.