Migrating to 3.0: What to use instead of glfwSleep?

melekor wrote on Monday, June 17, 2013:

Let’s remove glfwGetTime and glfwSetTime too. What has time got to do with opening an OpenGL context? GLFW strives to be “minimalistic” and clean. If you need them just use a time library. It’s not that painful.

Clipboard handling? WTF has clipboard handling to do with GLFW? Most commercial games don’t use the clipboard either. Better junk it ASAP as the clipboard code is introducing unacceptable maintenance burden on the GLFW developers.

Joysticks? Not many commercial games actually use joysticks. At least not many I’ve played. I’m not convinced that your need for joystick input is common, so let’s get rid of it too. Use a library if you need it.

Repeat ad nauseam until no one uses GLFW.

vittoriolassini wrote on Monday, June 17, 2013:

If you need them just use a time library

Do you know any mature, lightweight, popular and maintained portable high-precision timing library?

Clipboard handling? WTF has clipboard handling to do with GLFW? Most commercial games don’t use the clipboard either

Actually, many commercial games do use clipboard handling (for example, in chat or login screens). It’s really useful for GUI’s. And there is no “clipboard library”, so without this functionality you must write all the platform-specific code yourself. This is not the case with sleeping, because there is many mature, lightweight and maintained threading libraries (and with newer standards of c/c++ you don’t even need any threading library).

Joysticks? Not many commercial games actually use joysticks.

In this day and age of console gaming and multiplatform games there are literally TONS of games that use (or even “optimized for”) joysticks. Again, do you know good joystick library?

melekor wrote on Monday, June 17, 2013:

You didn’t get it again. I was attempting to demonstrate your way of thinking extended to its absurd conclusion.
The point is, even if there were libraries for those things, they still shouldn’t be removed from GLFW. Just like sleep shouldn’t have been removed. If those libraries exist in the future will you recommend GLFW remove that functionality then? This is nonsense.

and BTW, the whole “commercial games” thing is kind of a red herring anyways as (1) this isn’t the only target use case for GLFW and (2) I’m pretty sure most commercial games (at least AAA games) would never use GLFW anyways as it is not full featured enough. (Lack of a sleep function doesn’t help here)

vittoriolassini wrote on Monday, June 17, 2013:

You didn’t get it again. I was attempting to demonstrate your way of thinking extended to its absurd conclusion.

And i’m demonstrated that my way of thinking don’t really extends to any “absurd conclusion”, but to current state of things.

If those libraries exist in the future will you recommend GLFW remove that functionality then?

If there will be a really good timing library in the future, then i will not object deprecating current timing functions in some very-very distant, backwards-incompatible version of GLFW. I will not recommend this deprecation (because i use timing all the time), but it will not hurt me.
GLFW3 was designed not in vacuum with only some “way of thinking”. It was designed (as backwards-incompatible, cleaned up API) with existing software and typical use cases in mind, i believe.

and BTW, the whole “commercial games” thing is kind of a red herring anyways as (1) this isn’t the only target use case for GLFW and (2) I’m pretty sure most commercial games (at least AAA games) would never use GLFW anyways as it is not full featured enough. (Lack of a sleep function doesn’t help here)

The whole “commercial games” thing is not about GLFW, but about frame rate limiting with sleep. It is only to show on what grounds i think that your needs are not common. To reiterate, i think that including some “out-of-focus, overlapped” functionality is only justified when there is sufficiently high demand for such functionality and you can’t get that functionality elsewhere without much pain. I believe that glfwSleep() not meets such conditions.
David, i understand your position fully. It’s just i cannot agree with it.

This is going nowhere, so i’m done.

elmindreda wrote on Monday, June 17, 2013:

The suggested library is a single source file with a single header file, requiring no special configuration. It would have taken a fraction of the time that was spent creating this thread to integrate into a project. It’s bundled with GLFW 3 and used in its tests, so if you have the source tree available you can see this for yourself.

I currently have no plans to reintroduce a sleep function and this thread isn’t convincing anyone of anything. If anyone wants to change those plans, file a feature request and make a good argument for it.

That’s all I’m going to say here.

melekor wrote on Monday, June 17, 2013:

The suggested library is a single source file with a single header file, requiring no special configuration. It would have taken a fraction of the time that was spent creating this thread to integrate into a project.

Clearly, but I’ve been arguing this thread on principle for a while now. You know what would have taken even less time? Not removing core functionality on a whim, for zero benefit.

I currently have no plans to reintroduce a sleep function and this thread isn’t convincing anyone of anything. If anyone wants to change those plans, file a feature request and make a good argument for it.

In other words, please direct all complaints to the trash bin. Gotcha.

BTW: for anyone else affected by this debacle, here’s what I’m using:

void MySleep(double time) {
#ifdef WIN32
    Sleep((DWORD)(time*1000));
#else
    if(time == 0.0) {
        sched_yield(); // sched.h
    } else {
        usleep((useconds_t)(time*1000000)); // unistd.h
    }
#endif
}

Will work on Windows and 'nix until I can complete migrating my code to SDL 2.0.

mariojmartin wrote on Friday, October 04, 2013:

I am using the following code to limit the framerate

clock_t star_clock = clock();
...
display()
...
clock_t end_clock = clock();
clock_t sleep_time = MAX_FRAMERATE_MILISECONDS
    + ((start_clock - end_clock)* 1000)/CLOCKS_PER_SEC;

if (sleep_time > 0){
    std::this_thread::sleep_for
        ( std::chrono::milliseconds( sleep_time ));
}

marcus256 wrote on Thursday, October 31, 2013:

I know that this is an old thread by now (I missed it), but I’ll give it a shot anyway. Perhaps it can clear up some things about the design philosophies in GLFW (old and new).

First of all, David, I’m sorry that you feel that way.

I guess I’m the one to blame for bringing the threading functionality into the 2.x line of GLFW in the first place, but here’s my reasoning (and I honestly don’t think that it’s insane)…

While threading support, including glfwSleep(), was an attempt att bringing cross-platform threading to the OpenGL development community back in the days when such solutions were not commonplace, I totally agree with the decision of dropping it in 3.x.

GLFW 3.x is, as has been told over and over, a move for simplification and re-adjustment to the current state and trends in the software & OpenGL development community, over a decade after GLFW first saw the light of day.

In particular, C11 and C++11 has finally brought proper, standardized threading libraries to us C & C++ developers, so there is really little point in keeping that in GLFW anymore.

If you look closely at those specifications, you’ll find that the Sleep functionality is part of the threading libraries: thrd_sleep() in <threads.h> (C11) and std::this_thread::sleep_for() in (C++11).

Now, since C11 and C++11 support was lacking a couple of years ago, I created TinyCThread and TinyThread++ for developers to use during the transitional period until compilers have caught up (because in my mind, the only sane way is to use standard libraries - and if you’re compiler is flawed, change compilers or work around it by using replacement libraries until it gets fixed).

I think that your particular problem is that you’re using the dreadful combination Visual Studio + C, which (as I understand it) has very poor support from Microsoft. They don’t seem to be interested in supporting anything newer than C90, other than subsets from newer C++ standards, which in my book reads “C is pretty low priority to us”.

My suggestion to anyone finding themselves in this unfortunate situation, rather than blaming GLFW for Microsofts poor C11 support, is to do some combination of the following:

A) If you want to stay with Visual Studio, perhaps move to C++ and reap the benefits of its C++11 support (yes, you can do std::this_thread::sleep_for() in VS 2012 and later).

B) If you want to stay with Visual Studio AND stay with its dated C support, start using TinyCThread and adopt the new standard way of doing threads in C11 (and no matter what you say - sleep() IS a threading function).

C) Nag on Microsoft to add support for C11 threads in Visual Studio (according to [1] I assume it’s only a matter of time, but they could probably do with some constructive user feedback).

[1] http://herbsutter.com/2012/05/03/reader-qa-what-about-vc-and-c99/

I need 3-5 and sometimes 30 FPS for non-gaming applications with interactive 3D graphics. They should run on inexpensive laptops without noise. I want to try GLFW3 instead of SDL2. Can anyone write a cross platform equivalent of this code using GLFW3:

const float maxFPS = 3.f;

int main()
{
    /* ... */
    bool running = true;
    while (running)
    {
        float startTicks = SDL_GetTicks();
        // Drawing ...
        // Limit the FPS to the max FPS
        float frameTicks = SDL_GetTicks() - startTicks;
        if (1000.f / maxFPS > frameTicks)
        {
            SDL_Delay(1000.f / maxFPS - frameTicks);
        }
    }
    /* ... */
}

Hi @8Observer8,

Welcome to the GLFW forums.

If you are using C++ and can use C++11 you can use std::this_thread::sleep_for:
https://en.cppreference.com/w/cpp/thread/sleep_for

Alternatively you can use code similar to:

Note that the code above requires headers depending on the version, namely:

  • usleep needs #include <unistd.h>.
  • nanosleep needs #include <time.h>
  • Sleep needs #include <Windows.h> - you may want to define WIN32_LEAN_AND_MEAN and NOMINMAX to stop lots of definitions creeping in from the Windows header:
#ifdef _WIN32
    #define WIN32_LEAN_AND_MEAN
    #define NOMINMAX
    #include <Windows.h>
#endif

Hopefully that can get you started - if not let me know.

Cheers,

Doug.

1 Like

I decided to make a Python version of the above SDL2 C++ example. Please help me write the equivalent code using GLFW3:

maxFPS = 5.0

def main():
    # ...
    running = True
    while running:
        startTicks = SDL_GetTicks()
        # Drawing ...
        # Limit the FPS to the max FPS
        frameTicks = SDL_GetTicks() - startTicks
        if 1000.0 / maxFPS > frameTicks:
            SDL_Delay(int(1000.0 / maxFPS - frameTicks))

This is a complete SDL2 example that can be run:

main.py

import ctypes
import sys

from OpenGL.GL import *
from sdl2 import *

maxFPS = 5.0
window: SDL_Window = None

def fatalError(message):
    print(message)
    if window:
        SDL_DestroyWindow(window)
    SDL_Quit()
    exit(-1)

def main():
    if SDL_Init(SDL_INIT_VIDEO) < 0:
        fatalError(SDL_GetError())

    window = SDL_CreateWindow(
        b"OpenGL, SDL2, Python",
        SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
        300, 300,
        SDL_WINDOW_OPENGL)
    if not window:
        fatalError(SDL_GetError())

    context = SDL_GL_CreateContext(window)
    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1)
    
    glClearColor(0.65, 0.6, 0.85, 1.0)
    
    event = SDL_Event()
    running = True
    startTicks = 0.0
    frameTicks = 0.0
    while running:
        while SDL_PollEvent(ctypes.byref(event)) != 0:
            if event.type == SDL_QUIT:
                running = False
        startTicks = SDL_GetTicks()

        glClear(GL_COLOR_BUFFER_BIT)
        SDL_GL_SwapWindow(window)
        
        # Limit the FPS to the max FPS
        frameTicks = SDL_GetTicks() - startTicks
        if 1000.0 / maxFPS > frameTicks:
            SDL_Delay(int(1000.0 / maxFPS - frameTicks))
    SDL_GL_DeleteContext(context)
    SDL_DestroyWindow(window)
    SDL_Quit()
    return 0

if __name__ == "__main__":
    sys.exit(main())

This is the window using GLFW3:

import ctypes
import sys

from OpenGL.GL import *
import glfw

maxFPS = 5.0
window = None

def fatalError(message):
    print(message)
    glfw.terminate()
    exit(-1)

def main():
    if not glfw.init():
        fatalError("Failed to init GLFW")

    window = glfw.create_window(
        300, 300, "OpenGL, GLFW, Python", None, None)
    if not window:
        fatalError("Failed to create the GLFW window")
    glfw.make_context_current(window)

    glClearColor(0.65, 0.6, 0.85, 1.0)

    while not glfw.window_should_close(window):
        glfw.poll_events()
        glClear(GL_COLOR_BUFFER_BIT)
        glfw.swap_buffers(window)

    glfw.terminate()
    return 0

if __name__ == "__main__":
    sys.exit(main())

I think I should use glfw.get_time() instead of SDL_GetTicks() and time.sleep() instead of SDL_Delay() and deleted glfwSleep().

import time

print("before")
time.sleep(1)
print("after")

Yes, the python time.sleep() function can replace gflwSleep(). For timing you can either use the GLFW gettime function or python has it’s own set of timing functions, though care must be taken as many of these functions return float which is may not have enough precision for a realtime application if many seconds have passed.

1 Like

The GLFW3 equivalent for the SDL2 example above:

import time

maxFPS = 5.0

def main():
    # ...
    while not glfw.window_should_close(window):
        startTicks = glfw.get_time()
        # Drawing ...
        # Limit the FPS to the max FPS
        frameTicks = (glfw.get_time() - startTicks) * 1000
        if 1000 / maxFPS > frameTicks:
            time.sleep((1000 / maxFPS - frameTicks) / 1000)

main.py

import ctypes
import sys
import time

from OpenGL.GL import *
import glfw

maxFPS = 5.0
window = None

def fatalError(message):
    print(message)
    glfw.terminate()
    exit(-1)

def main():
    if not glfw.init():
        fatalError("Failed to init GLFW")

    window = glfw.create_window(
        300, 300, "OpenGL, GLFW, Python", None, None)
    if not window:
        fatalError("Failed to create the GLFW window")
    glfw.make_context_current(window)

    glClearColor(0.65, 0.6, 0.85, 1.0)

    startTicks = 0.0
    frameTicks = 0.0
    while not glfw.window_should_close(window):
        glfw.poll_events()
        startTicks = glfw.get_time()

        glClear(GL_COLOR_BUFFER_BIT)
        # Draw ...
        glfw.swap_buffers(window)
        # Limit the FPS to the max FPS
        frameTicks = (glfw.get_time() - startTicks) * 1000
        if 1000 / maxFPS > frameTicks:
            time.sleep((1000 / maxFPS - frameTicks) / 1000)
    glfw.terminate()
    return 0

if __name__ == "__main__":
    sys.exit(main())

Equivalent code in GLFW3 and SDL2:

GLFW3:

const float maxFPS = 3.f;

/**
 * Cross-platform sleep function for C
 * @param int milliseconds
 */
void sleep_ms(int milliseconds)
{
#ifdef WIN32
    Sleep(milliseconds);
#elif _POSIX_C_SOURCE >= 199309L
    struct timespec ts;
    ts.tv_sec = milliseconds / 1000;
    ts.tv_nsec = (milliseconds % 1000) * 1000000;
    nanosleep(&ts, NULL);
#else
    usleep(milliseconds * 1000);
#endif
}

int main()
{
    /* ... */
    while (!glfwWindowShouldClose(window))
    {
        float startTicks = glfwGetTime();

        // Drawing
        // ...
        glfwSwapBuffers(window);

        // Limit the FPS to the max FPS
        float frameTicks = glfwGetTime() - startTicks;
        if (1000.f / maxFPS > frameTicks)
        {
            sleep_ms(1000.f / maxFPS - frameTicks);
        }
    }
    /* ... */
}

SDL2:

const float maxFPS = 3.f;

int main()
{
    /* ... */
    bool running = true;
    while (running)
    {
        float startTicks = SDL_GetTicks();

        // Drawing
        // ...
        SDL_GL_SwapWindow(window);

        // Limit the FPS to the max FPS
        float frameTicks = SDL_GetTicks() - startTicks;
        if (1000.f / maxFPS > frameTicks)
        {
            SDL_Delay(1000.f / maxFPS - frameTicks);
        }
    }
    /* ... */
}