qy9
November 2, 2025, 9:46am
1
I want to add zooming with touchpad gestures to my project, by that I mean moving my fingers appart or closer like on a phone or laptop.
Now I tried events.c to check if library could detect it, it didn’t. I went through the input guide and didn’t see a relevant section either so maybe this is a feature request unless there’s a way.
GLFW does not currently support touch input.
There is an old touch input PR:
master ← torkeldanielsson:touch
opened 06:22AM - 22 Jul 20 UTC
This PR proposes to add the generic API from the touch branch for receiving touc… h input.
There exists at least the following efforts, since at least 2013 (!), to add touch to GLFW:
- #42 The related touch branch was created by Quinten Lansu with additions (bananas!) from Camilla Löwy.
- #532 An effort was made in 2015 to get touch working on Wayland by linkmauve.
- #952 In 2017 Erik Sunden made a PR which (besides rebasing) proposed bundling touches before passing to callbacks.
- Related: #90 (trackpads, but #42 is mentioned in the discussions)
Now, I have a customer who needs touch to work in our application (voysys.se) so I need touch to work (which it does) at least in a fork for us. Platform for my project is Windows. I added a few hours to the work package to be able to make this PR with these changes to the upstream. (If it is merged is of course an open question for the maintainers/community.)
After reading through the previous efforts on touch, above, I came to the conclusion that I like the callback api in the touch branch. It is based on WM_TOUCH, which is available from windows 7. The effort in PR #532 to add Wayland support indicates that the approach will work for other platforms too. Raw touch points with id, state (press/move/release), and position are passed in. It is up to the user to add inertia, gestures, whatever. I think this is a clean, sane low level api for touch and that it is the right approach. Because who knows what people want to do with the touches.
What I have done:
- Rebased the touch branch on the latest upstream/master
- Updated comments, defines, styling, etc to be in line with what the current standards seems to be (hope I got it right?)
- Added a "glfwTouchInputSupported" to check for touch support (I'm open for removing this one - I took inspiration from the raw mouse support test function but that actually works slightly differently in that it is a per platform true/false thing. Touch is a platform, OS-version and also "does the user actually have a touch screen" question. Right now this function answers the platform and OS-version questions, but it doesn't know if the user has a touch screen.)
- Used this to build a simple pan+zoom touch interface in our application
- (I squashed the individual commits, because rebasing was easier this way and they did not build individually anyway)
I have tested this now on my laptop which has a touchscreen (surface book 2) and starting next week I will run tests on the target displays which are external multitouch monitors.
[2020-11-02] Update: this is running and working well on customer's site for a few months. Touch is the only input used, on two external touch monitors per pc. No issues has been reported.
qy9
November 2, 2025, 10:17am
3
It’s fine, I’ll use scroll motion then. Thank you for the quick response and I hope it becomes a possible one day, as it is common. I personally am not capable enough to contribute to the project.