hckrnws
Inputlag.science – Repository of knowledge about input lag in gaming
by akyuu
The website notes that you can measure lag with an “expensive” high-speed camera setup.
My favorite trick, which I’ve used frequently (including in scientific publications on lag!) is to use the slo-mo cam on a smartphone. Phones will usually do anywhere from 120-240Hz. Set up the camera so it can see both your input (e.g. a side view of you pushing a button) and the display, record a video, and then pull it into a media player that supports frame-by-frame playback. You can then measure the number of frames elapsed from you pushing the button (pressing it down far enough to electrically activate it) and the corresponding reaction on screen. This gives you a cheap and easy setup capable of measuring latency down to ~4ms granularity, and doing a few repeated measurements can give you a very accurate picture of latency. Keep in mind that latency is a range (statistical distribution), not a single number, so you need repeated measurements to understand the shape of the distribution.
If you’re developing a game, you can add a prominent frame counter on screen to be captured on the video, and add the frame counter to your log output. Then you can match up the video with your game’s events, after accounting for display latency.
I don't know how scientifically valid this is (I hope very) but when a friend told me my USB hub / switcher would be introducing a lot of input lag, I bought a USB to eth adaptor and did a few thousand pings to the router, from direct to mobo and then from via the switcher. Unsurprisingly, there was no measurable latency (I had to use a 3P tool because Windows wouldn't go lower than ms by default).
I am aware that admitting to using Windows in these hallowed halls is a terrible sin, but the anecdote was too relevant to pass up and that's an important detail for anybody looking to repro.
Check out “Game Feel: A Game Designer's Guide to Virtual Sensation” by Steve Swink.
To some extent, responsiveness is a perception based on expectations derived from the visual and acoustical elements of a game. So on top of engine-level optimizations, there are artistic tricks that can further improve the sense of responsiveness.
Think Kiki and Bouba, and how each one would move, race and fight.
I wish they included the window compositor as something that can introduce latency because I'd like to learn more about it.
When I switched from Windows to Linux on the same hardware I noticed a lot of keyboard input latency when playing games, at least 150ms. This only happens to me with niri, KDE Plasma (Wayland) feels identical to Windows. So did Hyprland. I'm able to reproduce it on multiple systems when I have a 4k display running at 1:1 native scaling. On AMD cards, turning off v-sync helped reduce it but it didn't remove it. With an NVIDIA card, turning off v-sync made no difference. I believe it's semi-related to that 4k display because when I unplug that display and use my 2560x1440 monitor, it's much less noticeable despite getting a solid 60 FPS with both monitors. All that to say, there's certainly a lot more than your input device, GPU and display playing a role.
If anyone played Quake on a dial-up connection with client side prediction turned off, that is the exact same feeling. It's pressing a key and then seeing the screen update X ms afterwards.
Windows solution to this is exclusive fullscreen, which bypasses the compositor.
You can try Gamescope [1] from Valve, that's what Steam Deck uses - i think its a compositor designed to minimize latency but support the few things games need. Some compositors like KDE Plasma KWin support a direct scanout mode which is the same idea as windows' exclusive fullscreen. You might need to look for support for something similar in niri.
Thanks, I have tried gamescope but it kills the performance of games for me. All games have a lot of stuttering when I use it. It also didn't reduce the input latency. Same hardware is liquid smooth on Windows.
As far as I know niri enables direct scanout by default. It's an option you can disable if you want https://niri-wm.github.io/niri/Configuration%3A-Debug-Option.... I do not have this set which indicates direct scanout is enabled.
It's interesting because the latency is only when pressing keys on the keyboard. Mouse movement and button press latency feels as good as Windows, I can't perceive any delay. I tried 3 keyboards, it's all the same. I'm also not running anything like keyd or anything that intercepts keys. It's a vanilla Arch Linux system on both of the systems I tested.
You don't need exclusive fullscreen on Windows to bypass the compositor. Fullscreen borderless windows also bypass the compositor. And in newer Windows versions the compositor can be bypassed even in regular windows using hardware overlays.
Windows's desktop compositor DWM is actually very advanced, and I don't believe any Linux desktop compositor is anywhere close. It's one of the things I miss when leaving Windows.
I remember one time I was playing Fortnite 1v1s with a friend and just kept losing. Something felt severely off, my game inputs felt terrible. I couldn’t quite trace my finger on what the issue was until I lost a couple rounds in a row. Turns out I had forgot to set my refresh rate to 160hz, as I had just fresh installed Windows so I could play with my friend. After that I genuinely won dang near every round. It is absolutely insane how when you switch from 60hz to 120+ you don’t really notice anything—but switching back makes you feel like your device is defective.
I was never very good on FPS games, but during pandemics I would play often with friends.
One day I pop up a practice map in cs:go where one of the challenges is shooting a fixed target after it turns green. If you don’t do it within 250ms or something (nothing crazy in terms of human reaction time), then you don’t score.
I was flabbergasted to see myself miss every single time. My friend even told me “dude, are you pretending? How are you so slow?”
So the next day I got a new mouse and what do you know, I’m actually responding in time, and scored most of the time when the rectangle went green. Just the mouse was not registering it fast enough.
Of course, that didn’t translate into such a huge boost in actual gameplay, but it’s impressive how that made me consistently miss. Likely it had some crazy 50ms+ lag.
It always seemed a bit weird to me because both the game FPS and the monitor refresh rate matters - currently I only have 60 Hz monitors (4x1080p ones, a work setup mostly, some gaming) and when playing War Thunder there's a perceptible difference between the game being capped to 60 FPS (with RTSS or just running with VSync on), vs allowing it to run at 120 FPS or more. The monitor doesn't even support VRR.
Even if your monitor is going at 60Hz, you still benefit from a higher refresh rate ingame as the info you get from the screen is more recent.
Back when I got the first 120Hz monitor I used my old 60Hz LCD as a secondary monitor. Moving the mouse cursor across felt like it went from free and fluent to moving through molasses.
As you say, I hadn't noticed anything when I just had the 60Hz monitor.
Likewise with the resolution. I briefly maintained a dual monitor setup, one was a vertical 1080p 27 inch monitor and the other was 4k. The pixel density of the 1080p monitor was genuinely Paleolithic in comparison, so I couldn’t even use it because of how obviously I could tell. But when it was my primary monitor it never occurred to me.
Input lag is one of those things you feel before you can explain it. Good to finally have a resource that breaks down the full chain — controller, engine, display — instead of just blaming the monitor like everyone does
The engine section is the part most developers seem to ignore. A locked 60fps doesn't mean 16ms latency, and that gap make me surprise
I used to get into arguments all the time about how triple-buffering reduces latency, and I think it's because we lacked resources like this; people assume it adds the additional back buffer to a queue, when the traditional implementation "renders ahead" and swaps the most recently-completed back buffer. It's a subtle difference but significantly reduces the worst-case latency vs. a simple queue.
I think most people get their information from help blurbs in settings menus for PC games, which are often hilariously vague or incorrect.
1. It doesn’t help that on Windows’ “Triple buffering” options actually means FIFO forced three-frame buffering. So people had prestablished PTSD from those dreadfully laggy smoothing.
2. Triple buffering does not reduce latency compared to unsynced tearing. It’s a spatial vs temporal tradeoff between whether to let frequency mismatches manifest as tearing or jitter. For passive consumption of motion, losing temporal consistency in exchange for spatial cohesion is the better tradeoff and so triple buffering is appropriate. For active controls of motion and its feedback, temporal consistency is absolutely critical whereas spatial cohesion while in motion is far, far less important, so triple buffering is unacceptable in this use case.
I should have been more clear and contrasted with double-buffering, thanks.
It does increase input lag in the ssme manner Vsync does, there is a wait time before the information is sent to the screen to avoid tearing.
If you wanna minimize latency, you'd want always the most recent information available, which vsyc or buffering does not provide. You trade that for tearing with those schemes.
I should have been more clear and contrasted with double-buffering, thanks.
Vulkan's presentation API makes this distinction explicit: VK_PRESENT_MODE_MAILBOX_KHR is the "replace if already queued" mode that actually reduces latency, while VK_PRESENT_MODE_FIFO_KHR is the pipeline-queue variant that adds frames ahead of time. OpenGL never standardized the difference,
so "triple buffering" meant whatever the driver implemented -- usually vendor-specific extension behavior that varied between hardware. The naming confusion outlived OpenGL's dominance because the concepts got established before any cross-platform API gave them precise semantics.I sometimes use an old laptop with Fedora, with full disk encryption. When booting up, there is a text input box for the password to unlock the disk. When I hit enter, the screen content "immediately" changes (continues booting). It often feels like I hit enter before I typed the last letter of my password, but then I see that it apparently worked. The latency of that input box is just so much lower than usual, that it confuses me.
I’ve noticed that too. I have kind of a wonky password that’s hard to input correctly due to having to hold shift for 2 characters in a row that are in the middle of the password. I’ll enter a password and completely think that it was wrong just to be let in. It often feels like I clip the last letter wrong too and I just get let in. Fun.
One area of focus missing here is game streaming / remote play (Steam Link, Moonlight, etc. over a local network).
I've come to accept input lag, but mostly play games where it doesn't matter (simple platformers, turn-based games, etc). I know steam link from my home desktop to my ~5 year smart TV is adding latency to my inputs – though I can't tell if it's from my router, desktop, or TV – but I've come to accept it for the convenience of playing on the couch (usually with someone watching next to me).
I know some blame is on the TV, as often if I just hard-reset the worst of the lag spikes go away (clearly some background task is hogging CPU). And sometimes the sound system glitches and repeats the same tone until I reset that. Still worth putting up with for the couch.
Build an sffpc, have it by the tv :)
Ten years ago, I did some experiments building an "input lag measurement device" for touch screen devices with an Arduino. https://github.com/graup/feedback-delay-measurement The research was related to QoE in cloud gaming. At that time, some Android devices had insane lag (125 ms+), whereas Apple devices were already consistently around 30ms. I assume this isn't a problem anymore nowadays.
Much of this not relevant to how modern games work. It is talking about DirectX 9/10, both 20+ year old APIs that are not used anymore.
Yeah, being in fps gaming for quite when I was younger made me quite picky about my peripherals, it's astonishing for me how much crappy stuff average people tolerate.
Somebody should measure keyboard/mouse lag for various web site/browser/operating system combinations. That would be useful. There's probably a startup in doing that as a metric.
This would be easier to do now that LLMs can learn to navigate web sites. Less custom code.
Also useful - measure it for point of sale systems.
'Input lag' should really be called 'Output lag', as most of it usually comes from the display device and/or graphics pipeline, not input devices
I once heard a rumor going around the cs:go world that Linux had lower input latency by default than windows, which made it easier to bunny hop. Is there even a semblance of truth to this?
FWIW, there's also happened quite a lot of research on latency in academia - which that page seems to completely ignore.
My group has been looking into that topic, too¹. One of our most interesting findings (IMHO) was that for many USB devices, input latency does not follow a normal distribution but that each device has its own distribution of latencies for input events, including funny gaps².
However, with gaming hardware supporting 1000+ Hz polling, the effect of input latency should be negligible nowadays.
¹) https://hci.ur.de/projects/latency
²) https://epub.uni-regensburg.de/40182/1/On_the_Latency_of_USB...
> quite a lot of research on latency in academia
I recall reading about a study years ago that showed while response times are limited to around 150ms between stimulus and say moving a finger, the participants could consistently time movements with an accuracy of less than 10 ms or so (I forgot the exact number).
Which I assume explains why consistent input lag is much better than variable input lag.
As a musician, it’s amazingly noticeable if there’s even a 10ms delay between hitting the key and hearing the sound, or seeing the note displayed on the screen - once I really got into digital music production and recording, it really helped to set reasonable expectations for lag in gaming, between peripherals and response rate in displays and audio lag in headphones. I still (somewhat more superstitiously at this point) default to wired devices to this day, mostly out of concern for lag.
quite a few syntactical errors on this website. I’d suggest running it through an LLM and telling it to fix the mistakes without altering anything else!
Good to know it's human written
If this were a reliable indicator, then it would no longer be one.
Crafted by Rajat
Source Code