Love your post. So, don’t take this as disagreement.
I’m always a little bewildered by frame rate discussions. Yes, I understand that more is better, but for non-gaming apps (e.g. “productivity” apps), do we really need much more than 60 Hz? Yes, you can get smoother fast scrolling with higher frame rate at 120 Hz or more, but how many people were complaining about that over the last decade?
I enjoy working on my computer more at 144Hz than 60Hz. Even on my phone, the switch from 60Hz to a higher frame rate is quite obvious. It makes the entire system feel more responsive and less glitchy. VRR also helps a lot in cases where the system is under load.
60Hz is actually a downgrade from what people were used to. Sure, games and such struggled to get that kind of performance, but CRT screens did 75Hz/85Hz/100Hz quite well (perhaps at lower resolutions, because full-res 1200p sometimes made text difficult to read on a 21 inch CRT, with little benefit from the added smoothness as CRTs have a natural fuzzy edge around their straight lines anyway).
There's nothing about programming or word processing that requires more than maybe 5 or 6 fps (very few people type more than 300 characters per minute anyway) but I feel much better working on a 60 fps screen than I do a 30 fps one.
Everyone has different preferences, though. You can extend your laptop's battery life by quite a bit by reducing the refresh rate to 30Hz. If you're someone who doesn't really mind the frame rate of their computer, it may be worth trying!
It isn't equivelent in the sense that the progressive scanout on CRTs resulted in near-zero latency and with minimal image persistance, versus flat panels which are global refresh adding latency and worsening motion clarity. So it isn't really a "but", it's a "made even better by being rendered only one pixel/dot at a time".
Motion clarity yes, but it's zero latency in the least useful way possible, only true when you're rendering the top and bottom of the screen at different points in time. And scanout like that isn't unique to CRTs, many flat panels can do it too.
When rendering a full frame at once and then displaying it, a modern screen is not only able to be more consistent in timing, it might be able to display the full frame faster than a CRT. Let's say 60Hz, and the frame is rendered just in time to start displaying. A CRT will take 16 milliseconds to do scanout. But if you get a screen that supports Quick Frame Transport, it might send over the frame data in only 3 milliseconds, and have the entire thing displayed by millisecond 4.
I never complained about 60, then I went to 144 and 60 feels painful now. The latency is noticable in every interaction, not just gaming. It's immediately evident - the computer just feels more responsive, like you're in complete control.
Even phones have moved in this direction, and it's immediately noticable when using it for the first time.
I'm now on 240hz and the effect is very diminished, especially outside of gaming. But even then I notice it, although stepping down to 144 isn't the worst. 60, though, feels like ice on your teeth.
Did you use the same computer at both 60 and 144? I have no doubt that 144 feels smoother for scrolling and things like that. It definitely should. But if you upgraded your system at the same time you upgraded your display, much of the responsiveness would be due to a faster system.
I have a projector that can project 4k at 60hz or 1080p at 240, and I can really notice it by just moving the cursor around. I don’t need to render my games anywhere near 240 to notice that too. Same with phones - moving from pixel 3 to pixel 5, scrolling through settings or the home screen was a palpable difference. Pixel 3 now feels broken. It is not.m, it just renders at 60 instead of 90 fps.
Yes same system, then again at 240hz. Realistically I think just about any modern GPU can composite at 240 fps, although I see what you mean if I did an SSD upgrade or something, but I didn't.
> how many people were complaining about that over the last decade?
Quite a few. These articles tend to make the rounds when it comes up: https://danluu.com/input-lag/https://lwn.net/Articles/751763/ Perception varies from person to person, but going from my 144hz monitor to my old 60hz work laptop is so noticeable to me that I switched it from a composited wayland DE to an X11 WM.
Input lag is not the same as refresh rate. 60 Hz is 16.7 ms per frame. If it takes a long time for input to appear on screen it’s because of the layers and layers of bloat we have in our UI systems.
Refresh rate directly affects one of the components of total input lag, and increasing refresh rate is one of the most straightforward ways for an end user to chip away at that input lag problem.
If our mouse cursors are going to have half a frame of latency, I guess we will need 60Hz or 120Hz desktops, or whatever.
I dunno. It does seem a bit odd, because who was thinking about the framerates of, like, desktops running productivity software, for the last couple decades? I guess I assumed this would never be a problem.
Mouse cursor latency and window compositing latency are two separate things. I probably did not do a good enough job conveying this. In a typical Linux setup, the mouse cursor gets its own DRM plane, so it will be rendered on top of the desktop during scanout right as the video output goes to the screen.
There are two things that typically impact mouse cursor latency, especially with regards to Wayland:
- Software-rendering, which is sometimes used if hardware cursors are unavailable or buggy for driver/GPU reasons. In this case the cursor will be rendered onto the composited desktop frame and thus suffer compositor latency, which is tied to refresh rate.
- Atomic DRM commits. Using atomic DRM commits, even the hardware-rendered cursors can suffer additional latency. In this case, the added latency is not necessarily tied to frame times or refresh rates. Instead, its tied to when during the refresh cycle the atomic commit is sent; specifically, how close to the deadline. I think in most cases we're talking a couple milliseconds of latency. It has been measured before, but I cannot find the source.
Wayland compositors tend to use atomic DRM commits, hence a slightly more laggy mouse cursor. I honestly couldn't tell you if there is a specific reason why they must use atomic DRM, because I don't have knowledge that runs that deep, only that they seem to.
Mouse being jumpy shouldn’t be related to refresh rate. The mouse driver and windowing system should keep track of the mouse position regardless of the video frame rate. Yes, the mouse may jump more per frame with a lower frame rate, but that should only be happening when you move the mouse a long distance quickly. Typically, when you do that, you’re not looking at the mouse itself but at the target. Then, once you’re near it, you slow down the movement and use fine motor skills to move it onto the target. That’s typically much slower and frame rate won’t matter much because the motion is so much smaller.
Initially I wrote “input device”, but since mouse movements aren’t generally a problem, I narrowed it to “keyboard”. ;) Mouse clicks definitely fall into the same category, though.
Essentially, the only reason to go over 60 Hz for desktop is for a better "feel" and for lower latency. Compositing latency is mainly centered around frames, so the most obvious and simplest way to lower that latency is to shorten how long a frame is, hence higher frame rates.
However, I do think that high refresh rates feel very nice to use even if they are not strictly necessary. I consider it a nice luxury.
I’m always a little bewildered by frame rate discussions. Yes, I understand that more is better, but for non-gaming apps (e.g. “productivity” apps), do we really need much more than 60 Hz? Yes, you can get smoother fast scrolling with higher frame rate at 120 Hz or more, but how many people were complaining about that over the last decade?