I must admit, when I got my 144hz monitor I was excited, coming from a 60hz monitor. But even if a game runs at 144 fps I don’t see much of a difference, many people do, but I don’t. It’s a bit smoother, but not much.
But if a game runs at 30 fps it’s horrible. The Crew, for example, can be switched to 30 or 60 fps, that’s night and day!
Yeah, 144hz makes a significant difference for competitive FPS games (especially fast paced ones like Overwatch), but I hardly notice a difference when playing single player or PvE oriented games.
Hell, on some games (e.g. Borderlands 3 and CP2077) I actually prefer to play on my 60hz monitor since a smooth 60hz is much more enjoyable IMO than an inconsistent 100-144hz experience. My computer is admittedly pretty old though.
The faster something on screen moves, the higher your framerate needs to be for a certain level of motion blur.
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
Framerate is inversly proportial to frametimes, which is what makes it harder to notice a difference the higher you go.
From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That’s only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.
It’s super dependent on the game. Baldur’s Gate 3? 30 fps is more than enough. League of Legends? Yeah, I’ll take those 144hz, tho to be honest I don’t notice a big difference compared to 60 fps.
I must admit, when I got my 144hz monitor I was excited, coming from a 60hz monitor. But even if a game runs at 144 fps I don’t see much of a difference, many people do, but I don’t. It’s a bit smoother, but not much.
But if a game runs at 30 fps it’s horrible. The Crew, for example, can be switched to 30 or 60 fps, that’s night and day!
Yeah, 144hz makes a significant difference for competitive FPS games (especially fast paced ones like Overwatch), but I hardly notice a difference when playing single player or PvE oriented games.
Hell, on some games (e.g. Borderlands 3 and CP2077) I actually prefer to play on my 60hz monitor since a smooth 60hz is much more enjoyable IMO than an inconsistent 100-144hz experience. My computer is admittedly pretty old though.
144hz in overwatch feels like putting glasses on for the first time, my brain can actually track movement properly
Most other games I barely notice the difference though
You can cap the fps in software, no need to switch monitors.
Also personally I always notice the difference, even when scrolling webpages
Going back to 60, I notice an extreme difference.
Yeah, the difference is very noticable once you get used to the higher frame rate.
Yes, many do. I’m just one of the unlucky ones. But at least I can see the difference between 1080p and 4k. It’s the little things in life…
Two things are important here:
A 2D point and click adventure at 30fps could have comparable motion blur to a competitive shooter at 180, for example
From 30 to 60? That’s an improvement of 16.67ms. 60 to 120 makes 8.33ms, 120 to 240 only improves by 4.17ms, and so on
Ah, something I want to add:
That’s only explaining the visual aspect, but frametimes are also directly tied to latency.
Some people might notice the visual difference less than the latency benefit. That’s the one topic where opinions on frame generation seem to clash the most, since the interpolated frames provide smoother motion on screen, but don’t change the latency.
It’s super dependent on the game. Baldur’s Gate 3? 30 fps is more than enough. League of Legends? Yeah, I’ll take those 144hz, tho to be honest I don’t notice a big difference compared to 60 fps.