Why 60fps feels choppy on 144hz

Why 60fps feels choppy on 144hz
Why 60fps feels choppy on 144hz

Introduction

As an experienced gamer and lifelong computer user, terms like FPS, Refresh Rate, and such can seem like second nature to me. But for those newer to gaming it can be a bit confusing to constantly hear about these with no explanation.

If you’ve ever wondered why your games stutter, why your TV series looks weird or if you just want to learn more about the relationship between things like FPS and Refresh Rate than you’ve come to the right place!

Why 60fps feels choppy on 144hz

If you are watching content at a lower/higher frame rate than what your monitor natively supports, then a few visual artifacts can occur that will degrade your viewing experience. One of these is called Judder.

Inconsistencies in the time a frame is displayed can cause Judder. This can happen when watching 24fps video on a 60hz TV. It can also happen when playing a 60fps game in a 144hz monitor.

To understand judder and how to fix we should first clarify what some of these terms mean.

Refresh Rate

Refresh Rate, also known as Vertical Scan Rate (from when CTRs where common) is how often a display can display a new image.

Back when CRTs where common refresh rates increased with the size of the display. Smaller to medium TVs and monitors had a 60hz display rate, only bigger CRTs required 72hz to avoid flickering.

This changed with the introduction of modern displays like LCDs which were limited only by the frame rate. Common refresh rates for these monitors and TVs start from 60hz and go up to 240hz with 60, 75, 120 and 144 being the most common.

Frame Rate

You are probably familiar with this term. Frame Rate also commonly abbreviated as FPS, is how many frames per second are displayed.

Frame rates can vary by a lot. Standard films are shot at 24 frames per second. There are a few reasons for them choosing 24 with most of them being a good compromise between cost, and audio comprehension.

Nowadays 24fps is great for stuff like VFX and editing frame by frame in film. Imagine if they had to edit more than twice the frames for 60fps video, instead of just 24!

For live TV broadcasts 30fps is the standard. This is mostly because sports look much better at 30fps than at 24fps. It can be difficult to perceive motion when it’s just 24 frames per second.

Next, with the advent of computers 60fps became a viable alternative to 30fps. Computers had no problem displaying 60 frames per second. It was perfect for computer work and computer graphics.

With more modern displays like 4K monitors, higher refresh rates than 60 became viable even for the casual consumer. It’s common for gamers to play at 90, 120 or even 144 frames per second. This is most important for professional players that want that bit of extra smoothness to their games.

Screen Tearing

Screen tearing is a visual artifact that occurs when multiple frames are displayed in a single screen draw.

To put it in simpler terms screen tearing is a lack of synchronization between frame rate and refresh rate. It will happen when both fps is much lower than refresh rate and also when fps is much higher than your monitor’s refresh rate.

Screen tearing can be fixed by using different software or hardware solutions. V-Sync is the most common and is offered by many things. But more modern solutions that employ a Variable Refresh Rate are available like G-Sync from Nvidia or FreeSync from AMD.

Judder

This is a screen artifact that can occur both on monitors and TVs, but it’s a bit more common on TV sets than on monitors.

Judder occurs when the frame rate of your game can’t be evenly divided by your monitor’s refresh rate. So, in this case 60 fps can’t be evenly divided by a 144hz monitor but a 120hz could do it perfectly.

When this happens some frames are shown for longer than others and this is what causes the choppiness in the image.

One way to solve this is by using a variable refresh rate solution like G-Sync or FreeSync. Or you could also change your monitor’s refresh rate for this specific game to a number that closely matches your desired FPS. Thankfully monitors do allow you to do this.

Alright so does this mean that 60 fps looks worse at 144hz than at 60hz

No, not really, and if it happens it’s because something is wrong. You would need to use a modern variable refresh rate implementation like G-Sync or FreeSync or the old school V-Sync to fix this. Otherwise the only other way to fix this is to run the monitor at a lower refresh rate that closely matches the content you are consuming.

Make sure to configure everything properly when you install a new game, don’t be afraid to google every single thing you don’t understand!

Do you need 120hz to see 120fps

Yes, if you’ve been reading everything on this article you should know that the only way to properly see high fps content is with a high refresh rate monitor or TV.

So, if you wanted to play Counter Strike GO at 120fps, you would need a 120hz monitor to experience it properly. Otherwise 120fps will look exactly the same as 60fps in a standard 60hz monitor!

Remember, while you can play something at 30 fps on 144hz monitor (to increase graphical fidelity for example) you can’t play a game at 120fps on a 60hz monitor, that monitor can only display 60 frames per second. Every other frame will be dropped.

Does 120fps look better than 60fps

Yes, 120fps will look much smoother than 60fps content in general. But the difference could be minimal depending on the type of content.

For example, if you’re playing Civilization 6 at 120fps, you won’t really see much of a difference.

Strategy games don’t really have a lot of movement. Meanwhile E-sports games which are highly competitive and require serious precision will look much better at higher frame rates.

Is it better to lock fps or not

This is a great question actually. Another common thing people do is to Limit their GPU to that of their monitor’s max refresh rate. For example, if their monitor is 144hz they limit fps in game or using their graphics driver to 144.

The good thing about this method is that you don’t suffer from V-Sync’s usual input lag, your GPU doesn’t overheat from drawing 300fps in an older game but screen tearing can occur just like if you didn’t limit anything.

Locking fps is something that you should try first, if you don’t see any tearing then great, if you do that’s when you should consider using V-Sync. Preferably a modern implementation (G-Sync or FreeSync for example)

Do I need a modern 144hz monitor to be good at games

No, not really. Pretty much every single game out there was developed with 60fps as the standard. With many being played this way for decades.

As we mentioned in the frame rate section 60fps has been the standard for a very long time. If you go and take a look at things like steam hardware survey and things like that, you will see that the majority of people still play at 1080p/60fps.

To get to the top 5-1% you need more than a good monitor and a powerful computer to get there. With most E-sports games being easy to play even with 5- to 10-year-old hardware.

Is 60fps good for LOL

Yes, 60fps is more than good enough for League of Legends. Even if LOL is one of the more fast paced MOBAS out there, it’s still perfectly playable at 60fps.

I always like to tell this story. Back in the day Quas (retired played from Team Curse and Liquid in 2013-2015) got to challenger with almost 300ms. How did he do it? Playing champions like Swain and Vayne, with no skillshots!

There will always be things that limit our potential, but 60fps is not one of them. I personally knew a lot of people that played LOL at 30 fps and they were constantly dominating matches.

Related

References

Leave a Comment

Your email address will not be published. Required fields are marked *