I’ve been a PC gamer for nearly thirty years, but I’ve never seriously considered gaming on a television until quite recently. Most of this, I think, is due to growing up in an era when TVs were flatly incompatible with monitors and vastly inferior to them in terms of resolution and overall image quality. Even before the advent of LCDs and flat-screen monitors, CRT-based displays outstripped the resolution of your average television.
Modern televisions are vastly more capable than the CRT-based sets of 20 years ago. Like PCs, they use HDMI, support the same resolutions (720p, 1080p, and 4K), and are often advertised as supporting refresh rates of 120-240Hz. While there are a handful of >60Hz monitors available on the market, these often command substantial premiums compared to regular old 60Hz displays. In theory, a 4K TV could make a great gaming display. The reality is considerably more complicated.
First, there’s the issue of input lag. Input lag is the lag introduced between pushing a button on a controller, mouse, or keyboard, and when the results of that action appear on-screen. While it’s often discussed as an issue that impacts console gaming, anyone considering an HDTV for PC gaming will have to contend with this as well. One of the problems with HDTV gaming is that input lag on the best TVs is still higher than on top monitors. The fastest monitors add 9-10ms of input latency, while the best HDTV’s are around 17-18ms.
To be clear, 17-18ms isn’t bad at all (it rates as “Excellent” on DisplayLag.com’s official ranking system), and if you aren’t playing high-speed FPS or RTS titles, you might not notice higher input lag at all. Civilization doesn’t exactly rely on fast-twitch gaming, after all. Plenty of TVs, however, don’t even clear the 40ms bar that DisplayLag qualifies as “Great.” Input lag can sometimes be improved by adjusting settings within the TV’s various submenus, but this varies by model and manufacturer. The vast majority of manufacturers don’t list or provide input lag information — it’s something you typically have to check at third party sites like DisplayLag.
Next, there’s the issue of overscan. Overscan refers to the practice of not displaying all of an available image on the actual screen. It’s a holdover from the pre-LCD era when there was no way to guarantee that every television set would display content in precisely the same fashion. The solution to this problem was to zoom the final output slightly, creating a small border around the edges of the screen. Ultimately, modern LCDs don’t have much use for overscan, but it’s still enabled by default on many displays. Whether or not you can disable it depends on what kind of TV you have — some older LCDs may not offer the option to disable overscan at all. Graphics cards from AMD and Nvidia can compensate for overscan in software, but this may result in less-than ideal text and font rendering.
Finally, while there are televisions that can actually achieve a 120Hz refresh rate, this varies by manufacturer. This article from CNET explains the rules of thumb for a number of companies and how to determine exactly what the refresh rate is.
This last point is aspirational, but if you’ve spent any time with a monitor that supports Adaptive Sync (that’s the official VESA name of what AMD calls FreeSync) or G-Sync, you’re aware of how awesome the feature is for gaming, even when you’re playing at 60 FPS. The lower the frame rate, the more FreeSync / Adaptive Sync helps, since ensuring smooth frame delivery is more important, the longer the gap between each frame. For example, 30 FPS titles deliver one frame every 33.3ms, while 60 FPS titles deliver one frame every 16.6ms, assuming constant frame latency.
I mention G-Sync since that’s Nvidia’s version of the same technology, though the company has never announced any interest in working with TV manufacturers to bring a G-Sync compatible panel to market. The AMD-backed VESA standard, Adaptive Sync, could theoretically be supported in future panels — but only if it’s added to the HDMI specification. Right now, the latest version of HDMI, 2.0, doesn’t support Adaptive Sync. HDMI 2.1 is still in the planning phase, which means TV sets that use this standard are still a few years away, best-case.
While supporting Adaptive Sync in HDMI 2.1 wouldn’t solve input lag or overscan issues, it would improve the overall gaming experience on HDTVs that also addressed these problems. Both PCs and consoles would benefit from the feature, and it might even be possible to activate on current consoles depending on exactly how Adaptive Sync was implemented and whether or not the GPUs inside the Xbox One and PlayStation 4 support it.
If you want to game on a large-screen TV, you’ll need to plan your purchase carefully, and we recommend Googling specific model numbers of displays you’re considering to see how others are getting on with the same hardware. Hopefully in the near future we’ll see HDTV’s adopting standards like Adaptive Sync / FreeSync, as well as some manufacturers explicitly moving towards the PC gaming crowd. Given that we’re already seeing HDR support show up in early TVs and monitors, it’d be nice to see more cross-pollination between feature sets.
Next, read: How to buy the right video card. And check out our ExtremeTech Explains series for more in-depth coverage of today’s hottest tech topics.