How to Buy a TV: What You Need to Know
Dusan Petkovic/Shutterstock
Buying a TV is more complex than ever. There are new technologies, formats, and buzzwords you have to keep up with. Plus, pricing is also all over the place as more affordable companies try to unseat brands like LG and Samsung.
And
if you’re looking for a TV specifically for gaming
, different features are more important. We’re breaking down everything you need to know to make the wisest purchase!
Choosing a Display: OLED, QLED, and More
Currently, there are two dominant display technologies on the market: LED-LCD (including QLED) and OLED.
Understanding the differences
will help you make the right decision. A simple rule of thumb is to match the display type to your viewing habits.
Most TVs on the market are LCD panels lit with an LED backlight. These include the cheapest new TVs, from brands like TCL and Hisense, all the way to LG’s NanoCell lineup and Samsung’s top-tier QLED sets.
Not all LED-lit panels are equal, however. Panels advertised as QLED use a
Quantum Dot layer
that improves the range and vibrancy of colors on the display. Of all the LCD panels on the market, the QLEDs are as good as it gets.
Samsung
Advertisement
The one drawback on panels that use traditional LED lights is they’re backlit. This means to display an image, a bright LED must shine through the many layers that make up the panel. This can result in poor black reproduction and potential light bleeding around the edge of the display.
The latest (and best) LED models use full-array local dimming (FALD) to dim select areas of the screen and improve black reproduction. This helps LCD panels get much closer to a “true” black. Since the dimming zones can be quite large, the technology isn’t perfect. This process often produces a “halo” effect around the edges of the dimming zones.
OLED is a completely different technology than QLED. These panels are self-emissive, which means each pixel produces its own light. There’s no LCD film, and no backlight shining through the “stack” of layers that make up the display. In fact, an OLED stack is incredibly thin.
This means OLED screens have “perfect” blacks because they can turn off pixels entirely. The result is a striking image with excellent contrast. On the other hand, OLED displays can suffer from poor near-black performance. Some models are prone to “black crush,” in which dark shadow details are lost.
OLEDs are also susceptible to burn-in
under certain conditions.
LG
OLED technology can also be a bit more expensive than traditional LED-lit screens because it’s a newer technology with a higher manufacturing cost. With this in mind, LG’s flagship displays, like the C9 and CX, usually fall within the same bracket as Samsung’s flagship QLED displays.
Advertisement
But there’s also an outlier: mini-LED. These panels still use traditional LCD technology, but with smaller LEDs. This means they can
pack in many more dimming zones
. The result is a far less pronounced halo effect and the same deep, inky blacks you might see on an OLED.
While MiniLED TVs provide a great balance between price and image quality, they’re thin on the ground at the moment.
TCL
is currently the only company selling Mini-LED models on the U.S. market, although more are expected to land from Samsung and others in the near future.
RELATED:
How to Buy a TV for Gaming in 2020
Brightness and Viewing Angles
Matching your display technology to your watching environment and habits is important. Since LCD (including QLED) sets use an LED backlight, they can get a lot brighter than OLED models. This is because OLEDs use organic compounds, the brightness of which is limited due to heat output.
A QLED set might get twice as bright as an OLED, which makes it perfect for viewing in a very bright room. Conversely, if you enjoy watching movies in the dark or mostly at night, the superior black levels of an OLED will yield a better image. If you hate washed-out blacks, OLED is the way to go.
Monkey Business Images/Shutterstock
OLED displays also have excellent viewing angles, which makes them ideal for group watching. While some color-shifting can occur when watching off-axis, the image won’t dim substantially, even at extreme angles. This makes an OLED a great choice if not everyone in the room will be facing the screen directly.
Different LCD models use different coatings and panel types in an attempt to get around this. For example, LG’s NanoCells use
IPS panels
, which have excellent viewing angles, but poor contrast ratios.
Advertisement
On the other hand, VA panels, like those in Samsung’s QLEDs, suffer from poor off-axis viewing angles, but have the best contrast ratios and color reproduction.
If you have a large family or enjoy having friends over to watch sport or movies, make sure you consider viewing angles and the ambient light in the room before selecting a TV.
RELATED:
TN vs. IPS vs. VA: What's the Best Display Panel Technology?
High Dynamic Range: The Future of Video
High Dynamic Range (HDR) is a leap forward in display technology. Dynamic range is the visible spectrum between the darkest blacks and the lightest lights, and it’s usually measured in stops. While a traditional standard dynamic range (SDR) TV has a range of about six stops, the latest HDR displays can exceed 20.
This means you get more details in the shadows and highlights, which makes for a richer image. HDR also incorporates a wider color gamut and much higher peak brightness. You’ll see more shades of colors, which results in less “banding” or grouping together of similar colors. You’ll also see flashes of brightness from objects like the sun, which creates a more realistic presentation.
HDMI.org
HDR is a big deal as most new movies and TV content are taking advantage of it. Next-generation game consoles (like the Xbox Series X and S, and PlayStation 5) also place a heavy emphasis on HDR, although last-gen systems have been using it for years. If you watch a lot of movies or play games, you’ll want good HDR support.
First, it helps
understand the differences between the major HDR formats
. Below are the most important features to note:
HDR10:
This is basic, standardized HDR. Almost every TV on the market support it. If you buy a movie with a “High Dynamic Range” sticker on the box, it almost certainly includes HDR10 support.
Dolby Vision:
A
superior HDR implementation
, it uses dynamic metadata to aid the TV in producing the most accurate HDR picture on a frame-by-frame basis.
HDR10+:
An open evolution of HDR10, it also includes dynamic metadata. This format is mostly found on Samsung TVs.
Hybrid Log-Gamma (HLG):
This is a broadcast implementation of HDR that allows both SDR and HDR displays to use the same source. Additional data is provided for HDR-capable displays, so they receive a better image.
With the exception of HDR10 (the “default” HDR implementation), Dolby Vision has much better support than HDR10+. Streaming services, like Netflix, use it for almost all new content, and Microsoft has also committed to bringing Dolby Vision to gaming on the Xbox Series X and S in 2021.
RELATED:
HDR Formats Compared: HDR10, Dolby Vision, HLG, and Technicolor
Fancy Features: The Devil’s in the Details
You can buy a great TV for around $600, but spending $1,200 won’t necessarily get you a TV that looks discernibly better. You might even spend more money and get a TV that somehow looks worse.
Advertisement
This is because TVs can differ quite wildly in terms of additional features. To avoid spending money on features you might never use, it’s worth it to take the time and familiarize yourself with a few of them.
The image processor in your TV can massively affect the quality of the picture. A good image processor can take a murky 720p video and make it look presentable on a 4K display. A bad image processor, though, might handle 24p cinematic content very poorly, and introduce a distracting judder or stutter. Cheap sets might perform poorly in this area, but premium brands, like Sony, handle this well on their higher-end sets.
Some brands go even a step further with features like black frame insertion (BFI), which literally inserts black frames at set intervals to make for smoother motion. This might be important to movie buffs, but it’s not something you should prioritize if you just want a TV to watch the news.
HDMI.org
Connectivity is another area that can come at a premium. Most TVs include HDMI 2.0 ports, but
the new 2.1 standard
is slowly rolling out. Unless you want the very highest resolutions and frame rates (120Hz) on the PS5, Xbox Series, or a high-end PC, you don’t need HDMI 2.1.
A high-refresh-rate display allows you to view content at up to 120Hz—double that of most TVs on the market. However, unless the source (like a new console or graphics card) is providing an image of that quality, you have little need for a 120Hz display.
Advertisement
Gaming features like
FreeSync and G-Sync
make playing games a more pleasant experience. They smooth out frame-rate drops, but they aren’t necessary for most people. Unless you know you need the feature because your hardware is compatible with it, you can discount it and save some money.
Both Sony and Microsoft’s
latest consoles use HDMI VRR
, so they don’t necessarily need these features.
HDMI.org
One area that seems to have improved across the board on the latest TVs is software. While one you bought a decade ago probably has a slow or clunky interface, new smart TVs often use modern operating systems, like Android TV, LG’s WebOS, Samsung’s Tizen, or TCL’s Roku.
You might want to try out the interface before you buy a TV just to make sure you like the OS you’ll be using for the next few years.
RELATED:
What Is HDMI VRR on the PlayStation 5 and Xbox Series X?
Bad Sound: The Problem with Audio
Modern TVs often emphasize the form factor over almost everything else. This is how we got ultra-thin bezels, slim OLED screens, and flush wall mounting. The side effect of this is most TVs ship with subpar, downward-firing speakers that can’t fill a room with good audio.
There are exceptions: Sony’s OLEDs use the glass display as a sort of speaker and some TCL models include built-in soundbars. However, the majority—particularly those on the budget end of the spectrum—are probably going to be disappointing when it comes to sound.
Yamaha
Advertisement
To improve your experience, you might want to leave room in your budget for some audio hardware, too. You don’t necessarily have to break the bank on a
Sonos Arc
soundbar, unless you want a room-shaking, immersive experience from a tiny footprint on your entertainment unit.
Soundbars are designed to provide better-than-TV audio at a price point that won’t make you wince. Many support the latest standards, like eARC and Dolby Atmos, but those are secondary to the primary function: making up for the terrible integrated audio prevalent in TVs right now.
On Resolution: Stick with 4K
As 4K TVs and HDR support are now seeing widespread adoption, most people finally have a good reason to upgrade. So, why are manufacturers already trying to get you to buy an 8K set?
It’s true that some 8K sets—like Samsung’s high-end QLEDs—aren’t
that
expensive right now. Unfortunately, 8K just isn’t worth the investment yet. For some, 8K will never be worth it because the perceived jump in image quality is negligible, at best.
The jump from standard definition to HD was huge in terms of image quality, but from HD to 4K, things start to get a bit murkier. You have to be a certain distance from a TV to see the benefits of 4K, but there’s no denying the image is sharper and more detailed.
So, how about from 4K to 8K? As you might have guessed, this is a game of diminishing returns. While the difference
is
visible when you get much closer than what would be considered a reasonable viewing distance, overall, you’ll likely be underwhelmed.
Advertisement
Then there’s the issue of content. While an 8K display will do a good job of upscaling 4K content, finding native 8K content is virtually impossible at the moment. YouTube supports it, but there’s no way to filter search results for it. Some streaming services don’t even offer 4K content yet, and many cable broadcasts are still chugging along in standard definition.
Netflix recommends a 25Mbps internet speed to stream 4K content, which is already heavily compressed. By this logic, you’d need at least 50Mbps for 8K content, which would also use a lot more bandwidth than 4K.
One day,
8K
will be worth it because it will be the standard, just as
4K
is now. There will be better reasons to upgrade your TV when that time comes. Let’s not forget how poor HDR implementation plagued the early 4K TVs when they came out. We’ve only had a few generations of truly great 4K TVs providing a notably superior viewing experience over our old HD sets.
RELATED:
8K TV Has Arrived. Here's What You Need to Know
Read Reviews
As with any modern electronic product, independent reviewers hold the key to making an informed decision.
RTINGS
is one of the best resources for buying a TV. A broad testing criterion is used on all TVs reviewed, which provides an objective overview of strengths and weaknesses.
Just apply your findings to your situation, your living room, and your viewing habits. There’s no single perfect TV for everyone. Just be sure to
avoid the usual mistakes people make when buying a TV
.
RELATED:
6 Mistakes People Make When Buying a TV
READ NEXT
›
How to Disable the Google Assistant Swipe Gesture on Android
›
Can Power Companies Remotely Adjust Your Smart Thermostat?
›
How to Pin the Downloads Button to Microsoft Edge Toolbar
›
What Is a Slack “Huddle” and How Do I Start One?
›
How to Group and Animate Objects in Microsoft PowerPoint