Quick Links
-
It's All About Resolution
-
4K and UHD Are Different
-
What About 1440p?
-
Things to Consider When Upgrading to 4K
-
Upscaling to Ultra HD
-
What About HDR?
Key Takeaways
- 4K & UHD refer to a resolution four times that of 1080p, providing cleaner, more detailed images.
- 4K is a production standard, UHD is a display resolution, and both are used interchangeably.
- Consider your viewing distance, 4K content availability, internet speed, and device compatibility before upgrading.
The vast majority of new TVs are marketed as 4K displays, also known as Ultra HD. So what makes 4K different from standard HD, and what do you need to know about upgrading?
It's All About Resolution
Commonly, 4K and UHD refer to a resolution that's a step up from 1080p (or "full HD"). A 4K UHD display has roughly four times the pixels of the previous generation, which creates a cleaner, more detailed image.
A 1080p high-definition TV isn't able to take full advantage of 4K media since there aren't enough pixels available. At the same time, to get the full benefit of a 4K display the media you are watching will also need to be in 4K resolution.
Fortunately, 4K or Ultra HD is everywhere, from movies and TV shows to the latest video games. You can also buy a UHD 4K monitor for your computer for lots of screen real estate and excellent image quality. Your smartphone probably shoots video in 4K (some even manage 8K), even if the massive video files aren't worth it on a smaller display.
4K and UHD Are Different
Despite being used interchangeably by manufacturers, retailers, and consumers alike, 4K and Ultra HD (UHD) are technically different. While 4K is a production standard as defined by the Digital Cinema Initiatives (DCI), UHD is just a display resolution. Films are produced in DCI 4K, while most TVs have a resolution that matches UHD.
The 4K production standard specifies a resolution of 4096 x 2160 pixels, twice the width and length of the previous standard of 2048 x 1080, or 2K. As part of this production standard, 4K also specifies the type of compression that should be used (JPEG2000), the maximum bitrate (up to 250 Mbits per second), and color depth specifications (12-bit, 4:4:4).
Ultra HD has a display resolution of 3840 x 2160 pixels, and it's used in the vast majority of modern TVs, even those advertised with an eye-catching "4K" label. Besides the number of on-screen pixels, there aren't any additional specifications. The real differences between the two formats are the width of the images and the aspect ratios.
A movie produced in 4K can use an aspect ratio of up to 1.9:1, although, most filmmakers prefer 1.85:1 or 2.39:1. Video games rendered for consumer-level displays typically use the UHD aspect ratio of 1.78:1 to fill the screen.
This is why you'll continue to see letterboxing (black bars at the top and bottom of the screen) when you watch movies on your brand-new UHD television. Because UHD doesn't specify any additional standards, older televisions with 8-bit panels are advertised as UHD sets alongside new, 10-bit (and the future 12-bit) UHD displays.
To make matters worse, Ultra HD is also used for so-called 8K content. Labeled as "8K UHD" (as opposed to 4K UHD), this refers to content with a resolution of 7680 x 4320 pixels. This leap in quality is enormous in terms of overall pixel count, but the benefits to most are limited. It will be a while before we see widespread content produced for this format.
Many manufacturers use the term "2160p" to describe regular UHD content, even though it isn't strictly accurate about production standards.
What About 1440p?
1440p sits between "Full HD" (1080p) and 4K or Ultra HD (2160p), in reference to the resolution 2560 x 1440p. This resolution doesn't apply to TVs, but rather PC monitors especially those aimed at gamers. 1440p has become a sweet spot in PC gaming where resolution is appreciably improved over 1080p, while still allowing mid-range machines to be performant in terms of frame rate.
Some refer to 1440p as Quad HD or QHD. Others use the term "2K" even though this isn't strictly accurate. It can also apply to a range of resolutions with a vertical resolution of 1440 pixels, including ultrawide and super ultrawide monitors.
Things to Consider When Upgrading to 4K
It's a great time to upgrade to a UHD TV capable of 4K playback, as technology has matured considerably over the last five years. Not only are UHD displays now much cheaper, but they also come with more features. There are 10-bit panels capable of displaying high-dynamic-range content that also have powerful onboard image processors.
For the leap to be worth it, you'll need to consider how large you want your display to be and how far away you sit from it. According to RTINGS, the upgrade isn't worth it if you sit farther than six feet away from a 50-inch screen. You can't see the pixels from that distance, anyway, so you won't benefit from the increased resolution.
Another thing worth considering is what 4K content you'll be watching on a new TV. Ultra-HD Blu-rays provide the best at-home viewing experience, and there's a sizable catalog that's growing all of the time. If you don't often buy expensive discs, though, you might be stuck streaming content, instead.
This is where the speed of your internet connection can make or break your investment in a shiny new TV. Netflix claims its customers need an internet speed of 25 Mbits per second or better to stream Ultra HD. You can test your internet speed to find out how your display will fare. Remember, though, these speeds can dip considerably during busy periods (like when everyone's streaming Netflix simultaneously).
You'll also have to pay for a premium-level streaming subscription to access the highest-quality content. Netflix gates its UHD content behind a $22.99 monthly fee. This includes Netflix Originals, and many other movies and series not produced by the streaming giant. Unfortunately, a lot of movies that have UHD releases are still presented in HD on Netflix.
Do you have older HD devices, like a Roku or Apple TV? These can pose an issue, as they're only capable of delivering a 1080p image. You'll need a Chromecast Ultra or Apple TV 4K if you want to take advantage of higher resolution and HDR playback. This is less of an issue for your TV, as long as it has a stable and responsive OS, which many do.
Remember that 4K shines on larger displays. Unfortunately, when you upgrade to a larger native UHD TV, any 1080p content will look worse. This will be less of a problem in the future, though, and there are some solutions.
Upscaling to Ultra HD
TVs place a heavy emphasis on upscaling, which takes lower resolution content and scales it to fit a much larger display. Remember, there are four times as many pixels on an Ultra HD display than there are on a regular Full HD television.
Upscaling means more than simply stretching an image. Modern TVs and playback devices process the image and attempt to reconstruct it to look its best at a higher resolution. This is done via a process known as interpolation, during which missing pixels are generated on the fly. The intent is to produce a smooth transition between contrasting areas of the image.
As TVs become more powerful, better interpolation and upscaling techniques will be used. The NVIDIA Shield has some of the best upscaling technology on the market. It utilizes AI and machine learning to improve different parts of the image using different techniques.
If you upgraded to an Ultra HD TV and have noticed subpar performance with lower-resolution content, a Shield might be just what you need.
Xbox Series X and PlayStation 5 game consoles output at a native 4K resolution, though a native 4K image is rare. Games instead rely on dynamic resolution scaling to adjust the output resolution on the fly in order to hit performance targets. These games still look great on a 4K TV, just make sure you opt for one with a 120Hz refresh rate and plenty of HDMI 2.1 ports.
NVIDIA has developed Deep Learning Super Sampling to render games at lower resolutions and upscale them in real-time to 4K and better. AMD and Intel now have similar technologies. These allow you to put your 4K display to good use while still hitting your desired frame rate by limiting how many raw pixels the GPU renders.
What About HDR?
High dynamic range (HDR) is also often advertised on movies and TVs, and it's an entirely different technology. While 4K is a production standard and UHD is a resolution, HDR is a loosely defined term that refers to a wider color gamut and higher peak brightness. It's the other big benefit of upgrading your old TV.
Brighter images deliver a more immersive viewing experience. HDR allows the image to really pop since it more closely imitates the difference between light and dark that we see in the "real world." Most TVs will support basic HDR10, while others support technologies that can deliver a more dynamic image like Dolby Vision and HDR10+.
Modern game consoles like the Xbox Series X and PlayStation 5 both support some form of HDR, which can make just as much of an impact on the image as the jump in resolution from HD to Ultra HD. Xbox and Windows even support Auto-HDR which intelligently applies HDR to older games that do not explicitly support it.
While 1080p HDR can exist, HDR content wasn't widely produced during the "Full HD" age, so you won't find any televisions on the market that offer HDR at 1080p. The vast majority of 4K sets on the market do support HDR in some form, however.
Don't Worry About the Terminology
Whether it's called 4K or UHD doesn't matter. Your UHD TV is 4K-capable. The world has just adjusted to the nebulous terms thrown around by manufacturers and marketers.
Netflix might advertise a movie in Ultra HD, while Apple labels the same movie as 4K. Your TV doesn't care and will play both just fine.
Before you head out to buy that new set, though, be sure to check out these common mistakes people make when shopping for a TV.