While it seems pretty clear that the upcoming Apple TV will support 4K resolution – aka Ultra HD aka UHD aka UHDTV – we may have just received a hint that this won’t be the only improvement offered to the image quality.
A tweet from Rene Ritchie yesterday indicated that he’d been given a demo of High Dynamic Range (HDR) video, and the timing suggests that this is likely to have been driven by a tip from Apple about the new Apple TV …
Reminder: The next big leap isn’t 4K. It’s HDR. HDR video is *Amazeballs*. 4K is simply along for the ride. Enjoy 10-bit HECV [a typo for HEVC], friends.
If true, the impact of image quality could be dramatic – provided you have a television capable of displaying the full output.
Since it’s 4K resolution that’s so far received all the attention in TV and monitors, the terms could use a little decoding.
High Dynamic Range is a term originally applied to still photography. Normally, a camera sensor has a limited dynamic range. What this means is that it can’t simultaneously capture a shade very close to black and a shade very close to white. If you adjust the exposure to show shadow detail (very dark colors), then very bright colors will be washed-out and simply appear to be white. Conversely, if you expose for the highlights, then very bright colors will be shown correctly but very dark ones will simply appear black.
HDR is a technology designed to overcome this problem, capturing all shades from very dark to very light. In still photography, the first way to achieve HDR was to shoot a series of photos of the same scene. At its simplest, you shoot three photos – one underexposed, one normal, one overexposed. Photo editing software then combines the three images in such a way that you can see all the shades present.
When HDR is overdone, it looks very unnatural. But done well, it produces a natural-looking image that allows us to see everything our eyes would have seen at the time.
HDR video works in a different way, but has the same aim. Here, the key is capturing and displaying more shades of color, which includes both darker and lighter tones. Here’s an example showing Dolby’s implementation:
The left side shows a standard TV image while the right side shows the HDR version. You can see a number of benefits. First, we can see more of the shadow detail. In the standard version, the dark areas of the building all mush together; in the HDR version, we can see more of the details.
The same is true in the brighter areas. In the standard image, the sky is washed out, but in the HDR version the colors in it are seen.
Finally, the color reproduction is better across the entire image.
The key to HDR video is what’s known as greater color depth. This is the ’10-bit’ part of Ritchie’s tweet. Standard video uses 8-bit colors. With 8-bit color, each individual pixel element – a red, a green or a blue – can display up to 256 tones. Combine them, and you get a total of just over 16 million colors. With 10-bit color, each element can display 1024 tones to give a total of more than a billion colors. (If the math seems weird, I’m simplifying slightly from RGBA to RGB.)
What about the ‘HEVC’ part? This is High Efficiency Video Coding, which is about twice as efficient as current H.264 video encoding. That doubled efficiency can be used in one of two ways.
Either you reduce the amount of data that needs to be streamed to display the same quality, or you can encode more data in the same bandwidth – providing greater quality at the same bit-rate. Such as, for example, replacing 8-bit color with 10-bit color. For this reason, HEVC and 10-bit color are commonly discussed in tandem.
HEVC encoding has another benefit. Today’s H.264 encoding compares 16×16-pixel areas to look for differences between them. HEVC expands that pattern-matching area to 64×64-pixel areas. That means it will detect more differences and display colors more accurately. In particular, it will reduce one issue seen with H.264 encoding: color banding, or posterization.
Here’s an admittedly-extreme example from Intel, with an 8-bit image on the left and 10-bit on the right.
So, put 10-bit HEVC together with 4K resolution and you’ll have truly stunning image quality.
Of course, to take advantage of this, the video needs to have been captured and encoded in 10-bit color to start with, and you need a TV capable of displaying it. But if you do, and Ritchie is indeed hinting about the Apple TV, the new model is going to be a massive upgrade.
If you don’t yet have a HDR-capable TV and are looking to buy one, HDR10 and Dolby Vision are the buzzwords to look for. These are two competing standards, so ideally you want both, but HDR10 currently has the lead.
Martian comparison of SDR and HDR from HDTVtest