High Dynamic Range, or HDR, is a technology that allows displays to show a wider range of colors, brighter whites, and deeper blacks. The technology enhances the contrast of images, making the bright parts brighter and the dark parts darker, resulting in a more lifelike and realistic image.
There are however multiple HDR versions or formats such as HDR10, HDR10+, Dolby Vision, HLG, and others in the market. Understanding the differences between these acronyms is not the easiest, so this article will try to demystify these formats.
HDR Formats: An Overview
The commonly used HDR formats include HDR10, HDR10+, Dolby Vision, and HLG:
- HDR10 – HDR10 is the most common HDR format and an open standard used by a wide range of streaming services. Those include Netflix, Disney+, and Apple TV+. With 10-bit color depth, HDR10 can support far more colors than traditional 8-bit images, offering a more detailed and vibrant image.
- HDR10+ – As an enhancement of HDR10, HDR10+ introduces dynamic metadata that adjusts brightness levels on a frame-by-frame basis. This dynamic approach results in an image with better detail and color representation, especially in scenes with rapid changes in light.
- Dolby Vision – This proprietary HDR format, developed by Dolby Labs, supports 12-bit color depth and a theoretical maximum brightness of a massive 10,000 nits. This format uses dynamic metadata, which means that each frame in a video can be individually adjusted for optimal brightness and color accuracy.
- HLG – HLG, or Hybrid Log-Gamma, is a unique HDR standard developed for live broadcast signals. Unlike other HDR formats, HLG doesn’t use metadata. Instead, it overlays an HDR information layer over a standard dynamic range (SDR) signal, ensuring compatibility with both SDR and HDR displays.
With a basic understanding of the different HDR formats, let’s dig a bit deeper into each one to understand their unique features and benefits.Each format has its unique characteristics and advantages, making them suitable for different scenarios.
Understanding HDR10
HDR10 is a popular choice for being an open standard. It is a format that promises improved image quality with more dynamic and vibrant colors. HDR10 uses static metadata, meaning that the brightness values are set at the beginning and remain the same throughout the content. This approach ensures a consistent viewing experience but may not always provide the optimal brightness levels for every scene.
It also has a few limitations. The static metadata approach might not always result in the best image quality, particularly in scenes that would benefit from dynamic brightness adjustments. Moreover, HDR10 is limited to 10-bit color depth, which falls behind Dolby Vision’s 12-bit color support.
HDR10+ Improvements
HDR10+ is another open standard and was developed to overcome the limitations of HDR10. It introduces dynamic metadata, allowing it to adjust brightness levels on a frame-by-frame basis. This dynamic approach results in an image with better detail and color representation in scenes with rapid changes in light, for example.
The format retains the 10-bit color support but raises the maximum brightness to 4,000 nits, resulting in better image quality compared to HDR10. However, HDR10+ content and hardware support are less widespread than HDR10 or Dolby Vision.
Dolby Vision: Premium HDR
Dolby Vision is considered a premium HDR format because of its advanced capabilities. Unlike HDR10 and HDR10+, Dolby Vision supports up to 12-bit color depth and a theoretical maximum brightness of 10,000 nits. This allows it to display more colors and achieve higher brightness levels than other HDR formats.
Dolby Vision’s premium features come with a cost, however. First of all it is a proprietary standard, which means manufacturers need to pay licensing fees to use it. This has led to somewhat limited adoption, with not all TVs supporting Dolby Vision. However, some Netflix content now uses the technology.
Hybrid Log-Gamma (HLG): Broadcast-Friendly HDR
HLG stands out from other HDR formats due to its unique approach to HDR. Developed by the BBC and NHK for broadcasting purposes, HLG doesn’t rely on metadata. Instead, it integrates HDR information into an SDR signal, ensuring compatibility with both SDR and HDR displays.
The downsides are that it can’t enhance the black levels of an image, limiting its ability to improve detail in shadows and night scenes. Moreover, HLG content is not as widely available as content in other HDR formats.
Is HDR Exclusive to 4K?
Contrary to popular belief, HDR is not exclusive to 4K resolution. While most 4K TVs support some form of HDR, HDR can also be implemented on TVs with lower resolutions, such as 1080p, 1440p, or anything really. The two technologies, while often bundled together, serve different purposes. While 4K refers to the number of pixels on the screen, HDR only pertains to the color and brightness range the display can reproduce.
How to Watch HDR Content
To fully enjoy the benefits of HDR, you need a few things. First, you’ll need a TV that supports an HDR format. These formats are not cross-compatible, so a TV that only supports HDR10+ won’t be able to display the benefits of Dolby Vision. Additionally, you’ll need to find available content in your TV’s supported format. There’s plenty of content available in HDR10, but not so much in Dolby Vision and HDR10+.
If you use a streaming device, like an Apple TV or Roku, that device will also need to support your TV’s HDR format. The same is true for Blu-ray players and game consoles. In other words, to experience HDR at its best, your TV, content, and playback device all need to be on the same page.
To try it out, there is a variety of sample content that showcases HDR on YouTube, such as this:
The Future of HDR
As HDR technology continues to advance, we can expect to see more vibrant displays, as well as a greater variety of HDR content. It is nevertheless important to note that the quality of your HDR experience will always be determined by the capabilities of your TV and the quality of the HDR content you’re watching.