HDR: Dolby Vision, HDR10, HLG — What It Means for TV Viewers

How does HDR practically impact your TV viewing experience?

The number of TVs with 4K display resolution has exploded, and for a good reason. Who doesn't want a more detailed TV image?

Ultra HD: More Than Just 4K Resolution

The 4K resolution standard is one part of what is now referred to now as Ultra HD. In addition to increased resolution, proper brightness and exposure levels are important factors that improve picture quality due to the increased light output in conjunction with a video processing system referred to as HDR.

Vizio E-Series 4K Ultra HD TV with HDR
Vizio

What Is HDR?

HDR stands for High Dynamic Range.

During the creating process for selected content destined for theatrical or home video presentation, the full brightness and contrast data captured during the filming process is encoded into the video signal. When the content renders into a stream, broadcast, or on a disc, that signal is sent to an HDR-enabled TV.

The information is decoded, and the high dynamic range information displays, based on the brightness and contrast capability of the TV. If a TV isn't HDR-enabled (referred to as a standard dynamic range TV), it displays the images without the high dynamic range information.

Added to a 4K resolution and wide color gamut, an HDR-enabled TV, combined with properly encoded content, can display brightness and contrast levels close to what you see in the real world. This means bright whites without blooming or washout, and deep blacks without muddiness or crushing.

For example, if a scene has bright elements and darker elements in the same frame, such as a sunset, you see both the bright light of the Sun and the darker portions of the rest of the image with equal clarity, along with all the brightness levels in between.

Since there's a wide range from white to black, details not normally visible in both the bright and dark areas of a standard TV image are more easily seen on HDR-enabled TVs, which provides a more satisfying viewing experience.

LG HDR and SDR Comparison
LG

How HDR Implementation Affects Consumers

HDR represents an evolutionary step in improving the TV viewing experience. Still, consumers face four main HDR formats that affect what TVs, related peripheral components, and content they should buy. These four formats are:

  • HDR10
  • Dolby Vision
  • HLG (Hybrid Log Gamma)
  • Technicolor HDR

Each format has its own special attributes.

HDR10 and HDR10+

HDR10 is an open, royalty-free standard incorporated into all HDR-compatible TVs, home theater receivers, Ultra HD Blu-ray players, and select media streamers.

HDR10 is considered more generic, as its parameters are applied equally throughout a specific piece of content. For example, an average brightness range is determined across an entire film.

During the creating process, the brightest point and darkest point in a movie are marked. When the HDR content is played back, all other brightness levels are indexed to those points.

However, in 2017, Samsung demonstrated a scene-by-scene approach to HDR called HDR10+ (not to be confused with HDR+, which will be discussed below). Just as with HDR10, HDR10+ is royalty-free, but there are some initial adoption costs.

Although all HDR-enabled devices use HDR10, TVs and content from Samsung, Panasonic, and 20th Century Fox use HDR10 and HDR10+ exclusively.

Dolby Vision

Dolby Vision is the HDR format developed and marketed by Dolby Labs, which combines hardware and metadata in its implementation. The added requirement is that content creators, providers, and device makers need to pay Dolby a license fee for its use.

Dolby Vision is considered more precise than HDR10. Its HDR parameters can be encoded scene-by-scene or frame-by-frame and it can be played based on the capabilities of the TV. In other words, playback is based on the brightness levels present at a given reference point, such as a frame or scene, rather than limited to the maximum brightness level for the entire film.

On the other hand, the way Dolby structured Dolby Vision, all licensed and equipped TVs supporting that format can decode HDR10 signals if the TV manufacturer turned on this capability. However, a TV that is only compliant with HDR10 isn't capable of decoding Dolby Vision signals.

In other words, a Dolby Vision TV can decode HDR10, and an HDR10-only TV can't decode Dolby Vision. However, many content providers that incorporate Dolby Vision encoding in their content often include HDR10 encoding, specifically to accommodate HDR-enabled TVs that may not be compatible with Dolby Vision.

When the content source includes only Dolby Vision and the TV is HDR10 compatible only, the TV ignores the Dolby Vision encoding and displays the image as a standard dynamic range image. In other words, in that case, you won't get the benefit of HDR.

TV brands that support Dolby Vision include select models from LG, Philips, Sony, TCL, and Vizio. Ultra HD Blu-ray players that support Dolby Vision include select models from OPPO Digital, LG, Philips, Sony, Panasonic, and Cambridge Audio. Depending on the device's manufacturing date, Dolby Vision compatibility may only activate after a firmware update.

On the content side, Dolby Vision is supported through streaming on select content offered on Netflix, Amazon, and Vudu, as well as a limited number of movies on Ultra HD Blu-ray Disc.

Samsung is the only major TV brand marketed in the U.S. that doesn't support Dolby Vision. Samsung TVs and Ultra HD Blu-ray Disc players only support HDR10.

Hybrid Log Gamma (HLG)

Hybrid log gamma is an HDR format that is designed for cable, satellite, and over-the-air TV broadcasts. It was developed by Japan's NHK and the BBC Broadcasting Systems but is license-free.

The main benefit of HLG for TV broadcasters and owners is that it is backward compatible. In other words, since bandwidth space is at a premium for TV broadcasters, using an HDR format such as HDR10 or Dolby Vision doesn't allow owners of non-HDR TVs (including non-HD TVs) to view the HDR-encoded content.

However, HLG encoding is another broadcast signal layer, containing added brightness information without the need for specific metadata, that can be placed on top of the current TV signal. As a result, the images can be viewed on any TV.

If you don't have an HLG-enabled HDR TV, it won't recognize the added HDR layer, so you won't get the benefits of the added processing, but you will get a standard SDR image.

Although HLG provides a way for both SDR and HDR TVs to be compatible with the same broadcast signals, it doesn't provide as accurate an HDR result if viewing the same content with HDR10 or Dolby Vision encoding, significantly limiting HLG's potential.

HLG compatibility is included on most 4K Ultra HD HDR-enabled TVs (except Samsung models) and home theater receivers beginning with the 2017 model year. So far, the BBC and DirecTV have been providing some programming using HLG.

Technicolor HDR

Of the four major HDR formats, Technicolor HDR is the least known and is only seeing minor use in Europe. Without getting bogged down in the technical details, Technicolor HDR is probably the most flexible solution, as it can be used in both recorded (streaming and disc) and broadcast TV applications. It can also be encoded using frame-by-frame reference points.

In addition, in a similar fashion as HLG, Technicolor HDR is backward compatible with both HDR and SDR-enabled TVs. You get the best viewing result on an HDR TV, but an SDR TV can benefit from increased quality, based on its color, contrast, and brightness capabilities.

Technicolor HDR signals can be viewed in SDR, making it convenient for content creators, content providers, and TV viewers. Technicolor HDR is an open standard that is royalty-free for content providers and TV makers to implement.

Tone Mapping

One of the problems in implementing the various HDR formats on TVs is that not all TVs have the same light-output characteristics. For example, a high-end HDR-enabled TV might output as much as 1,000 nits of light (such as some high-end LED/LCD TVs). Others may have a maximum of 600 or 700 nits light output (OLED and mid-range LED/LCD TVs). And, some lower-priced HDR-enabled LED/LCD TVs may only output about 500 nits.

As a result, a technique known as tone mapping is used to address this variance. What happens is that the metadata placed in a specific movie or program is remapped to the TV's capabilities. The brightness range of the TV is taken into consideration. Adjustments are made to peak brightness and all of the intermediate brightness information, in conjunction with the detail and color present in the original metadata in relation to the range of the TV. As a result, the peak brightness encoded in the metadata is not washed out when shown on a TV with less light-output capability.

SDR-to-HDR Upscaling

Since the availability of HDR-encoded content isn't plentiful, several TV manufacturers are making sure that the extra money consumers spend on an HDR-enabled TV doesn't go waste by including SDR-to-HDR conversion. Samsung labels their system as HDR+ (not to be confused with HDR10+ discussed earlier), and Technicolor labels its system as Intelligent Tone Management.

Samsung 4K UHD TV HDR+ Feature Example

However, just as with resolution upscaling and 2D-to-3D conversion, HDR+ and SD-to-HDR conversion don't provide as accurate a result as natural HDR content. Some content may look washed out or uneven from scene to scene, but it provides another way to take advantage of an HDR-enabled TV's brightness capabilities. HDR+ and SDR-to-HDR conversion can be turned on or off as desired. SDR-to-HDR upscaling is also referred to as inverse tone mapping.

In addition to SD-to-HDR upscaling, LG incorporates a system it refers to as Active HDR processing into a select number of its HDR-enabled TVs, which adds onboard scene-by-scene brightness analysis to both HDR10 and HLG content, improving the accuracy of those two formats.

Samsung's HDR+ adjusts the brightness and contrast ratio of HDR10 encoded content so that objects are more distinct.

The Bottom Line

The addition of HDR elevates the TV viewing experience. As format differences are resolved, and content becomes widely available across the disc, streaming, and broadcast sources, consumers will likely accept it just as they have previous advances.

Although HDR is being applied only in combination with 4K Ultra HD content, the technology is independent of resolution. This means that it can be applied to other resolution video signals, whether it be 480p, 720p, 1080i, or 1080p. This also means that owning a 4K Ultra HD TV doesn't automatically mean that it is HDR-compatible. The TV maker makes an assertive decision to include it.

However, the emphasis by content creators and providers has been to apply HDR capability within the 4K Ultra HD platform. With the availability of non-4K ultra HD TVs, DVD, and standard Blu-ray disc players diminishing, and with the abundance of 4K Ultra HD TVs as well as an increased number of Ultra HD Blu-ray players available, along with the forthcoming implementation of ATSC 3.0 TV broadcasting, the time and financial investment of HDR technology is best suited for maximizing the value of 4K Ultra HD content, source devices, and TVs.

Although in its current implementation stage there seems to be a lot of confusion, it'll all sort out eventually. Even though there are subtle quality differences between each format (Dolby Vision is considered to have a slight edge), all the HDR formats provide a significant improvement in the TV viewing experience.

FAQ
  • How is HDR10 different from HDR10+?

    HDR10 is an older standard, and HDR10+ is the successor to the HDR10 standard. However, your specific TV and its unique HDR implementation will determine how good your HDR experience is.

  • Is it more important to have HDR or 4K?

    HDR and 4K are completely different technologies. HDR involves the brightness and contrast of a display, while 4K refers to a display's resolution. Higher resolutions and HDR both offer their own advantages.

Was this page helpful?