High-dynamic-range video (HDR video) describes video having a dynamic range greater than that of standard-dynamic-range video (SDR video). HDR video involves capture, production, content/encoding, and display. HDR capture and displays are capable of brighter whites and deeper blacks. To accommodate this, HDR encoding standards allow for a higher maximum luminance and use at least a 10-bit dynamic range (compared to 8-bit for non-professional and 10-bit for professional SDR video) in order to maintain precision across this extended range.
While technically "HDR" refers strictly to the ratio between the maximum and minimum luminance, the term "HDR video" is commonly understood to imply wide color gamut as well.
In 1991, the first commercial video camera using consumer-grade sensors and cameras was introduced that performed real-time capturing of multiple images with different exposures, and producing an HDR video image, by Hymatom, licensee of Cornuéjols.
Also in 1991, Cornuéjols introduced the principle of non linear image accumulation HDR+ to increase the camera sensitivity: in low-light environments, several successive images are accumulated, increasing the signal-to-noise ratio.
Later, in the early 2000s, several scholarly research efforts used consumer-grade sensors and cameras. A few companies such as RED and Arri have been developing digital sensors capable of a higher dynamic range. RED EPIC-X can capture time-sequential HDRx images with a user-selectable 1-3 stops of additional highlight latitude in the "x" channel. The "x" channel can be merged with the normal channel in post production software. The Arri Alexa camera uses a dual gain architecture to generate an HDR image from two exposures captured at the same time.
With the advent of low-cost consumer digital cameras, many amateurs began posting tone mapped HDR time-lapse videos on the Internet, essentially a sequence of still photographs in quick succession. In 2010, the independent studio Soviet Montage produced an example of HDR video from disparately exposed video streams using a beam splitter and consumer grade HD video cameras. Similar methods have been described in the academic literature in 2001 and 2007.
Modern movies have often been filmed with cameras featuring a higher dynamic range, and legacy movies can be converted even if manual intervention would be needed for some frames (as when black-and-white films are converted to color). Also, special effects, especially those that mix real and synthetic footage, require both HDR shooting and rendering. HDR video is also needed in applications that demand high accuracy for capturing temporal aspects of changes in the scene. This is important in monitoring of some industrial processes such as welding, in predictive driver assistance systems in automotive industry, in surveillance video systems, and other applications. HDR video can be also considered to speed image acquisition in applications that need a large number of static HDR images are, for example in image-based methods in computer graphics.
TV sets with enhanced dynamic range and upscaling of existing SDR/LDR video/broadcast content with reverse tone mapping have been anticipated since early 2000s. In 2016, HDR conversion of SDR video was released to market as Samsung's HDR+ (in LCD TV sets) and Technicolor SA's HDR Intelligent Tone Management.
As of 2018, high-end consumer-grade HDR displays can achieve 1,000 cd/m2 of luminance, at least for a short duration or over a small portion of the screen, compared to 250-300 cd/m2 for a typical SDR display.
Academy Color Encoding System (ACES) was created by the Academy of Motion Picture Arts and Sciences and released in December 2014. ACES is a complete color and file management system that works with almost any professional workflow and it supports both HDR and wide color gamut. More information can be found at https://www.ACESCentral.com (WCG).
Video interfaces that support at least one HDR Format include HDMI 2.0a, which was released in April 2015 and DisplayPort 1.4, which was released in March 2016. On December 12, 2016, HDMI announced that Hybrid Log-Gamma (HLG) support had been added to the HDMI 2.0b standard.HDMI 2.1 was officially announced on January 4, 2017, and added support for Dynamic HDR, which is dynamic metadata that supports changes scene-by-scene or frame-by-frame.
The Society of Motion Picture and Television Engineers (SMPTE) created a standard for dynamic metadata: SMPTE ST 2094 or Dynamic Metadata for Color Volume Transform (DMCVT). SMPTE ST 2094 was published in 2016 as six parts and includes four applications from Dolby, Philips, Samsung, and Technicolor.
Perceptual Quantizer (PQ), published by SMPTE as SMPTE ST 2084, is a transfer function that allows for the display of high dynamic range (HDR) video with a luminance level of up to 10,000 cd/m2 and can be used with the Rec. 2020 color space. PQ is a non-linear electro-optical transfer function (EOTF). On April 18, 2016, the Ultra HD Forum announced industry guidelines for UHD Phase A, which uses Hybrid Log-Gamma (HLG) and PQ transfer functions with a bit depth of 10-bits and the Rec. 2020 color space. On July 6, 2016, the ITU announced Rec. 2100, which uses HLG or PQ as transfer functions with a Rec. 2020 color space.
The PQ inverse EOTF is as follows:
The DisplayHDR standard from Vesa is an attempt to make the differences in HDR specifications easier to understand for consumers. Vesa defines a set of HDR levels.
|Minimum Peak Luminance||Range of Color||Typical Dimming Technology||Maximum Black Level Luminance||Maximum Backlight Adjustment Latency|
|Brightness in cd/m2||Color Gamut||Brightness in cd/m2||Number of Video Frames|
|DisplayHDR 400 True Black||400||WCG*||Pixel-level||0.0005||2|
|DisplayHDR 500 True Black||500||WCG*||Pixel-level||0.0005||2|
*Wide Color Gamut
HDR10 Media Profile, more commonly known as HDR10, was announced on August 27, 2015, by the Consumer Technology Association and uses the wide-gamut Rec. 2020 color space, a bit depth of 10-bits, and the SMPTE ST 2084 (PQ) transfer function - a combination later also standardized in ITU-R BT.2100. It also uses SMPTE ST 2086 "Mastering Display Color Volume" static metadata to send color calibration data of the mastering display, such as MaxFALL (Maximum Frame Average Light Level) and MaxCLL (Maximum Content Light Level) static values, encoded as SEI messages within the video stream. HDR10 is an open standard supported by a wide variety of companies, which include monitor and TV manufacturers such as Dell, LG, Samsung, Sharp, VU, Sony, and Vizio, as well as Sony Interactive Entertainment, Microsoft and Apple which support HDR10 on their PlayStation 4, Xbox One video game console and Apple TV platforms, respectively. HDR10 most closely resembles Vesa's DisplayHDR 1000
HDR10+, also known as HDR10 Plus, was announced on April 20, 2017, by Samsung and Amazon Video. HDR10+ updates HDR10 by adding dynamic metadata that can be used to more accurately adjust brightness levels on a scene-by-scene or frame-by-frame basis. This function is based on Samsung application SMPTE ST 2094-40 Application #4. HDR10+ is an open standard and is royalty-free; it is supported by Colorfront's Transkoder and MulticoreWare's x265. A certification and logo program for HDR10+ device manufacturers will be made available with an annual administration fee and no per unit royalty. An authorized test center conducts a certification program for HDR10+ devices.
On August 28, 2017, Samsung, Panasonic, and 20th Century Fox created the HDR10+ Alliance to promote the HDR10+ standard. HDR10+ video started being offered by Amazon Video on December 13, 2017. On January 5, 2018, Warner Bros. announced their support for the HDR10+ standard. On January 6, 2018, Panasonic announced Ultra HD Blu-ray players with support for HDR10+. On April 4, 2019, Universal Pictures Home Entertainment announced a technology collaboration with Samsung Electronics to release new titles mastered with HDR10+.
Dolby Vision is an HDR format from Dolby Laboratories that can be optionally supported by Ultra HD Blu-ray discs and streaming video services. Dolby Vision is a proprietary format and Dolby SVP of Business Giles Baker has stated that the royalty cost for Dolby Vision is less than $3 per TV. Dolby Vision includes the Perceptual Quantizer (SMPTE ST 2084) electro-optical transfer function, up to 4K resolution, and a wide-gamut color space (ITU-R Rec. BT.2020 or ICTCP). Some Dolby Vision profiles allow for 12-bit color depth (as of 2018, only professional reference monitors and some projectors have this capability) and 10,000 cd/m2 maximum brightness (as of 2018, according to the Dolby Vision white paper, professional reference monitors, such as the Dolby Vision HDR reference monitor, are currently limited to 4,000 cd/m2 of peak brightness). It can encode mastering display colorimetry information using static metadata (SMPTE ST 2086) but also provide dynamic metadata (SMPTE ST 2094-10, Dolby format) for each scene. Examples of Ultra HD (UHD) TVs that support Dolby Vision include LG, TCL, VU, Sony and Vizio. MulticoreWare's x265 encoder supports Dolby Vision as of version 3.0.
Hybrid Log-Gamma (HLG) is a royalty-free HDR standard jointly developed by the BBC and NHK. HLG is designed to be better-suited for television broadcasting, where the metadata required for other HDR formats is not backward compatible with non-HDR displays, consumes additional bandwidth, and may also become out-of-sync or damaged in transmission. HLG defines a non-linear optical-electro transfer function, in which the lower half of the signal values use a gamma curve and the upper half of the signal values use a logarithmic curve. In practice, the signal is interpreted as normal by standard-dynamic-range displays (albeit capable of displaying more detail in highlights), but HLG-compatible displays can correctly interpret the logarithmic portion of the signal curve to provide a wider dynamic range.
HLG is defined in ATSC 3.0, Digital Video Broadcasting (DVB) UHD-1 Phase 2, and International Telecommunication Union (ITU) Rec. 2100. HLG is supported by HDMI 2.0b, HEVC, VP9, and H.264/MPEG-4 AVC. HLG is supported by video services such as the BBC iPlayer, DirecTV, Freeview Play, and YouTube.
SL-HDR1 is a HDR standard that was jointly developed by STMicroelectronics, Philips International B.V., and Technicolor R&D France. It was standardised as ETSI TS 103 433 in August 2016. SL-HDR1 provides direct backwards compatibility by using static (SMPTE ST 2086) and dynamic metadata (using SMPTE ST 2094-20 Philips and 2094-30 Technicolor formats) to reconstruct a HDR signal from a SDR video stream that can be delivered using SDR distribution networks and services already in place. SL-HDR1 allows for HDR rendering on HDR devices and SDR rendering on SDR devices using a single layer video stream. The HDR reconstruction metadata can be added either to HEVC or AVC using a supplemental enhancement information (SEI) message.
Rec. 2100 is a technical recommendation by ITU-R for production and distribution of HDR content using 1080p or UHD resolution, 10-bit or 12-bit color, HLG or PQ transfer functions, and wide color gamut using the Rec. 2020 or ICtCp color space.
UHD Phase A are guidelines from the Ultra HD Forum for distribution of SDR and HDR content using Full HD 1080p and 4K UHD resolutions. It requires color depth of 10-bits per sample, a color gamut of Rec. 709 or Rec. 2020, a frame rate of up to 60 fps, a display resolution of 1080p or 2160p, and either standard dynamic range (SDR) or high dynamic range that uses Hybrid Log-Gamma (HLG) or Perceptual Quantizer (PQ) transfer functions. UHD Phase A defines HDR as having a dynamic range of at least 13 stops (213=8192:1) and WCG as a color gamut that is wider than Rec. 709. UHD Phase A consumer devices are compatible with HDR10 requirements and can process Rec. 2020 color space and HLG or PQ at 10 bits.
For consumers displays that have limited color volume (i.e. do not provide peak brightness/contrast and color gamut required by the standards), SMPTE defines metadata for describing the scenes as they appear on the mastering display. SMPTE ST 2086 "Mastering Display Color Volume Metadata Supporting High Luminance and Wide Color Gamut Images" describes static data such as MaxFALL (Maximum Frame Average Light Level) and MaxCLL (Maximum Content Light Level). SMPTE ST 2094 "Content-Dependent Metadata for Color Volume Transformation of High Luminance and Wide Color Gamut Images" includes dynamic metadata that can change from scene to scene. This includes ST 2094-10 (Dolby Vision format), Colour Volume Reconstruction Information (CVRI) SMPTE ST 2094-20 (Philips format) and Colour Remapping Information (CRI) defined in ST 2094-30 (Technicolor format), and HDR10+ ST 2094-40 (Samsung format).
The HEVC specification incorporates the Main 10 profile on their first version that supports 10 bits per sample.
On April 8, 2015, The HDMI Forum released version 2.0a of the HDMI Specification to enable transmission of HDR. The Specification references CEA-861.3, which in turn references the Perceptual Quantizer (PQ), which was standardized as SMPTE ST 2084. The previous HDMI 2.0 version already supported the Rec. 2020 color space.
On November 17, 2016, the Digital Video Broadcasting (DVB) Steering Board approved UHD-1 Phase 2 with a HDR solution that supports Hybrid Log-Gamma (HLG) and Perceptual Quantizer (PQ). The specification has been published as DVB Bluebook A157 and will be published by the ETSI as TS 101 154 v2.3.1.
On January 2, 2017, LG Electronics USA announced that all of LG's SUPER UHD TV models now support a variety of HDR technologies, including Dolby Vision, HDR10, and HLG (Hybrid Log Gamma), and are ready to support Advanced HDR by Technicolor.