High Dynamic Range (HDR) is a video and image technology that improves the way light is represented. It overcomes the limits of the standard format that is named Standard Dynamic Range (SDR). HDR offers the possibility to represent substantially brighter highlights, darker shadows, more details in both sides and more colorful colors than what was previously possible. HDR do not increase display's capabilities, rather it allows to make better use of displays that have high brightness, contrast and colors capabilities. Not all HDR displays have the same capabilities and HDR contents will thus look different depending the display used. HDR emerged first for videos in 2014. HDR10, HDR10+, Dolby Vision and HLG are common HDR formats. Some still pictures formats do also support HDR.
Other technologies before HDR improved the image quality by increasing the pixel quantity (resolution and frame rate). HDR improve the pixel quality.[1]
While CRT are not used anymore and modern displays are often much higher capable, SDR format is still based and limited to CRT's characteristics. HDR overcomes those limits.[2]
The maximum luminance level that can be represented is limited to around 100 nits for SDR while it is at least 1000 nits for HDR up to 10,000 nits (depends on the format used).[2][3] HDR also allows for lower black levels[4] and more saturated colors (i.e. more colorful colors).[2] The most common SDR format is limited to Rec.709/sRGB gamut while common HDR formats uses Rec. 2100 color primaries which is a Wide Color Gamut (WCG).[2][5]
Those are the technical limits of HDR formats. HDR contents are often limited to a peak brightness of 1000 or 4000 nits and DCI-P3 colors, even if they are stored in a higher capable format.[6][7] Display's cabilities vary and no current display is able to reproduce all the maximal range of brightness and colors that can be stored in HDR formats.
HDR video involves capture, production, content/encoding, and display.
Main benefits:
Other benefits:
The increased maximum brightness capability can be used to increase the brightness of small areas without increasing overall image's brightness, resulting in for example bright reflections off shin objects, bright stars in a dark night scene, bright and colorful fire or sunset, bright and colorful light emissive objects.[8][9][10] Content creators can choose the way they use the increased capabilities. They also can choose to restrain themselves to the limits of SDR even if the content is delivered in an HDR format.[10]
For the creative intents to be preserved, video formats require contents to be displayed and viewed following the standards.
HDR formats that use dynamic metadata (such as Dolby Vision and HDR10+) can be viewed on any compatible display. The metadata allow for the image to be adjusted to the display accordingly to the creative intents.[5] Other HDR formats (such as HDR10 and HLG) requires the consumer display to have at least the same capability than the mastering display. The creative intents are not ensured to be preserved on lower capable displays.[11]
For optimal quality, contents are required to be viewed in a relative dark environment.[12][13] Dolby Vision IQ and HDR10+ Adaptive adjust the content according to the ambient light.[14][15]
The HDR capture technique used for years in photography increase the dynamic range captured by cameras. The photos would then need to be tone mapped to SDR. Now they could be saved in HDR format (such as HDR10) that can be used to display the photos in a higher range of brightness.
Previous high dynamic range format such as raw and logarithmic format were only intended to be used for storage. Before reaching the consumer, they would need to be converted to SDR (i.e. gamma curve). Now they can also be converted into an HDR format (such as HDR10) and being displayed with a higher range of brightness and color.
Since 2014, multiple HDR formats have emerged including HDR10, HDR10+, Dolby Vision and HLG.[5][16] Some formats are royalty free, others require license and some are more capable than others.
Dolby Vision and HDR10+ include dynamic metadata while HDR10 and HLG don't.[5] Those are used to improve image quality on limited displays that are not capable to reproduce an HDR video in the way it has been created and delivered. Dynamic metadata allow content creators to control and choose the way the image is adjusted.[17] When low capable displays are used and dynamic metadata are not available, the result will vary upon the display's choices and artistic intents might not be preserved.
HDR10 Media Profile, more commonly known as HDR10, is an open HDR standard announced on 27 August 2015 by the Consumer Technology Association.[18] It is the most wide spread of the HDR formats.[19]
HDR10 is technically limited to a maximum of 10,000 nits peak brightness, however HDR10 contents are commonly mastered with peak brightness from 1,000 to 4,000 nits.[6]
HDR10 is not backward compatible with SDR displays.
HDR10 lacks dynamic metadata.[20] On HDR10 displays that have lower color volume than the HDR10 content (for example lower peak brightness capability), the HDR10 metadata give information to help adjusting the content.[5] However the metadata are static (remain the same for the entire video) and do not tell how the content should be adjusted, thus the decision is up to the display and the creative intents might not be preserved.[11]
Dolby Vision is an end-to-end ecosystem for HDR video. It covers content creation, distribution and playback.[21] It's a proprietary solution from Dolby Laboratories that emerged in 2014.[22] It does use dynamic metadata and it's technically capable to represent luminance levels up to 10,000 nits.[5] Dolby Vision require content creators's display to have a peak brightness of at least 1,000 nits.[7]
HDR10+, also known as HDR10 Plus, is an HDR video format announced on 20 April 2017.[23] It's the same as HDR10 but with some dynamic metadata developed by Samsung added to it.[24][25][26] It's free to use for content creators and has a maximal $10,000 annual licence for some manufacturers.[27] It's aimed to be an alternative to Dolby Vision without the fees.[19]
HLG10, commonly referred as HLG, is a video format using the HLG transfer function, a bit depth of 10-bits and the wide-gamut Rec. 2020 color space.[28] The HLG transfer function is backward compatible with SDR video[29][30][31] but the Rec. 2020 color space is not compatible with SDR color space (Rec.709).
HDR10 | HDR10+ | Dolby Vision | HLG10 | ||
---|---|---|---|---|---|
Developed by | CTA | Samsung | Dolby | NHK and BBC | |
Year | 2015 | 2017 | 2014 | 2015 | |
Cost | Free | Free (for content company)
Yearly license (for manufacturer) [38] |
Proprietary | Free | |
Technical characteristics | |||||
Metadata | Static
(SMPTE ST 2086, MaxFALL, MaxCLL) |
Dynamic | Dynamic
(Dolby Vision L0, L1, L2 trim, L8 trim) |
None | |
Transfer function | PQ | PQ | PQ (in most profiles)[39]
HLG (in profile 8.1)[39] |
HLG | |
Bit Depth | 10 bit | 10 bit (or more) | 10 bit or 12 bit | 10 bit | |
Peak luminance | Technical limit | 10,000 nits | 10,000 nits | 10,000 nits | Variable |
Contents | No rules
1,000 - 4,000 nits (common) [6] |
No rules
1,000 - 4,000 nits (common)[6] |
(At least 1,000 nits[40])
4,000 nits common[6] |
1,000 nits common[41][42] | |
Color primaries | Technical limit | Rec. 2020 | Rec. 2020 | Rec. 2020 | Rec. 2020 |
Contents | DCI-P3 (common)[5] | DCI-P3 (common)[5] | At least DCI-P3[40] | DCI-P3 (common)[5] | |
Backward compatibility | None | HDR10 | It depends on the profile used:
|
|
|
Notes | PQ10 format is same as HDR10 without the metadata[32] | Technical characteristics of Dolby Vision depend on the profile used, but all profiles support the same Dolby Vision dynamic metadata.[39] | On SDR displays that don't support Rec. 2020 color primaries (WCG), HLG formats using Rec. 2020 color primaries will show a de-saturated image with visible hue shifts.[32] | ||
Sources | [5][6][43] | [5][6][44][45] | [5][6][39][40][46][47] | [5][32][41][42] |
Display devices capable of greater dynamic range have been researched for decades, primarily with flat panel technologies like plasma, SED/FED and OLED.
TV sets with enhanced dynamic range and upscaling of existing SDR/LDR video/broadcast content with reverse tone mapping have been anticipated since early 2000s.[48][49] In 2016, HDR conversion of SDR video was released to market as Samsung's HDR+ (in LCD TV sets)[50] and Technicolor SA's HDR Intelligent Tone Management.[51]
As of 2018, high-end consumer-grade HDR displays can achieve 1,000 cd/m2 of luminance, at least for a short duration or over a small portion of the screen, compared to 250-300 cd/m2 for a typical SDR display.[52]
Video interfaces that support at least one HDR Format include HDMI 2.0a, which was released in April 2015 and DisplayPort 1.4, which was released in March 2016.[53][54] On 12 December 2016, HDMI announced that Hybrid Log-Gamma (HLG) support had been added to the HDMI 2.0b standard.[55][56][57] HDMI 2.1 was officially announced on 4 January 2017, and added support for Dynamic HDR, which is dynamic metadata that supports changes scene-by-scene or frame-by-frame.[58][59]
As of 2020, no display is capable of rendering the full range of brightness and color of HDR formats.[28] A display is called an HDR display if it can accept HDR content and map them to its display characteristics.[28] Thus, the HDR logo only provides information about content compatibility and not display capability.
Certifications have been made in order to give consumers information about the display rendering capability of a screen.
The DisplayHDR standard from VESA is an attempt to make the differences in HDR specifications easier to understand for consumers, with standards mainly used in computer monitors and laptops. VESA defines a set of HDR levels; all of them must support HDR10, but not all are required to support 10-bit displays.[60] DisplayHDR is not an HDR format, but a tool to verify HDR formats and their performance on a given monitor. The most recent standard is DisplayHDR 1400 which was introduced in September 2019, with monitors supporting it released in 2020.[61][62] DisplayHDR 1000 and DisplayHDR 1400 are primarily used in professional work like video editing. Monitors with DisplayHDR 500 or DisplayHDR 600 certification provide a noticeable improvement over SDR displays, and are more often used for general computing and gaming.[63]
Minimum peak luminance | Range of color | Typical dimming technology | Maximum black level luminance | Maximum backlight adjustment latency | |
---|---|---|---|---|---|
Brightness in cd/m2 | Color gamut | Brightness in cd/m2 | Number of video frames | ||
DisplayHDR 400 | 400 | sRGB | Screen-level | 0.4 | 8 |
DisplayHDR 500 | 500 | WCG* | Zone-level | 0.1 | 8 |
DisplayHDR 600 | 600 | WCG* | Zone-level | 0.1 | 8 |
DisplayHDR 1000 | 1000 | WCG* | Zone-level | 0.05 | 8 |
DisplayHDR 1400 | 1400 | WCG* | Zone-level | 0.02 | 8 |
DisplayHDR 400 True Black | 400 | WCG* | Pixel-level | 0.0005 | 2 |
DisplayHDR 500 True Black | 500 | WCG* | Pixel-level | 0.0005 | 2 |
*Wide Color Gamut, at least 90% of DCI-P3 in specified volume (peak luminance)
UHD Alliance certifications:
HDR is mainly achieved by the use of PQ or HLG transfer function.[2][3] Wide Color Gamut (WCG) is also commonly used along HDR. Rec. 2020 color primaries.[2] A bit-depth of 10-bit or 12-bit is used to not see banding across the extended brightness range. Some additional metadata are sometimes used to handle the variety in displays brightness, contrast and colors. HDR video is defined in Rec. 2100.[3]
HDR transfer functions:
Both PQ and HLG are royalty-free.[78]
Static HDR metadatas gives informations about the whole video.
Those metadatas do not describe how the HDR content should be adapted to an HDR consumer displays that have lower color volume (i.e. peak brightness, contrast and color gamut) than the content.[11][80]
The values of MaxFALL and MaxCLL should be calculated from the video stream itself (not including black borders for MaxFALL) based on how the scenes appear on the mastering display. It is not recommended to set them arbitrary.[81]
Dynamic metadatas are specific for each frame or each scene of the video.
Dynamic metadatas of Dolby Vision, HDR10+ and SMPTE ST 2094 describe what color volume transform should be applied to contents that are shown on displays that have different color volume from the mastering display. It's optimized for each scene and each display. It allows for the creative intents to be preserved even on consumers displays that have limited color volume.
SMPTE ST 2094 or Dynamic Metadata for Color Volume Transform (DMCVT) is a standard for dynamic metadata published by SMPTE in 2016 as six parts.[82] It's carried in HEVC SEI, ETSI TS 103 433, CTA 861-G.[83] It includes four applications:
ETSI TS 103 572: A technical specification published on October 2020 by ETSI for HDR signaling and carriage of ST 2094-10 (Dolby Vision) metadata.[84]
HDR is commonly associated to a Wide Color Gamut (a system chromaticity wider than BT.709). Rec. 2100 (HDR-TV) uses the same system chromaticity that is used in Rec. 2020 (UHDTV).[86][87] HDR formats such as HDR10, HDR10+, Dolby Vision and HLG also use Rec. 2020 chromaticities.
HDR contents are commonly graded on a DCI-P3 display[5][88] and then contained in a HDR format that uses Rec. 2020 color primaries.
Color space | Chromaticity coordinate (CIE, 1931) | ||||||||
---|---|---|---|---|---|---|---|---|---|
Primary colors | White point | ||||||||
Red | Green | Blue | |||||||
xR | yR | xG | yG | xB | yB | Name | xW | yW | |
Rec. 709[85] | 0.64 | 0.33 | 0.30 | 0.60 | 0.15 | 0.06 | D65 | 0.3127 | 0.3290 |
sRGB | |||||||||
DCI-P3[89][90] | 0.680 | 0.320 | 0.265 | 0.690 | 0.150 | 0.060 | P3-D65 (Display) | 0.3127 | 0.3290 |
P3-DCI (Theater) | 0.314 | 0.351 | |||||||
P3-D60 (ACES Cinema) | 0.32168 | 0.33767 | |||||||
Rec. 2020[87] | 0.708 | 0.292 | 0.170 | 0.797 | 0.131 | 0.046 | D65 | 0.3127 | 0.3290 |
Rec. 2100[86] |
Rec.709 and sRGB (SDR) https://handwiki.org/wiki/index.php?curid=2016345
DCI-P3 (common HDR contents) https://handwiki.org/wiki/index.php?curid=2018095
Rec.2020 and Rec.2100 (HDR technical limit) https://handwiki.org/wiki/index.php?curid=2046801
Because of the increased dynamic range, HDR contents need to use more bit depth than SDR to avoid banding. While SDR uses a bit depth of 8 or 10 bits,[85] HDR uses 10 or 12 bits.[86] This, combined with the use of more efficient transfer function (i.e. PQ or HLG), is enough to avoid banding.[91][92]
Rec. 2100 specifies the use of the RGB, the YCbCr or the ICTCP signal formats for HDR-TV.[86]
ICTCP is a color representation designed by Dolby for HDR and wide color gamut (WCG)[93] and standardized in Rec. 2100.[86]
IPTPQc2 (or IPTPQc2) with reshaping is a proprietary format by Dolby and is similar to ICTCP.[94] It is used by Dolby Vision profile 5.[94]
Some Dolby Vision profiles use a dual layer video composed of a base layer and an enhancement layer.[94][95] Depending of the Dolby Vision profile, the base layer can be backward compatible with SDR, HDR10, HLG, Blu-ray or no video format.[94]
ETSI GS CCM 001 describes a Compound Content Management functionality for a dual layer HDR system.[96]
Rec. 2100 is a technical recommendation by ITU-R for production and distribution of HDR content using 1080p or UHD resolution, 10-bit or 12-bit color, HLG or PQ transfer functions, the Rec. 2020 wide color gamut and YCBCR or ICTCP as color space.[12][97]
UHD Phase A are guidelines from the Ultra HD Forum for distribution of SDR and HDR content using Full HD 1080p and 4K UHD resolutions. It requires color depth of 10-bits per sample, a color gamut of Rec. 709 or Rec. 2020, a frame rate of up to 60 fps, a display resolution of 1080p or 2160p, and either standard dynamic range (SDR) or high dynamic range that uses Hybrid Log-Gamma (HLG) or Perceptual Quantizer (PQ) transfer functions.[98] UHD Phase A defines HDR as having a dynamic range of at least 13 stops (213=8192:1) and WCG as a color gamut that is wider than Rec. 709.[98] UHD Phase A consumer devices are compatible with HDR10 requirements and can process Rec. 2020 color space and HLG or PQ at 10 bits.
UHD Phase B will add support to 120 fps (and 120/1.001 fps), 12 bit PQ in HEVC Main12 (that will be enough for 0.0001 to 10000 nits), Dolby AC-4 and MPEG-H 3D Audio, IMAX sound in DTS:X (without LFE). It will also add ITU's ICtCp and Color Remapping Information (CRI).
The following image formats are compatible with HDR (Rec.2100 color space, PQ and HLG transfer functions, Rec.2100/Rec.2020 color primaries):
Panasonic: Panasonic's S-series cameras (including Lumix S1, S1R, S1H and S5) can capture photos in HDR using the HLG transfer function and output them in a HSP file format.[76][102][103] The captured HDR pictures can be viewed in HDR by connecting the camera to an HLG-compliant display with an HDMI cable.[76][102]
Canon: EOS-1D X Mark III and EOS R5 are able to capture still images in the Rec.2100 color space by using the PQ transfer function, the HEIC format (HEVC codec in HEIF file format), the Rec. 2020 color primaries, a bit depth of 10 bit and a 4:2:2 YCbCr subsampling.[74][104][105][106][107] The captured HDR pictures can be viewed in HDR by connecting the camera to an HDR display with an HDMI cable.[107] Captured HDR pictures can also be converted to SDR JPEG (sRGB color space) and then viewed on any standard display.[107] Canon refers to those SDR pictures as "HDR PQ-like JPEG". Canon's Digital Photo Professional software is able to show the captured HDR pictures in HDR on HDR displays or in SDR on SDR displays.[107][108] It is also able to convert the HDR PQ to SDR sRGB JPEG.[109]
Sony: Sony α7S III and α1 cameras can capture HDR photos in the Rec.2100 color space with the HLG transfer function, the HEIF format, Rec. 2020 color primaries, a bit depth of 10 bit and a 4:2:2 or 4:2:0 subsampling.[77][110][111][112] The captured HDR pictures can be viewed in HDR by connecting the camera to an HLG-compliant display with an HDMI cable.[112]
Qualcomm: Snapdragon 888 mobile SoC allow the capture of 10-bit HDR HEIF still photos.[113][114]
In February and April 1990, Georges Cornuéjols introduced the first real-time HDR camera combining two successively[115] or simultaneously[116]-captured images.
In 1991, the first commercial video camera using consumer-grade sensors and cameras was introduced that performed real-time capturing of multiple images with different exposures, and producing an HDR video image, by Hymatom, licensee of Cornuéjols.
Also in 1991, Cornuéjols introduced the principle of non linear image accumulation HDR+ to increase the camera sensitivity:[117] in low-light environments, several successive images are accumulated, increasing the signal-to-noise ratio.
Later, in the early 2000s, several scholarly research efforts used consumer-grade sensors and cameras.[118] A few companies such as RED and Arri have been developing digital sensors capable of a higher dynamic range.[119][120] RED EPIC-X can capture time-sequential HDRx[121] images with a user-selectable 1–3 stops of additional highlight latitude in the "x" channel. The "x" channel can be merged with the normal channel in post production software. The Arri Alexa camera uses a dual gain architecture to generate an HDR image from two exposures captured at the same time.[122]
With the advent of low-cost consumer digital cameras, many amateurs began posting tone mapped HDR time-lapse videos on the Internet, essentially a sequence of still photographs in quick succession. In 2010, the independent studio Soviet Montage produced an example of HDR video from disparately exposed video streams using a beam splitter and consumer grade HD video cameras.[123] Similar methods have been described in the academic literature in 2001 and 2007.[124][125]
Modern movies have often been filmed with cameras featuring a higher dynamic range, and legacy movies can be converted even if manual intervention would be needed for some frames (as when black-and-white films are converted to color). Also, special effects, especially those that mix real and synthetic footage, require both HDR shooting and rendering. HDR video is also needed in applications that demand high accuracy for capturing temporal aspects of changes in the scene. This is important in monitoring of some industrial processes such as welding, in predictive driver assistance systems in automotive industry, in surveillance video systems, and other applications. HDR video can be also considered to speed image acquisition in applications that need a large number of static HDR images are, for example in image-based methods in computer graphics.
OpenEXR was created in 1999 by Industrial Light & Magic (ILM) and released in 2003 as an open source software library.[126][127] OpenEXR is used for film and television production.[127]
Academy Color Encoding System (ACES) was created by the Academy of Motion Picture Arts and Sciences and released in December 2014.[128] ACES is a complete color and file management system that works with almost any professional workflow and it supports both HDR and wide color gamut. More information can be found at https://www.ACESCentral.com (WCG).[128]
On January 2014, Dolby Laboratories announced Dolby Vision.[16]
The HEVC specification incorporates the Main 10 profile on their first version that supports 10 bits per sample.[129]
On 8 April 2015, The HDMI Forum released version 2.0a of the HDMI Specification to enable transmission of HDR. The Specification references CEA-861.3, which in turn references the Perceptual Quantizer (PQ), which was standardized as SMPTE ST 2084.[53] The previous HDMI 2.0 version already supported the Rec. 2020 color space.[130]
On 24 June 2015, Amazon Video was the first streaming service to offer HDR video using HDR10 Media Profile video.[131][132]
On 27 August 2015, Consumer Technology Association announced HDR10.[18]
On 17 November 2015, Vudu announced that they had started offering titles in Dolby Vision.[133]
On 1 March 2016, the Blu-ray Disc Association released Ultra HD Blu-ray with mandatory support for HDR10 Media Profile video and optional support for Dolby Vision.[134]
On 9 April 2016, Netflix started offering both HDR10 Media Profile video and Dolby Vision.[135]
On 6 July 2016, the International Telecommunication Union (ITU) announced Rec. 2100 that defines two HDR transfer functions—HLG and PQ.[12][97]
On 29 July 2016, SKY Perfect JSAT Group announced that on 4 October, they will start the world's first 4K HDR broadcasts using HLG.[136]
On 9 September 2016, Google announced Android TV 7.0, which supports Dolby Vision, HDR10, and HLG.[137][138]
On 26 September 2016, Roku announced that the Roku Premiere+ and Roku Ultra will support HDR using HDR10.[139]
On 7 November 2016, Google announced that YouTube would stream HDR videos that can be encoded with HLG or PQ.[140][141]
On 17 November 2016, the Digital Video Broadcasting (DVB) Steering Board approved UHD-1 Phase 2 with a HDR solution that supports Hybrid Log-Gamma (HLG) and Perceptual Quantizer (PQ).[142][143] The specification has been published as DVB Bluebook A157 and will be published by the ETSI as TS 101 154 v2.3.1.[142][143]
On 2 January 2017, LG Electronics USA announced that all of LG's SUPER UHD TV models now support a variety of HDR technologies, including Dolby Vision, HDR10, and HLG (Hybrid Log Gamma), and are ready to support Advanced HDR by Technicolor.
On 20 April 2017, Samsung and Amazon annouced HDR10+.[23]
On 12 September 2017, Apple announced the Apple TV 4K with support for HDR10 and Dolby Vision, and that the iTunes Store would sell and rent 4K HDR content.[144]
On 13 October 2020, Apple announced the iPhone 12 and iPhone 12 Pro series, the first smartphone that can record and edit video in Dolby Vision right in the camera roll[145] on frame-by-frame basis. iPhone uses HLG compatible profile 8 of Dolby Vision[146] with only L1 trim.
The content is sourced from: https://handwiki.org/wiki/Engineering:High_Dynamic_Range_(display_and_formats)