HDR and the industry that cried “wolf”
A related, but smaller triangle has made its way to the UHD Alliance spec. In order to get the Alliance's "Ultra HD Premium" sticker, a display must be able to achieve 90 percent of a DCI (a.k.a. P3) color space.
Our old TV content was created thinking in terms of brightness at 100 Nits, which is what our old CRT reference monitors were capable of putting out. In HDR, we will create deploying monitors capable of putting out 4,000 to 10,000 Nits.
Indeed, that's bloody dynamic!!!
A semi-new term not usually read about is "bit-depth." Everything we watch now is termed "8-bit." This defines (limits) how many shades of colors we see. 8-bit ITU TV started way back in 1982 with a range of 0 to 255 and delivers a mere 220 usable shades of any possible color. HDR (10-bit) now starts with 1,024 shades. The combination of a wider color gamut and a greater bit depth is an extremely significant and easily visible color improvement.
Lastly, you may not have heard of EOTF. It IS NOT a social disease. It stands for Electro-Optical Transfer Function – simply the new name for gamma. But now we have a special new gamma for HDR called SMPTE 2084 EOTF, prominent in most all of the various HDR proposals.
Can't leave the HDR topic without stating the core specifications that really are the essence of the term. Two specs are cited to accommodate two different technologies. The LED/LCD displays need to be capable of over 1,000 Nits peak brightness and less than .05 Nits black level. For OLED displays, it’s 540 Nits brightness and less than .0005 Nits black level. This writer will opt for the blackest blacks. However, if a room has really bright ambient light, you may opt for the brightest whites.