HDR - Computer Definition
(1) (High Data Rate) A CDMA 3G technology from Qualcomm. See Qualcomm HDR.
(2) (High Dynamic Range for TV) A high contrast ratio between the brightest whites and darkest blacks on screen. OLED TVs are known for their high contrast ratio; however, backlight techniques employed on LCD TVs create a higher dynamic range. For example, a common method is "local dimming," in which the light is reduced to the dark pixels in real time. Sony's X-tended Dynamic Range Pro dynamically allocates the unused power for the LEDs behind dark areas to the LEDs behind the bright pixels. HDR10 Vs. Dolby Vision HDR10 is part of the Ultra HD Blu-ray standard (4K Blu-ray). HDR10 supports 10-bit pixels, which means 1,024 shades for each red, green and blue subpixel (one billion colors). Dolby Vision films are mastered in 12-bit color (4,096 shades and 68 billion colors), and high-end Dolby Vision TVs support HDR10 as well. In addition, HDR10 content is mastered with up to 1,000 nits of brightness, whereas Dolby Vision supports 10,000 nits, although most displays cannot handle more than 4,000. Both HDR10 and Dolby Vision use meta-data that HDR-capable sets interpret but non-HDR TVs ignore. However, HDR10 meta-data affects the overall content, whereas Dolby Vision meta-data can change from scene to scene. See contrast ratio, LED TV, Dolby HDR, OLED, binary values and nit.
(3) (High Dynamic Range for photos) A photographic technique in which several shots are taken of the same high-contrast scene at different exposures. Highlights from the underexposed images and shadows from the overexposed frames are blended together to create a more natural look. See bracketing.