You’re shopping for a new monitor or laptop, scanning the spec sheet. Resolution? Check. Refresh rate? Check. Then you notice a growing chorus of models touting HDR. If you’re wondering what HDR is, how it works, and what it actually gets you, you’re in the right place. In this article, we cover all the low-down on those three magic letters.
A quick explanation about HDR
HDR stands for “High Dynamic Range,” a label for the technologies used in image and video content, display panels, and graphics rendering that increase the difference between the maximum and minimum levels of light and color.
That difference, a.k.a. the dynamic range, is how many times higher the maximum value is compared to the minimum. For displays, think in terms of contrast: max brightness divided by m…
You’re shopping for a new monitor or laptop, scanning the spec sheet. Resolution? Check. Refresh rate? Check. Then you notice a growing chorus of models touting HDR. If you’re wondering what HDR is, how it works, and what it actually gets you, you’re in the right place. In this article, we cover all the low-down on those three magic letters.
A quick explanation about HDR
HDR stands for “High Dynamic Range,” a label for the technologies used in image and video content, display panels, and graphics rendering that increase the difference between the maximum and minimum levels of light and color.
That difference, a.k.a. the dynamic range, is how many times higher the maximum value is compared to the minimum. For displays, think in terms of contrast: max brightness divided by min brightness is the monitor’s contrast ratio.
Why it matters: the human visual system can perceive a very large contrast range under the right conditions. HDR systems preserve fine detail that SDR can crush in shadows or blow out in highlights, and they expand the depth and range of colors. Static photos, films, and rendered graphics all look better on a high-quality HDR display.
There’s a cost angle, too. HDR-capable gear is often pricier than SDR (Standard Dynamic Range) because it leans on more complex components and moves more data through the pipeline. With that context, let’s dig into how it all works.
Understanding digital color
Computers and our favorite tech gadgets create images on screens by manipulating the value of a color channel. These are values used to represent a specific color – specifically, red, green, and blue – and every other color is made through a combination of these three, using what is known as the RGB color model.
This is a mathematical system that simply adds these three color values together, but by itself, it’s not much use. The model needs further information about how the colors are to be interpreted, to account for aspects such as the way the human visual system works, and the result is known as a color space. There’s a whole host of different combinations, but some of the most well-known color spaces are DCI-P3, Adobe RGB, and sRGB.
Such spaces are unable to cover every possible color that’s discernible by us and the set of colors that it can represent is called a gamut. These are often displayed in what’s known as a CIE xy chromaticity diagram:
The gamuts of Adobe RGB vs sRGB
In our monitor reviews, you’ll always see references to these gamuts, with measurements of how much of the gamut gets covered by the display. But since color models, spaces, and gamuts are essentially all just a bunch of math, various systems are needed to convert the numbers into a correct physical representation of the image.
Going through all of those would require another article, but one of the most important ones is called the electro-optical transfer function (EOTF). This is a mathematical process that translates the electrical signals of a digital image or video into displayed colors and brightness levels.
For most of us, we never need to worry about any of this – just plug our monitors and TVs in, watch movies and play games, and don’t need to give any of this much thought. However, professional content creators will allocate plenty of time to calibrating their devices, to ensure that images and the like are displayed as correctly as possible.
The importance of color depth
The color channels that are processed in the models can be represented in a variety of ways, such as an arithmetic domain of [0,1] or a percentage value. However, since the devices handling all the math use binary numbers, color values are stored in this format, too.
The size of digital data is measured in bits and the amount used is often referred to as the color depth. The more bits that are used, the greater the number of different colors that can be created. The minimum standard these days is to use 8 bits for each channel, and you may sometimes see this written as R8G8B8 or just 888. A single bit provides two values (0 and 1), two bits results in 2 x 2 = 4 values, and so 8 bits give 2 x 2 x ... (8 times) = 256 values.
Multiply these together, 256 x 256 x 256, and you get 16,777,216 possible combinations of RGB. That might seem like an impossibly large number of colors, far more than you would ever need, and for the most part, it is! Hence, why this is the industry norm.
For the moment, let’s focus on a single color channel. Above, you can see the difference in how the red channel changes, depending on the number of bits used. Notice how much smoother 8 bits is compared to the others? It might seem to look perfectly okay like this and the need for using more bits just doesn’t exist.
However, once you start blending colors together, 8 bits isn’t quite enough. Depending on the image being viewed, it’s easy to spot marked regions where the color seems to jump from one value to another. In the image below, the colors on the left side have been simulated using 2 bits per channel, whereas the right side is standard 8 bits.
Image credit: Microsoft
While the top of the right side doesn’t look too bad, a close examination of the bottom (especially near the ground) amply demonstrates the issue. Using a higher color depth value would eradicate this problem, although you don’t need to go really big – 10 or 12 bits is more than enough, as even at 10 bits, there would be 1024 steps in the gradient of a color channel. That’s four times more color changes than at 8 bits.
Using a larger color depth becomes even more important when using a color space with a very wide gamut. sRGB was developed by Hewlett Packard and Microsoft over 20 years ago, but is still perfectly suitable for the displays of today because they nearly all use 8 bits for color depth. However, something like Kodak’s ProPhoto RGB color space has a gamut so large that 16-bit color channels are required to avoid banding. Also note that although games render internally at very high precision, most HDR video formats and consumer delivery are mastered at 10-bit.
How displays make images
The majority of today’s computer monitors, TVs, and screens in tablets and phones all use one of two technologies to generate an image: liquid crystals that block light (a.k.a. LCD) or tiny diodes that emit light (LEDs). In some cases, it’s a combination of the two, using LEDs to create the light that the LCD then blocks out.
If you take a look at some of the best monitors you can buy at the moment, the majority of them have LCD panels inside. Break one of these apart and take a close look at the screen, and you might see something like this.
Here you can clearly make out the individual RGB color channels that comprise each pixel (picture element). In the case of this example, each one is actually a tiny filter, only allowing that color of light to pass through. Screens that use LEDs (such as the OLED panels in top-end monitors or expensive phones) don’t need to use them, as they make the light color directly.
No matter what technology is used, there is a limit to how much light can be passed through or emitted. This quantity is called luminance and is measured in nits or candela per square meter. A typical monitor might have a peak luminance of, say, 250 nits – this would be achieved by having all of the pixels set to white (i.e. maximum values for each RGB channel).
However, the minimum value is somewhat harder to pin down. LCD screens work by using liquid crystals to block light from coming through but some light will always manage to sneak its way past the crystals. It might only be a tiny amount of luminance, perhaps 0.2 nits or so, but recall that the dynamic range is the ratio between the maximum and minimum values for something. When you see “contrast ratio,” it’s typically measured from full-screen white to full-screen black (static contrast). Modern HDR tests also evaluate how well local dimming controls stray light in smaller areas.
OLED panels are best for HDR, but certain LCD models are still very good
If the max is 250 and the min is 0.2, that’s a dynamic range of 250/0.2 = 1250. As it is very hard to lower the minimum luminance in LCD screens, manufacturers improve the dynamic range by increasing the maximum luminance.
Screens that emit light, rather than transmit it through crystals, fare much better in this aspect. When the LEDs are off, the minimum luminance is so low that you can’t really measure it. This actually means that displays with OLED (Organic LEDs) panels, for example, theoretically have infinite contrast ratios!
HDR formats and certification
Let’s take a mid-range computer monitor from a few years ago, the Asus TUF Gaming VG279QM. This uses an LCD panel that’s lit from behind using rows of LEDs, and the manufacturer claims it to be HDR capable, citing two aspects: HDR10 and DisplayHDR 400.
The first one is a specification for a video format, created by the Consumer Technology Association, that sets out several technical aspects for the color space, color depth, transfer function, and other elements. In practice, today’s HDR production/distribution sits under ITU BT.2100 (either the PQ EOTF, a.k.a. SMPTE ST-2084, or HLG), uses the BT.2020 color space, and is typically 10- or 12-bit. HDR10 is the baseline, royalty-free profile; HDR10+ and Dolby Vision add dynamic metadata (per-scene/per-frame adjustments).
Where sRGB uses a relatively simple gamma curve for the transfer function, the HDR10 format uses one known as the Perceptual Quantizer (PQ) and is far more suitable for content with a high dynamic range. Likewise, the color space (ITU-R Recommendation BT.2020, shown below) for this format also has a wider gamut than sRGB and Adobe RGB.
Additionally, the specification also requires the color depth to be a minimum of 10 bits to avoid banding. The format also contains fixed metadata (further information about the whole video the display is showing, to allow the display to adjust video signals) and support for chroma sub-sampling 4:2:0 when using compression.
There are multiple other HDR video formats (e.g. HDR10+, HLG10, PQ10, Dolby Vision) and they differ in terms of licensing cost, transfer function, metadata, and compatibility. The majority, though, use the same color space and depth.
A quick glance shows us that a DisplayHDR rating of 400 is the lowest one you can get and, in general, any monitor with this rating isn’t especially good at HDR.
The other HDR label (DisplayHDR 400) on our example monitor is another certification, this time by VESA (Video Electronics Standards Association). Where HDR10 and the others are all about content, this one covers the hardware displaying it.
As this table demonstrates, VESA requires manufacturers to ensure their monitors to have certain requirements concerning luminance, color depth, and color space coverage before they can apply for the certification. VESA’s program now spans DisplayHDR 400/500/600/1000/1400 for LCD/Mini-LED and separate “DisplayHDR True Black” 400/500/600/1000 tiers for emissive tech like OLED/QD-OLED. Higher tiers (1000/1400) have tightened requirements around local dimming and sustained contrast, so edge-lit designs no longer qualify.
A quick glance shows us that a DisplayHDR rating of 400 is the lowest one you can get and, in general, any monitor with this rating isn’t especially good at HDR, though if you’ve never experienced anything better, you may still find it perfectly okay. As a rule of thumb: look for DisplayHDR 1000/1400 on bright Mini-LED LCDs, or True Black 500+ on OLEDs, if you want impactful HDR.
The world of HDR formats and certification is rather messy. It’s perfectly possible to have a fantastic display that can show content using various HDR formats but isn’t certified by either VESA or the UHD Alliance (another standards body). Conversely, one can have a monitor with multiple certifications but isn’t especially good when showing high dynamic range material.
OLED displays still set the standard for perfect blacks and near-black detail, but the best Mini-LED LCDs now hit 1,000 – 4,000-nit peaks with thousands of local-dimming zones, making them spectacular in bright rooms. Both can deliver top-tier HDR, choose based on viewing environment and budget. Aim for high peak brightness, not “average luminance.” For punchy HDR: ≥ 1,000-nit peak on Mini-LED, or ~600-nit peak on OLED (where black levels carry the wow factor).
Movies and HDR
If you want to watch the latest films with high dynamic range, then you’re going to need three things – an HDR TV or monitor, a playback device that supports HDR formats, and the film in a medium that’s been HDR encoded. Actually, if you’re planning on streaming HDR content, then you’re going to need one extra thing, and that’s a decent internet connection.
We’ve already covered the first one, so let’s talk about devices that play the movies, whether it’s a Blu-ray player or a streaming service dongle. In the case of the latter, almost all of the latest devices from Amazon, Apple, Google, and Roku support various formats – only the cheapest models tend not to have it.
For example, the $40 Roku Streaming Stick 4K handles HDR10, HDR10+, HLG, and Dolby Vision. Given that most streaming services use either HDR10 or Dolby Vision, you’d be more than covered with that range of support.
If you prefer to watch films on physical media, then you’ll need to check the specifications of your Blu-ray device. Most recent 4K players will support it, but older ones probably won’t. Returning to streaming, the likes of Disney Plus, Netflix, and Prime Video do offer content that’s been encoded in one of the HDR formats, but more often than not, you’ll need to hunt through the various menus to find them; you’ll probably also need to be paying extra for it, too.
Streaming HDR on a PC still has moving parts: you need an HDR-capable display, an HDCP 2.2/2.3-compliant connection, and HEVC decoding. Netflix’s 4K/HDR support depends on specific apps/browsers (e.g., Edge on Windows, Safari on macOS), and the rules change – always check the current requirements. Windows 11 now lets you stream HDR video even with system-wide HDR disabled and adds per-app Dolby Vision controls. Before judging results, run the Windows HDR Calibration app.
It can be such a struggle to get it to work properly, that it’s often a lot easier to just stream using a dongle plugged into the HDMI port of your HDR-capable monitor. The good news: HDR does not require HDMI 2.1 – DisplayPort 1.4 or HDMI 2.0b with the right formats/DSC can carry HDR just fine.
HDR for gaming
In the early days of 3D graphics, everything – colors, lighting, shading – was calculated using 8-bit integer values per channel. The result went into a frame buffer with the same color depth, which was fine at the time but left a lot of visible banding and limited brightness. Modern GPUs do their rendering in 16- or 32-bit floating point, and frame buffers often match this precision. That shift is what makes true high dynamic range rendering possible in games.
One of the earliest public showcases for HDR rendering was Valve’s Half-Life 2: Lost Coast expansion. It was rough around the edges – bloom everywhere and blinding exposure shifts – but it proved the tech worked. Below we can see how a typical Lost Coast frame looks with standard rendering (left) compared to HDR (right).
Fast-forward to today and nearly every major 3D engine renders internally in HDR, whether or not your monitor supports it. If it doesn’t, the image is tone-mapped down to SDR. What’s still surprisingly rare, though, is proper HDR output support in games – where the tone mapping is actually calibrated for HDR10 displays, respects luminance ranges, and gives you controls tailored to your display’s capabilities.
Some games get it very right. Users in HDR-focused communities point to titles like Cyberpunk 2077, Forza Horizon 5, Horizon Forbidden West, Alan Wake 2, and Helldivers 2 as examples of genuinely great HDR implementations – where highlights pop, shadows stay rich, and color accuracy isn’t sacrificed for sheer brightness. Others… not so much. Many still offer only a generic “HDR On” toggle that crushes blacks or blows out skies, unfortunately.
Cyberpunk 2077 formats its frame buffer to match HDR10 standards and allows adjustment based on your display’s luminance. That level of control is still the exception, not the rule. HDR in games can be breathtaking – but only if the developers and your display both do the work.
HDR in games can be breathtaking – but only if the developers and your display both do the work.
If a game doesn’t support HDR, Windows 11 can try to do it for you. Auto HDR analyzes SDR content rendered via DirectX 11 or 12 and maps it into HDR. You can enable it via Settings > Display > HDR or the Windows Game Bar (Win + G). Combined with the Windows HDR Calibration app, you can set black levels, peak brightness, and tone-mapping targets more precisely.
It works surprisingly well for some games, especially open-world titles or games with strong lighting pipelines. For others, the effect is barely noticeable. The magic only happens if the game already uses high-precision rendering before tone mapping... simple RTS or 2D titles won’t suddenly become HDR masterpieces.
HDR isn’t yet the norm... but it’s getting there
HDR is still not universal in gaming. It’s not quite niche, but it’s nowhere near as standard as high refresh rates or 4K resolution. The best experience is still tied to more expensive OLED displays. And some players genuinely don’t care. Many users only see HDR as worthwhile once they’ve experienced it on a good display with good content. Cheap “HDR-ready” monitors can look washed out, dim, or worse than SDR.
That said, the floor is rising:
- You can now find decent HDR monitors around $500, though true HDR impact still starts above that.
- OLED burn-in concerns are less severe thanks to burn-in mitigation, MLA and QD-OLED tech, and longer warranties.
- Mini-LED displays with thousands of dimming zones are closing the gap on OLED in bright rooms.
- Console gaming (PS5 / Xbox Series X) treats HDR like a default, not an extra.
HDR isn’t just marketing anymore, but it isn’t fully democratized yet either. If you’ve only seen HDR on a budget “HDR400” monitor, you probably walked away unimpressed. If you’ve seen it on a proper OLED or high-end Mini-LED, with a game that uses HDR well, you already know it can feel like the jump from DVD to Blu-ray.
Prices are dropping, implementations are improving, and Windows is getting less painful about it. There will come a point in the near future when a good HDR monitor will fall into a range that covers every budget. But at least now you know exactly what HDR is!