What is an HDR monitor, do you need one, and what sets apart good ones from bad ones?
In this article, I’ll discuss all you need to know about HDR monitors in order to help you make an informed decision. Let’s get into it.
What is an HDR Monitor?
First, let’s answer the most important question: what is an HDR monitor, anyway?
An HDR (High Dynamic Range) monitor is a monitor with support for current-gen display HDR technologies.
Compared to non-HDR displays, HDR monitors are capable of dimming and brightening their screens on a per-region basis, allowing for much greater contrast. Or, as the kids say, “brighter brights and darker darks”.
There are a few different HDR certifications out there, but the main ones to be aware of are:
- HDR10 — The de facto, open HDR standard. Supports 10-bit color depth and up to 1000 nits of brightness.
- HDR10+ — An updated version of HDR10, which supports 10-bit color depth as well but ups the maximum brightness to 10,000 nits, covering a much wider gamut of HDR-capable displays.
- Dolby Vision — Dolby’s proprietary HDR solution. Also supports 10-bit color depth with a maximum brightness of up to 10,000 nits. Prior to the release of HDR10+, this was far and away the better HDR solution.
- HLG (Hybrid Log-Gamma) — Basically an implementation of HDR for broadcast TV in Japan that sends HDR metadata alongside an SDR signal. For non-HDR TVs, only the SDR signal displays. Time will tell if this sees its way overseas.
- DisplayHDR — The de facto HDR grading standard, straight from VESA. DisplayHDR 1000, for example, indicates an HDR display that is properly certified for operation at up to 1000 nits of peak brightness. The True Black Standard further improves DisplayHDR by enforcing per-pixel dimming on all certified displays, which serves to improve HDR performance in dark scenes.
What Makes HDR Monitors Different From HDR TVs
So, what makes HDR Monitors different from HDR TVs?
Not that much, actually. They’re being built with the same underlying technologies for their HDR implementations, and often have even more in common than just that.
However, not all implementations of HDR are made equal, and this is actually where we run into a major issue with most HDR monitors compared to HDR TVs.
A lot of monitor HDR simply isn’t very good. The reason for this is twofold: generally, limited monitor brightness is one problem, but the more major issue is the lack of dimming zones.
Basically, HDR TVs are great for building HDR implementations, since you have plenty of physical room with which to install HDR’s extra hardware for zone dimming and brightening. With certain TVs, like OLEDs, you can take this as far as per-pixel.
This is a lot harder to do on a monitor, especially a smaller monitor. Because of that, many HDR implementations with PC monitors have very limited dimming zones, which results in a less authentic-looking HDR experience.
Fortunately, DisplayHDR grading can help you narrow down the better picks in this regard since the base standard and its number schemes easily show the best HDR monitors (DisplayHDR 1000 or True Black certified, for example) compared to lacking implementations (DisplayHDR 400 or an ungraded monitor, as another).
What Makes Display HDR Different From Photo HDR
So, HDR is a term being thrown around pretty heavily here, but anyone who knows photography is probably a little confused by now.
There is another HDR in photography, which also means High Dynamic Range, but does not refer to a bunch of specific display technologies.
In the context of photography, HDR still means the same thing— High Dynamic Range.
With a camera, good HDR means that you’re able to capture images with deep contrasts between light and dark. Photo HDR is done by taking the same photo multiple times, at different exposure levels, and then using post-processing in software to produce the final image.
This is the general process that happens on both professional cameras and smartphone cameras, though it’s handled differently and more user-friendly-ly with smartphones.
Whereas HDR photography with a camera requires the user to actually take several photos, you can usually do all your HDR tweaking with a few taps in-app. (Granted, the phone software is taking multiple photos in real-time to achieve this, even if they aren’t being saved.)
What Are The Best Display Panel Types For HDR?
The best display types for HDR are OLED, IPS, and to a lesser extent, VA. Allow me to explain.
First up, the best-case scenario: OLED. OLED (Organic Light-Emitting Diode) panels allow for per-pixel dimming out-of-the-box, which makes it a natural companion for HDR (which already wants to dim regions of the display).
Besides great per-pixel dimming, the right OLED can also boast superb color accuracy and viewing angles right alongside it.
However, OLED is also the most expensive tech by a considerable margin, and even though OLED monitors are starting to release, the selection is fairly limited for now. The reviews look promising, though!
Past OLED, we move on to IPS. IPS (In-Plane Switching) isn’t great for HDR’s dimmest content, since IPS’ Achilles heel is a lack of per-pixel dimming (and resulting backlight bleed).
However, IPS panels are superb for wide color gamut and color accuracy, which when used alongside high-end HDR implementations results in a superb picture.
IPS monitors are the most popular among professionals due to their wide viewing angles and great colors in general, as well.
Finally, there’s VA. VA (Vertical Alignment) panels don’t have OLED or IPS color reproduction, viewing angles, or color accuracy.
However, it’s still relatively decent at those things, and it’s far better than IPS at dark scenes and deep contrast since it also supports per-pixel dimming.
For gamers and consumers, VA displays offer a cheaper alternative to OLED and IPS that can offer a few of both tech’s spoils.
If you’re a pro or looking for the best possible image quality, stick to IPS or OLED. Otherwise, consider a VA TV or VA monitor for decent HDR at a lower price.
Is a Bigger Screen Size Better For HDR?
Yes, generally speaking. That’s mainly because strong HDR is tied to larger TVs that can more easily implement the technology than smaller monitors, though.
Screen size doesn’t tell the whole story, especially with monitors: you’ll want to take a deeper look into any monitor you’re considering buying for its HDR implementation.
More on that below.
How Does DisplayHDR and DisplayHDR True Black Certification work?
So, let’s take a moment to talk about VESA’s DisplayHDR certifications. There’s the main DisplayHDR Series, and there’s DisplayHDR True Black.
Let’s start with the main series first.
The main series starts at DisplayHDR 400 and ends at DisplayHDR 1400.
The number at the end not only refers to the peak brightness in nits but also a different level of features.
400 doesn’t even have local dimming, for example— that isn’t introduced until DisplayHDR 500, which also requires 10-bit processing.
With the regular DisplayHDR, you’ll mainly want to look out for displays that offer DisplayHDR 1000 certification or better. These displays will support a proper 1000+ nits of brightness for a vivid HDR image.
DisplayHDR True Black Certification is a little different. At the time of writing, there are only three grades— 400, 500, and 600.
On top of that, those numbers still correspond to peak brightness, so what makes True Black a worthwhile contender?
Basically, all DisplayHDR True Black-certified displays are OLED displays that require and use per-pixel dimming as a baseline.
This means that while they can’t get as bright as DisplayHDR 1000 displays, they can still reach far better levels of dimness due to their per-pixel dimming.
This allows for great contrast and colors to be achieved with much less raw brightness, hence the lower grades of 400, 500, and 600— are still considered to be competitive with DisplayHDR 1000!
Is HDR Needed For Productivity and Creation?
With all this talk of tech specifications, how useful is HDR for productivity and creation? Well, the answer is a bit complicated, but the short version is it depends on whether you’re working with HDR video content or not.
If you aren’t working with HDR video content (or game dev), your HDR photography will be perfectly fine on your non-HDR monitor, as long as its actual color accuracy and color balance is taken care of.
Some highly-accurate pro monitors will have HDR support and some won’t, but this by itself really doesn’t matter if you aren’t going to be making HDR video content or developing HDR video games.
So, needed is a strong word if you aren’t explicitly making HDR content.
However, I will say that a high-accuracy monitor that also supports a high grade of DisplayHDR should prove to be well worth your time. Focus on gamut and accuracy before HDR support if you’re shopping for work, though.
Of course, if you’re editing videos for an HDR-enabled end result such as HDR TV Broadcasting or Movie Projections, then you should make sure to do this on an HDR capable monitor.
Is HDR Needed For Gaming?
So, besides HDR video content, it turns out the other place where display HDR technology is seeing a lot of use is in gaming!
The best support for HDR in gaming is actually found on current-generation consoles (gaming devices built explicitly for TVs) first, but HDR support on PC has been improving over time.
Windows 11 even adds an Auto HDR feature to work HDR support into games that didn’t previously have it, and the results are generally considered to be at least okay.
Generally speaking, the best results you’ll get from HDR PC gaming is when HDR is integrated into the game engine itself and allows you to do calibration and tweaking when you launch the game.
Auto HDR is nice, but actual HDR is better. Console HDR will generally work with minimal fuss, as well.
What Makes Display Types Different From One Another?
I talked a lot about IPS, OLED, and VA in this article, but didn’t go into too much detail outside of how their respective qualities impact an HDR viewing experience.
If you’re interested in learning more about those and other display panel types, head over to my full-length Display Panel Comparison, where I tackle each dominant technology and how it compares to the others in detail.
In general, though, I’d keep your eyes focused on OLED if HDR is your main priority.
Does Curved or Flat Screen Matter?
The answer to this depends on a variety of factors.
If you’re using an IPS or OLED panel, the answer will largely come down to personal preference, as IPS and OLED are great at any viewing angle. However, for other panel types, curved or flat can make a difference to the final image’s viewability.
I’d consider heading over to Alex’s Curved Vs Flat Monitors Guide for a more detailed rundown of this question if it piques your interest.
Do You Have Any Monitor Recommendations?
As far as HDR Monitors currently available on the market go, pickings are pretty slim at the time of writing.
I’m looking forward to the day when I can point more people in the direction of OLED HDR monitors for the best of all worlds, but OLED is prohibitively expensive and has yet to see widespread adoption in the monitor space.
If you’re shopping for monitors with HDR in mind today, I’d focus on high-end 4K IPS monitors. While backlight bleed can be a problem, a high-quality IPS monitor with proper DisplayHDR 1000 support can turn around a pretty gorgeous-looking HDR picture.
If you’re mainly looking for professional work monitors, I’d recommend giving Alex’s Best Graphic Design Monitor guide a good look.
Over to You
And that’s it, at least for now!
I hope that this article answered any burning questions you might’ve had about HDR, and helped further illuminate what the display technology is, how it actually works, and how it’s impacted by other display technologies.
Leave a comment below and let me know: will you be on the lookout for an HDR monitor anytime soon, or will you be waiting for the technology to evolve? Alternatively, will you consider snagging an LG OLED TV for the best available HDR support?