High Dynamic Range, or simply HDR, is something video game developers like to mention. They promise advanced graphics, but in reality, achieving a high-quality picture using HDR on Windows computers can be difficult. However, it is something to strive for. HDR is a technology that could actually be revolutionary. If implemented correctly, graphics quality can improve significantly.
Despite the fact that this technology is becoming extremely popular not only in the world of mobile gadgets but also in TVs, the vast majority of people either don’t quite understand its essence and purpose or don’t understand what we’re talking about at all.
So, here is what you should know about HDR10+ Gaming.
What does HDR mean
Some people, when they see the letters HDR in the acronym HDR, assume that this technology is somehow associated with screen resolution. Sometimes you can even find questions on the Internet about whether HDR or 4K is better. In fact, HDR has nothing to do with resolution. HDR video can be either FullHD or 4K resolution.
An even more common myth is the opinion that HDR is a video in which the picture looks very bright and saturated. Just like the AMOLED screens of 10 years ago with oversaturated “acidic” colors. The reality is that in most cases, the standard video looks brighter than HDR. And in the forums there are always questions about how to turn off HDR, so you can at least see something on the screen of your smartphone.
HDR (High Dynamic Range) is a technology that allows you to display video with high bit-rate (depth), wide color gamut, and extended dynamic range. HDR video allows you to see the picture exactly as the manufacturer (movie company) intended. In other words, HDR allows the video to be reproduced as realistically as possible and to evoke the feel of the picture as the creator intended.
What is HDR10
HDR10 was introduced in 2015 by the Consumer Technology Association. It is an open standard, so it is widely supported. In the world of computers, it is so common that gaming studios and monitor manufacturers support HDR10 by default.
Other HDR formats are rarely used in video games. Dolby Vision may be considered an exception. The graphics processors on consoles and computers may have Dolby Vision support, and some laptop monitors also support this standard. They’re few and far between and such laptops may be the best choice for PC gaming.
The most common format with static metadata support is HDR10. Moreover, it is the most common HDR format in principle. If you see an HDR sticker on your TV, know that it supports HDR10. That’s a plus.
What is HDR10+
The fact that it only supports static metadata makes it impossible to call it true HDR. So Samsung, together with 20th Century Fox and Panasonic decided to correct this misunderstanding and added dynamic metadata support to HDR10, naming the new standard – HDR10+.
Its advantage over conventional HDR10 is an increased amount of metadata and no need to manually calibrate the image when playing games. Using HDR10+ Gaming will increase the peak brightness of the image up to four times, increase the saturation of the picture and achieve the most accurate color reproduction. In addition, the standard includes support for variable screen refresh rates up to 120 Hz and automatic low latency response mode.
HDR10+ isn’t only available in Samsung TVs and smartphones – it is supported by almost all manufacturers of TVs, smartphones, cameras, and, of course, movies are shot and made in this format. That means HDR10+ has every chance to become a real national HDR standard, which you can enjoy on all screens of the country, both large and small.
What is the difference between HDR, HDR10, HDR10+ and Dolby Vision
If you see a device with HDR support (without additional numbers), it means that it is HDR10. That is, any smartphone with HDR support is a smartphone with HDR10 support. The same goes for any video. If it isn’t stated that a video supports HDR10+ or Dolby Vision, then it is created in HDR10 format. In other words, HDR10 is a basic set of guidelines that is always used unless otherwise specified. Therefore, you should compare HDR10, HDR10+, and Dolby Vision.
Dolby Vision is a more advanced standard developed by Dolby. Unlike HDR10 and HDR10+, this standard requires licensing, which means that companies wishing to add support for it to their smartphones must pay a license fee to Dolby. To date, only Apple smartphones support Dolby Vision.
The main advantage of HDR10+ over HDR10 is the support for dynamic metadata. That is, videos with HDR10+ will look noticeably better on devices whose maximum brightness doesn’t meet the stated requirements.
How does HDR work in video games
HDR quality depends on the maximum brightness of the monitor, in games first. This is something to keep in mind when you purchase a TV or monitor with a maximum peak brightness of 1000 nits or higher. Brightness isn’t the only advantage of HDR, but games benefit the most from it.
Apart from a few genres, like horror or some simulation games, the action doesn’t usually take place in constant darkness. No one likes to be killed by an enemy they can’t see. In eSports, gamers can intentionally turn up the brightness of the screen so they can see their opponents better. Bright, saturated games are a modern trend that builds on the strengths of very bright HDR-enabled monitors.
What is HDR10+ Gaming
HDR10+ Gaming isn’t just about HDR10+ benefits, such as enhanced dynamic range and increased color depth. It also focuses on three key aspects – Variable Refresh Rate (VRR), automatic HDR calibration, and low-latency source tone display – to keep gaming at its best.
Unlike regular video content such as TV shows or movies, which always have a fixed frame rate, video games can have a variable frame rate. Therefore, to match the variable frame rates of games, displays are equipped with a feature called variable refresh rate or VRR. HDR10+ Gaming supports this feature at up to 120Hz, providing a smooth gaming experience.
Overall, HDR10+ Gaming ensures that compatible games display rich and realistic colors, just as the developers intended, without sacrificing additional latency or important features like VRR.