Ray Tracing vs HDR: Fad or Technologies with a Future?

It seems that the Ray Tracing “boom” has given everyone a break and although it is still in force because AMD is late to the fight with NVIDIA, it will pick up momentum when comparing architectures and performance. The big question is whether this technology will have a similar path to that seen with HDR , which does not finish taking off on PC. Do they run parallel lots or do they follow different paths?

It goes without saying that this is one of those articles open for debate and that it will be interesting to know each other’s point of view in the comments. It is possibly a reflection rather than an instructive article as such, but it is equally interesting to sit and think about the path that these two technologies will face and how they can end.

Ray Tracing vs HDR

Ray Tracing and HDR, present and future? or passing fashion?

What we are seeing with HDR may not have much to do with Ray Tracing due to simple markets and different supports, but it is no less curious that the former is happening with more pain than glory in the PC world.

It came as a revolution (HDR), something that would change not only the world of gaming but the way of understanding and viewing content on PC monitors. Years later it is increasingly forgotten and it is not for lack of support:

  • Windows caught up and supports it natively and even powers it with different options.
  • AMD and NVIDIA have been supporting their GPUs for years.
  • Monitor manufacturers have a wide catalog to choose from.

HDR-vs-SDR

So what’s wrong with HDR so it’s not a standard used in every home and office? In our opinion there are two well differentiated factors: the number of standards that HDR drives and the quality of the panels to be able to implement it.

There is too much fragmentation in HDR standards, each one is different and has different requirements (there are 12 versions ) and some do not even fall within the minimum characteristics for certification, although they include the word HDR in their names.

In addition, there is a problem that HDR content may not be playable on two monitors with different certificates, which sometimes generates quite a few compatibility problems.

Will they follow the same path?

RayTracing

Not exactly, Ray Tracing is an added feature that has much more limited parameters and way of working than HDR. It is still a passing fad like Antialiasing , tessellation or TAA at the time, but unlike these it is moving an entire industry and dragging consoles towards a market that will no longer be relevant only when hardware is capable moving it freely enough, from the smallest of GPUs to the largest.

In addition, Ray Tracing does not depend on as many companies as such, it depends on an API with support, a driver that manages it and software that reproduces it. HDR instead needs a certifying standard, a panel that supports it, content that offers it and everything specifically, not generally as it happens in RT.

cyberpunk-2077-nvidia-geforce-e3-2019-rtx-on-exclusive-4k-in-game-screenshot-002

If HDR wants to survive it has to be simplified and reach more markets, the panelists have to make a leap in quality and offer VA, TN and IPS panels with higher contrast and brightness, greater color reproduction capabilities and to finish off, at a lower cost.

The content must be standardized, come from more media and with more bitrate, if not, HDR as a technology will go very long in time and will be forgotten as Ray Tracing but for different reasons, since ray tracing will become a feature more in games, used and activated to improve the quality as if it were any filter.

HDR instead can be replaced by other simpler and more effective standards, with lower costs and above all with better implementation, where perhaps only OLED technology in gaming monitors can end up saving it.