Ray tracing has been around for over five years now, but is it worth it? The technology, which aims to deliver a more realistic and immersive approach to in-game lighting, reflections, and shadows, remains hardware-intensive and results in a significant hit to game performance, both on consoles and PCs.
Despite the iterations on the technology in GPUs such as Nvidia’s RTX 20- and 30-series and AMD’s Radeon RX 7000 XTX, modern games still struggle to maintain high framerates with ray tracing enabled. The likes of the Xbox Series X and PS5 can’t handle ray tracing at their target resolutions, and even high-end gaming PCs with the latest graphics cards still experience noticeable performance drops.
In many cases, players have to make compromises in image quality or play in lower resolutions to maintain playable framerates. Some solutions, such as Nvidia’s DLSS 3 and AMD’s Radeon Super Resolution, involve upscaling and filling in gaps with artificial intelligence to achieve a target resolution.
The compromise may be worth it for some players who value the realism and immersion that ray tracing brings, but for many others, the sacrifice is not worth it. Ray tracing remains a challenging feature to justify, and difficult to recommend for those who prioritize high framerates and smooth gameplay over visual fidelity.
As hardware becomes more capable of handling ray tracing at higher resolutions, we may see a significant improvement in performance. But for now, players must weigh the benefits of ray tracing against its impact on game performance before deciding whether it’s worth using.