Path tracing is the gold standard for computer graphics. It is a physically based rendering model that is based on how lighting actually works. There are degrees of path tracing of varying quality, but there is nothing else that is better from a visual quality and accuracy standpoint.
Your modern AAA title does a massive amount of impressive hacks to get rasterization into the uncanny valley, as rasterization has nothing to do with how photons work. That can all be thrown out and replaced with “model photons interacting with the scene” if the path tracing hardware was powerful enough. It’d be simpler and perfectly accurate. The end result would not live in the uncanny valley, but would be indistinguishable from reality.
Assuming the hardware was fast enough. But we’ll get there.
Think of the 'impressive hacks' used in realtime 3D rendering as compression techniques to reduce bandwidth (and this goes all the way back to 8-bit home computers with their tricky video memory encoding, this was essentially hardware image compression to reduce memory bandwidth).
Why waste computing power on 'physically correct' algorithms, when the 'cheap hacks' can do so much more in less time, while producing nearly the same result.
Because a lot of the "cheap hacks" aren't even cheap. Famously, using real-time SSR to simulate ray-traced reflections can end up slower than ray tracing them from the jump, the same goes for high-res shadowmaps. Ambient occlusion is a lighting pass you can skip entirely if you render shadows the right way with RT from the start. If you keep stacking these hacks until you have every feature that a RT scene does, you're probably taking longer to render each frame than a globally illuminated scene would.
Accelerating ray tracing in-hardware is a literal no-brainer unless you deliberately want to ostracize game developers and animators on Mac. I understand the reactionary "But I don't care about dynamic shadows!" opinion, but there's practically no opportunity cost here. If you want to use traditional lighting techniques, you can still render it on an RT-enabled GPU. You just also have the option of not wanting to pull out your fingernails when rendering a preview in Blender.
Yes, for some specific problems, raytracing definitely makes a lot of sense, but rasterization and 'geometry compression' via triangles also makes a lot of sense for other situations. I think the best approach will always be a hybrid approach instead of a 100% pure raytracing pipeline.
The other question is how much of the raytracing features in 3D APIs need to be implemented in fixed-function hardware units, or whether this is just another area where the pendulum will swing back and forth between hardware and software (running on the GPU of course).
But then maybe in 10 years we'll have 'signed distance field units' in GPUs and nobody talks about raytracing anymore ;)
From everything I've seen, hardware acceleration seems to be the only option. Software-accelerated RT existed for years and was mainly the reason why it was disregarded as a technique in the first place. Here's the OG M1 Pro, manufactured on TSMC 5nm, getting blown the fuck out by a 3070 Ti on Samsung 8nm: https://youtu.be/NgDTCNPm0vo
The architecture will probably improve on both Nvidia and Apple's side going forward, but in theory both GPUs should be easier to support for longer since they're not focused on accelerating obsolete junk like MSAA or HBAO. It enables a better convergence of features, it makes porting to-and-from Mac easier, and with any luck Apple might even quit being so sheepish about working with Khronos since they're not actively neglecting the Vulkan featureset anymore.
Not really. Tiny Glade is a recent ray-traced title that has no goal of photorealism whatsoever, and it looks gorgeous with RT lighting and shadows: https://youtu.be/QAUSBxxgIbQ
You can scale these same principles to even less photorealistic games like the Borderlands series or a Grand Theft Auto game. Ray tracing is less about photorealism (although it is a potent side effect) and more about creating dynamic lighting conditions for an interactive scene. By simulating the behavior of light, you get realistic lighting - photoreal or not a lot of games rely on SSR and shadowmaps that can be replaced with realtime RT.
Let me get this 100% straight, rendering light the exact same way reality itself does it is a gimmick?
In any case, RT isn't just about getting pretty graphics. It massively lowers the artists' workload since there's no need to painstakingly go through each in-game area and add fake lights to make everything look good.
No gaming studio will pass up on the opportunity to get better graphics and better productivity. It really is just a matter of waiting a few years for the hardware to get good enough.
> Let me get this 100% straight, rendering light the exact same way reality itself does it is a gimmick?
Yes, because "Rendering light the exact same way reality itself does it" was never the assignment, outside of nvidia desperately trying to find excuses to sell more GPUs.
Maybe some games would benefit from it in some cases... but you have to weigh that marginal improvement against the increased costs, both economic and ecological.
> It massively lowers the artists' workload since there's no need to painstakingly go through each in-game area and add fake lights to make everything look good.
This is a laughable claim, as long as you're going for anything more than fixed ambient lighting you're going to need to massage and QA that the lighting works for what you're trying to show.
---
In short, yes. Ray tracing as a headline feature is a gimmick, made to appeal to the same clueless HNers who think AI will eliminate our need to understand anything or express ourselves.
Apple wishes ray tracing was AI acceleration, because then they could at least ride the Nvidia valuation pump. But the truth is even less glamorous than that - this is a feature for the future, something for posterity. It's a well-researched field of fixed-function acceleration with multiple competitive implementations, unlike AI architectures. What's else, in 3-5 years you can bet your bottom dollar Apple will announce their CUDA alternative alongside a new Mac Pro. You can't even confidently say I'm joking since they've already done this (and subsequently depreciated it) once before: https://en.wikipedia.org/wiki/OpenCL#OpenCL_1.0
I guess I can't blame HN for courting pedestrian takes, but man you're going to be disappointed by Apple's roadmap going forward if this is your opinion. And I say this, defending Apple, as someone that treats Tim Cook and Steve Jobs as the living and buried devil respectively. Having hardware-accelerated ray tracing is no more of a "gamer" feature than the high-impedance headphone jack is an "audiophile" feature. It is intelligent design motivated by a desire to accommodate a user's potential use-case. Removing either would be a net-negative at this point.
Path tracing is the gold standard for computer graphics. It is a physically based rendering model that is based on how lighting actually works. There are degrees of path tracing of varying quality, but there is nothing else that is better from a visual quality and accuracy standpoint.
Your modern AAA title does a massive amount of impressive hacks to get rasterization into the uncanny valley, as rasterization has nothing to do with how photons work. That can all be thrown out and replaced with “model photons interacting with the scene” if the path tracing hardware was powerful enough. It’d be simpler and perfectly accurate. The end result would not live in the uncanny valley, but would be indistinguishable from reality.
Assuming the hardware was fast enough. But we’ll get there.