Jump to content

Some of the comments on Ray Tracing in games on the latest Hardware Unboxed video.

P.S. Am I the only one who went team AMD this generation because of the better price/vram and see Ray Tracing as a useless gimmick that halfs your FPS in games?

Not saying anything to Artists here, I know RT helps a lot in that department.

image.png

image.png

image.png

image.png

image.png

image.png

Link to post
Share on other sites



RT/PT is not perfect technology (currently) but when it works it can make huge difference we just need 2 fire more rays and for that we need more powerful hardware.

Good video on the topic by HW Unboxed as well:
 

 

R7 9800X3D, Arctic Liquid Freezer III 360 ARGB, ASRock X870E TAICHI, Kingston FURY 64GB (2x32GB) 6000MT/s CL30 Beast Black RGB , Gigabyte RTX 4090 Gaming OC, Windows 11

Link to post
Share on other sites

29 minutes ago, Brexy9 said:

and see Ray Tracing as a useless gimmick that halfs your FPS in games?

it is a way to do realistic lighting in games without essentially "faking it" like we've been doing since the dawn of 3D graphics. it is legitimately a really cool tech, if game devs use it properly and not just as a copout to do less work.

Link to post
Share on other sites

1 minute ago, manikyath said:

if game devs use it properly and not just as a copout to do less work.

*If publishers are willing to pay for using it in an artistic way instead of do lighting wquick and cheap 😛

 

Ive seen one or 2 games/demos where with raytracing it looked GENUINLY good. Not realistic artstules but clearly stylized stuff. Dont care about the hyperralism in games as it just looks so uncanny valley at this point i avoid it like the plague usually.

Link to post
Share on other sites

34 minutes ago, Brexy9 said:

Ray Tracing as a useless gimmick that halfs your FPS in games?

I'd say it ultimately boils down to performance, or lack thereof. If it was possible on lower end hardware, while still maintaining playable frame rates, it would allow for a lot of additional realism. But we're likely still a fair number of hardware generations away from that. The whole noise problem goes away if you increase the number of rays (or have better versions of thing like ray reconstruction), but that just increases the performance cost even more.

Remember to either quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

Ray tracing is genuinely a huge thing, but trying to use it for everything is a recipe for poor performance and image quality. At our current level of performance, and probably up to an order of magnitude, I feel it prudent to use it for effects that would otherwise be impossible to reasonably approximate with rasterization. 
 

Real-Time Global Illumination is a pretty worthwhile focus. Shadows and reflections we can already do reasonably well, but methods of doing GI in real time are few and far between. 

My eyes see the past…

My camera lens sees the present…

Link to post
Share on other sites

4 hours ago, Eigenvektor said:

I'd say it ultimately boils down to performance, or lack thereof. If it was possible on lower end hardware, while still maintaining playable frame rates, it would allow for a lot of additional realism. But we're likely still a fair number of hardware generations away from that. The whole noise problem goes away if you increase the number of rays (or have better versions of thing like ray reconstruction), but that just increases the performance cost even more.

Its not the lack of performance on the hardware side really, UE5 is consolidating every dev to their engine while its quality is going down simultaneously. Like even CDPR is going UE5 for Witcher 4, for the sole reason that it makes onboarding easier, even if its worse.

UE5 Is literally doing less with more by everyone who uses it, And then hiding it all with TAA which makes ALL games that use it look worse. TAA is only good at hiding the 1/8th resolution noisy effects that UE5 is using. but very few people are willing to invest in making it better, including EPIC. 

And then using frame gen which makes all games both look and play worse, no exceptions and calling it a feature blows my mind.

  

4 hours ago, jaslion said:

*If publishers are willing to pay for using it in an artistic way instead of do lighting wquick and cheap 😛

 

Ive seen one or 2 games/demos where with raytracing it looked GENUINLY good. Not realistic artstules but clearly stylized stuff. Dont care about the hyperralism in games as it just looks so uncanny valley at this point i avoid it like the plague usually.

Though this has nothing to do with it.

Link to post
Share on other sites

7 hours ago, starsmine said:

Its not the lack of performance on the hardware side really, UE5 is consolidating every dev to their engine while its quality is going down simultaneously. Like even CDPR is going UE5 for Witcher 4, for the sole reason that it makes onboarding easier, even if its worse.

Neither the video nor my comment were focused on UE5. I was primarily talking about the noise problem addressed in the video.

 

Noise is very much down to a lack of hardware performance. Shooting more rays would solve it, but that is cost prohibitive. Technologies like Ray Reconstruction try to address it without resorting to brute force, but have their own issues. My hope is that it'll be solved by a combination of faster hardware and better algorithms over time.

 

That said, I don't think UE5 is that bad from a technological point of view (though it certainly has issues). The majority of games using UE5 means the probability of a bad game using UE5 is fairly high, so that might also skew perception. Doesn't mean the blame rests squarely on UE5.

 

Few studios seem willing (or capable) to put in the effort needed to adjust the engine to their game's needs. From what I've read CDPR are doing just that though, focused on reducing (traversal) stutter.

 

I'd also question whether a game running badly in UE5 would've fared better using any other engine. I assume most studios select UE5 because they either lack the experience to use anything else (let alone develop their own) or are unwilling to invest the money needed to do so.

 

I think HUB had a good take in one of their recent videos (podcast, I think?). If Nvidia had introduced ray tracing as a technology preview of were gaming was headed, it would've been much better received. Instead they made it appear as if it was ready for use. But even now, three hardware generations later it's often unusable, unless you own the fastest possible card (and even there it is often a compromised experience).

 

Though I think Indiana Jones is a good indicator of where things are headed. It's not perfect by any means, but I think the DF video is a good showing of what PT can offer.

Remember to either quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

16 hours ago, starsmine said:


And then using frame gen which makes all games both look and play worse, no exceptions and calling it a feature blows my mind.

  

 

Frame gen applied correctly, is a bit like icing on an already good cake. Taking an already good gaming experience (60+ fps), and ramping that up further. Definitely really nice to have with gaming laptops as well, to be able to utilize the high refresh panels, while not sensing the fans into a frenzy. And at the sort of framerates that frame gen works well, the latency hit is minimal. 
 

But, a great icing does not salvage a shit cake. What frame gen is not for, is to fix an already marginal gaming experience. Going from 30 fps to a generated 60 is a shit experience, with noticeable artifacts (owing to the increased time gap between frames) and terrible latency even with controller. I’ll take a rock steady 30 fps with judiciously applied motion blur, over a generated 60 FPS. 
 

It’s absolutely awesome for really elevating an already great experience on a high refresh display, but a borderline liability to have available for devs otherwise. I’d probably even go so far as to say Nvidia (and console makers) should probably have the feature disabled with sub-120hz panels (as it’s really not useful below this point), though with FSR frame gen available, that horse had already bolted. 

My eyes see the past…

My camera lens sees the present…

Link to post
Share on other sites

10 hours ago, Eigenvektor said:

Neither the video nor my comment were focused on UE5. I was primarily talking about the noise problem addressed in the video.

 

Noise is very much down to a lack of hardware performance. Shooting more rays would solve it, but that is cost prohibitive. Technologies like Ray Reconstruction try to address it without resorting to brute force, but have their own issues. My hope is that it'll be solved by a combination of faster hardware and better algorithms over time.

 

That said, I don't think UE5 is that bad from a technological point of view (though it certainly has issues). The majority of games using UE5 means the probability of a bad game using UE5 is fairly high, so that might also skew perception. Doesn't mean the blame rests squarely on UE5.

 

Few studios seem willing (or capable) to put in the effort needed to adjust the engine to their game's needs. From what I've read CDPR are doing just that though, focused on reducing (traversal) stutter.

 

I'd also question whether a game running badly in UE5 would've fared better using any other engine. I assume most studios select UE5 because they either lack the experience to use anything else (let alone develop their own) or are unwilling to invest the money needed to do so.

 

I think HUB had a good take in one of their recent videos (podcast, I think?). If Nvidia had introduced ray tracing as a technology preview of were gaming was headed, it would've been much better received. Instead they made it appear as if it was ready for use. But even now, three hardware generations later it's often unusable, unless you own the fastest possible card (and even there it is often a compromised experience).

 

Though I think Indiana Jones is a good indicator of where things are headed. It's not perfect by any means, but I think the DF video is a good showing of what PT can offer.

A lot of the noise isn't FROM RT in UE5. Its from Lumen. and Nanite, because otherwise you could just turn off RT AND TAA and the game would look fine.  

image.thumb.png.4bacd293c9a93cfb4ff9374b3ee98649.png
Like above
image.png.73d45ca138d7684907dd9d1928069b5b.png
Left one is a noisy mess, from nanite
image.thumb.png.d35a59b5873692709529ec397fdf594d.png
Volumetric clouds rendered at 1/8th the resolution they should be.
image.thumb.png.bdecf5f85cfe4260a79f8f7ae3fe3bb5.png
 

This is often done with hair, like if you turn off TAA you will find everyons hair because messed up.

image.thumb.png.eccf29521cc1dbef4a37d01bb024b409.png
Nanite is why the Traffic cones look like that.


image.thumb.png.3d3c6e7a4c9eeb1c2faef3c4430a0217.png
I do agree that a lot of the problems can be fixed with more rays, but thats not an option. 

You CAN and (you used to) make a material at an individual mesh/material/effect level to just do temporal shit on JUST that effect. 

Link to post
Share on other sites

10 minutes ago, starsmine said:

I do agree that a lot of the problems can be fixed with more rays, but thats not an option.

At least not until we get a lot more performance

Remember to either quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

3 minutes ago, Eigenvektor said:

At least not until we get a lot more performance

Or we can be more selective with how we are using RT for today. UE5 makes that difficult from my understanding. My understanding is you cant put in hints into Lumen

If we want to have Global illumination in games now, have it be an ultra only settings, as in, this is only here for GPUs that will exist tomorrow, please stop trying to run it with GPUs of today other then enthusiasts doing enthusiast shit and having fun breaking shit. 

Link to post
Share on other sites

22 minutes ago, starsmine said:

If we want to have Global illumination in games now, have it be an ultra only settings, as in, this is only here for GPUs that will exist tomorrow, please stop trying to run it with GPUs of today other then enthusiasts doing enthusiast shit and having fun breaking shit.

Agreed. Only problem is people like to crank things up to Ultra and expect their cards to handle it just fine. So it should probably be called something different and be made very clear this is only a preview for now and intended for future cards

 

~edit: though I think it would be pretty cool to be able to crank things up to insane level (through console for example) for future GPUs

Remember to either quote or @mention others, so they are notified of your reply

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×