Jump to content

DLSS 3.0 - Great tech with terrible implementation? (it does not mix well with VRR?)

WereCat

After watching Digital Foundry, Hardware Unboxed and 2kliksphillip on DLSS 3.0 I just can't see how this is useful for most people in most cases except some very few niches if you want to keep using Variable Refresh Rate. I mean, the tech is really impressive but I think it's almost useless the way it's done.

 

I'll try to explain my point of view and I may stumble trough with my words so hopefully it will come out legible.

 

Why doubling the FPS may be a bad thing (with VRR):

 

How many of you have considered this question when remembering that most people use Variable Refresh Rate displays these days and want to keep using them? Since DLSS 3.0 doubles the frame at the cost of latency or at best maintains the latency of base FPS before doubling the frames, this means that if you're using 144Hz display and want to keep using VRR you have to render the game at base 72FPS and deal with the input latency of 72FPS at best. This defeats the whole purpose of wanting to own a high refresh rate display, especially considering the fact that not only is the 4090 capable of being almost entirely CPU bound at 1080p/1440p but it's powerful enough to even handle 4k 120Hz with native rendering without DLSS so it just becomes redundant to use this tech in those circumstances. 

 

There are for sure cases where you want to crank up visuals up high and if the game is slow paced the added latency may not matter like in Flying Sim but that's just one of very few games where I imagine this could be useful.

 

The other factor is that you WANT the game to run at as much FPS as it can since the more FPS you have, the less visual artifacts you will see from those generated frames as they will appear for much shorter time. As HW Unboxed showed at 30FPS base with DLSS 3.0 boosted to 60FPS, you can clearly see visual artifacts during gameplay in motion since each frame is visible for 16.66ms every other frame which is quite a long time to spot these during faster motion scenes. And it's also visible but to much lesser extent at 60FPS base with doubling to 120FPS.

 

So if you want to use VRR you have to cap your frames but then you have to deal with increased input latency of half of your refresh rate and also possibly visual glitches due to AI rendering... at least at 144Hz. So then you should consider this at 240Hz right?

 

But 4090 has DP 1.4a and there's now issue if you want to use 4k 240Hz... sure, you're fine at 1080p/1440p but also, who get's 240Hz monitor to play games cranked up with higher input latency? For sure most single player games at 240Hz with DLSS 3.0 will feel just fine with 120Hz input latency but for Multiplayer I can't imagine anybody wanting to use DLSS 3.0.

 

Also I've been talking only about RTX 4090 as an example right now but that's the issue. This card blows everything out of water at 1080p/1440p so there is absolutely no need for DLSS 3.0 and at 4k it only matters in the most demanding games with RT enabled and even then you're most likely at around 100FPS.

 

So the lower you go in the GPU stack the more you're just shifting the display tiers. 4k becomes less relevant for DLSS 3.0 as you would start to render the game at FPS so low that you risk visual glitching and in lower resolutions you will end up with the same situation as with 4090 at 4k.

 

Really the only reason to push DLSS 3.0 is for 4k at 240Hz right now (for some people) BUT the RTX 4090 has DP 1.4a so that's limited also in some other ways...

 

 

 

You probably don't want it in competitive games because of higher base latency which defeats the purpose of having high refresh rate display for competitive play. You probably don't want it if you run 144Hz/165Hz with VRR or lower because you may see visual glitching in motion and you probably don't want it ALSO in games that run at low base FPS because of very high latency even if it's single player (although this one could be argued about by some people and very game dependant).

 

 

So what I think DLSS 3.0 should be because the tech is still impressive.

 

I think NVIDIA should have make the DLSS 3.0 to only compensate for frame drops when using VRR or frame cap.

 

Let's say you set your frame cap to 140FPS. The game will run at 140FPS base (without DLSS) but once the FPS dips let's say to 130FPS for a moment then the DLSS kicks in and generates those extra frames to maintain the frame pacing. This would eliminate the visual stutter and yes, it still comes with the same shortcoming as mentioned above but the impact of them would be massively reduced and it does not completely kill it's use case with VRR and the input latency would increase only for a fraction of what it is now. Basically it would be a stutter eliminating tech.

 

Yes, I can already see scenarios where this would not work well, if the game stutters heavily it would probably not feel nice but at that point you wouldn't enjoy it anyways even without DLSS. Also not sure how well this would look during rare but massive stutters where AI has to generate multiple frames in a row... 

 

Anyhow, not trying to convince you guys, just sharing my point of view and I wonder about your opinions and also ideas. Right now I see this as the most niche feature since SLI... perhaps even more niche than SLI.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I've also watched DF and HU and yeah, it seems to only make sense if you already have a reasonably high frame rate (120+), so it really only seems to make sense if you have a high-end GPU, you're playing at 4K and can already to 120 fps. It's not something that will greatly benefit people with lower end GPUs to achieve higher frame rates.

 

Or, well it will, it's just that input lag isn't improved (or rather worsened) to a point where you're better of playing at the lower frame rate 🤔 It is certainly interesting tech from an academic standpoint, but seems of little practical use outside of limited cases.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

It seems super gen 1 and like a marketing gimmick right now. I think Nvidia should have asked devs of DLSS 3 games to put in an option for FPS limiting, because if you enable DLSS3 AND then hit Vsync framerates it will feel like you have 20 fps. That being said if you have high refresh rate monitor I don't think sync is all that important.

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

 most people use Variable Refresh Rate displays these days

Not sure about this part, but otherwise I don't necessarily disagree with your post. I guess it just adds another option. Some people only care about frames, others latency, others resolution, and others overall detail and quality. It can depend on the game, the genre, the person, etc... so I suppose this just gives people more tinkering choices that may give a more enjoyable experience, depending on what you're looking for. Plus, it's v1... it could be an important stepping stone for future iterations and other spinoff tech.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Holmes108 said:

Not sure about this part, but otherwise I don't necessarily disagree with your post. I guess it just adds another option. Some people only care about frames, others latency, others resolution, and others overall detail and quality. It can depend on the game, the genre, the person, etc... so I suppose this just gives people more tinkering choices that may give a more enjoyable experience, depending on what you're looking for. Plus, it's v1... it could be an important stepping stone for future iterations and other spinoff tech.

I agree and just to clarify I meant to say it in the context that these days if you get a monitor or TV it most likely has VRR. Especially somebody in a market for a graphics card with this kind of tech is probably someone who has also a modern display with modern tech like VRR and is most likely using it.

 

I know that many people who play competitively don't use VRR but in that case they wouldn't care about DLSS 3.0 either.

16 minutes ago, ZetZet said:

It seems super gen 1 and like a marketing gimmick right now. I think Nvidia should have asked devs of DLSS 3 games to put in an option for FPS limiting, because if you enable DLSS3 AND then hit Vsync framerates it will feel like you have 20 fps. That being said if you have high refresh rate monitor I don't think sync is all that important.

Depends on person to person. My friend can't tell VSync is ON on a 144Hz screen I can immediately tell by just moving the mouse cursor. (pure VSync without VRR enabled).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WereCat said:

But 4090 has DP 1.4a and there's now issue if you want to use 4k 240Hz...

DP 1.4 with DSC can already handle 4k 240 Hz I think. Are there (m)any displays existing that do that yet?

 

1 hour ago, WereCat said:

For sure most single player games at 240Hz with DLSS 3.0 will feel just fine with 120Hz input latency but for Multiplayer I can't imagine anybody wanting to use DLSS 3.0.

Multiplayer is not necessarily a problem, but I presume you specifically mean e-sports competitive every ms counts games, then I think people into those would not turn on DLSS 3. 

 

1 hour ago, WereCat said:

I think NVIDIA should have make the DLSS 3.0 to only compensate for frame drops when using VRR or frame cap.

That would have its own problem. To switch DLSS off to on, you need to introduce that extra delay to display the generated frame. It seems computationally wasteful to render them "just in case" and only show them if needed. On that note, I wonder if there's been a power usage test with and without DLSS 3? What is the power cost of that tech per non-DLSS 3 frame? Edit: thinking more this can't work, since the generated frame is in the past. They would have to come up with a predictive method to fill in missing future frames.

 

I over-trimmed the quote but the point about including a frame rate limiter I think is an essential part, regardless if done in game or in driver. For VRR users you'd want to limit just below half of your display maximum. For 144 Hz displays I think that is fine. For 120 Hz displays it might be a bit scarier if you have to effectively limit below 60 fps. 

 

On the latency question, I just don't play competitive reaction based games. I love the responsiveness of my 120-165 Hz VRR displays, but games are still playable at 60 Hz V-sync on. So on that note 60 Hz class latency is still ok for me.

 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, porina said:

DP 1.4 with DSC can already handle 4k 240 Hz I think. Are there (m)any displays existing that do that yet?

 

That would have its own problem. To switch DLSS off to on, you need to introduce that extra delay to display the generated frame. It seems computationally wasteful to render them "just in case" and only show them if needed. On that note, I wonder if there's been a power usage test with and without DLSS 3? What is the power cost of that tech per non-DLSS 3 frame? Edit: thinking more this can't work, since the generated frame is in the past. They would have to come up with a predictive method to fill in missing future frames.

 

I over-trimmed the quote but the point about including a frame rate limiter I think is an essential part, regardless if done in game or in driver. For VRR users you'd want to limit just below half of your display maximum. For 144 Hz displays I think that is fine. For 120 Hz displays it might be a bit scarier if you have to effectively limit below 60 fps. 

 

On the latency question, I just don't play competitive reaction based games. I love the responsiveness of my 120-165 Hz VRR displays, but games are still playable at 60 Hz V-sync on. So on that note 60 Hz class latency is still ok for me.

 

I'm personally not aware of any 4k 240Hz right now, I just considered it. 

 

What I didn't consider is that you're right... during stutter you don't know the next frame so it would have to predict based on the previous frame only which would get increasingly worse by having to insert multiple frames like this in a row. 

 

I don't mind playing with VSync. I play on Steam Deck with 40Hz lock and VSync... I find Rocket League unplayable though even at 60Hz and even at 165Hz with VSync on my PC. But I play games like Tunic, Planescape Torment, etc... on the Deck so VSync doesn't bother me at all there. 

Link to comment
Share on other sites

Link to post
Share on other sites

It just released yesterday.  I've only tried in one game and didn't see any issues even though it was doing 160-180+ fps and my refresh is 120.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, WereCat said:

There are for sure cases where you want to crank up visuals up high and if the game is slow paced the added latency may not matter like in Flying Sim but that's just one of very few games where I imagine this could be useful.

 

I think there's more than that for people who don't live by "frames win games".

 

I think DLSS (3.0) is best for:

  • Mitigating recognizable FPS drops

Even if I can't tell the difference between 90 and 120 FPS, I can still notice a drop lower than 90 - Especially if I'm coming down from closer to my max refresh (165). Though the tradeoff is...

 

  • Games that players don't care a lot about input latency

If I don't play a ton of FPSs, and I know that milliseconds rarely matter because I'm not a young gamee and my reaction time is average (hundreds of milliseconds) anyways, then I'm not really sacrificing much latency: Its only ~6ms taking my 165 Hz monitor down to 82 Hz equivalent of input latency via DLSS 3.0 with VRR (like G Sync).

That can apply to many other game types other than flight sims.

 

  • Cannot natively average an FPS higher than half the monitor's max refresh rate

This is easier to come across at 1440p or higher on an XX70 or lower with RT on or other intensive GFX. Its rarely something I come across with my RTX 3080 on 1440p 165 Hz panel, but I also don't play AAA games regularly either.

 

 

Obviously the needs & frame quality is all in the eye of the gamer, the latter of which you've understandably argued as poor along with high latency demands.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia should have separated the DLSS and Frame Generation features. Now they backed themselves into a corner. A better DLSS would help to sell 3000 series while reserving Frame Generation for 4000 cards

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, ImperialKnightErrant said:

Nvidia should have separated the DLSS and Frame Generation features. Now they backed themselves into a corner. A better DLSS would help to sell 3000 series while reserving Frame Generation for 4000 cards

I thought they had? I just watched Digital Foundry's video on it, and I thought Alex showed that they were separate distinct toggles? Unless I'm thinking of another setting/feature. It's entirely possible.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Holmes108 said:

I thought they had? I just watched Digital Foundry's video on it, and I thought Alex showed that they were separate distinct toggles? Unless I'm thinking of another setting/feature. It's entirely possible.

I think he meant in name. Just call it something else and not DLSS

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, WereCat said:

I think he meant in name. Just call it something else and not DLSS

Ah yep, that makes sense too.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×