Jump to content
WMGroomAK

nVidia building up for RTX release hype: 9 more DLSS games & more 4k perf comparisions (Plus a GN Video with details)

Recommended Posts

1 hour ago, RejZoR said:

25 titles. Out of how many games in existence? Hundreds of thousands. And only new games. What if I'd want it applied on a 1 year old game? Oh, tough luck right? This is why I hate features that are only applicable for new games. Not everyone plays just the new games.

You mean old games like Ark: Survival evolved? Yeah, it has DLSS.

It is a new form to do AA, it does not seem to be hard to implement at all. 

 

Given the major performance uplift, every game that is GPU bound will want to include it in order to make people happy with their new flag ships. After all: People that spend 800+ on a GPU are more likely to also spend on DLC, Microtransactions etc. So i would not worry about missing support for it.

But like every form of AA: It won't be in every game. But it also does not need to be in every game in order to be a good thing.

Link to post
Share on other sites

Can anyone with machine learning experience explain why they would need to manually do the whole training process on a game by game basis? Correct me if I'm wrong, but 'd assume that the datacenter-side side of the training is mainly just pattern recognition of objects and environments in the given game.

 

However, you look at a lot of AAA games like Tomb Raider or GTA and you have cities and farms and mountains and jungles, dozens of vehicles of different shapes and sizes, architecture that's all over the spectrum, all sorts of different human and animal models. At what point is there so much variety in one or two big games that you can generalize the algorithms to just work on everything? 

 

I'm probably just fundamentally misunderstanding how the AI process works, but if it's just extrapolating based on patterns then it seems like the level of variety in one AAA game is probably greater than the minor artistic differences you'd get with similar objects in a different game.

Link to post
Share on other sites
11 hours ago, RejZoR said:

25 titles. Out of how many games in existence? Hundreds of thousands. And only new games. What if I'd want it applied on a 1 year old game? Oh, tough luck right? This is why I hate features that are only applicable for new games. Not everyone plays just the new games.

It's been applied to 1 year old games.  PUBG has it, It is being deployed into games faster than any other tech introduced.  Are you also upset because physx isn't incorporated into all games? what about vulkan? hey what happened to mantle, is that stupid too?

 

It sounds like you just want to be angry.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
25 minutes ago, Waffles13 said:

I'm probably just fundamentally misunderstanding how the AI process works, but if it's just extrapolating based on patterns then it seems like the level of variety in one AAA game is probably greater than the minor artistic differences you'd get with similar objects in a different game.

 

You are pretty much on point here, with only a minor misunderstanding:

You can NEVER have enough data. And the more you try to generalize, the more errors you get.

 

Basically, there is no reason to not do the process for every game. It takes a few hours for them to do on their servers and is a few MB of data for the user.

And the result will certainly be better if all the data points come from one game only. The art style on the pixel level can vary a lot more than you would notice when looking at the full picture.

But more data is more better. Always.

 

Maybe that is why GeforceExperience is in the mix, so not all people need to download data for a game they don't even have installed. But that is pure speculation on my part. 

Link to post
Share on other sites
14 hours ago, Waffles13 said:

 

i blame Linus for this, we are missing a DLSS video.

 

 

Also am i wrong in expecting something like a GTX 2050 with DLSS to beat a old school 1080 or 1080ti?


.

Link to post
Share on other sites
1 hour ago, asus killer said:

i blame Linus for this, we are missing a DLSS video.

 

 

Also am i wrong in expecting something like a GTX 2050 with DLSS to bit a old school 1080 or 1080ti?

Any GTX card will presumably be missing tensor cores, so DLSS will be out of the picture.

 

1 hour ago, Tech Enthusiast said:

Basically, there is no reason to not do the process for every game. It takes a few hours for them to do on their servers and is a few MB of data for the user.

And the result will certainly be better if all the data points come from one game only. The art style on the pixel level can vary a lot more than you would notice when looking at the full picture.

But more data is more better. Always.

I just hope they add some generalized profiles eventually, because no matter how trivial it would be you know that they're never going to go back and add profiles for older games that aren't super active.

 

Maybe they just keep adding profiles for every new game that comes out, and then use that data to constantly update the "base" profile(s) that work on anything else. Maybe have 2-3 different generalized profiles for "realistic", "cartoony", etc, selectable though GFE.

 

I guess you can make an argument that older games are easier to run and therefore just use supersampling, but if the tech is anywhere near as good as they claim, then presumably you could bump up way higher with a DLSS "faked" resolution than with straight rendering.

Link to post
Share on other sites
2 minutes ago, Waffles13 said:

I guess you can make an argument that older games are easier to run and therefore just use supersampling, but if the tech is anywhere near as good as they claim, then presumably you could bump up way higher with a DLSS "faked" resolution than with straight rendering.

For older games you could also use it to upscale the output resolution above what the original game engine supported, maybe. That would be nice especially for games that didn't support wide screen as well.

Link to post
Share on other sites
Just now, leadeater said:

For older games you could also use it to upscale the output resolution above what the original game engine supported, maybe. That would be nice especially for games that didn't support wide screen as well.

I didn't even think of that, good point.

 

It could also be really useful for games without proper UI scaling. Playing old RTSes at 1440p sucks sometimes.

Link to post
Share on other sites

 

 

@WMGroomAK and @leadeater, Tim addressed DLSS a bit. Normal DLSS runs a game at some % lower than native then upscales via with specialized techniques per game. DLSS x2 is at native resolution and then uses the specialized techniques to replace most of the AA functions. So, somehow, all of the takes were correct.

Link to post
Share on other sites
9 hours ago, leadeater said:

For older games you could also use it to upscale the output resolution above what the original game engine supported, maybe. That would be nice especially for games that didn't support wide screen as well.

I'd like to see the original shadow warrior and redneck rampage upscaled.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
10 hours ago, leadeater said:

For older games you could also use it to upscale the output resolution above what the original game engine supported, maybe. That would be nice especially for games that didn't support wide screen as well.

Amazing point actually.

DLSS is doing a 64x supersampling anyways,... oh, the possibilities!

Link to post
Share on other sites
1 hour ago, Taf the Ghost said:

 

@WMGroomAK and @leadeater, Tim addressed DLSS a bit. Normal DLSS runs a game at some % lower than native then upscales via with specialized techniques per game. DLSS x2 is at native resolution and then uses the specialized techniques to replace most of the AA functions. So, somehow, all of the takes were correct.

 

Hmm.

So DLSS(x1) is actually a form of downscaling while preserving quality? That would explain why the performance jump is so crazy.

And DLSSx2 is the AA alternative we have all been talking about.

 

Interesting. That means we actually have to take a VERY close look at both versions. As they are basically two completely different things. Kinda confusing if true, at least the naming is confusing. But  NVidia kinda loves to be confusing with names. As if they would not get enough shit for that.

Link to post
Share on other sites
17 minutes ago, Tech Enthusiast said:

Hmm.

So DLSS(x1) is actually a form of downscaling while preserving quality? That would explain why the performance jump is so crazy.

And DLSSx2 is the AA alternative we have all been talking about.

 

Interesting. That means we actually have to take a VERY close look at both versions. As they are basically two completely different things. Kinda confusing if true, at least the naming is confusing. But  NVidia kinda loves to be confusing with names. As if they would not get enough shit for that.

The issue is the x1 vs x2, as both features use the DLSS engine to post-process the image (and it allows a huge chunk of functions to be moved to the inactive GPU units), but they're for completely different applications. x1 it is being used as an Upscaler and x2 it's being used as a post-process filter (at native resolution).  The nomenclature is thus really confusing.

 

However, especially as an upscaler, I think that's a really good use of the Async ability. Run a game at  2880 x 1620 (so 1080p x 1.5) then upscale into 4K, so you can always maintain 60 Hz in practically every scenario. Plus, with high-refresh 4K monitors to land next year, it's a way of being able to actually get into those upper ranges consistently, as even the 2080 Ti should struggle with 100 Hz in modern titles at 4K. (Obviously, esports titles are a massive exception.) Of course, Nvidia is basically working off the fact that no one should be rocking a big enough monitor to be able to notice the difference between them, especially when people already argue for removing AA at 4K. 

 

24 inch/1080p monitors have a dot pitch of 0.2721mm, while a 32 inch/4K display has a dot pitch of 0.1816mm. Unless you're sitting really close to a 42 inch, 4K display (do they make any gaming monitors that big? Okay, they do, and not as badly priced as I expected. They're clearly for Stock Traders, though) won't really be noticable when upscaling like that. 5K and 8K, sure, but those are even much further off.

 

I was wrong, though, in my thought they'd actually be using DLSS for normal super-sampling. I thought they might be rendering above the resolution then using the DLSS to downscale & clean up the image.

Link to post
Share on other sites
1 hour ago, Tech Enthusiast said:

Amazing point actually.

DLSS is doing a 64x supersampling anyways,... oh, the possibilities!

Well that's not quite correct, DLSS is trained with 64x SS images. The application of DLSS is well, unkown/Nvidia secret sauce but it won't be as visually effective as 64x SS because it's extrapolating what the image would look like if it were 64x SS.

 

In fact you could train DLSS with 8k, 16k, 32k etc native renders, or at least I don't see why you couldn't.

Link to post
Share on other sites

It seems regardless of which you use (x1 or x2) huge amounts of AA processing is being removed from the cuda side, but with 1x not only is it not processing AA but it's not processing the full resolution either.  No wonder they can infer huge jumps in performance from questionable increases in cuda core count.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
8 minutes ago, mr moose said:

It seems regardless of which you use (x1 or x2) huge amounts of AA processing is being removed from the cuda side, but with 1x not only is it not processing AA but it's not processing the full resolution either.  No wonder they can infer huge jumps in performance from questionable increases in cuda core count.

Personally it's really good it can do both, one or the other would kind of suck.

Link to post
Share on other sites
2 hours ago, leadeater said:

Personally it's really good it can do both, one or the other would kind of suck.

I am not gonna lie here. I try to be as neutral as possible,.. but the inner nerd in me is kinda freaking out about this.

If they can deliver pictures that look like 4k, even tho they are not rendered at 4k,... that is pure genius. 

 

To make an analogy:

In a time where more horses to drag the cart is not always easily possible,... they started to reduce the weight of the cart while maintaining the load it can handle.

 

Can this... for example,... deliver 8k pictures and only render them in 4k? I mean, those cards can handle 4k@60, right? This could be a huge bang when people freak out about missing performance and suddenly they deliver 8k on a single GPU. Kinda fake 8k, but still. Damn.

 

Just pure genius. Chapeau!

Link to post
Share on other sites

Does anyone know of any up to date software like Nvidia Inspector? It's obviously a bit early to assume anything, but if you can currently force a game to use the SLI profile of another with third party software, then I wonder if you could force DLSS on with a not-ideal-but-technically-functional profile on unsupported games.

 

Hopefully they don't 100% lock the setting into GFE.

Link to post
Share on other sites
6 minutes ago, asus killer said:

wasn't today that the embargo lifted?

Yeah I think so, I expect that in few hours or maybe afternoon my YT sub feed will explode. Unless it's something like 7pm Pacific Time then RIP for Europe :D


CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Audio-GD NFB-11.28 + Focal Elear Headphones

Link to post
Share on other sites

@asus killer I did a little digging and apparently according to reddit, previous Pascal launch pattern and Videocardz leak it's going to be 6am Pacific, so in less than 2 hours from now. The hype is real.

 

Sure, not 100% confirmed but it's a solid bet.


CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Audio-GD NFB-11.28 + Focal Elear Headphones

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.

Buy VPN

×