Jump to content

Death Stranding (PC) supports DLSS 2.0, allowing 4K at 60+ FPS on any RTX GPU

illegalwater
50 minutes ago, D13H4RD said:

It's not exactly making it from nothing as the model is trained using ultra high resolution samples, so it has an approximate idea of how to upscale the scene. 

 

And there actually are implementation of artificial intelligence-based upscalers like the aforementioned Topaz Gigapixel but also Photoshop's Preserve Details 2.0 upscaling algorithm and even waifu2x, which was specifically trained to upscale pics of anime styled images to a certain point. 

That is making shit from nothing. There is no way a prediction model can know what should be somewhere on the output side out of 700000000000000000 trillion possibilities of scene positioning within entire game. Or how something should look like, but doesn't and then it upscales it. Comparing that to upscaling of what's essentially a sequence of 2D images is just incomparable. I've done image and also video upscaling using using advanced algorithms and while they deliver pretty impressive results, they can only go so far.

 

Few examples of advanced upscaling I've done years ago:

https://rejzor.wordpress.com/2015/05/26/system-shock-2-high-definition-cutscenes/

 

And we're talking 2D images. Games are 3D dimensional data that's not recorded ahead of time but done in realtime for every tiny move player does anywhere within that 3D space. Saying you can train something to do that is just bullshit. NVIDIA is doing something far more simplistic behind the scene and then blowing it out of proportions by calling it "Ai whatever". Because if they were really doing some Ai godwork on the output, we'd have to run supercomputers for them to do these kind of approximations. And even then it doesn't make sense. Training the DLSS for a particular game however for it to render details in full and cut down the rest isn't some magic, it's just a matter of knowing how depth buffer for that title works and adjust the way how world in game is perceived by DLSS and make sure it renders as many things in full detail as possible while leaving unimportant stuff out to save cycles and increase performance. And they use tensor cores to do that kind of filtering. The training is for depth interaction and where details can be cut and not to train it how to magically bring details from low resolution. Because a racing game will have a different out of focus zone than FPS game or a 3rd person adventure and they can optimize titles for that.

 

AMD is doing a rather similar but more basic trickery with that dynamic resolution thing that's based on mouse movement. And I know they've improved it a bit to retain detail in static elements like GUI while scaling the rest so dynamic resolution based on mouse movement retains crosshair and HUD detail but downscales the rest. I'm willing to bet NVIDIA does something similar with DLSS, but in a more advanced way. And present it as Ai something because that's the marketing speak of today. Even if you're selling shovels, you're selling Ai shovels today.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RejZoR said:

Because if they were really doing some Ai godwork on the output, we'd have to run supercomputers for them to do these kind of approximations.

My understanding of "AI" used in this context isn't great, but I can at least offer the following. In essence there are two parts two it, the training, then the application of the training. The training of the AI is the hard part. That's the part where you need a lot of compute resource regardless if you call it a supercomputer or not. Take the data, generate the AI model. nvidia does that part for you. The application of the training is much easier. That's what enables it to be run locally on consumer hardware. In other words, writing a program is difficult, but once done, running it is relatively easy. In this case it isn't a conventionally written program, but done through machine learning.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, RejZoR said:

If they can pull just what RTGI shader for ReShade can do that would be insane. Coz even though it's just screen space ambient occlusion and lighting, it creates such amazing effect on top of nearly any game. Games from 2009 feel like they have better lighting, shadowing and depth with it than current new games.

IK i was playing some cod mw3 (one of the last cod games worth playing) and the lighting was absolutely beautiful , you could see the shadows as leaves moved across the ground, bodys fell, the waves reflected, Fires creating a glow on some of the night maps. All that time I was thinking “holy fuck the only game rtx really makes sense is Minecraft”

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nice that the game supports this, the game look poopy though (not graphically),

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

I'm so happy about this, one DLSS 2.0 is great, two Death Stranding is an amazing game, so happy its been ported well. It was great on PS4 but the frame times were inconsistent and resolution was okay. Roll on release day

My Current Build: https://uk.pcpartpicker.com/list/36jXwh

 

CPU: AMD - Ryzen 5 3600X | CPU Cooler: Corsair H150i PRO XT | Motherboard: Asus - STRIX X370-F GAMING | RAM: G.SKILL Trident Z RGB 2x8Gb DDR4 @3000MHz | GPU: Gigabyte - GeForce RTX 2080 Ti 11 GB AORUS XTREME Video Card | Storage: Samsung - 860 EVO 250GB M.2-2280 - Sandisk SSD 240GB - Sandisk SSD 1TB - WD Blue 4TB| PSU: Corsair RM (2019) 850 W 80+ Gold Certified Fully Modular ATX Power Supply | Case: Corsair - Corsair Obsidian 500D RGB SE ATX Mid Tower Case | System Fans: Corsair - ML120 PRO RGB 47.3 CFM 120mm x 4 & Corsair - ML140 PRO RGB 55.4 CFM 140mm x 2 | Display: Samsung KS9000 |Keyboard: Logitech - G613 | Mouse: Logitech - G703 | Operating System: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Well,My GPU supports RTX but doesn't support DLSS,so i am not that interested in it.

It's a cool technology though.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Vishera said:

Well,My GPU supports RTX but doesn't support DLSS,so i am not that interested in it.

It's a cool technology though.

And what GPU supports RTX but not DLSS ? AMD doesn't have one yet and NVIDIA's ones should all support both or none.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, RejZoR said:

And what GPU supports RTX but not DLSS ? AMD doesn't have one yet and NVIDIA's ones should all support both or none.

I have a GTX 1660 it supports RTX,I even played Control with RTX ON,DLSS is grayed out:

On 8/28/2019 at 5:26 PM, Vishera said:

I have results for 1080p :)

 

1080p,Max settings - RTX OFF:

RTX-OFF.thumb.png.9d32eed3d037b9b56d513347d4be3b15.png

 

1080p,Max settings - RTX MEDIUM:

RTX-MEDIUM.thumb.png.775b78be9f4988a2d71e8d8f40daabd4.png

 

1080p,Max settings - RTX HIGH:

RTX-HIGH.thumb.png.16ad208b38fa7e407ec2244703fff559.png

 

Sorry for the picture quality (compression)

 

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Vishera said:

I have a GTX 1660 it supports RTX,I even played Control with RTX ON,DLSS is grayed out:

I had forgotten about that. There was a driver update that enabled "RTX" on some Pascal/Turing GTX cards. So it's a kinda software support given it doesn't have dedicated hardware support.

 

Nice to see performance doesn't totally suck in that title, even if it is rather "cinematic".

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

That doesn't mean GTX 1660 supports RTX (or more specifically DXR in hardware). I can enable it on my GTX 1080Ti too in those few games in existence, but that doesn't mean it works well or is actually supported in terms of being hardware accelerated, not just "it can work". As it can be seen in your screenshots with massive framerate drop. Which is what I get too, maybe a bit less since my card is more powerful as is, but since there is no HW acceleration for ray tracing, performance hit is still massive. Also a dead giveaway is the GTX in your card's name. Just like on mine...

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/2/2020 at 4:44 PM, RejZoR said:

(...) Even if you're selling shovels, you're selling Ai shovels today.

Have you seen the effects waifu2x provides? Some images I could upscale 16x and the viewer would be no wiser that it was an upscale, of course on others the effects were no better than classic upscaling.

Just the way you upscaled those system shock videos, Nvidia can 'train' DLSS to use just the right methods based on comparing upscaled version with the same frame rendered at full resolution. It won't create detail out of thin air, but may create the best upscaled version for the chosen scene and with the resources of Nvidia it's guess should be solid. I suspect there will be many cases where DLSS would add something that isn't there or just make it look worse, but on the average it should be the best algorithm, if not now, then soon.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

Here's a video from Digital Foundry covering DLSS and FidelityFX

tldw: DLSS looks better and runs significantly faster than native res TAA, also it totally destroys FidelityFX in visual quality

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×