Jump to content

DLDSR in Cyberpunk

DeerDK

Helloooo Everybody!

After getting a tip about using DLDSR with my combination of 1080p 144hz monitor and my 4070 Super, I tried activating it in Cyberpunk  2077(Phantom Liberty) - Disclaimer, I can't remember right now if Im using DSR or DLDSR. But Wauw, just Wauw. I was very impressed with how smooth the lines and textures became. I've always considered it to be a very beautiful game, but it did have a bit of a tendency to jagged lines, even with all the AA ect. turned on, so im very pleased with the smoother look provided by 1440p.

My performance does seem to take a bit of a hit, though. I can't remember all the specifics, but im playing with all the raster settings at max and RT reflections turned on, due to me preferring how it makes the water look, and im chugging out around 47 fps. Its good enough for me, but I did look up that the 4070 Super should provide around 86 fps in 1440p pure raster and 43 fps with RT on.

I did try turning off RT entirely, but I don't recall reaching 86 fps. I may be wrong, though, and will have to revisit the settings to confirm or debunk that notion.

 

My question. Is DLDSR a bigger drain on FPS apart from rendering in a higher resoloution? Does it put a heavier toll on the CPU or RAM?

Or may I be bottlenecked in terms of RAM or CPU in terms of raw performance in Cyberpunk - I am aware its a very demanding game.

My CPU is a 5600x and I have 2x8 gb 3200 mhz CL16 RAM.

Motherboard is a ROG STRIX B550-I GAMING.

 

Other possible factors are that I do play a bit around with undervolting both CPU and GPU for temperature benefits.

 

Anyway, just wanted to share my excitement about how good it made my 1080p game look, and maybe discuss the technicalities. Again. I am happy with the current performance, but i've fallen down the tinkering rabbit hole of pc building and im not sure how to get out xD

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

The 5600X might be a little bit of a bottleneck, but I can't imagine it would be a 40fps bottleneck, maybe 20 at the most

Apprentice Software Developer

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, DeerDK said:

My question. Is DLDSR a bigger drain on FPS apart from rendering in a higher resoloution? Does it put a heavier toll on the CPU or RAM?

I don't know how much CPU the Nvidia driver needs but most of the load should be on GPU. From the rest of system perspective it is mainly the rendering at higher resolution.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

yes true upscaling (actually downsampling) is an actual "bottleneck" potentially as the requirements get increased exponentially. 

 

also btw welcome to ca. 2008 when people started using this tech to "get rid of the lines"... which nvidia only later copied and officially incorporated into their drivers.

 

Originally called OGSSAA : Ordered Grid SuperSampling AntiAliasing, btw

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

yes true upscaling (actually downscaling) is an actual "bottleneck" potentially as the requirements get increased exponentially. 

 

also btw welcome to ca. 2008 when people started using this tech to "get rid of the lines"... which nvidia only later copied and officially incorporated into their drivers.

 

Originally called OGSSAA : Ordered Grid SuperSampling AntiAliasing, btw

Who cares if it's old. I joined the enthusiast community way later, and hadn't heard about this, as everyone always talk about upscaling. Im excited, and I do think it provides a good middle station for people with a good 1080p monitor who has just gotten a new GPU that technically is a bit overpowered for the monitor. 

 

I get that im now running 1440p with all that entails, but does the downscaling process add additional load to my hardware? 

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, DeerDK said:

Who cares if it's old. I joined the enthusiast community way later, and hadn't heard about this, as everyone always talk about upscaling. Im excited, and I do think it provides a good middle station for people with a good 1080p monitor who has just gotten a new GPU that technically is a bit overpowered for the monitor. 

 

I get that im now running 1440p with all that entails, but does the downscaling process add additional load to my hardware? 

not really the rendering at higher resolution does tho...

 

imo its only worth it if you can at least reach stable 60fps, but that's just my personal preference. 

 

Also while 1440p downsampling can provide some advantages, i never really used that much, 4k is way better on a 1080p monitor because the pixel density of 4k transfers 1:1 so there should be no artifacts, in theory. 

 

tldr: yeah its a neat tech but only worth it if performance doesn't suffer too much imho.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Mark Kaine said:

also btw welcome to ca. 2008 when people started using this tech to "get rid of the lines"... which nvidia only later copied and officially incorporated into their drivers.

The basics are way older than that as it is just a basic technique in signal processing. I looked at it at university in the 90's and it wasn't new then either.

https://en.wikipedia.org/wiki/Oversampling

 

1 minute ago, DeerDK said:

I get that im now running 1440p with all that entails, but does the downscaling process add additional load to my hardware? 

As you mentioned before there will be increased load from having to render at the higher resolution in the first place. It is less clear if there is any other significant additional load outside of the GPU to do the DLDSR processing. I doubt it would be significant if there is. Could be tested by running native and comparing performance vs (DL)DSR at same rendering.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, porina said:

The basics are way older than that as it is just a basic technique in signal processing. I looked at it at university in the 90's and it wasn't new then either.

https://en.wikipedia.org/wiki/Oversampling

 

As you mentioned before there will be increased load from having to render at the higher resolution in the first place. It is less clear if there is any other significant additional load outside of the GPU to do the DLDSR processing. I doubt it would be significant if there is. Could be tested by running native and comparing performance vs (DL)DSR at same rendering.

 

9 minutes ago, Mark Kaine said:

not really the rendering at higher resolution does tho...

 

imo its only worth it if you can at least reach stable 60fps, but that's just my personal preference. 

 

Also while 1440p downsampling can provide some advantages, i never really used that much, 4k is way better on a 1080p monitor because the pixel density of 4k transfers 1:1 so there should be no artifacts, in theory. 

 

tldr: yeah its a neat tech but only worth it if performance doesn't suffer too much imho.

Yeah, I definitely have to cut out the RT so I can compare the performance with the review benchmarks for the 4070 super. 

4k would be great but I would need a bigger gpu for that. Or play around with DLSS. But last time I saw DLSS in effect the npc faces looked like they had been smeared with foundation. It was DLSS to 1080p though, and that does have a bad reputation. 

 

But as said, I should probably test without RT before making final conclusions. And maybe run the gpu in stock settings 😅

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, DeerDK said:

But as said, I should probably test without RT before making final conclusions. And maybe run the gpu in stock settings 😅

Testing to see how things work and are impacted by settings and hardware can be fun, as is playing a game 😄 For gaming I just find a setting that looks good at acceptable performance then get on with it.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, porina said:

Testing to see how things work and are impacted by settings and hardware can be fun, as is playing a game 😄 For gaming I just find a setting that looks good at acceptable performance then get on with it.

Yeah. And the great visuals make up for it in my book. 47 are okay, especially as I don't have the snappiest reactions anyway. I tend to use quickhacks as I can both slow down the situation to get an overview of the situation and reduce the number of enemies at the time.

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

I use dldsr on a 1440p monitor to 4k res then dlss on top lol 

 

-13600kf 

- 4000 32gb ram 

-4070ti super duper 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×