Jump to content

Report: AMD's competitor to Nvidia DLSS, FidelityFX Super Resolution, to launch in Spring

Random_Person1234
2 minutes ago, gabrielcarvfer said:

Or maybe you meant Foveated rendering, which maintains high resolution on the focal point of the eyes and gradually reduces as the image gets further away from it (requires eye-tracking).

Nah it was VRS but this is also cool!

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SlimyPython said:

Idea: Make resolutions lower in certain areas where there isnt much difference in colour.

 

Have 2 settings for this. One being for colour difference before it makes a new pixel and another for the pixel size.

 

It would work well for 2D games though idk about 3D games

But like I said before, that would reduce image quality to increase performance.

DLSS and FidelityFX is about increasing image quality without taking a hit to performance.

 

It's apples and oranges.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LAwLz said:

But like I said before, that would reduce image quality to increase performance.

DLSS and FidelityFX is about increasing image quality without taking a hit to performance.

 

It's apples and oranges.

No, DLSS is really about improving performance without losing quality... Because you do lose quality in a lot of scenarios, especially on smaller details like raindrops or raindrops splashes and stuff like that. To NVIDIA's luck, you also gain perceivable details, usually in form of sharpness of some elements.

 

Btw, Foveated Rendering was used in Shadow Warrior 2. It was named differently and was actually NVIDIA branded feature. It works, but was funky to see it for the first time. Center of the screen is native resolution and outside of field of focus, resolution gets significantly lower. Problem is that game was pretty interesting visually where you wanted to stop and observe the world and blocky peripheral vision kept annoying me. So I turned it off. Could see it useful in fast paced shooters where you keep focus in middle of screen anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, RejZoR said:

No, DLSS is really about improving performance without losing quality... Because you do lose quality in a lot of scenarios, especially on smaller details like raindrops or raindrops splashes and stuff like that. To NVIDIA's luck, you also gain perceivable details, usually in form of sharpness of some elements.

No it is not. DLSS is about increasing image quality, and you do not lose image quality.

DLSS is image upscaling. Why do you upscale images? To increase image quality.

 

You are getting technology mixed up with the use case.

DLSS is "let's take a 720p image and upscale it to 1080p, making it look better without increasing load on the GPU too much".

When you hear DLSS you probably think "an upscaled 1080p frame took less computational power to render than a native 1080p, but it doesn't look as good".

 

The first is what DLSS actually is. The second thing is an example of how it can be used. You are letting a single use case define an entire technology.

 

Stop thinking of DLSS as "what's best, DLSS or native high res" and start thinking of DLSS for what it actually is, image upscaling.

 

 

DLSS is an entire toolbox that has a wide range of applications. You can't define an entire toolbox by how one of the tools performs in one particular application.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, LAwLz said:

No it is not. DLSS is about increasing image quality, and you do not lose image quality.

DLSS is image upscaling. Why do you upscale images? To increase image quality.

You actually kinda do, but that is most apparent when upscaling when your targeted resolution is 1080p. I've tried it and it is noticeably blurrier at 1080p. Play at 1440p and especially 4K, you'll actually see an improved final output.

 

For 1440p and especially 4K, that's when DLSS well and truly shines.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/8/2021 at 9:53 AM, WikiForce said:

finally we have it, now they will kick Nvidia's ass. Hopefully they also bring navi refresh with G6X memory.

they don't need G6X for these chips. G6x is a mess right now, its power hungry

 

19 hours ago, leadeater said:

Sadly AMD's timeline of execution by measure of quality and completeness compares to Nvidia's is longer. I'm actually not hoping for much better than DLSS 1.0 for their first go around.

AMD can basically already do DLSS 1.0 with a few tweaks in the drivers. I'd love to see the upscalers used on the consoles come to AMD GPUs even without anything like DLSS that would allow you to run games at 75-85% res and not know.

19 hours ago, Vishera said:

Don't forget that NVIDIA is a giant in the AI industry,AMD has no chance to compete with DLSS.

They are a giant for some workloads but not all.

19 hours ago, Vishera said:

The problem is that NVIDIA already laid out the framework for the adoption of RTX,AMD coming late to the game can kill the adoption of it's Ray Tracing technology.

Also,NVIDIA is not the only competitor,Unigine has a really good Ray Tracing technology,Cry Engine on the other hand doesn't have a good Ray Tracing technology.

Assuming games are built on the DX12 extension then AMD getting RT shouldn't be that much work but for some games this will be a repeat of gameworks hair.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, D13H4RD said:

You actually kinda do, but that is most apparent when upscaling when your targeted resolution is 1080p. I've tried it and it is noticeably blurrier at 1080p. Play at 1440p and especially 4K, you'll actually see an improved final output.

 

For 1440p and especially 4K, that's when DLSS well and truly shines.

there isn't enough data when you scale say 600p to 1080p but 720p to 1440p or 1080p to 4k there is a lot more to work with

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GDRRiley said:

there isn't enough data when you scale say 600p to 1080p but 720p to 1440p or 1080p to 4k there is a lot more to work with

Yep. DLSS at 1080p isn't really advised as the base resolution would be too low for the AI to upscale with the desired level of quality due to having too little information.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, SlimyPython said:

Idea: Make resolutions lower in certain areas where there isnt much difference in colour.

 

Have 2 settings for this. One being for colour difference before it makes a new pixel and another for the pixel size.

 

It would work well for 2D games though idk about 3D games

Problem is, 2D games most likely don't need the performance as they are generally easy to run.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, GDRRiley said:

AMD can basically already do DLSS 1.0 with a few tweaks in the drivers.

That's not really an accomplishment. DLSS 1.0 was generally regarded as something you want to have turned off.

 

Even 2.0 will only be comparable to Native resolution when set to the quality mode. Anything lower will result in a noticeably lower quality image.

 

I have no problem with AMD implementing a similar Tech to DLSS, but i just think people should have realistic expectaions about it. It will most likely not be as good as Nvidias DLSS in the near future. Just like we see with the Ray-Tracing performance right now.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, LAwLz said:

No it is not. DLSS is about increasing image quality, and you do not lose image quality.

DLSS is image upscaling. Why do you upscale images? To increase image quality.

 

You are getting technology mixed up with the use case.

DLSS is "let's take a 720p image and upscale it to 1080p, making it look better without increasing load on the GPU too much".

When you hear DLSS you probably think "an upscaled 1080p frame took less computational power to render than a native 1080p, but it doesn't look as good".

 

The first is what DLSS actually is. The second thing is an example of how it can be used. You are letting a single use case define an entire technology.

 

Stop thinking of DLSS as "what's best, DLSS or native high res" and start thinking of DLSS for what it actually is, image upscaling.

 

 

DLSS is an entire toolbox that has a wide range of applications. You can't define an entire toolbox by how one of the tools performs in one particular application.

Pal, you have that all backwards. DSR is to improve quality, at performance penalty. DLSS's sole existence is to increase performance at perceptibly no degradation in quality (even though it is there). VSR uses same approach. It degrades quality in parts where you can't perceive it anyway, even if it's measurably there. If you see it any other way you either don't understand the tech or you're interpreting it all wrong. You're not rendering that 1440p at 720p to gain quality. You're doing that to gain performance. And DLSS is doing exactly that.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, RejZoR said:

Pal, you have that all backwards. DSR is to improve quality, at performance penalty. DLSS's sole existence is to increase performance at perceptibly no degradation in quality (even though it is there). VSR uses same approach. It degrades quality in parts where you can't perceive it anyway, even if it's measurably there. If you see it any other way you either don't understand the tech or you're interpreting it all wrong. You're not rendering that 1440p at 720p to gain quality. You're doing that to gain performance. And DLSS is doing exactly that.

No, you're the one who got it all wrong. It is even in the name.

And believe me, I understand it. I think you're the one who don't understand it. You are too narrow minded to fully understand it. You have seen it used in one way and think that is all there is to it. DLSS is by itself just image upscaling. DLSS is "let's take an image and make it bigger". That is all there is to it. Just because YOU then compare that output image to a natively rendered one and try and compare performance between upscaling and running natively does not mean that is an inherent part of the technology.

 

DLSS is "let's take this frame and make it higher resolution". That's it. 

The fact that you are using it to get away with rendering a game at a lower resolution is not something that is inherent to the technology itself. It's just your use case for it. 

 

If you don't understand this then I don't think it is worth discussing this any further with you.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

Pal, you have that all backwards. DSR is to improve quality, at performance penalty. DLSS's sole existence is to increase performance at perceptibly no degradation in quality (even though it is there). VSR uses same approach. It degrades quality in parts where you can't perceive it anyway, even if it's measurably there. If you see it any other way you either don't understand the tech or you're interpreting it all wrong. You're not rendering that 1440p at 720p to gain quality. You're doing that to gain performance. And DLSS is doing exactly that.

I have done an extensive research about DLSS,VSR and DSR.

DSR has no performance penalty at all,It's goal is to bring higher resolutions to all monitors.

So if you have a 1080p display and want to play at 4K resolution,you just need to enable DSR and change the resolution to 4K.

AMD's VSR does it as well.

AMD's VSR is superior to DSR,VSR is using scaling,while DSR is using weird hacks.

The difference in image quality between DSR and VSR  is significant,and not to mention that DSR has artifacts and bugs due to it's use of hacks.

 

DLSS is a totally different beast.

It uses AI to increase the quality of the original image.

The higher the resolution of the source image the better the result.

 

@LAwLz

 

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, LAwLz said:

No, you're the one who got it all wrong. It is even in the name.

And believe me, I understand it. I think you're the one who don't understand it. You are too narrow minded to fully understand it. You have seen it used in one way and think that is all there is to it. DLSS is by itself just image upscaling. DLSS is "let's take an image and make it bigger". That is all there is to it. Just because YOU then compare that output image to a natively rendered one and try and compare performance between upscaling and running natively does not mean that is an inherent part of the technology.

 

DLSS is "let's take this frame and make it higher resolution". That's it. 

 

If you don't understand this then I don't think it is worth discussing this any further with you.

Your logic is just straight up broken. DLSS is a performance improving feature. Full stop. It's sole existence is to boost framerate to make RTX usable because ray tracing by itself is so demanding. I have absolutely no idea where you are getting an idea that rendering things at lower resolution and measurably losing quality is somehow "a visual quality enhancing feature". How?! It just makes absolutely no sense. It's a performance improving feature in same way as VSR or angle dependent anisotropic filtering.You're REMOVING quality to gain performance, but you're doing it in such a way that quality drop is almost imperceptible. Which is the whole point of it. You boost performance by what, up to 50% at hardly noticeable degradation in quality? I'd take that any time. And so would many gamers.

 

@Vishera

Oh dear, you have things mixed up even further...

 

DSR does in fact have a performance hit. And it has a performance hit of whatever resolution you picked. If you have 1080p display and you pick DSR at 4K, you're rendering the games at 4K and downscaling to 1080p. The performance hit is the 4K rendering resolution + a tiny loss for downsampling to your native resolution.

 

VSR is Variable Rate Shading and examines where you can use lower shading quality to gain performance without losing quality. There is no point in shading a surface at maximum quality if it's in the peripheral vision region and on a surface that is angled in a specific way or is occluded by AO effect anyway. You're mixing up VSR as what it is with AMD's Radeon Boost feature that uses VSR as an assisted technology/feature. VSR helps Radeon Boost to scale down resolution by tapping into VSR functionality and degrading resolution in a much clever way so it's less perceptible to the player.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, LAwLz said:

DLSS is "let's take this frame and make it higher resolution". That's it. 

That's wrong,as i said:

9 minutes ago, Vishera said:

DLSS is a totally different beast.

It uses AI to increase the quality of the original image.

The higher the resolution of the source image the better the result.

DLSS doesn't increase the resolution of the image.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Vishera said:

That's wrong,as i said:

DLSS doesn't increase the resolution of the image.

In a way it does which is why LAwLz is confused I believe. DLSS renders game at 720p, but outputs it natively to 1440p. That however doesn't make it 1440p. It's still 720p in a nutshell, they just do it clever enough that at the point it's rendered at native resolution, user can barely notice the difference. But the quality is always degraded no matter how everyone is hyping it as "DLSS makes image sharper". It makes some things sharper. But as whole, the image quality is degraded in a very quantifiable way. But again, like I said, it's fine because the quality degradation is easily replaced by massive performance gains.

 

AMD's FidelityFX CAS however is straight up image quality improvement feature. It renders game unchanged and then slaps contrast adaptive sharpening filter on top. With it, you can essentially use FXAA and not lose sharpness because of it (you can easily try this on ANY graphic card using ReShade with CAS and FXAA enabled at the same time). So you get a more blurry image because of FXAA, but you get rid of jagged edges and with CAS on top, you also get rid of FXAA's blur. So you end up with image without jaggies with nearly all details retained. At performance drop of almost insignificant amount making this combo nearly free. But you don't gain any performance, you just don't lose it while improving quality. CAS and DLSS have rtwo very different approaches and CAS will ALWAYS produce better quality. But not better performance. Then it's up to you what you prefer to use. If game is brand new and very demanding, you'll probably prefer DLSS. If game is older and it runs at 150fps anyway, you'll probably prefer CAS every time.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, RejZoR said:

The performance hit is the 4K rendering resolution

Running games at 4K is a performance hit?

What are you going on about,if that's the case just run games at the lowest resolution possible since any higher will be a "performance hit".

33 minutes ago, RejZoR said:

a tiny loss for downsampling to your native resolution.

Half a frame to a full frame penalty is so insignificant that you will have a hard time detecting it,since it's within the margin of error.

33 minutes ago, RejZoR said:

VSR is Variable Rate Shading

Nope,just nope.

 

VSR = Virtual Super Resolution

 

Do your research.

 

 

EDIT:

From AMD's website: https://www.amd.com/en/technologies/vsr

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Vishera said:

Running games at 4K is a performance hit?

What are you going on about,if that's the case just run games at the lowest resolution possible since any higher will be a "performance hit".

Half a frame to a full frame penalty is so insignificant that you will have a hard time detecting it,since it's within the margin of error.

😂

Nope,just nope.

 

VSR = Virtual Super Resolution

 

Do your research.

 

Your display is 1080p. You convince the game to render at 4K and then downscale to fit your actual resolution. You're literally computing MORE pixels. What does that make it? A performance boost or performance penalty? Must be performance improvement somehow. Only by your logic. Your understanding of rendering technologies is so lacking I don't even know where to begin.

 

As for mention of VSR, that was straight up a typo. I meant VRS (see the two letters flipped between VSR and VRS?). I do agree that I actually managed to repeat the typo in several places of that paragraph. But you'd understand I actually meant VRS if you actually read my explanation of it. And I wrote an explanation of VRS. Hell, I literally wrote VSR and exploded the abbreviation into Variable Rate Shading. Why would I get that wrong on purpose? That would be like saying USA is an abbreviation for United America States. It's obviously worng. Makes no fucking sense. It was a simple but stupid typo and you went with "gotcha" response. Come on dude... Maybe you should actually do your research because you clearly don't understand what VRS is if you go with that kind of response on my typo.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

If you don't understand this then I don't think it is worth discussing this any further with you.

Now i understand you,i am out 😄

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

I love people who are so vehemently confident at being wrong...

Link to comment
Share on other sites

Link to post
Share on other sites

That moment when people are complaining and arguing about what DLSS actually is.

 

Meanwhile, I've just been using DLSS in conjunction with DSR to have a better 1080p image. XD

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, gabrielcarvfer said:

Did they track eye movement? If they didn't, that explains why it looked bad. 

How would you track eye movement in a non-VR game without any eye detection?

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/8/2021 at 10:28 AM, Middcore said:

However, unless it starts to become commonplace for games to actually support these features, they both fall squarely into the category of "perks". Wikipedia lists a grand total of 43 games that support DLSS, over two years on... but that list includes 10 titles that claim they'll support DLSS at some future time but have a date for actually implementing it of "TBA", and several games which haven't been released yet. Some are also games I have literally never heard of. 

 

I'd like to think that FidelityFX SR will see more widespread support since historically AMD has been more into open standards whereas Nvidia tried to sell based on exclusive features, but AMD themselves seem to be moving more in the latter direction lately. 

Yeah, it's awesome. Can I get in line now for no one to support it? Or have Nvidia do the typical and do the same shit with Witcher 3 and make companies drop support for their inferior versions just before launch?

The only bonus is both consoles went with AMD, so if they support it that might also leak over to the PC.

#Muricaparrotgang

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, D13H4RD said:

That moment when people are complaining and arguing about what DLSS actually is.

 

Meanwhile, I've just been using DLSS in conjunction with DSR to have a better 1080p image. XD

I used to joke about that, but then noticed it can actually be done. For example, Shadow of the Tomb Raider refused to allow me to use DLSS at 1080p. You just couldn't select it. So I thought, if I bump up the resolution with DSR, it might allow me. And behold, that's exactly what happened. After setting DSR to 1440p or 4K, DLSS appeared. So it was upscaling 1080p to 1440p and then downscaling it with DLSS to 840p or something like that iirc. It's such a funny rollercoaster, but that's how it actually worked.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×