Jump to content

AMD Announces Fidelity FX Super Resolution, their competitor to nvidia DLSS

Juanitology
On 6/1/2021 at 3:30 AM, LAwLz said:

Sounds promising but I am very skeptical.

If I had to guess, it will not be anywhere near as good as DLSS currently is, but comparing it to DLSS is kind of moot. This is hardware agnostic and hopefully gets wide support (if it's good).

We can discuss DLSS vs Fidelity FX all we want, but that discussion will only matter when:

1) A game support both technologies and someone needs to make a choice which one to use.

2) When some fanboy wants to talk about how their favorite brand is superior to the competitor.

 

The reason why I am very skeptical is because the demo image they showed, running on a GTX 1060, looks quite frankly awful.

Left is native resolution, right is Fidelity FX in "quality mode":

130857154_Screenshot2021-06-01102858.png.ee68d49fdb944fe4fea3a8c07febc973.png

 

If it's this bad then you might as well just run at a lower resolution and not use FidelityFX.

I seriously hope it improves before release, because if the comparison they posted is the current example they want to use to highlight how great it is, then it's in a VERY bad shape.

 

 

For comparison, this is an example of what DLSS looks like:

Left is native 4K, right is 1440P upscaled to 4K with DLSS:

Untitled.thumb.png.4b8992dbd1a4e9b20c0e5f9e01325ebb.png

The GTX 1060 demo was 1440p native and something upscaled to 1440p. Not 4k. They also don't tell you whether it's upscaled from 1080p or 720p to 1440p in the image itself. 

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting comments so far.

 

I'd like to try and take a game dev view from this.

DLSS: requires the dev to do work to integrate it. Only works on nvidia RTX cards. Already deployed and performance is understood.

FSR: requires the dev to do work to integrate it. Could work on many cards going back a way, including red, green and eventually blue.

 

We should also consider, game devs already have to take some consideration of differences between AMD and nvidia GPUs. For that reason, I'm not sure FSR would become the default for higher end games. Quite possibly both DLSS and FSR would be supported on major titles moving forward. Maybe FSR has more of a chance for lower effort game devs.

 

In a quick manual count, RTX GPUs currently make up 16.51% of Steam Survey. That's higher than all AMD combined at 16.18%. AMD allowing it to work on old nvidia is about the only way they can seek to gain traction as it could expand the potential market coverage significantly.

 

We're going to have to wait until release and see how FSR really performs on both AMD and nvidia GPUs. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, porina said:

FSR: requires the dev to do work to integrate it. Could work on many cards going back a way, including red, green and eventually blue.

 

Don't forget that it works on the two performance oriented consoles as well. That's a good 7-10 years of use that developers will have a permanent install base for.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ravenshrike said:

Don't forget that it works on the two performance oriented consoles as well. That's a good 7-10 years of use that developers will have a permanent install base for.

Haven't double checked but in reports it will not be in consoles initially. Seems inevitable it might do down the line. This might just be a resource allocation priority thing. The benefits will be seen more in PC gaming space. Think it will be more for MS/Sony to decide if/when they want to implement it.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

I guess it's a good thing we can see that when it happen because this is on Github? 

It's not yet, but hopefully AMD will put it up like they have with other FidelityFX libraries.

 

 

54 minutes ago, ravenshrike said:

The GTX 1060 demo was 1440p native and something upscaled to 1440p. Not 4k. They also don't tell you whether it's upscaled from 1080p or 720p to 1440p in the image itself. 

I'm not sure what your point is. I never said it was 4K and I don't see how that's relevant.

My point is that the upscaling looks really bad to the point where I hope AMD's demo is not an accurate representation of what it will look like at launch, because if it is then it looks like it might be ass and not worth using.

Link to comment
Share on other sites

Link to post
Share on other sites

Hardwareunboxed just did a thing on that subject.  Their stance seems to be the announcement was earlier than they expected.  They do not expect it to be as good as DLSS2.0 either, but do expect it to be better than DLSS1.0.   I never saw DLSS1.0 myself.

 

 

 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

I'm not sure what your point is. I never said it was 4K and I don't see how that's relevant.

Because the quality differences are pretty big. DLSS doesn't work nearly as well from 1080 or 960p to 1440p as it does from 1440p to 4k.

https://www.techspot.com/article/1992-nvidia-dlss-2020/#F-31

https://www.techspot.com/article/1992-nvidia-dlss-2020/#F-32

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ravenshrike said:

Because the quality differences are pretty big. DLSS doesn't work nearly as well from 1080 or 960p to 1440p as it does from 1440p to 4k.

https://www.techspot.com/article/1992-nvidia-dlss-2020/#F-31

https://www.techspot.com/article/1992-nvidia-dlss-2020/#F-32

I didn’t know that.  Would be a salient point then.  To judge quality one would need to know what the upscale from what to what was.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ravenshrike said:

Because the quality differences are pretty big. DLSS doesn't work nearly as well from 1080 or 960p to 1440p as it does from 1440p to 4k.

https://www.techspot.com/article/1992-nvidia-dlss-2020/#F-31

https://www.techspot.com/article/1992-nvidia-dlss-2020/#F-32

Ah, that's good to know. So with a bit of luck, FidelityFX super resolution will perform better at higher resolutions.

I still think that looks way better than what AMD offered so my guess is still that AMD will be behind, and the technology needs to reach a certain quality level before it's actually worth using. If it manages to pass that threshold or not remains to be seen.

Still disappointed that AMD will not optimize it for Nvidia GPUs though. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, LAwLz said:

Ah, that's good to know. So with a bit of luck, FidelityFX super resolution will perform better at higher resolutions.

I still think that looks way better than what AMD offered so my guess is still that AMD will be behind, and the technology needs to reach a certain quality level before it's actually worth using. If it manages to pass that threshold or not remains to be seen.

Still disappointed that AMD will not optimize it for Nvidia GPUs though. 

For a lot of people lower resolution is where they’re going to care though.  PvP people need fps more than resolution.  They’re trying to keep from getting shot not trying to make it look nice.  I suspect a lot of people would do wireframe if they could just to get fps up to an advantageous level. A person on a budget isn’t even going to have access to a 1440p screen at all. It’s going to be 1080p  or whatever their tv produces if they even have access to a tv and can get it to display monitor output.  TV manufacturers haven’t always been very helpful there.  There were a bunch of flat screens that were 1080p or even 720p but most new ones are “4k” which means a bunch of stuff smaller than but still closer to true 4x1080p than not.  There just weren’t a lot of 1440p Rez TVs made. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bombastinator said:

For a lot of people lower resolution is where they’re going to care though.  PvP people need fps more than resolution.  They’re trying to keep from getting shot not trying to make it look nice.  I suspect a lot of people would do wireframe if they could just to get fps up to an advantageous level. A person on a budget isn’t even going to have access to a 1440p screen at all. It’s going to be 1080p  or whatever their tv produces if they even have access to a tv and can get it to display monitor output.  TV manufacturers haven’t always been very helpful there.  There were a bunch of flat screens that were 1080p or even 720p but most new ones are “4k” which means a bunch of stuff smaller than but still closer to true 4x1080p than not.  There just weren’t a lot of 1440p Rez TVs made. 

True, but that just means it is even more important for AMD to make the low to medium resolution conversion good.

Like I've said before, if the quality isn't good enough then there is no point in using this. The real competitor to FidelityFX Super Resolution is running at lower resolution and letting the standard upscaling algorithms do their job. If AMD's offering doesn't compete with that on both performance (this will be slower than no upscaling) and visual quality then this is DoA.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

True, but that just means it is even more important for AMD to make the low to medium resolution conversion good.

Like I've said before, if the quality isn't good enough then there is no point in using this. The real competitor to FidelityFX Super Resolution is running at lower resolution and letting the standard upscaling algorithms do their job. If AMD's offering doesn't compete with that on both performance (this will be slower than no upscaling) and visual quality then this is DoA.

If they’re boned at low Rez with both DLSS and FXSR it won’t matter though. If this thing works though it could become ubiquitous which could fundamentally change game output.  It could become possible for devs to make games that don’t produce good looking video at all, but stuff that is designed to be upscaled.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, LAwLz said:

2) AMD are still the ones who are 100% in control over how Fidelity FX performs because they are the ones who can approve or block any changes Nvidia wants to make to it, in order to make it work better on Nvidia cards. My guess is that if a pull request from Nvidia increased performance on Nvidia cards by let's say 30%, but reduced performance on AMD cards by 5% then AMD will probably not approve the change. It also makes it so that AMD could potentially cripple Nvidia's performance if this became a big thing and if AMD ever felt like they needed some help. That's a bad situation.

Well, FSR is stated to be MIT, so nvidia could simply fork it, apply the patches and release it as they wish.

They could even close it back, make it a feature in their drivers and don't tell a single soul about it 🤷‍♂️

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, igormp said:

Well, FSR is stated to be MIT, so nvidia could simply fork it, apply the patches and release it as they wish.

They could even close it back, make it a feature in their drivers and don't tell a single soul about it 🤷‍♂️

Fumbled an acronym there: MIT  it sounds like you are talking about a patent license.  lm familiar with the GNU public license which would not do what you say and the Berkeley public license (for BSD) which would. I don’t know what MIT is though in this instance.  Was it written at the Massachusetts Institute of Technology?  It’s possible I suppose.  Lots of smart people there.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bombastinator said:

Fumbled an acronym there: MIT  it sounds like you are talking about a patent license.  lm familiar with the GNU public license which would not do what you say and the Berkeley public license (for BSD) which would. I don’t know what MIT is though in this instance.  Was it written at the Massachusetts Institute of Technology?  It’s possible I suppose.  Lots of smart people there.

https://opensource.org/licenses/MIT

 

It's a permissive license similar to BSD licenses. The GPL, on the other hand, is copyleft and viral.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, igormp said:

https://opensource.org/licenses/MIT

 

It's a permissive license similar to BSD licenses. The GPL, on the other hand, is copyleft and viral.

Thx.  For some any public license is copyleft. I suppose it depends on where you happen to stand.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/1/2021 at 4:30 AM, LAwLz said:

Sounds promising but I am very skeptical.

If I had to guess, it will not be anywhere near as good as DLSS currently is, but comparing it to DLSS is kind of moot. This is hardware agnostic and hopefully gets wide support (if it's good).

We can discuss DLSS vs Fidelity FX all we want, but that discussion will only matter when:

1) A game support both technologies and someone needs to make a choice which one to use.

2) When some fanboy wants to talk about how their favorite brand is superior to the competitor.

 

The reason why I am very skeptical is because the demo image they showed, running on a GTX 1060, looks quite frankly awful.

Left is native resolution, right is Fidelity FX in "quality mode":

130857154_Screenshot2021-06-01102858.png.ee68d49fdb944fe4fea3a8c07febc973.png

 

If it's this bad then you might as well just run at a lower resolution and not use FidelityFX.

I seriously hope it improves before release, because if the comparison they posted is the current example they want to use to highlight how great it is, then it's in a VERY bad shape.

 

 

For comparison, this is an example of what DLSS looks like:

Left is native 4K, right is 1440P upscaled to 4K with DLSS:

Untitled.thumb.png.4b8992dbd1a4e9b20c0e5f9e01325ebb.png

While I agree with your premise, this image comparison is not even remotely fair. The AMD showing is a pretty complex texture, which is pretty hard to make crisp under any circumstances. The Nvidia example is basically straight lines and flat colors. I could down-sample that in MS paint and you'd be hard pressed to tell a difference. Not to mention, even then it does lose quite a bit of detail. Look at the chipped paint lines on the cabinet. They are almost nonexistent with DLSS on.

 

So again, for all the fanboys that will undoubtedly misread that, I agree that AMD has a lot to prove, but this specific image comparison is not useful.

Primary Gaming Rig:

Ryzen 5 5600 CPU, Gigabyte B450 I AORUS PRO WIFI mITX motherboard, PNY XLR8 16GB (2x8GB) DDR4-3200 CL16 RAM, Mushkin PILOT 500GB SSD (boot), Corsair Force 3 480GB SSD (games), XFX RX 5700 8GB GPU, Fractal Design Node 202 HTPC Case, Corsair SF 450 W 80+ Gold SFX PSU, Windows 11 Pro, Dell S2719DGF 27.0" 2560x1440 155 Hz Monitor, Corsair K68 RGB Wired Gaming Keyboard (MX Brown), Logitech G900 CHAOS SPECTRUM Wireless Mouse, Logitech G533 Headset

 

HTPC/Gaming Rig:

Ryzen 7 3700X CPU, ASRock B450M Pro4 mATX Motherboard, ADATA XPG GAMMIX D20 16GB (2x8GB) DDR4-3200 CL16 RAM, Mushkin PILOT 1TB SSD (boot), 2x Seagate BarraCuda 1 TB 3.5" HDD (data), Seagate BarraCuda 4 TB 3.5" HDD (DVR), PowerColor RX VEGA 56 8GB GPU, Fractal Design Node 804 mATX Case, Cooler Master MasterWatt 550 W 80+ Bronze Semi-modular ATX PSU, Silverstone SST-SOB02 Blu-Ray Writer, Windows 11 Pro, Logitech K400 Plus Keyboard, Corsair K63 Lapboard Combo (MX Red w/Blue LED), Logitech G603 Wireless Mouse, Kingston HyperX Cloud Stinger Headset, HAUPPAUGE WinTV-quadHD TV Tuner, Samsung 65RU9000 TV

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Bombastinator said:

Thx.  For some any public license is copyleft. I suppose it depends on where you happen to stand.

No, that's wrong. Copyleft has mostly to do with derivative works, while permissive licenses don't give a crap about that.

 

39 minutes ago, Kid.Lazer said:

So again, for all the fanboys that will undoubtedly misread that, I agree that AMD has a lot to prove, but this specific image comparison is not useful.

I believe quality will be bad, but we can only be certain once it's actually released.

Another important thing is how much it can be improved over time (if at all). We already know that DLSS can be improved continuously, since it's purely a ML model afterall.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, igormp said:

No, that's wrong. Copyleft has mostly to do with derivative works, while permissive licenses don't give a crap about that.

 

I believe quality will be bad, but we can only be certain once it's actually released.

Another important thing is how much it can be improved over time (if at all). We already know that DLSS can be improved continuously, since it's purely a ML model afterall.

DLSS certainly wasn’t very good when it came out. It’s apparently a lot better than it was.   That DLSS can apparently run fine without tensor cores though points to shenanigans by Nvidia.  I’m not sure how trusted they are.  What matters in the end is what do the game devs who will be implementing or not implementing it think users will buy.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Not sure if it's been mentioned here, or if this is even the right thread, but it seems they revealed a 5900x with an upgraded stacked cache to be released as a Zen 3 refresh?

 

Is this going to be seeing retail? Will it still be AM4? Will it work on B550 motherboards?

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Mister Woof said:

Not sure if it's been mentioned here, or if this is even the right thread, but it seems they revealed a 5900x with an upgraded stacked cache to be released as a Zen 3 refresh?

 

Is this going to be seeing retail? Will it still be AM4? Will it work on B550 motherboards?

There was another thread specifically for the 3D v-cache they used to make their L3 cache enormous. I can't seem to find it anymore though. AMD's confirmed that Zen 3 Ryzen processors with 3D V-Cache will enter production "later this year". Don't know about compatibility with AM4 or B550 since AMD didn't mention anything specific about that.

Link to comment
Share on other sites

Link to post
Share on other sites

As expected, I don't know why they didn't do that in the first place since 570/580 chip is a just basically the same family as the 470/480. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, thechinchinsong said:

There was another thread specifically for the 3D v-cache they used to make their L3 cache enormous. I can't seem to find it anymore though. AMD's confirmed that Zen 3 Ryzen processors with 3D V-Cache will enter production "later this year". Don't know about compatibility with AM4 or B550 since AMD didn't mention anything specific about that.

I don’t see why it wouldn’t be.  They “3d” thing seems to be an implementation of something TSMC has been able to do for some time.  My memory is they can do 15 layers of it but AMD is only using 3 and it’s memory so there won’t be much increase in heat.  They’re adding barely half a millimeter of chip height and there’s room under the current AMD ihs for it and it doesn’t affect horizontal or vertical size at all and doesn’t require more power pins so I don’t see why it wouldn’t fit.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Bombastinator said:

That DLSS can apparently run fine without tensor cores though points to shenanigans by Nvidia.

The tensor cores do a specific subset of GPU instructions, with optimisations, typically on smaller data allowing it to be ultra fast. The same code can be run on regular GPU cores, just much more slowly. The question is at what performance? You can even run it on CPU if you wanted to. Intel VNNI could help out. Don't think AMD have anything in silicon other than throwing cores at it.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Bombastinator said:

My memory is they can do 15 layers of it but AMD is only using 3 and it’s memory so there won’t be much increase in heat.

It's 1 layer, so 2 dies in stack total. The SRAM is also only over the CCD L3 cache and the portion over the CCD cores has been thermally optimized blank silicon.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×