Jump to content

AMD Announces Fidelity FX Super Resolution, their competitor to nvidia DLSS

Juanitology

Looking at the demo, I can already guess that AMD's solution probably won't hold up to NVIDIA's on a quality point-of-view.

 

But the fact that it works on any card down to an NVIDIA Pascal/AMD Vega is a potentially big deal, but we'll have to wait to see how it stacks up in the real-world.

 

Also, fun-fact. NVIDIA has a version of DLSS that doesn't need Tensor Core hardware.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

AMD did a demo of FSR working on a GTX 1060 as proof if concept, AMD will of course work to further optimize their GPUs for better compatibility, performance and probably PQ as well.

 

They have said the FSR works with older nVidia cards, like the GTX 1000 series, but it'd be entirely up to nVidia to optimize their drivers for their older cards for better performance and PQ. 

 

AMD has thrown down the FSR gauntlet, the big question is, knowing nVidia's business model, walled garden ecosystem like Apple's, do you think they would bother to do so?

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, D13H4RD said:

Looking at the demo, I can already guess that AMD's solution probably won't hold up to NVIDIA's on a quality point-of-view.

 

But the fact that it works on any card down to an NVIDIA Pascal/AMD Vega is a potentially big deal, but we'll have to wait to see how it stacks up in the real-world.

 

Also, fun-fact. NVIDIA has a version of DLSS that doesn't need Tensor Core hardware.

Didn’t know that bit about Nvidia.  Kinda makes sense.  Tensor cores have had white elephant problems since they came out.  There’s a couple technologies that are pretty good but only run on tensor cores where strictly speaking they don’t need to. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Papercut Captain said:

If FSR does gain traction, they probably will, just like they had to do with Freesync. 

I suppose so, but they'd probably do their best to pretend they've not heard of FSR in the first place. I think GTX 1000 (possibly 900, we won't know till FSR us released on 22nd) series owners should write or feedback to nVidia that they want drivers tweaked for better FSR performance and possibly better PQ.

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GamerDude said:

They have said the FSR works with older nVidia cards, like the GTX 1000 series, but it'd be entirely up to nVidia to optimize their drivers for their older cards for better performance and PQ. 

Ehm... No? Where did you get this idea from? 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

Ehm... No? Where did you get this idea from? 

Scott Herkelman said it on Twitter. Makes sense right? You can't expect AMD to optimize it for nVidia cards, it's up to nVidia to do that.

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

Ehm... No? Where did you get this idea from? 

I heard it had been tested on a 1060. That doesn’t automatically mean any series 1000 card though.  No information was given about what 1060 it was or what was done to it.   I still suspect this is going to be a situation where we won’t know what works or what doesnt or how well until we’ll after launch. Launch, a week for the hype to die down and for people to try stuff, then stuff will start to come out.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm assuming it's being done through some sort of standardized compute method to be this universal and run on NVIDIA cards too. I don't think there will be much to optimize anyways and may more depend on just how effective a card is at compute. And if GTX 1060 was able to have gains from it, clearly the upsampling isn't more taxing than the sum of gains, still being beneficial.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, GamerDude said:

Scott Herkelman said it on Twitter. Makes sense right? You can't expect AMD to optimize it for nVidia cards, it's up to nVidia to do that.

Well they won’t.  They might even do stuff to block it.  I suspect it’s going to be up to individual people to do that, and those situations are often really specific.  It’s a hard thing to do and not that many people can do it, and quite often those that can and wind up doing often only do stuff until they get their own particular system working then stop.

 

It could very well be in AMD’s best interest to specifically get 10xx stuff working because it’s a large enough part of the installed base that it alone could assure the success of FXSR as long as it works even OK.  It wouldn’t be helping Nvidia because they don’t sell 10xx series stuff anymore except the 1030 which doesn’t really count. Everything else is 16xx or higher.  It doesn’t have to be as good as DLSS.  Just look at the video cassette war for that one.  Sony had a better product but it lost anyway.  What matters is what people use because game developers look at installed base when figuring out what they can do.  The most installed card on steam is far and away the 1060. If they can get FXSR working on it they basically win. Iirc the second most installed card is the 580.  Not sure. 

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, GamerDude said:

Scott Herkelman said it on Twitter. Makes sense right? You can't expect AMD to optimize it for nVidia cards, it's up to nVidia to do that.

I am not sure if I misread your post or if you have misread Scott's tweet.

Scott isn't saying "our hands or tied, it must be Nvidia that optimizes for their cards". What Scott said and meant is that AMD will not be bothered to optimize for Nvidia. AMD just made it work on a 1060 to show that it could be done, but they will not put in any effort to make sure it works well.

If Nvidia wants to try and improve Fidelity FX they will have to submit pull requests to AMD, which AMD could just ignore if they want to.

 

These are very bad news.

1) AMD themselves have now admitted that they don't care or will even bother making Fidelity FX work well on Nvidia cards. So the idea that was pushed that this is a universal technology that would benefit everyone is very questionable at best. 

 

2) AMD are still the ones who are 100% in control over how Fidelity FX performs because they are the ones who can approve or block any changes Nvidia wants to make to it, in order to make it work better on Nvidia cards. My guess is that if a pull request from Nvidia increased performance on Nvidia cards by let's say 30%, but reduced performance on AMD cards by 5% then AMD will probably not approve the change. It also makes it so that AMD could potentially cripple Nvidia's performance if this became a big thing and if AMD ever felt like they needed some help. That's a bad situation.

 

 

It's like when Intel's MKL didn't use AVX for AMD processors, except AMD are just not keeping quiet about it, they are actively trying to push the responsibility of optimization of their own technology onto their competitors. Imagine if Intel had done that with MKL:

Quote

Yeah, if you want the Intel MKL to work better on AMD then ask AMD to develop our software for us. Then AMD will have to beg us to apply the chances, which we might just reject for the lulz.

That's basically what is happening here.

 

Fidelity FX is AMD's software. They are the ones responsible for making sure it works well, and they are in 100% control over how it works. It's not Nvidia's responsibility, nor does Nvidia actually have any control over it. The only thing Nvidia could do is spend time and money fixing issues in AMD's software and then beg AMD to let them incorporate the fixes into AMD's software so that it doesn't cripple Nvidia cards.

This just seems like AMD preemptively trying to get it into peoples' minds that it's Nvidia's fault if AMD's products (Fidelity FX is AMD's product) are bad, and my guess for why they are doing this is because Fidelity FX in its current form looks like it is really bad. They want to push the blame onto someone else if it doesn't meet expectations, which I don't think it will.

 

Here is the tweet for those interested:

1562037542_Screenshot2021-06-03080747.png.9296454e0b8a41d2fc89306d950c4989.png

Link to comment
Share on other sites

Link to post
Share on other sites

It has nothing to do with pushing blame on anyone. It's all about adoption. NVIDIA has to throw money and shill their DLSS And it then ends up only working on half of graphic cards. FSR working by default on all cards means it'll be in game devs best interest to add it themselves. Especially because of what FSR does. It widens your audience, meaning more users will be able to buy your game even if they have inferior hardware because it'll run better at slight expense of visual quality. With DLSS, it can only run on RTX cards, which means you're already limited there, then it's just the fact it's only NVIDIA cards at that. Maybe AMD doesn't have the perfect 50% market share, but there is a lot of people who have RX 400 and RX 500 series cards. There is tons of people who have GTX 1060 cards. And then there is also a lot of users who have Vega 56 and GTX 1070. And some more with RX 5000 series too. None of these can use DLSS. They can ALL utilize FSR. They are also aging cards a the lower end and not being able to buy any RTX 3060 these days, this is very important for game developers. It's why we see some more adoption of DLSS even for older games. That RTX 2060 being able to run game at 40% higher framerate means user with that old card might buy your game now that it supports DLSS.

 

FSR is just inherently more interesting for the developers because of this. A relative has Radeon RX 580 which is showing some age at this point. FSR means he'll be able to get some more useful value from it in new games that can still be played on it, but not comfortably because of cards age. With FSR, even with slight drop in quality, he'll be able to. And it's in devs best interest to do that, especially in current times when it's hard to buy pretty much any new card.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, RejZoR said:

It has nothing to do with pushing blame on anyone.

Yes it is. AMD are saying they will not bother optimizing their technology for anyone but themselves, thus not making it an actual cross-vendor library.

 

10 minutes ago, RejZoR said:

NVIDIA has to throw money and shill their DLSS And it then ends up only working on half of graphic cards.

According to the Steam hardware survey, it's 75%.

Also, like I said earlier, it is pointless to compare this to DLSS.

 

11 minutes ago, RejZoR said:

FSR working by default on all cards means it'll be in game devs best interest to add it themselves.

You're thinking of this in too black and white terms. FSR is only worth using if it's actually good. The fact that it technically runs on all cards is irrelevant if it doesn't run well.

 

12 minutes ago, RejZoR said:

It widens your audience, meaning more users will be able to buy your game even if they have inferior hardware because it'll run better at slight expense of visual quality.

"Slight expense"? Did you see the comparison screenshots? In AMD's own comparison it looked awful. If it doesn't improve before launch then I'd say it's entirely pointless because you might as well just run the game at lower resolution and let your monitor upscale it. Seems to be about as efficient.

Also, the number of people who goes from "not playable" to "playable thanks to Fidelity FX" is most likely VERY small. 

 

14 minutes ago, RejZoR said:

Maybe AMD doesn't have the perfect 50% market share

They have 16% market share, so yeah... Quite a far stretch from 50%.

 

16 minutes ago, RejZoR said:

They can ALL utilize FSR.

But do they want to? From what I've seen, I wouldn't want to turn it on even. 

In order for people to want to use this it has to actually be good, and now AMD have said that they don't have any interest in making sure it works well on Nvidia cards which is 75% of the market.

This won't get widely adopted if it only works alright, and only works alright on 16% of the market's graphics cards. That's why I am upset. Because I want this to be good and to be widely supported, but AMD have said they don't give two shits about making sure it is good on the graphics cards 75% of their potential customers actually use. That's terrible.

 

 

18 minutes ago, RejZoR said:

FSR is just inherently more interesting for the developers because of this.

I think FSR was interesting to developers up until AMD said that they won't bother lifting a finger to make sure it works decently on 75% of graphics cards out there. That they will only focus on making sure it works well on 16% of graphics cards.

 

20 minutes ago, RejZoR said:

With FSR, even with slight drop in quality

Have you seen the comparison images AMD released? It's not a "slight drop in quality". The quality takes a nose dive to the point where I even question if it will be any different from just turning the resolution down and letting your monitor upscale the image.

Link to comment
Share on other sites

Link to post
Share on other sites

Already there are people pushing the blame to AMD even BEFORE the tech is actually released LOL! That it runs on the GTX 1000 series is all AMD is obliged to do, you can't expect AMD to bend over backwards to get it running perfectly on GeForce cards, and neglect their own cards. 

 

nVidia has always had the bigger driver team, let them sort it out with AMD if they're so inclined. But I can bet you that nVidia will delay adoption of FSR because it'd affect the sale of their DLSS capable cards (meaning RTX cards). Best of all, with so many pushing the blame to AMD right now (even BEFORE FSR is released), it'd be a sure bet that they would say AMD is stonewalling them.

 

When the time comes, let's see if AMD helps them out with FSR, because it's actually beneficial for them to do so. Right now, FSR hasn't been released yet, so let's not speculate and start the blame game, okay?

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, LAwLz said:

Yes it is. AMD are saying they will not bother optimizing their technology for anyone but themselves, thus not making it an actual cross-vendor library.

 

According to the Steam hardware survey, it's 75%.

Also, like I said earlier, it is pointless to compare this to DLSS.

 

You're thinking of this in too black and white terms. FSR is only worth using if it's actually good. The fact that it technically runs on all cards is irrelevant if it doesn't run well.

 

"Slight expense"? Did you see the comparison screenshots? In AMD's own comparison it looked awful. If it doesn't improve before launch then I'd say it's entirely pointless because you might as well just run the game at lower resolution and let your monitor upscale it. Seems to be about as efficient.

Also, the number of people who goes from "not playable" to "playable thanks to Fidelity FX" is most likely VERY small. 

 

They have 16% market share, so yeah... Quite a far stretch from 50%.

 

But do they want to? From what I've seen, I wouldn't want to turn it on even. 

In order for people to want to use this it has to actually be good, and now AMD have said that they don't have any interest in making sure it works well on Nvidia cards which is 75% of the market.

This won't get widely adopted if it only works alright, and only works alright on 16% of the market's graphics cards. That's why I am upset. Because I want this to be good and to be widely supported, but AMD have said they don't give two shits about making sure it is good on the graphics cards 75% of their potential customers actually use. That's terrible.

 

 

I think FSR was interesting to developers up until AMD said that they won't bother lifting a finger to make sure it works decently on 75% of graphics cards out there. That they will only focus on making sure it works well on 16% of graphics cards.

 

Have you seen the comparison images AMD released? It's not a "slight drop in quality". The quality takes a nose dive to the point where I even question if it will be any different from just turning the resolution down and letting your monitor upscale the image.

Re: “doesn’t run well” the question becomes how not well.  If it runs meh that may be good enough.  Say a person has a 1060 and is looking at a 30xx or 6xxx series card as a replacement. So $600+ out of pocket.  A “meh” upsampler might not be so bad if saves $600+. VHS didn’t beat Betamax because it was better.  It wasn’t.  It was just slightly cheaper, and Sony was attaching a bunch of BS to Betamax that the manufacturers didn’t want to mess with. It doesn’t have to work well to win.  It only has to work OK.

 

re: “this won’t get adopted if it only works alright and only works on 16% of the market’s graphics cards”. This is a real point. AMD does not have the market share to be able to just force a methodology they way NVIDIA can.  If they want a system to win like freesynch did they have to get devs to support it, and the way to get devs to support it is to make it runnable on 1060s not 1070s. Just 1060s.  I don’t know enough about how game development works to be able to tell what exactly it would take support wise to get  a game company to either include FXSR with DLSS or to do FXSR but not DLSS.  whatever it is if they don’t want it to die and to do what freesynch did they’re going to have to do it though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, GamerDude said:

Already there are people pushing the blame to AMD even BEFORE the tech is actually released LOL!

Yes? What's the problem with that?

AMD showed a demo and the demo doesn't look very good. I don't see how I am being unreasonable for saying "I hope this works well because I want it to succeed, but right now it looks really bad".

And yes, this is AMD's technology so they will get blamed if it doesn't work well. They are the developers. They have 100% control over how it works. They therefore get 100% of the blame for any issues it has.

 

11 minutes ago, GamerDude said:

That it runs on the GTX 1000 series is all AMD is obliged to do, you can't expect AMD to bend over backwards to get it running perfectly on GeForce cards, and neglect their own cards. 

Sure, but if that's the approached AMD wants to take then I think they should be clear about it and not pretend like "look at how great we are for making an open standard that works with every vendor!". I think a lot of people in this thread have been mislead into thinking this is a technology that will work equally well on both AMD and Nvidia cards. In reality, it might be "it technically works but just barely on Nvidia, and it works well on AMD". That remains to be seen.

I think it's very disappointing of AMD to not want to put any effort into optimizing this for Nvidia cards though. Like I said earlier, AMD only has about 16% market share. They are neglecting about 75% of the market.

 

13 minutes ago, GamerDude said:

nVidia has always had the bigger driver team

This is not a driver thing... Optimizations like these should be done at the library level.

 

14 minutes ago, GamerDude said:

it'd be a sure bet that they would say AMD is stonewalling them.

Well AMD have deliberately put themselves in a position where they have the ability to stonewall Nvidia, so it is a real possibility that AMD might do so.

 

15 minutes ago, GamerDude said:

When the time comes, let's see if AMD helps them out with FSR, because it's actually beneficial for them to do so. Right now, FSR hasn't been released yet, so let's not speculate and start the blame game, okay?

But... You were the one who started the blame game... I was just responding to your claim that Nvidia are the ones who are 100% responsible for how well it works.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Bombastinator said:

Re: “doesn’t run well” the question becomes how not well.  If it runs meh that may be good enough.  Say a person has a 1060 and is looking at a 30xx or 6xxx series card as a replacement. So $600+ out of pocket.  A “meh” upsampler might not be so bad if saves $600+. VHS didn’t beat Betamax because it was better.  It wasn’t.  It was just slightly cheaper, and Sony was attaching a bunch of BS to Betamax that the manufacturers didn’t want to mess with. It doesn’t have to work well to win.  It only has to work OK.

Yes I absolutely agree.

The problem right now is that AMD's demo doesn't work "well enough". It seems to work really poorly. That's why I have on several occasions said that this is not something you can look at in black and white terms. This is not a the checkbox type technology where you either have it and it works flawlessly, or you don't have it at all.

Judging by the results I would say that it might be better to have Fidelity FX turned off because it performs so poorly.

 

For those who missed it, this is what it looked like in AMD's demo:

On 6/1/2021 at 10:30 AM, LAwLz said:

Left is native resolution, right is Fidelity FX in "quality mode":

130857154_Screenshot2021-06-01102858.png.ee68d49fdb944fe4fea3a8c07febc973.png

 

Edit: FidelityFX doesn't seem to improve image quality any more than your average monitor scalar can do, and that's Fidelity FX's biggest competitor.

I think it is pretty obvious from the demo that it is lightyears behind DLSS. But this is not a DLSS competitor in the same way a Honda Civic is not a Ferrari competitor. Chances are games that implement both DLSS and FidelityFX will look significantly better with DLSS than FidelityFX. So there isn't really any competition there. If you got the option for DLSS, always use it.

 

The real competition is FidelityFX vs other ways of upscaling, such as the monitor's scalar. If AMD can't be bothered to make any optimizations for 75% of the graphics cards used by gamers, and Nvidia probably won't spend time and effort either, then FidelityFX is in a very bad spot and I wouldn't be surprised if monitors can do a similar or better job than this.

 

If FidelityFX isn't better than other upscaling methods that are also hardware agnostic (like the upscaler that exists in monitors) then the technology is DoA. No point in using it.

Link to comment
Share on other sites

Link to post
Share on other sites

The question I think is that FXSR is an AMD product and their various GPUs are also AMD products.  If AMD merely considers FXSR a bit of support for their video cards it would be dealt with one way and if they consider FXSR to be as big a product as their video cards they will do it a different way.  Personally I suspect that if they go the first route FXSR will fail and that will cause their cards to fail as well. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

But... You were the one who started the blame game... I was just responding to your claim that Nvidia are the ones who are 100% responsible for how well it works.

Even the poster after your reply was talking about blame game, it was implicit in your reply to my post. You can try to talk your way around it, but I see your post as shifting the blame BEFORE the release of FSR. Let's see what happens, then we can make an informed decision and express an inform opinion.

 

As for your screenshot showing drop in PQ, you're taking a compressed screenshot of a screenshot to make your point, I mean really? Wait for it to be released before forming an opinion. As I'd said, I, and I'm sure those who are more informed, do NOT expect PQ to match DLSS 2.0/2.1, no way. But, as long as the PQ is 'good enough' (or, at least better than DLSS 1.0) then it'd be a big win for us.

 

Anyway, I don't wish to get involved in an argument so I'll ignore you, kindly return the favor, thanks.

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

Yes I absolutely agree.

The problem right now is that AMD's demo doesn't work "well enough". It seems to work really poorly. That's why I have on several occasions said that this is not something you can look at in black and white terms. This is not a the checkbox type technology where you either have it and it works flawlessly, or you don't have it at all.

Judging by the results I would say that it might be better to have Fidelity FX turned off because it performs so poorly.

 

For those who missed it, this is what it looked like in AMD's demo:

That looks like nothing but a motion blur feature.  If it’s no better than upscaling that is built into a monitor there isn’t any reason to even do it.   The implication is that FXSR doesn’t actually exist except as marketing fluff. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bombastinator said:

The question I think is that FXSR is an AMD product and their various GPUs are also AMD products.  If AMD merely considers FXSR a bit of support for their video cards it would be dealt with one way and if they consider FXSR to be as big a product as their video cards they will do it a different way.  Personally I suspect that if they go the first route FXSR will fail and that will cause their cards to fail as well. 

I am not sure if I understand you but I think I do, and I agree 100%.

Is FidelityFX a product in and of itself that AMD wants to make as good as possible, or is FidelityFX something they want to use to sell more of their own graphics cards?

 

If it's the latter, which AMD Scott basically flat out said, then FidelityFX probably won't get the wide adoption people including myself are hoping for, and it will probably fail.

 

I want AMD's attitude towards FidelityFX to be "this is a product we are supplying to developers because we want an industry standard that everyone wants to use because it's so good".

Their attitude right now seems to be "we want to sell graphics cards so we made this as a checkbox item and we don't really care how well it works. Also, let's pretend like it's open and good so that we get some good will points from our fans and make Nvidia look bad".

 

You can't pretend like something is hardware agonistic if you aren't putting in even a tiny bit of effort into making sure it works well on more hardware than your own. That's like saying MacOS is super open and hardware agnostic because you technically can make it work on whichever hardware you want. It's just that Apple aren't doing anything to support hardware other than the one they sell.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

I am not sure if I understand you but I think I do, and I agree 100%.

Is FidelityFX a product in and of itself that AMD wants to make as good as possible, or is FidelityFX something they want to use to sell more of their own graphics cards?

 

If it's the latter, which AMD Scott basically flat out said, then FidelityFX probably won't get the wide adoption people including myself are hoping for, and it will probably fail.

 

I want AMD's attitude towards FidelityFX to be "this is a product we are supplying to developers because we want an industry standard that everyone wants to use because it's so good".

Their attitude right now seems to be "we want to sell graphics cards so we made this as a checkbox item and we don't really care how well it works. Also, let's pretend like it's open and good so that we get some good will points from our fans and make Nvidia look bad".

If it fails AMD cards may fail too then upscaling could be a big big deal.  If you only have to buy a better monitor rather than a better monitor and a new card to get your frame rates up people are going to do that one. A card that can do 1080p@60 or 720p@120 would be enough if they can turn the 720 into 1080. The games people buy systems to play these days are PvP shooters not AAA rpgs.  Shooters like that benefit a LOT from higher frame rates. The often don’t go for big monitors they go for above 100fps frame rates. If AMD can’t upscale theyre at a 4x disadvantage and that is enough to kill them.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bombastinator said:

That looks like nothing but a motion blur feature.

I don't think that's motion blur. That is a screenshot AMD decided to release as press material for FidelityFX. I don't think AMD are incompetent enough to pick a screenshot where FidelityFX has a bunch of motion blur in it and the native screenshot doesn't.

 

If it was motion blur on that exact frame then that motion blur should be present in the native resolution screenshot as well, but it isn't. I think that's just how bad FidelityFX looks in its current form.

 

12 minutes ago, Bombastinator said:

 If it’s no better than upscaling that is built into a monitor there isn’t any reason to even do it.

Well, maybe for marketing purposes. Being able to say "we have upscaling like DLSS as well!" on the box or have your fans use it as ammunition might be a good enough reason for AMD.

Just look at all the people in this thread that are praising it before even looking at the screenshots showing the video quality. 

 

14 minutes ago, Bombastinator said:

The implication is that FXSR doesn’t actually exist except as marketing fluff. 

That's what I am worried about.

The poor visual quality in AMD's own screenshots, and the fact that they apparently won't even bother optimizing it for 75% of gamers are arguments for why it seems to be marketing fluff in my eyes. Everything I have heard so far points towards it being marketing fluff. So I am not very hopeful but I'd love to be proven wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, LAwLz said:

I don't think that's motion blur. That is a screenshot AMD decided to release as press material for FidelityFX. I don't think AMD are incompetent enough to pick a screenshot where FidelityFX has a bunch of motion blur in it and the native screenshot doesn't.

 

If it was motion blur on that exact frame then that motion blur should be present in the native resolution screenshot as well, but it isn't. I think that's just how bad FidelityFX looks in its current form.

 

Well, maybe for marketing purposes. Being able to say "we have upscaling like DLSS as well!" on the box or have your fans use it as ammunition might be a good enough reason for AMD.

Just look at all the people in this thread that are praising it before even looking at the screenshots showing the video quality. 

 

That's what I am worried about.

The poor visual quality in AMD's own screenshots, and the fact that they apparently won't even bother optimizing it for 75% of gamers are arguments for why it seems to be marketing fluff in my eyes. Everything I have heard so far points towards it being marketing fluff. So I am not very hopeful but I'd love to be proven wrong.

Cut 1:

Didn’t say it was, just what it looked like.  As bad as in monitor upscaling which is to say nothing at all.

 

cut2:

so doesn’t actually exist and is just marketing fluff

 

cut3: 

If it’s what they’re doing they’ll be caught.  That’s a mug’s game.  Something they would only do if they actually had absolutely nothing.  Which is possible I guess. I vaguely recall AMD having some demos that looked pretty good a long while back though.  Might have been faked I suppose.  That’s the problem with demos.  I’m reminded of that Nikola company not actually having a working truck so they rolled a non working one down a hill.  If it’s BS we will eventually be told.  Probably a week or so after the launch.  Maybe less. Too many people with review capacity. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

If Nvidia wants to try and improve Fidelity FX they will have to submit pull requests to AMD, which AMD could just ignore if they want to.

I guess it's a good thing we can see that when it happen because this is on Github? 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×