Jump to content

AMD Announces Fidelity FX Super Resolution, their competitor to nvidia DLSS

Juanitology
4 hours ago, xAcid9 said:

I reckon this will become like G-Sync vs Freesync.  AMD implementation is inferior but get widely adopted. 

We will have to wait for the quality comparisons, which is as much of a battleground as the performance numbers. Still, that's what I suspect will be the case. Apparently it will only use spatial scaling not temporal, so there is more risk of flickering for example.

 

2 hours ago, xAcid9 said:

Yeah, but i just found out AMD's published a patent about FSR recently and one image in particular show deep-learning involved so probably not a rebrand. 

I haven't seen the AMD presentation yet but Anandtech's writeup on it states they are not using machine learning based techniques. Also existence of a patent it not an indicator it will end up in a product any time ever. Companies file anything that is at all interesting as it may be of value even if they never use it themselves.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds promising but I am very skeptical.

If I had to guess, it will not be anywhere near as good as DLSS currently is, but comparing it to DLSS is kind of moot. This is hardware agnostic and hopefully gets wide support (if it's good).

We can discuss DLSS vs Fidelity FX all we want, but that discussion will only matter when:

1) A game support both technologies and someone needs to make a choice which one to use.

2) When some fanboy wants to talk about how their favorite brand is superior to the competitor.

 

The reason why I am very skeptical is because the demo image they showed, running on a GTX 1060, looks quite frankly awful.

Left is native resolution, right is Fidelity FX in "quality mode":

130857154_Screenshot2021-06-01102858.png.ee68d49fdb944fe4fea3a8c07febc973.png

 

If it's this bad then you might as well just run at a lower resolution and not use FidelityFX.

I seriously hope it improves before release, because if the comparison they posted is the current example they want to use to highlight how great it is, then it's in a VERY bad shape.

 

 

For comparison, this is an example of what DLSS looks like:

Left is native 4K, right is 1440P upscaled to 4K with DLSS:

Untitled.thumb.png.4b8992dbd1a4e9b20c0e5f9e01325ebb.png

Link to comment
Share on other sites

Link to post
Share on other sites

I feel this is a crawl before you walk moment. AMD is in the crawling stage and over time, they'll be able to walk just as Nvidia did with DLSS 2.0. Consumers just have to do something they don't want to do, and that's give it... time. Hope it works well because that will be better for us all.

Leonidas Specs: Ryzen 7 5800X3D | AMD 6800 XT Midnight Black | MSI B550 Gaming Plus | Corsair Dominator CL16 3200 MHz  4x8 32GB | be quiet! Silent Base 802

Maximus Specs: Ryzen 7 3700x | AMD 6700 XT Power Color Fighter | Asrock B550M-Itx/AC | Corsair Vengeance CL 16 3200 MHz 2x8 16 GB | Fractal Ridge Case (HTPC)


 

Link to comment
Share on other sites

Link to post
Share on other sites

This will be fun to see, also being agnostic approach is good.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, HelpfulTechWizard said:

Time for rx400 owners to do a rx500 flash.

Guess as of tomorow ill have a rx580 4gb.

Got an RX480 so guess I gotta do that too... first timer on vbios flashing.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

I haven't seen the AMD presentation yet but Anandtech's writeup on it states they are not using machine learning based techniques. Also existence of a patent it not an indicator it will end up in a product any time ever. Companies file anything that is at all interesting as it may be of value even if they never use it themselves.

Yeah, Tweaktown mentioned there will be no machine learning too, for FSR 2.0 maybe?

 

3 hours ago, porina said:

Apparently it will only use spatial scaling not temporal, so there is more risk of flickering for example.

But since it's not using temporal it will be less prone to ghosting unlike DLSS?

 

3 hours ago, LAwLz said:

The reason why I am very skeptical is because the demo image they showed, running on a GTX 1060, looks quite frankly awful.

Yeah, The foliage is blurred to the point it looks like 720p to my eyes. Might as well just drop the resolution to 1080p. 

 

 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, BuckGup said:

I didn't watch the video but does it still rely on AMD training models for each game like Nvidia does?

DLSS 2.0 doesn't rely on training for specific games anymore. That's why it has been adopted by unreal engine as a developer tool. Or do you think Nvidia offers free Ai training for each indie dev that decides to put it in his game? The current DLSS is trained on a big library of images, not some specific game like DLSS 1.0 was.

 

AMD still has a long way to go to deliver an actual competitor to DLSS in terms of quality. It's impressive that they want to support older GPUs aswell and it looks like a great move in the current shortage that will likely last well into next year. But this is also likely a bottleneck. Remember, DLSS uses dedicated hardware "tensor cores". As it has been shown with many things in the past, dedicated hardware is pretty much always superior than relying on more "general" hardware like stream processors / cuda cores. But as AMD doesn't have these tensor cores or any equivalent hardware they have to work with what they got i guess.

 

Maybe this will also make Nvidia realease some form of "open" DLSS for non-RTX GPUs. Remember, G-Sync was also NVidia only and now is useable for AMD aswell because AMD pushed through with their open FreeSync.

 

I'm interested to see how this will work out.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, RejZoR said:

Haha, running FSR on GTX 1060 was a power move indeed 😄 Also having FSR work on everything from Polaris cards and up is indeed a great way to extend their lifetime. I bet people who have RX 570/580 are already rejoicing and I even heard many RX 480 users deciding to flash them to RX 580 to get support. Also I can imagine there us quite some users who grabbed RX 5700 series back then as they were pretty capable. And now they'll be even more capable.

 

Given it's hardware agnostic, we can already expect wider adoption, because doing this means you've done work for AMD and NVIDIA. Doing DLSS means you've only done it for NVIDIA. Now it'll be just a matter of how much money NVIDIA is willing to throw at this situation to shill their tech. Seeing DLSS become hardware agnostic is something I'm not expecting at all.

My memory is far and away the most popular cards on steam are 1060s & 580s. If it runs on those this could be another freesynch situation.  Game developers will go for it because it can be sold better because the market is bigger.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bombastinator said:

Game developers will go for it because it can be sold better because the market is bigger.

Not to mention it will be available to PS5, XBSX/S as well. 

 

6 minutes ago, Stahlmann said:

Remember, DLSS uses dedicated hardware "tensor cores"

Can 1660 series use DLSS as well? 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

I haven't seen the AMD presentation yet but Anandtech's writeup on it states they are not using machine learning based techniques

They didn't, they mentioned that it doesn't rely on deep learning hardware:

Quote

but they are confirming that it doesn’t require any kind of tensor or other deep learning hardware. 

We still don't know for sure what tech is behind it, we will need to wait until the source is available.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, xAcid9 said:

Can 1660 series use DLSS as well? 

No. DLSS can only be used on RTX cards because it needs their "Tensor Cores" to function.

Only RTX 2000/3000 cards have these.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Stahlmann said:

No. DLSS can only be used on RTX cards because it needs their "Tensor Cores" to function.

My bad, I thought 1660 Ti have tensor cores. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Murasaki said:

Got an RX480 so guess I gotta do that too... first timer on vbios flashing.

I can pm you if you need help.

Might take a couple tries though, depending on how much of the bios has to be rx580

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, HelpfulTechWizard said:

I can pm you if you need help.

Might take a couple tries though, depending on how much of the bios has to be rx580

Thanks! Won't do it right away though since I don't want to screw anything up and end up with no GPU... in these times it would be the worst thing.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Murasaki said:

Thanks! Won't do it right away though since I don't want to screw anything up and end up with no GPU... in these times it would be the worst thing.

I completly understand. 

though its nice, on ati flash, yu can flash a bios to a card that has been bricked, if you have a igpu or second gpu.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Stahlmann said:

DLSS 2.0 doesn't rely on training for specific games anymore. That's why it has been adopted by unreal engine as a developer tool. Or do you think Nvidia offers free Ai training for each indie dev that decides to put it in his game? The current DLSS is trained on a big library of images, not some specific game like DLSS 1.0 was.

AI training is just a fancy way of saying "optimized". It's just that it's not real-time and not performed on the client machine.

 

That last part is the problem IMHO. Both nVidia and AMD should provide APIs to game developers that let the client machine run the game in a demo loop to train optimization settings. This would reduce real-time load on the GPU while still allowing flexibility with future driver updates and not risk breaking compatibility and performance. It could also allow a wider adoption for game development. And who knows, maybe retraining again with a newer driver would yield more performance and improved image quality without having to change the game code.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, HelpfulTechWizard said:

I completly understand. 

though its nice, on ati flash, yu can flash a bios to a card that has been bricked, if you have a igpu or second gpu.

Used my mobo's onboard graphics when my first 480 died so thats kinda assuring!

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, xAcid9 said:

I reckon this will become like G-Sync vs Freesync.  AMD implementation is inferior but get widely adopted. 

likely will be the case bc dlss requires an rtx card iirc

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, StDragon said:

AI training is just a fancy way of saying "optimized". It's just that it's not real-time and not performed on the client machine.

 

That last part is the problem IMHO. Both nVidia and AMD should provide APIs to game developers that let the client machine run the game in a demo loop to train optimization settings. This would reduce real-time load on the GPU while still allowing flexibility with future driver updates and not risk braking compatibility and performance. It could also allow a wider adoption for game development. And who knows, maybe retraining again with a newer driver would yield more performance and improved image quality without having to change the game code.

Running a demo loop for a couple hours on a specific game with subpar hardware wouldn't improve things, it might even make it worse.

You just don't update a model's weight in real-time, specially in such scenario where latency is critical.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, GamerDude said:

DLSS is locked onto nVidia cards, and if you'd watched the vid, even GTX 1000 series cards can use FSR. Do you see nVidia doing anything resembling this? For sure, first gen FSR might be less than spectacular, so was DLSS 1.0 if you can recall. I'm just hoping FSR will be supported in the games I play, especially Metro Exodus PC EE (need a boost with RT enabled).

What's your point?

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, xAcid9 said:

My bad, I thought 1660 Ti have tensor cores. 

My understanding is Any consumer Nvidia gpu that starts with a 1 has no tensor cores

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, dizmo said:

What's your point?

My point is, all this dissing on AMD's first attempt at DLSS-like super res, it'd take time for AMD to get better at this, much like how DLSS 1.0 was less than spectacular at its launch, DLSS 2.0 is awesome now. nVidia had a head start, plus locking DLSS to its own hardware ensures that it'd work better, (hardware + software > software only) I'm not arguing that. 

 

This works with both older AMD GPU's and GTX 1000 cards, no reasonable person would expect FSR to match DLSS, especially in its first iteration when AMD is working out the kinks. Here, and in other forums, nVidia boys harp on the PQ, not even bothering to take note that it works on more recent nVidia GPUs as well. 

 

Whatever the case, I'll ignore this thread as I don't wanna get involved in any arguments.....

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I also wonder what will be resolution limitations. A lot of games refuse to enable DLSS if you're using "only" 1080p. For example Shadow of Tomb Raider refused to give any control of it at 1080p. When i had 1080p monitor I had to use DSR to run game at 1440p and then "downscale" it with DLSS. Or now that I have 1440p monitor it gets available by default. I wonder what limitations FSR has in this regard...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, StDragon said:

AI training is just a fancy way of saying "optimized". It's just that it's not real-time and not performed on the client machine.

 

That last part is the problem IMHO. Both nVidia and AMD should provide APIs to game developers that let the client machine run the game in a demo loop to train optimization settings. This would reduce real-time load on the GPU while still allowing flexibility with future driver updates and not risk breaking compatibility and performance. It could also allow a wider adoption for game development. And who knows, maybe retraining again with a newer driver would yield more performance and improved image quality without having to change the game code.

These algorythms to optimize DLSS need to run in a big datacenter. Running it on some client machines for a couple of hours or weeks won't help.

 

Seeing how well it works, with sometimes even improving image quality compared to native resolution, i think the current approach is a good one.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, RejZoR said:

I also wonder what will be resolution limitations. A lot of games refuse to enable DLSS if you're using "only" 1080p. For example Shadow of Tomb Raider refused to give any control of it at 1080p. When i had 1080p monitor I had to use DSR to run game at 1440p and then "downscale" it with DLSS. Or now that I have 1440p monitor it gets available by default. I wonder what limitations FSR has in this regard...

That's because it was DLSS 1.0.

 

The 2.0 version does not need any specific target resolution.

It works with all standard resolutions, including ultrawide.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×