Jump to content

AMD Announces Fidelity FX Super Resolution, their competitor to nvidia DLSS

Juanitology

This is interesting, hopefully AMD can improve Super Resolution so we can have something that competes with DLSS. I think people are being too quick to dismiss AMD's first attempt with FSR, the first implementation of DLSS wasn't great either.

Also nice to see that it works without AI training and works on both AMD and Nvidia GPU's.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Stahlmann said:

Maybe this will also make Nvidia realease some form of "open" DLSS for non-RTX GPUs. Remember, G-Sync was also NVidia only and now is useable for AMD aswell because AMD pushed through with their open FreeSync.

I think you're trying to say we kinda have a universal VRR tech now because of FreeSync. G-Sync remains nvidia only tech.

 

1 hour ago, igormp said:

They didn't, they mentioned that it doesn't rely on deep learning hardware:

We still don't know for sure what tech is behind it, we will need to wait until the source is available.

Still haven't had a chance to watch the original presentation, and assuming that is the case I'm not sure it'll make that much difference. If we assume AMD don't use DL then the lack of hardware doesn't matter at all. If they were to use DL, how much could they usefully do without impacting the traditional computation?

 

The way I see it, they have a given compute budget. Traditionally that would be 100% allocated to rendering say. Now we might divide that up between rendering at lower resolution, and the remaining power spent on upscaling. It will be interesting to see those tradeoffs.

 

 

Anyway, we will still have to await results. IMO DLSS 1.0 was fine. If FSR is comparable to that, I'll take it, although I think the only system I use regularly that doesn't have RTX has a 1080Ti. Gaming goal is to have a good experience, and that is what it should be judged on. I'll pixel peel like anyone else, but looking at still frames isn't the whole story. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Blademaster91 said:

This is interesting, hopefully AMD can improve Super Resolution so we can have something that competes with DLSS. I think people are being too quick to dismiss AMD's first attempt with FSR, the first implementation of DLSS wasn't great either.

Nvidia had the benefit of being the first and having no competitors. AMD took way longer to come with something, and people will be naturally comparing it to DLSS 2.0.

 

32 minutes ago, Blademaster91 said:

Also nice to see that it works without AI training

There's no confirmation on that yet.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

So many threads about this, but I guess this is worth for the updates.

I would assume they also have a fast solution, or rather a solution that not exactly worth it or just to say they can get a lot of FPS.

So it would nice to see what it falls into, if all of them are usable or if it's like 144p upscaled to 4k 😛

I guess they did compare their best setting vs native, would be fun to compare each offering of best to worst quality setting.

 

 

Nivida, lets have things secret until AMD opens up a open source one.

AMD a few years later, here you go!

Nvidia when more games are using AMD's version, well here you go with a free version of our tech.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

if its based on the upscaling they've used in the PS4/Xbox One, it will be good enough
I'm happy to see them focusing on the lower res. I don't plan on playing 4k anytime soon so something that gives me 25-40% more FPS at 4k doesn't matter, I'd much rather have 1440p run a lot better

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Quackers101 said:

So many threads about this, but I guess this is worth for the updates.

I would assume they also have a fast solution, or rather a solution that not exactly worth it or just to say they can get a lot of FPS.

So it would nice to see what it falls into, if all of them are usable or if it's like 144p upscaled to 4k 😛

I guess they did compare their best setting vs native, would be fun to compare each offering of best to worst quality setting.

 

 

Nivida, lets have things secret until AMD opens up a open source one.

AMD a few years later, here you go!

Nvidia when more games are using AMD's version, well here you go with a free version of our tech.

If you watch the keynote, there are 4 levels of the tech:

Max quality, quality, balanced, and preformance

Preformance is quated at up to 2x the fps, all.ething that would probably be turned on for all competitive games, like rocket league, csgp, fortnite, warzone, r6seige, if they got it

Quality id guess would be used more in eye candy games, to be able to turn up the settings without all of the food loss.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GDRRiley said:

if its based on the upscaling they've used in the PS4/Xbox One, it will be good enough
I'm happy to see them focusing on the lower res. I don't plan on playing 4k anytime soon so something that gives me 25-40% more FPS at 4k doesn't matter, I'd much rather have 1440p run a lot better

Agreed. 
In most cases, 4k is absurd for gaming, because fitting a big enough 4k monitor (comparing to 1440p) to see the difference, when being far enough away to have a comfortable fov is hard.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Any such feature will be greatly appreciated by users of whatever existing cards they have. Seeing FSR goes all the way back to Radeon RX 500 series. Those are quite in need for extra performance by now. And if you can squeeze even 50% extra performance out of it, that's huge. And if it means you sacrifice less quality than by dropping resolution scaling to 50% or just resolution itself down to 720p instead of 1080p, that's pretty cool.

 

I really can't wait for games with FSR to appear and give them a try to see how quality is affected. DLSS can still have CAS applied with ReShade which means one can slap CAS on FSR to enhance sharpness. Assuming it's not being done already internally.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

I think you're trying to say we kinda have a universal VRR tech now because of FreeSync. G-Sync remains nvidia only tech.

But you can now use G-Sync monitors on AMD GPUs.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Stahlmann said:

But you can now use G-Sync monitors on AMD GPUs.

yeah, only that it took Nvidia some time before doing so. If AMD and their push was never a thing, then Nvidia could have kept it for themselves. While I'm not sure if there was other reasons, but this is were competition is a benefit.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Stahlmann said:

But you can now use G-Sync monitors on AMD GPUs.

 

Yes, because Nvidia eventually conceded defeat. Still tried to claim credit for an open standard they could have supported all along with the "G-sync compatible" marketing though, and allegedly also trying to bully display manufacturers into not mentioning FreeSync. 

 

I think FSR following in the footsteps of FreeSync in being slightly technically inferior in edge cases but eventually becoming much more widely adopted is the best outcome we can hope for here. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Middcore said:

 

Yes, because Nvidia eventually conceded defeat. Still tried to claim credit for an open standard they could have supported all along with the "G-sync compatible" marketing though, and allegedly also trying to bully display manufacturers into not mentioning FreeSync. 

 

I think FSR following in the footsteps of FreeSync in being slightly technically inferior in edge cases but eventually becoming much more widely adopted is the best outcome we can hope for here. 

And imagine if all game needs to support is just support FSR and then it can be updated via drivers on how it's actually done/applied. I sure hope that's the case because then you don't need to wait for game devs to update it, you just need AMD to update it. So you don't have cases like with DLSS where some games are stuck with crappy DLSS 1.0 and will never get any higher version because devs are not interested in updating the game or they no longer exist. I hope it works this way.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Stahlmann said:

But you can now use G-Sync monitors on AMD GPUs.

Did I miss something? Either we still have a misunderstanding or I totally missed that news.

 

When I refer to G-Sync, I'm specifically talking about the original nvidia implementations and close variations thereof. This requires the hardware module in monitor, and one feature not required by other standards is the operating frequency range is essentially 0 to monitor refresh rate.

 

G-Sync Compatible certified displays are a lesser standard and is the more generic version with nvidia branding. I guess AMD GPUs could work with those but I never looked into it since I don't have modern AMD GPUs to test with, last I had was Vega. If nvidia haven't changed requirements since I last looked, this had a lower requirement of a 2.4x ratio between highest and lowest supported VRR range.

 

My current gaming laptop display is listed as supporting both FreeSync and G-Sync Compatible. One of the fun things about having an AMD APU with nvidia GPU in the laptop.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

"G-Sync Compatible" is, for practical purposes, Nvidia's branding for FreeSync now. They "certify' some displays as G-Sync Compatible and VRR will work with any relatively modern Nvidia or AMD card, although many more displays that Nvidia has not bothered to "certify' also work just fine.

 

The branding for displays with the traditional hardware module inside is now "G-Sync Ultimate" I believe, but I don't know how many of those are being produced these days. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Middcore said:

"G-Sync Compatible" is, for practical purposes, Nvidia's branding for FreeSync now. They "certify' some displays as G-Sync Compatible and VRR will work with any relatively modern Nvidia or AMD card, although many more displays that Nvidia has not bothered to "certify' also work just fine.

nvidia certification is not just "it works" but they have an image quality level also. To the best of my knowledge, basic FreeSync does not, or is so low there might as well not be any.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

nvidia certification is not just "it works" but they have an image quality level also. 

 

Sure. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Papercut Captain said:

You know that doesn't imply anything, right?

DLSS 1.0 didn't use motion vectors either, it's just a matter of what you feed your model. DLSS 2.0 also doesn't need per-game training.

There are many researches using ML models for "spatial upscaling".

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

But you can now use G-Sync monitors on AMD GPUs.

With a huge "BUT".

You need a new G-Sync monitor that actually works with AMD GPU. You can't take any old G-Sync monitor and expect it to work.

So if you're like me, you still would have to change your monitor if you bought an AMD card.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Papercut Captain said:

Yes. Just pointing it out. At least for me, their wording suggest it does rely on AI ("not requiring any per-game training"). If they said it didn't require any training, it would be safe to bet it was not AI-based.

Fair enough, I misunderstood your post as trying to make a point about not being ML-based or something, my bad.

I don't think it's a purely based ML solution, otherwise the end model would be too heavy to run in tandem with the game rendering itself on the stream processors.

I wonder how their entire solution will look like in the end, which API they are using for they models since AMD lacks any proper work in AI (in case they're indeed using ML), and how it'll fare when it comes to artifacts since they're seem to be doing a per-frame upsampling.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, HelpfulTechWizard said:

Agreed. 
In most cases, 4k is absurd for gaming, because fitting a big enough 4k monitor (comparing to 1440p) to see the difference, when being far enough away to have a comfortable fov is hard.

I disagree, I have a 4k 32" at normal monitor distance.

Native 1440p 32" would be ugly....

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, WereCat said:

With a huge "BUT".

You need a new G-Sync monitor that actually works with AMD GPU. You can't take any old G-Sync monitor and expect it to work.

So if you're like me, you still would have to change your monitor if you bought an AMD card.

Yes, it's only the newer G-Sync monitors, but it's better than no change 😉 

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, porina said:

Did I miss something? Either we still have a misunderstanding or I totally missed that news.

 

When I refer to G-Sync, I'm specifically talking about the original nvidia implementations and close variations thereof. This requires the hardware module in monitor, and one feature not required by other standards is the operating frequency range is essentially 0 to monitor refresh rate.

 

G-Sync Compatible certified displays are a lesser standard and is the more generic version with nvidia branding. I guess AMD GPUs could work with those but I never looked into it since I don't have modern AMD GPUs to test with, last I had was Vega. If nvidia haven't changed requirements since I last looked, this had a lower requirement of a 2.4x ratio between highest and lowest supported VRR range.

 

My current gaming laptop display is listed as supporting both FreeSync and G-Sync Compatible. One of the fun things about having an AMD APU with nvidia GPU in the laptop.

"Native" G-Sync monitors still use a G-Sync module as opposed to a reguar scaler. But newer monitors with G-Sync hardware can also use their VRR capabilities on AMD GPUs.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Quackers101 said:

yeah, only that it took Nvidia some time before doing so. If AMD and their push was never a thing, then Nvidia could have kept it for themselves. While I'm not sure if there was other reasons, but this is were competition is a benefit.

That's exactly what i said wasn't it?

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

-deleted-  too stupid.  No one cares about ancient joke bands.

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Stahlmann said:

"Native" G-Sync monitors still use a G-Sync module as opposed to a reguar scaler. But newer monitors with G-Sync hardware can also use their VRR capabilities on AMD GPUs.

Thanks, I was not aware of that. I suppose thinking about it, connection standards (DP, HDMI) have not kept static over the years, and it is pretty much an expectation for some minimal level of support these days.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×