Jump to content

AMD’s gift to ALL gamers - FidelityFX Super Resolution

0:22 - Work on older hardware? Maybe my sisters old RX 560 can finally be better in terms of performance.. while we wait for a new gpu ofcourse to come in stock 🙂

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, kitnoman said:

I’ve watched 4 reviews now. From what I understood fidelityfx is actually design for mid and high end gpus. Between, 4k, 1440p and 1080p, 1080p is the worst of the bunch and it would only get blurrier as you lower the resolution. While 1440p is the best option for a balance of high fps and quality. As fidelityfx is mainly focus on 4k and 1440p, we can say that these are mainly beneficial to gpus designed for 4k and 1440p, which are high end cards, old gen high end cards(mid range) and mid range cards. Though it is really helpful and would give at least another year or two for cards that run’s on 1080p, until we get out of this gpu stock issue.

 

I’ve always believed that amd’s best business move was being able to partner with sony and microsoft’s ps5 and xbox. I would not be surprised if within a year or even 6months, we will have more games that support fidelityfx compared to dlss supported games. I mean, think about it. It is much easier to incorporate to an older or current game. Plus most likely new games that would be coming out on consoles will have fidelityfx support together with there pc version.

 

but what I’m looking forward is if it is really true that there will be a samsung-amd phone and will it support fidelityfx too.

How exactly? 1440p FSR Quality uses source image of 1080p. I have an RX 580, which is low-end by today's standard, which can run games at 1080p no problem, but struggles pretty hard with 1440p. With FSR and the quality preset, I could run games at 1440p, which I previously could not on my low-end GPU.

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Fatih19 said:

How exactly? 1440p FSR Quality uses source image of 1080p. I have an RX 580, which is low-end by today's standard, which can run games at 1080p no problem, but struggles pretty hard with 1440p. With FSR and the quality preset, I could run games at 1440p, which I previously could not on my low-end GPU.

correct me if I'm wrong with my understanding but if your monitor's native resolution is 1080p and you have a 1060/rx 580 card, why would you use  1440p resolution with your game?  I guess if you want a little bit more quality. But if your goal to get more stable and playable frame rates, wouldn't it be more productive if you use 1080p  ultraquality or quality instead of 1440p at quality settings, as if you check, it would have a higher frame rates. Plus personally, I don't know if there's something wrong with my eyes but I barely see any difference between 1440p and 1080p resolution when the game is running on a 1080p monitor. Lastly, if you check the reviews, while 4k and 1440p have acceptable Performance increase from native to performance, you would see diminishing returns starting at quality to performance in 1080p and lower resolution. Again, this is what I understood, so I'm not really sure either. I don't play or have the games that currently support it so I can't test it for myself right now.

Note: the screenshot below was not from me. But it shows that 1080p at ultraquality and quality is better than 1440p at quality in terms of frame rates and this is not just with this review but the same if you check other reviews as well.

test 1.PNG

test 2.PNG

test 3.PNG

test 4.PNG

test 5.PNG

test 6.PNG

Link to comment
Share on other sites

Link to post
Share on other sites

Gamers : I can use my 1060 or 1030 now , Wow;

Also gamers with a Nvidia GT 710 and G210 : When is our time coming;

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ihaveagraphicscard said:

Gamers : I can use my 1060 or 1030 now , Wow;

Also gamers with a Nvidia GT 710 and G210 : When is our time coming;

Over at GN, Steve had said FSR should work for the GTX 900 series as well, so owners of such cards have cause to celebrate as well. I'm almost done downloading Godfall for my HTPC rig, which has a GTX 1080, so I'm gonna try it upscaled to 4K....but, EGS launcher is giving all sorts of problem on this rig. Now, Godfall is stalling at 97%, it's been stuck at 'Verifying' for more than an hour now. I don't get it, works perfectly on my main rig...

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, kitnoman said:

correct me if I'm wrong with my understanding but if your monitor's native resolution is 1080p and you have a 1060/rx 580 card, why would you use  1440p resolution with your game?  I guess if you want a little bit more quality. But if your goal to get more stable and playable frame rates, wouldn't it be more productive if you use 1080p  ultraquality or quality instead of 1440p at quality settings, as if you check, it would have a higher frame rates. Plus personally, I don't know if there's something wrong with my eyes but I barely see any difference between 1440p and 1080p resolution when the game is running on a 1080p monitor. Lastly, if you check the reviews, while 4k and 1440p have acceptable Performance increase from native to performance, you would see diminishing returns starting at quality to performance in 1080p and lower resolution. Again, this is what I understood, so I'm not really sure either. I don't play or have the games that currently support it so I can't test it for myself right now.

Note: the screenshot below was not from me. But it shows that 1080p at ultraquality and quality is better than 1440p at quality in terms of frame rates and this is not just with this review but the same if you check other reviews as well.

 

The assumption that I have a 1080p monitor is just wrong. Why would I want to play at 1440p if I have a 1080p monitor? It's quite obvious that I have a 1440p monitor. 

Secondly, I don't care too much about frame rates, as long as it's 60 and up I don't care. Why? Because my monitor is capped at 60 FPS, and the game I play are single-player games where visual fidelity is more important than frame rate. You're basing this argument off of the assumption that I play multi-player competitive games where frame rates are much more important and that my monitor can take advantage of high frame rate, both of which aren't true. 

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×