Jump to content

AMD adds FSR driver support, discontinues support on older GPUs/OSes

porina
13 minutes ago, TetraSky said:

the demand for GPUs will be slightly lower.... right? .... RIGHT!?!

GPU price already on a downward trend and supply is going up right now. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Morgan MLGman said:

This also makes me question the utility of Nvidia's Tensor Cores in RTX GPUs.

Wall of text added to OP including my thoughts on the review samples. Specific to this comment, what we currently have is a native vs FSR comparison. What we could use next is a FSR vs DLSS comparison. Does DLSS offer better visuals and/or more performance vs FSR for those with RTX GPUs.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, porina said:

Wall of text added to OP including my thoughts on the review samples. Specific to this comment, what we currently have is a native vs FSR comparison. What we could use next is a FSR vs DLSS comparison. Does DLSS offer better visuals and/or more performance vs FSR for those with RTX GPUs.

I agree with your conclusion - and I think we should look at this technology as something to be used in specific cases, like 4K gaming and being in need of more performance, trying to offset some of the ray tracing associated performance hit or having a lower-end part like a Ryzen APU or GTX 10-series card. Though I do not recommend using FSR at lower resolutions.

Apparently FSR actually works on even older hardware, it's just validated only so far back. Its easy implementation in games is a large plus too, the problem is that the launch game support is pretty laughable right now so it's more like a "tech demo" to most at this point

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, porina said:

Does DLSS offer better visuals and/or more performance vs FSR for those with RTX GPUs.

Better visuals, probably yes except for some temporal artifacts DLSS introduces. More performance? Probably not.
Does that even matter if FSR becomes the defacto upscaling solution, as both large and small studios can benefit from it?
Just imagine it on the Nintendo Switch... 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Morgan MLGman said:

Though I do not recommend using FSR at lower resolutions.

By that do you mean using low end hardware to output 1080p, or using the performance settings?

 

If I get around to listing it, I need to sell my 1050 2GB based laptop now it has been replaced by a 3070 laptop. The 1050 was struggling to output 1080p60 low in more modern games. FSR might give it a bit more useful life.

 

Quote

Apparently FSR actually works on even older hardware, it's just validated only so far back. Its easy implementation in games is a large plus too, the problem is that the launch game support is pretty laughable right now so it's more like a "tech demo" to most at this point

I forgot to say, FSR reminds me a lot of render scale options that already exist in some games. The difference being the upscaling routine should be better. Any new tech will take time for adoption. 

 

1 minute ago, Forbidden Wafer said:

Better visuals, probably yes except for some temporal artifacts DLSS introduces. More performance? Probably not.
Does that even matter if FSR becomes the defacto upscaling solution, as both large and small studios can benefit from it?
Just imagine it on the Nintendo Switch... 

I saw a comparison on DLSS1 vs 2, and 2 seems much better on temporal stability. It is an open question to where FSR fits into that. Given FSR is currently single frame based like DLSS1 was, there is a risk of flickering in fine details between frames.

 

I'm not a game dev. How much work is it to implement FSR? How much work is it to implement DLSS? I see that AMD had no choice in this matter. If they tried to make it AMD exclusive, they would probably not get any great traction in the PC space unless they can indirectly move it from the console side first.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, porina said:

By that do you mean using low end hardware to output 1080p, or using the performance settings?

I meant that if you run 1080p native and use the FSR Performance setting, your what AMD calls "input resolution" is so low that it completely breaks visuals at some point, so for best results I would probably recommend not going below "Quality" setting at 1080p.

For instance, you can see in Hardware Unboxed review video (here) how using FSR Performance mode at 1080p in Godfall reduced the quality significantly beyond what happens at 4K

This may be exactly why DLSS is restricted not to run at all with certain GPU + resolution configurations

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, porina said:

Given FSR is currently single frame based like DLSS1 was, there is a risk of flickering in fine details between frames.

Yup, it does happen, especially in balanced/performance mode.

18 minutes ago, porina said:

How much work is it to implement FSR? How much work is it to implement DLSS?

If you're using an engine that supports it, should be basically the same now that DLSS doesn't require per-game training. If you're not, than you would also need to deal with DLSS neural network + motion vectors used as feedback after each frame generation (which seem to be big and waste precious VRAM).

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Morgan MLGman said:

I agree with your conclusion - and I think we should look at this technology as something to be used in specific cases, like 4K gaming and being in need of more performance, trying to offset some of the ray tracing associated performance hit or having a lower-end part like a Ryzen APU or GTX 10-series card. Though I do not recommend using FSR at lower resolutions.

Apparently FSR actually works on even older hardware, it's just validated only so far back. Its easy implementation in games is a large plus too, the problem is that the launch game support is pretty laughable right now so it's more like a "tech demo" to most at this point

Well, so was DLSS. All we got was that Final Fantasy tech demo and Battlefield V and that was about it. It looked pretty bad. Metro Exodus came later and was still pretty bad although not as bad. And that was pretty much it for next 6 months after which we got 4 more games that support DLSS. With some it was a requirement to even be able to use RTX. Not because of performance but because it was tied to DLSS explicitely.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Forbidden Wafer said:

If you're using an engine that supports it, should be basically the same now that DLSS doesn't require per-game training. If you're not, than you also have to deal with the neural network + motion vectors used as feedback after each frame generation (which seem to be big and waste precious VRAM).

In terms of ease of integration, I found this quote on TweakTown:

Quote

AMD has made FSR easy to integrate for game developers, through a single compute shader solution -- all without the need for any external dependencies. AMD has minimal integration requirements for developers versus its competitor with DLSS, FSR does not require temporal data, or per-game training.

 

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Hope that FSR can be quickly integrated into popular games with DLSS so we can get comparisons soon

Link to comment
Share on other sites

Link to post
Share on other sites

Someone finally noticed that traditional upscaling is not that bad when using high resolution sources, as long as you use a decent algorithm, aka not bilinear.

I really don't understand why it took so long for something like this to become available. But good on AMD on the apparently nice implementation, hopefully it will get more options of algorithms and more game support.

Link to comment
Share on other sites

Link to post
Share on other sites

- They tested AMD's 6800 XT, 570XT, Vega64, RX570 4GB, Nvidia's 3080, 2070, 1660 and 1070 Ti. 
- Direct and Non-Direct Comparison between DLSS 1.0, Tradional Upscaling and DLSS 2.0+

 

51 minutes ago, Morgan MLGman said:

AMD has made FSR easy to integrate for game developers, through a single compute shader solution -- all without the need for any external dependencies. AMD has minimal integration requirements for developers versus its competitor with DLSS, FSR does not require temporal data, or per-game training.

Possible porting to Reshade? Yes pls. 

 

I wonder if this is enable in Riftbreaker Demo. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Is it just me that's confused by the GPUs they are dropping support for vs the ones they keep supporting?

Correct me if I am wrong, but aren't some of the GPUs from the 400 series basically the same as some older generations?

 

 

Also, that's really bad driver support for some GPUs. For example the Fury and 300 series only got like 5 years of support. 9 years of support for the 7970 is alright, but 

Meanwhile, Nvidia just recently said they would cut support for their 9 year old architecture (Kepler) but will keep giving it security updates.

 

I am also confused as to why they need to add FSR support in the driver. Wasn't the whole selling-point that it wouldn't need that?

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, LAwLz said:

Is it just me that's confused by the GPUs they are dropping support for vs the ones they keep supporting?

Correct me if I am wrong, but aren't some of the GPUs from the 400 series basically the same as some older generations?

 

 

Also, that's really bad driver support for some GPUs. For example the Fury and 300 series only got like 5 years of support. 9 years of support for the 7970 is alright, but 

Meanwhile, Nvidia just recently said they would cut support for their 9 year old architecture (Kepler) but will keep giving it security updates.

 

I am also confused as to why they need to add FSR support in the driver. Wasn't the whole selling-point that it wouldn't need that?

To my knowledge it actually doesn't require driver support. In Gamers Nexus review, Steve mentioned how FSR can run on older hardware, but they aren't officially validated.

 

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Levent said:

GCN1.0 received almost 10 years of driver support (9 years and 7 months), a lot longer than I expected. Which is also probably the longest for any graphics card (HD7970/HD7950).

It was great hardware, I think part of the reason they kept it so long as the architecture was also used in Xbox One and PS4, so they had to maintain that already.

 

Anyway shame, those GCN 1.0 products still do a good job today.  I have an R9 390X in a bedroom PC but I guess I'll just update it to the latest driver and keep it going.  It rivals and Xbox One X and has 8GB of GDDR5, hardly wroth retiring even if AMD abandoned driver support.  (And maybe a bit OP since I have it plugged into a 1360x768 TV...)

Desktop: Ryzen 9 3950X, Asus TUF Gaming X570-Plus, 64GB DDR4, MSI RTX 3080 Gaming X Trio, Creative Sound Blaster AE-7

Gaming PC #2: Ryzen 7 5800X3D, Asus TUF Gaming B550M-Plus, 32GB DDR4, Gigabyte Windforce GTX 1080

Gaming PC #3: Intel i7 4790, Asus B85M-G, 16B DDR3, XFX Radeon R9 390X 8GB

WFH PC: Intel i7 4790, Asus B85M-F, 16GB DDR3, Gigabyte Radeon RX 6400 4GB

UnRAID #1: AMD Ryzen 9 3900X, Asus TUF Gaming B450M-Plus, 64GB DDR4, Radeon HD 5450

UnRAID #2: Intel E5-2603v2, Asus P9X79 LE, 24GB DDR3, Radeon HD 5450

MiniPC: BeeLink SER6 6600H w/ Ryzen 5 6600H, 16GB DDR5 
Windows XP Retro PC: Intel i3 3250, Asus P8B75-M LX, 8GB DDR3, Sapphire Radeon HD 6850, Creative Sound Blaster Audigy

Windows 9X Retro PC: Intel E5800, ASRock 775i65G r2.0, 1GB DDR1, AGP Sapphire Radeon X800 Pro, Creative Sound Blaster Live!

Steam Deck w/ 2TB SSD Upgrade

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, LAwLz said:

Correct me if I am wrong, but aren't some of the GPUs from the 400 series basically the same as some older generations?

Yeah, anything below RX460 is GCN 3.0 and older. 

31 minutes ago, LAwLz said:

I am also confused as to why they need to add FSR support in the driver. Wasn't the whole selling-point that it wouldn't need that?

I'm confused by that as well, I read it just work on Nvidia GPU even with older driver?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, xAcid9 said:

- They tested AMD's 6800 XT, 570XT, Vega64, RX570 4GB, Nvidia's 3080, 2070, 1660 and 1070 Ti. 
- Direct and Non-Direct Comparison between DLSS 1.0, Tradional Upscaling and DLSS 2.0+

 

Possible porting to Reshade? Yes pls. 

 

I wonder if this is enable in Riftbreaker Demo. 

While ReShade could do it, it doesn't have control over framebuffer sizing. ReShade can only operate at resolution game is running where FSR changes (lowers) render resolution and outputs it to monitor's native resolution. Something ReShade can't do. Would be cool if it could do such things, people already do all sorts of funky thing with ReShade's shaders (best example is Marty's RTGI shader which adds screen space real time ray tracing to almost any game. I've used it for a while and it's pretty sick. Having RTGI and FSR and then CAS on top would be pretty sick 😄

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, LAwLz said:

I am also confused as to why they need to add FSR support in the driver. Wasn't the whole selling-point that it wouldn't need that?

Might be a performance optimisation thing? Might even be game specific. Someone motivated to do so could try running it on the previous driver and see what happens. Below lines from AMD's notes for the driver version.

 

Quote

AMD FidelityFX Super Resolution (FSR) support for select titles.

FSR requires game developer integration.1

 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, porina said:

It is also a good time to drop support for older GPUs

When people literally can't buy new supported one? It's literally the worst time for this support drop.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Ydfhlx said:

When people literally can't buy new supported one? It's literally the worst time for this support drop.

People had 5 years to get a currently supported GPU. 

 

It doesn't mean they suddenly stop working. They work today as well as they did yesterday, and will continue working with the last supported driver.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

People had 5 years to get a currently supported GPU. 

 

It doesn't mean they suddenly stop working. They work today as well as they did yesterday, and will continue working with the last supported driver.

5 years is still a very short time though. It's not the end of the world, but it's significantly worse than what we should be given by AMD. Will we even get security updates after this? Will Windows 11 be supported on these older GPUs that launched in 2015?

Link to comment
Share on other sites

Link to post
Share on other sites

Well this is perfect timing. After 3 years of ownership I sold my GTX 1080 for quite a bit more than I paid for it and downgraded back to my R9 290x. I guess I always expected the 7000/8000 Series to be axed prior to the R9/R7/R5 series. Didn't expected them to kill off the entire architecture in one day. Can't say I'm too upset, seeing as how the 290x is quite old, but I wish there was more of a heads up... or GPU's to buy...

QUOTE ME IF YOU WANT A REPLY!

 

PC #1

Ryzen 7 3700x@4.4ghz (All core) | MSI X470 Gaming Pro Carbon | Crucial Ballistix 2x16gb (OC 3600mhz)

MSI GTX 1080 8gb | SoundBlaster ZXR | Corsair HX850

Samsung 960 256gb | Samsung 860 1gb | Samsung 850 500gb

HGST 4tb, HGST 2tb | Seagate 2tb | Seagate 2tb

Custom CPU/GPU water loop

 

PC #2

Ryzen 7 1700@3.8ghz (All core) | Aorus AX370 Gaming K5 | Vengeance LED 3200mhz 2x8gb

Sapphire R9 290x 4gb | Asus Xonar DS | Corsair RM650

Samsung 850 128gb | Intel 240gb | Seagate 2tb

Corsair H80iGT AIO

 

Laptop

Core i7 6700HQ | Samsung 2400mhz 2x8gb DDR4

GTX 1060M 3gb | FiiO E10k DAC

Samsung 950 256gb | Sandisk Ultra 2tb SSD

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, LAwLz said:

5 years is still a very short time though. It's not the end of the world, but it's significantly worse than what we should be given by AMD. Will we even get security updates after this? Will Windows 11 be supported on these older GPUs that launched in 2015?

To expand on the 5 years, that's roughly how long Polaris has been on sale. Unless you bought the older GPU after Polaris was released you could have had much longer support than 5 years. If you buy a product late in its life cycle you know you will end up with less supported time than someone who buys at launch. It isn't as bad as say an Android phone, but still happens. 

 

Security updates, I'd put in the "don't count on it" pile. Those interested in a better answer could look at the last time AMD dropped support from the ongoing driver package and see what happened then.

 

Win 11 support? I'd think it depends on how different the OS really is from Win 10. If the driver model doesn't change in a significant way, then the Win 10 driver may continue to be used. Is it reasonable to expect a product to be supported by an operating system that didn't exist during its official sale life? If we assume Microsoft were to offer a "free" upgrade for Win10 users, like they offered Win10 upgrades to Win7 users, there is no expectation for all hardware to be supported. Not all Win7 running systems could be upgraded to Win10. Same could apply here if no driver is available for the GPU. Within the life of Win10 we have had updates to requirements to run it. 32-bit is practically dead for example.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, LAwLz said:

5 years is still a very short time though. It's not the end of the world, but it's significantly worse than what we should be given by AMD. Will we even get security updates after this? Will Windows 11 be supported on these older GPUs that launched in 2015?

My son is running my old 380, it's is perfectly fine for all the games he plays (none of us are super high res gaming tech heads), Does this mean he will have to buy a new GPU soon simply because AMD don't want to keep basic support?

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

My son is running my old 380, it's is perfectly fine for all the games he plays (none of us super high res gaming tech heads), Does this mean he will have to buy a new GPU soon simply because AMD don;t want to keep basic support?

Unsupported doesn't mean it wont work. Implicitly if new problems are found, they will no longer get fixed. How the product performed with the last supported driver is pretty much how it will perform from now on. If all you're doing is playing older games, it'll make no difference. If you try to play latest games, maybe you run into some support issue at some point, and that is when you might consider something newer.

 

I do have a handful of older nvidia GPUs. They are not supported by the ongoing driver package, and I have to specifically seek out older drivers which do support it when I use them in a build. I haven't had any problems with doing that, although they're not gaming grade GPUs so I may not be stressing them as much.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×