Jump to content

AMD silently nerfing the performance of older GCN cards

Repost
1 minute ago, Dabombinable said:

I don't want to use Crimson due to the fact that installing it+its drivers disables OpenCL. Which is very important for an APU as OpenCL is the only thing that makes it not a complete POS (2x modules @1.6GHz=slower than a Tegra 3 @ 1.3GHz

sempron 2650 ? what CCC do you use? also wich one would you recommend for a HD 5670 ? 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm glad I upgraded my 7850 to an RX470 then :D

CPU - Ryzen 7 3700X | RAM - 64 GB DDR4 3200MHz | GPU - Nvidia GTX 1660 ti | MOBO -  MSI B550 Gaming Plus

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Space Reptile said:

sempron 2650 ? what CCC do you use? also wich one would you recommend for a HD 5670 ? 

I'm using the A8 4555M (Radeon HD7600G). And I recommend Catalyst 15.7.1 for the HD5670, as I've had no issues on both of my laptops with it, and it does increase DirectX performance, however OpenGL performance is drastically reduced.

 

BTW, my old and now dead laptop has the same GPU at a lower clock speed-the MR HD5650, however when I overclocked it to 700MHz.....I got similar performance to the 5670 in a laptop, and a massive FPS boost in games, especially after upgrading from a Phenom II x4 @1.6GHz, to one @ 2.2GHz (aka, the worst mobile Phenom II X4 to the best)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Gotta love this forum sometimes.

 

Before today:

"lol Nvidia can't do async compute!"

"Buy AMD, they don't gimp their cards with future drivers"

"The only DX12 benchmark that matters is from Oxide games. They are the only ones who did DX12 the right way!"

 

After today:

"Who needed async compute anyway? It barely made a difference."

"Well it's an old card so it's to be expected."

"It's all lies! AMD would never do this!"

"Don't blame AMD for this! It's the game developers' faults! We should be pissed at Oxide games not AMD you guys!"

"Just run on the old drivers and it won't be a problem!"

 

 

I am surprised that this thread hasn't been deleted for violating the "no porn" rule, because there is so much dick sucking going on here it's obscene.

 

 

Anyway, not going to raise my pitchfork until AMD have had their chance to explain. It seems really weird that they would do this. Hanlon's razor.

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Space Reptile said:

tbh the last driver that my 270s automatically updated to was 16.7.3 , my 290 instantly went " HEY 16.9.2 is here , get it" after i just swapped my 270 w/ the 290 

 

didnt see anyone complain then , if you want async just run a slightly older driver since the cards wont run the newest one to begin w/ 

 

Somes games received optimization on last drivers. If you play news games and a game that required old drivers, it's not a easy way to switch every time you want to play.

51 minutes ago, RagnarokDel said:

except in this case it would be 16 fps vs 16.2 fps maybe 16.8 in best case scenario.

In this case this is more between 5 and 7% more. 56,1-> 59,8 ; 47,9 -> 50,8 ; 39,1 -> 43,3

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, M.Yurizaki said:

For instance, if a GCN 1.0 card without Async Compute gets 16 FPS and with ASync Compute gets 25 FPS, it's still a 50% increase in performance, but would you still care?

Umm yes? Why is that even a question?

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

Gotta love this forum sometimes.

 

Before today:

"lol Nvidia can't do async compute!"

"Buy AMD, they don't gimp their cards with future drivers"

"The only DX12 benchmark that matters is from Oxide games. They are the only ones who did DX12 the right way!"

 

After today:

"Who needed async compute anyway? It barely made a difference."

"Well it's an old card so it's to be expected."

"It's all lies! AMD would never do this!"

"Don't blame AMD for this! It's the game developers' faults! We should be pissed at Oxide games not AMD you guys!"

"Just run on the old drivers and it won't be a problem!"

 

 

I am surprised that this thread hasn't been deleted for violating the "no porn" rule, because there is so much dick sucking going on here it's obscene.

 

 

Anyway, not going to raise my pitchfork until AMD have had their chance to explain. It seems really weird that they would do this. Hanlon's razor.

Because we are talking about cards that came out 4 to 5 years ago. 

The nvidia cards we complain about are barely a year old. 

 

Though im sure you know this,  so im confused why youre trying to make it a talking point. 

CPU: Amd 7800X3D | GPU: AMD 7900XTX

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Prysin said:

that would be Oxide Games.

No it's a driver issue not a game issue

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

Because as I said earlier, PC gamers tend to say anything under 30 FPS is "unplayable."

 

I mean I could've said the FPS went from 10 to 15. That's still a 50% improvement.

I wouldn't be happy that the fps was under 30 but I would be annoyed that they removed a feature that gave me a little more performance (plus lowering the settings less for playable fps could be possible) , taking away a feature that helps performance is never a good idea especially if your trying to look like the good guy to drive sales

 

Think of it this way, how would people react if Nvidia blocked something like tesslation features on the maxwell cards, thus lowering performance 20% (this is different than adding support for something thus making old cards less useful as it is taking away a feature rather than adding an new one that doesn't work with old tech)

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, LAwLz said:

"Buy AMD, they don't gimp their cards with future drivers"

Well even if it is intentional, I think the issue is Nvidia card's performance dropped compared newer cards and AMD's within 1-2 years of it being released whereas the cards AMD are nerfing are 4-5 years old so...

 

Sad times, AMD was finally gaining a lil ground against the 1060 but all this has done is give Nvidia lovers more gas to start spewing over the community :/ 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, LAwLz said:

I am surprised that this thread hasn't been deleted for violating the "no porn" rule, because there is so much dick sucking going on here it's obscene.

Fucking savage, 10/10.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, goodtofufriday said:

Because we are talking about cards that came out 4 to 5 years ago. 

The nvidia cards we complain about are barely a year old. 

 

Though im sure you know this,  so im confused why youre trying to make it a talking point. 

Even if we assume that was true, it still does not address all the other hypocrisies I brought up.

 

 

2 minutes ago, Mr.Meerkat said:

I think the issue is Nvidia card's performance dropped compared newer cards and AMD's within 1-2 years of it being released whereas the cards AMD are nerfing are 4-5 years old so...

Has that actually happened? We're not talking about "The 600 series gained 5% performance while the 700 series gained 10%, they are gimping the older cards!" here, we're talking about a driver possibly lowering the performance. So if you are updating the driver you are getting a lower FPS number. Has Nvidia done that recently?

 

Again, not saying that these are legit news or that AMD are doing it on purpose, but some people on this forum are clearly making a very big distinction about how they treat the two brands, and it's stupid.

I am 100% sure that this thread would have looked very differently if it said "Nvidia disables <insert feature here> on GTX 600 cards".

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, AresKrieger said:

I wouldn't be happy that the fps was under 30 but I would be annoyed that they removed a feature that gave me a little more performance (plus lowering the settings less for playable fps could be possible) , taking away a feature that helps performance is never a good idea especially if your trying to look like the good guy to drive sales

I dunno, to me I find it silly to think of companies as "good guys" or "bad guys". Either they give me what I care about or not. Also considering that most DX12 implementations still suck I really couldn't care less about how well a 4 year old card performs in DX12.

Quote

Think of it this way, how would people react if Nvidia blocked something like tesslation features on the maxwell cards, thus lowering performance 20% (this is different than adding support for something thus making old cards less useful as it is taking away a feature rather than adding an new one that doesn't work with old tech)

If they did that, it wouldn't be considered a DX11 card. Tessellation support is a requirement.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, AresKrieger said:

Think of it this way, how would people react if Nvidia blocked something like tesslation features on the maxwell cards

Nah, too new. A better comparison would be fermi or kepler :P 

 

4 minutes ago, LAwLz said:

Has that actually happened? We're not talking about "The 600 series gained 5% performance while the 700 series gained 10%, they are gimping the older cards!" here, we're talking about a driver possibly lowering the performance. So if you are updating the driver you are getting a lower FPS number. Has Nvidia done that recently?

That's true but I think the issue with that also is how the 79xx series of cards had somehow went from being noticeably slower than the 6xx series of GPU to being able to basically beat it in most scenarios (i.e. 7970 vs 670/680 in 2012 and now) or the 290X at 1080p and 1440p being slower than the 780ti at the start and then nearly catching up to a 980 a few months before it was Axed...(as in Polaris being released and it no longer being produced).

 

Although Nvidia isn't actually gimping their cards, I think the biggest issue is how AMD's drivers are still maturing even on their older cards, increasing performance on basically all their cards whereas Nvidia drivers are already fully matured for the last gen arch meaning their old cards are just being left behind.

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, jimakos234 said:

rip 280x users. Also it looks like that these cards wont be full supported by the new upcoming drivers.

Yet the latest drivers will apparently allow for wattman to be used with the 200/300/Fury series?

USEFUL LINKS:

PSU Tier List F@H stats

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kwee said:

R7 370 was release in June 2015 but GCN is still 1.0. https://www.techpowerup.com/gpudb/2645/radeon-r7-370

That's AMD rebadging for you. The entire 300 series used no new GPU.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Dabombinable said:

That's AMD rebadging for you. The entire 300 series used no new GPU.

The only one "new" gpu is the R9 380X.(Full Tonga)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kwee said:

The only one "new" gpu is the R9 380X.(Full Tonga)

Not really, look at the Firepro cards.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

 

29 minutes ago, LAwLz said:

Even if we assume that was true, it still does not address all the other hypocrisies I brought up.

 

 

Has that actually happened? We're not talking about "The 600 series gained 5% performance while the 700 series gained 10%, they are gimping the older cards!" here, we're talking about a driver possibly lowering the performance. So if you are updating the driver you are getting a lower FPS number. Has Nvidia done that recently?

 

Again, not saying that these are legit news or that AMD are doing it on purpose, but some people on this forum are clearly making a very big distinction about how they treat the two brands, and it's stupid.

I am 100% sure that this thread would have looked very differently if it said "Nvidia disables <insert feature here> on GTX 600 cards".

from what i can gather from articles around the web, Oxide Games updated their game code, as a result, the Async code they currently run will not be working properly on GCN1.0 cards. Meaning something that the game dev, Oxide Games, did, caused a major issue with older gen GCN cards and AMD disabled the feature entirely to avoid having to deal with all the issues that would come from these changes.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Prysin said:

 

from what i can gather from articles around the web, Oxide Games updated their game code, as a result, the Async code they currently run will not be working properly on GCN1.0 cards. Meaning something that the game dev, Oxide Games, did, caused a major issue with older gen GCN cards and AMD disabled the feature entirely to avoid having to deal with all the issues that would come from these changes.

You misunderstand or the website make you believe that but that not the truth.

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, ivan134 said:

No, he's right. The 390 is just a 290 with better power delivery and slightly better power consumption. There was no architectural change. Tessellation improvements came with GCN 3 and much better GCN 4.

It had 8gb of Vram tho that's too important not to mention.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×