Jump to content

AMD silently nerfing the performance of older GCN cards

Repost
12 minutes ago, TOMPPIX said:

so they nerfed cards that are almost 5 years old, 7000 series was released q1 2012 i believe. i don't really see a problem here.

Imagine just a second. You buy a table. The constructor came to your home fews years later and cut one table leg, and tell you he sells many table with 4 legs. Would you be happy ?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kwee said:

Imagine just a second. You buy a table. The constructor came to your home fews years later and cut one table leg, and tell you he sells many table with 4 legs. Would you be happy ?

are you comparing a graphics processor with a table? 

Link to comment
Share on other sites

Link to post
Share on other sites

I dont belive this because there is no proff of anything just clickbait.

When nvidia does some shit like 500mb slow speed VRAM on gtx 970 people came out with benchmarks and tools and everyone could see with their own eyes, this article has 0 real proof, like benchmarks.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TOMPPIX said:

are you comparing a graphics processor with a table? 

Sure :)

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Kwee said:

Sure :)

you should have compared it to a desk with a lockable drawer. someone locks the drawer and throws away the key, you can still use the desk normally but you can't access the drawer. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, deviant88 said:

I dont belive this because there is no proff of anything just clickbait.

When nvidia does some shit like 500mb slow speed VRAM on gtx 970 people came out with benchmarks and tools and everyone could see with their own eyes, this article has 0 real proof, like benchmarks.

 

I put my benchmark and screens on Beyond3d, you can see them there. 

 

https://forum.beyond3d.com/threads/dx12-performance-discussion-and-analysis-thread.57188/page-81#post-1955842

 

and there 

https://forum.beyond3d.com/posts/1956002/

 

Link to comment
Share on other sites

Link to post
Share on other sites

aaawww why you do this AMD. 

Oh well. like I was expecting my 7950 to do any good in DX12. Still gonna keep it until next gen is here

Before you buy amp and dac.  My thoughts on the M50x  Ultimate Ears Reference monitor review I might have a thing for audio...

My main Headphones and IEMs:  K612 pro, HD 25 and Ultimate Ears Reference Monitor, HD 580 with HD 600 grills

DAC and AMP: RME ADI 2 DAC

Speakers: Genelec 8040, System Audio SA205

Receiver: Denon AVR-1612

Desktop: R7 1700, GTX 1080  RX 580 8GB and other stuff

Laptop: ThinkPad P50: i7 6820HQ, M2000M. ThinkPad T420s: i7 2640M, NVS 4200M

Feel free to pm me if you have a question for me or quote me. If you want to hear what I have to say about something just tag me.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kwee said:

Well atleast we got something to look at.

Now all that is left is some answers from AMD, this could be a bug aswell, if they dont fix it with some hotfix or next driver update its clearly intentional and it will severly hurt AMD in short & long run.

Usually its not guilty until proven otherwise, if the proof is real then AMD must come clean by fixing it.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, deviant88 said:

Well atleast we got something to look at.

Now all that is left is some answers from AMD, this could be a bug aswell, if they dont fix it with some hotfix or next driver update its clearly intentional and it will severly hurt AMD in short & long run.

Usually its not guilty until proven otherwise, if the proof is real then AMD must come clean by fixing it.

That exactly what i want. I don't want spit on AMD. I'm just pointing something is wrong and i want this fixed. 

Link to comment
Share on other sites

Link to post
Share on other sites

This would be the Tahiti block diagram i.e 7900 series.

Spoiler

tahiti-block.jpg

Pitcairn block diagram. I.e 7800 series

Spoiler

pitcairn.png

Cape Verde block diagram i.e 7700 series

Spoiler

arch-diagram.jpg

As you can see, all gaming oriented 7000 series graphics cards have two dedicated ACE (asynchronous compute engine) working in conjunction with the Command Processor. Asynchronous Compute is not only possible on a software level, but it's an inherent hardware design which will allow the GPU to perform better when allowed to process tasks Asynchronously.

 

For reference here is the Hawaii block diagram (290,290x, 390 and 390x)

Spoiler

Hawaii-Block-Diagram.jpg

And the Polaris 10 block diagram

Spoiler

rx-480-gpu-diagram.jpg

You can see the two newer graphics cards have more Asynchronous Compute Engines, the 290/390 cards have 8 while the Polaris 10 have 4. But in addition, the Polaris have 2 Hardware Schedulers, which also help with Asynchronous Compute. Still, there is no excuse why the 7700, 7800 and 7900 series card should not support Asynchronous Compute on a hardware level. Especially in titles that it supported those features previously.

 

These are not newer titles not supporting older graphics cards. These are older titles ceasing to support graphics cards they previously supported.

Motherboard: Asus X570-E
CPU: 3900x 4.3GHZ

Memory: G.skill Trident GTZR 3200mhz cl14

GPU: AMD RX 570

SSD1: Corsair MP510 1TB

SSD2: Samsung MX500 500GB

PSU: Corsair AX860i Platinum

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm that is very unfortunate...

 

It is a bit different IMHO to disable a function that at best doesn't help and at worst significantly hurts (a-sync on maxwell) than to disable a designed feature of the gpu architecture...

 

I hope AMD addresses this change at some point, esp since PS4 and XBone are both based on old GCN sets as well.

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Curufinwe_wins said:

Hmm that is very unfortunate...

 

It is a bit different IMHO to disable a function that at best doesn't help and at worst significantly hurts (a-sync on maxwell) than to disable a designed feature of the gpu architecture...

 

I hope AMD addresses this change at some point, esp since PS4 and XBone are both based on old GCN sets as well.

Yea, after thinking about the consoles, this started to make less sense.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

AMD changed the names from GCN 1.X to X somewhere during the announcement stuff of polaris.

 

So instead of polais having GCN 1.4 it has GCN 4.

 

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

Do we know if they're currently developing, or implementing changes to the way their A-Sync compute works or adding functionality which would limit the effectiveness of the 2XX and 7XXX cards? 

 

If so, probably a good thing to stop market fragmentation from a development perspective. 

 

If not, they pulled a Nvidia. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kwee said:

Not two games, someone write a program for testing Async Compute. It's disabled in the driver.

 

I see a lot of misunderstand since i post my finding.

In a first place, i started to search why GCN 1.0 was not supported in Rise Of The Tomb Raider last patch.

When Maxwell was accused to not support Async Compute, someone create a program for testing Async Compute. It worked on GCN 1.0, 1.1 and 1.2 back in time.

So i just wanted to verify if that still the case and that how i find that Async Compute was disabled on news drivers. Then i post on Reddit.

After that, many ask me to test a game that support Async Compute on GCN 1.0. So i tried Ashes Of Singularity. You know the end. That just confirm that Async Compute was disabled on news drivers. Old drivers performs way better because of Async Compute.

DirectX12 driver 16.3.1 Async Compute off : http://i.imgur.com/aiV1pSg.png

DirectX12 driver 16.3.1 Async Compute on : http://i.imgur.com/CGrb4yM.png

DirectX12 drivers superior to 16.9.2 Async Compute off :http://i.imgur.com/yiSSRCE.png

DirectX12 drivers superior to 16.9.2 Async Compute on :http://i.imgur.com/Fch5V8w.png

Ok. So before the driver which removed support, Async compute was providing a noticeable boost in performance.

So....

51fbae9301360173d3834809fd9fdc70b7362db3

Fuck AMD.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, samcool55 said:

AMD changed the names from GCN 1.X to X somewhere during the announcement stuff of polaris.

 

So instead of polais having GCN 1.4 it has GCN 4.

 

No. GCN 1.x is not something AMD had ever used. Look at Zmuel's link to see where that came from. Anandtech started it.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, samcool55 said:

AMD changed the names from GCN 1.X to X somewhere during the announcement stuff of polaris.

 

So instead of polais having GCN 1.4 it has GCN 4.

 

No. AMD never used the GCN 1.X naming scheme. They simply referred to it all as GCN.

 

It wasn't until Polaris that AMD started separating the different revisions of the GCN architecture with actual names. And then they just called them revisions 1, 2, 3, and 4.

Link to comment
Share on other sites

Link to post
Share on other sites

rip 280x users. Also it looks like that these cards wont be full supported by the new upcoming drivers.

The site has changed....

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, M.Yurizaki said:

For instance, if a GCN 1.0 card without Async Compute gets 16 FPS and with ASync Compute gets 25 FPS, it's still a 50% increase in performance, but would you still care?

except in this case it would be 16 fps vs 16.2 fps maybe 16.8 in best case scenario.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Sakkura said:

No. AMD never used the GCN 1.X naming scheme. They simply referred to it all as GCN.

 

It wasn't until Polaris that AMD started separating the different revisions of the GCN architecture with actual names. And then they just called them revisions 1, 2, 3, and 4.

Wikipedia said "screw it" and called it GCN # Gen.

Link to comment
Share on other sites

Link to post
Share on other sites

ayy my 290 is second gen , but my 270x´s are first gen :/ 

 

but honestly async perf on gen 1 was not there at all ,

we are talking 3 year old cards here and if async is made for more modern GCN architectures ,

then it makes sence to axe the feature from the old versions since it might bottleneck them more than benefit it 

 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Space Reptile said:

ayy my 290 is second gen , but my 270x´s are first gen :/ 

 

but honestly async perf on gen 1 was not there at all ,

we are talking 3 year old cards here and if async is made for more modern GCN architectures ,

then it makes sence to axe the feature from the old versions since it might bottleneck them more than benefit it 

 

Either way, its the kind of thing that should at least be optional, as async compute performance would differ between each game, and in games where it would be a negative in older cards, it would be a positive in others. Pretty much there should be a toggle for it in Catalyst.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, jimakos234 said:

rip 280x users. Also it looks like that these cards wont be full supported by the new upcoming drivers.

tbh the last driver that my 270s automatically updated to was 16.7.3 , my 290 instantly went " HEY 16.9.2 is here , get it" after i just swapped my 270 w/ the 290 

 

didnt see anyone complain then , if you want async just run a slightly older driver since the cards wont run the newest one to begin w/ 

 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dabombinable said:

Either way, its the kind of thing that should at least be optional, as async compute performance would differ between each game, and in games where it would be a negative in older cards, it would be a positive in others. Pretty much there should be a toggle for it in Catalyst.

eh maybe , but if you are still running CCC i advise you to update :P  

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Space Reptile said:

eh maybe , but if you are still running CCC i advise you to update :P  

I don't want to use Crimson due to the fact that installing it+its drivers disables OpenCL. Which is very important for an APU as OpenCL is the only thing that makes it not a complete POS (2x modules @1.6GHz=slower than a Tegra 3 @ 1.3GHz

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×