Jump to content

The GTX 970 vs. R9 390 thread to (theoretically) end them all

PlayStation 2
1 minute ago, Dan Castellaneta said:

Just go AMD in this case. One game out of the rest you listed (The Witcher 3) is bad on tessellation but can be adjusted if necessary.

planetside 2 is very CPU bound by the way 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, gilang01 said:

planetside 2 is very CPU bound by the way 

Planetside 2's only CPU bound on two cores. Better you just force the game to run on the last 2 cores.

The Frostbite engine benefits rather greatly from AMD GPUs.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dan Castellaneta said:

Planetside 2's only CPU bound on two cores. Better you just force the game to run on the last 2 cores.

The Frostbite engine benefits rather greatly from AMD GPUs.

ah ok. Thanks heaps 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Monarch said:

Apparently others have been instructed to avoid benchmarking drawcall heavy scenes, and to compare AMD cards only to the reference versions of the Nvidia counterparts to make AMD cards look better. 

I've seen those pclab's benchmarks and they clearly show a CPU bottleneck.

 

 

 

Even the 390 is a bit slower than the 960, which indicates the CPU is bottlenecking it. That's how bad AMD's driver is. And with an i5, the bottlenecking would be even worse.

you know that in the same article as the one you linked, there is two other "slides" where the 380X stomps the GTX cards. Wonder why you didnt post those.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Prysin said:

you know that in the same article as the one you linked, there is two other "slides" where the 380X stomps the GTX cards. Wonder why you didnt post those.

Don't know what slides you mean, but the point I'm trying to make isn't that the GTX 960 is a faster GPU than the 380X. R9 380X has been proven to be faster in most GPU-bound scenarios except tessellation. But the problem is that it's heavily bottlenecked by AMD's driver in draw call heavy scenarios, sometimes so severly that not even an OCed 6700K can't quite help it catch up to the 960. And people that have i5's and i3's will be bottlenecked even more, in which case it's useless having a faster GPU. 

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Monarch said:

Don't know what slides you mean, but the point I'm trying to make isn't that the GTX 960 is a faster GPU than the 380X. R9 380X has been proven to be faster in most GPU-bound scenarios except tessellation. But the problem is that it's heavily bottlenecked by AMD's driver in draw call heavy scenarios, sometimes so severly that not even an OCed 6700K can't quite help it catch up to the 960. And people that have i5's and i3's will be bottlenecked even more, in which case it's useless having a faster GPU. 

yet your post proves nothing. Because it does not prove that they were using a low end CPU. That means it could boil down to workload. Was there high tesselation in the picture? Was there other effects that AMD emulates rather then do in hardware? Was there a lot of Alpha channel effects? Transparent objects? These are some of the things AMD loses FPS over. And I know, that you do not have a clue about WHY AMD scores lower. And neither do I.

 

But in the end, posting a benchmark and saying "this is a driver issue" doesnt work. Because to prove a driver is at fault you MUST eliminate everything else. EVERYTHING else. From workload, to GPU architecture, to CPU and CPU workload and architecture, memory bandwidth, memory capacity AND THE DAMN GAME.

Because you posted from a GTA V test, which is a title known to favor Nvidia, but not neccessarily been built for them. So that means the workload they exert on the GPUs are probably more inline with those thigns Nvidia is strong at doing. Which is you know, an imporant factor here.

 

So in the end, you posted proves little to nothing, because there is too many variables. And you have dissected NONE of these issues. ZERO of them.

You cannot conclude anything without eliminating everything else.

 

On the flip side....

ATM i am working to dissect the PClabs.PL tests for Witcher 3. First using a FX 8320, then my i7 4790k. And i got 8GB 2400MHz DDR3 kit which i can freely turn up and down in Frequency. Meaning i can test memory bottlenecking too. I can test crossfire issues with my R9 295x2, and soon ill get a GTX 950. Whilst FAR from as strong as a R9 295x2, i can compare CPU driver load with ease and compare frametime issues.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is still going? Many here still have no clue how tessellation screw over people so here ya go... (do not forget PhysX)

No I do not hate Nvidia I have a 960 and a 970 not to mention I have no AMD GPU except some old ATI ones I found...

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Prysin said:

yet your post proves nothing. Because it does not prove that they were using a low end CPU. That means it could boil down to workload. Was there high tesselation in the picture? Was there other effects that AMD emulates rather then do in hardware? Was there a lot of Alpha channel effects? Transparent objects? These are some of the things AMD loses FPS over. And I know, that you do not have a clue about WHY AMD scores lower. And neither do I.

 

But in the end, posting a benchmark and saying "this is a driver issue" doesnt work. Because to prove a driver is at fault you MUST eliminate everything else. EVERYTHING else. From workload, to GPU architecture, to CPU and CPU workload and architecture, memory bandwidth, memory capacity AND THE DAMN GAME.

Because you posted from a GTA V test, which is a title known to favor Nvidia, but not neccessarily been built for them. So that means the workload they exert on the GPUs are probably more inline with those thigns Nvidia is strong at doing. Which is you know, an imporant factor here.

 

So in the end, you posted proves little to nothing, because there is too many variables. And you have dissected NONE of these issues. ZERO of them.

You cannot conclude anything without eliminating everything else.

 

On the flip side....

ATM i am working to dissect the PClabs.PL tests for Witcher 3. First using a FX 8320, then my i7 4790k. And i got 8GB 2400MHz DDR3 kit which i can freely turn up and down in Frequency. Meaning i can test memory bottlenecking too. I can test crossfire issues with my R9 295x2, and soon ill get a GTX 950. Whilst FAR from as strong as a R9 295x2, i can compare CPU driver load with ease and compare frametime issues.

I don't even know why I'm still discussing this with you. Everybody except you, don svetilio, ivan and Notional know AMD has significantly higher driver overhead. It's been proven time and time again by pclab, digital foundry, gamegpu and even 3dmark's overhead test, and yet you people still refuse to believe it. I've experienced it firsthand, which made me regret my 290X purchase. I wasn't aware of the issue, so now I'm making people aware so that they don't make the same mistake. Why you're defending AMD and trying to cover up the issue is beyond me. What's in it for you?

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Monarch said:

I don't even know why I'm still discussing this with you. Everybody except you, don svetilio, ivan and Notional know AMD has significantly higher driver overhead. It's been proven time and time again by pclab, digital foundry, gamegpu and even 3dmark's overhead test, and yet you people still refuse to believe it. I've experienced it firsthand, which made me regret my 290X purchase. I wasn't aware of the issue, so now I'm making people aware so that they don't make the same mistake. Why you're defending AMD and trying to cover up the issue is beyond me. What's in it for you?

The driver overhead is an issue. But is is blown way out of fucking proportion. If you're expecting a 290X to work 100% fine on an i5, then don't get it. You get AMD cards if you know what you're in for. That's pretty much the exact reason why my brother's gonna upgrade from a 7850 to a 390, because I already know that driver overhead isn't gonna be an issue on an i7 2600. And either way, if you're hitting CPU limits on an i5-4670K, it's probably not even AMD's fault at that.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Monarch said:

I don't even know why I'm still discussing this with you. Everybody except you, don svetilio, ivan and Notional know AMD has significantly higher driver overhead. It's been proven time and time again by pclab, digital foundry, gamegpu and even 3dmark's overhead test, and yet you people still refuse to believe it. I've experienced it firsthand, which made me regret my 290X purchase. I wasn't aware of the issue, so now I'm making people aware so that they don't make the same mistake. Why you're defending AMD and trying to cover up the issue is beyond me. What's in it for you?

ok, i do not know why i am bothering to reply to your idiocy, but here goes.

At no point do i dispute that AMD driver overhead IS an issue. If you bother to check my post history, you would see that i openly admitt it BEING an issue. However, since your reading comprehension is less then sufficient to read non-native english with imperfect grammar, i can understand that you are uncapable of reading what i wrote.

 

So i shall make it easier for you to understand.

Your benchmark, what you posted. PROVES NOTHING. Because it does not specify what it is testing. It is simply a graph with numbers on them, showing Nvidia is ahead most of the time. A slide with no context is but a slide. It has no value or merit.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Dan Castellaneta said:

The driver overhead is an issue. But is is blown way out of fucking proportion. If you're expecting a 290X to work 100% fine on an i5, then don't get it. You get AMD cards if you know what you're in for. That's pretty much the exact reason why my brother's gonna upgrade from a 7850 to a 390, because I already know that driver overhead isn't gonna be an issue on an i7 2600. And either way, if you're hitting CPU limits on an i5-4670K, it's probably not even AMD's fault at that.

Yeah, it's blown way out of proportion. That's why the 390 drops below 40 fps compared to the 970 that maintains 60 fps in this video:

 

 

Here's another example: 

https://youtu.be/frNjT5R5XI4?t=7m54s

It's obvious that the performance impact in draw call heavy scenarios is huge.

 

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Prysin said:

ok, i do not know why i am bothering to reply to your idiocy, but here goes.

At no point do i dispute that AMD driver overhead IS an issue. If you bother to check my post history, you would see that i openly admitt it BEING an issue. However, since your reading comprehension is less then sufficient to read non-native english with imperfect grammar, i can understand that you are uncapable of reading what i wrote.

 

So i shall make it easier for you to understand.

Your benchmark, what you posted. PROVES NOTHING. Because it does not specify what it is testing. It is simply a graph with numbers on them, showing Nvidia is ahead most of the time. A slide with no context is but a slide. It has no value or merit.

 

It doesn't matter what's being tested. If the R9 390 can beat the GTX 970 only with the 6700K, and not with the i5, then it's obvious that the GPU is bottlenecked with the i5 because of the high overhead and needs a fast and expensive CPU in order to not get bottlenecked. But the GTX 970 isn't bottlenecked as much with the i5 as the 390 is. Which means driver overhead has a big impact and AMD GPUs should not be recommended with slower CPUs for CPU-bound games.

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Dan Castellaneta said:

The driver overhead is an issue. But is is blown way out of fucking proportion. If you're expecting a 290X to work 100% fine on an i5, then don't get it. You get AMD cards if you know what you're in for. That's pretty much the exact reason why my brother's gonna upgrade from a 7850 to a 390, because I already know that driver overhead isn't gonna be an issue on an i7 2600. And either way, if you're hitting CPU limits on an i5-4670K, it's probably not even AMD's fault at that.

Don't hate me.

 

 

 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Monarch said:

-snip-

 

Fallout 4 isn't a good example, people with 980Ti's have been getting frame drops to the teens with that game, it's Bethesda's shitty engine from 2002 that causes all of those issues.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Citadelen said:

Fallout 4 isn't a good example, people with 980Ti's have been getting frame drops to the teens with that game, it's Bethesda's shitty engine from 2002 that causes all of those issues.

https://youtu.be/frNjT5R5XI4?t=7m54s

There you go. RoTR, draw call heavy area.

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Citadelen said:

Fallout 4 isn't a good example, people with 980Ti's have been getting frame drops to the teens with that game, it's Bethesda's shitty engine from 2002 that causes all of those issues.

WHAT!?! xD That's a lie, or you're talking about 4K. I run the game in 1440 and mine is HEAVILY modded for textures and I never dip below 40fps. Only way I could see a 980ti dipping in the teens is with a i3 or a 6300, then maybe.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, App4that said:

WHAT!?! xD That's a lie, or you're talking about 4K. I run the game in 1440 and mine is HEAVILY modded for textures and I never dip below 40fps. Only way I could see a 980ti dipping in the teens is with a i3 or a 6300, then maybe.

 

 

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Citadelen said:

 

 

Yes, I've seen the guy with more GPU than skill. Set the shadow distance to 10000, but that takes skill.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, App4that said:

Yes, I've seen the guy with more GPU than skill. Set the shadow distance to 10000, but that takes skill.

It still effects Nvidia GPU's though, and seemingly at random.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Monarch said:

 

It doesn't matter what's being tested. If the R9 390 can beat the GTX 970 only with the 6700K, and not with the i5, then it's obvious that the GPU is bottlenecked with the i5 because of the high overhead and needs a fast and expensive CPU in order to not get bottlenecked. But the GTX 970 isn't bottlenecked as much with the i5 as the 390 is. Which means driver overhead has a big impact and AMD GPUs should not be recommended with slower CPUs for CPU-bound games.

Nice to see that day one releases are still accurate.

obvious sarcasm

Look, I'm not defending the R9 390, I'm just saying that most of this BS that goes against it isn't fair towards the card.

33 minutes ago, App4that said:

Don't hate me.

 

 

 

Honestly, it looks like it's a partial mix of improvements in the architecture and RAM frequencies. 3200MHz DDR4 dwarfs 2133 DDR3 easily, so that's a net benefit. However, if we did 2133 DDR3/i5 2500K vs. 2133 DDR4/i5 6500, I think we'd see a more comparable match.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Citadelen said:

It still effects Nvidia GPU's though, and seemingly at random.

Again, skill.

 

Ive run the game using a 290, 390, Crossfire, and a 980ti. The crossfire and the 980ti are equal in power, but the 980ti has the advantage of draw calls. I run my shadow distance at 18000, but that takes a bit of work to not suffer dips. Which goes back to skill. Fallout isn't for console owners, it's for people who spend more time modding than playing.

 

You wouldn't understand.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Citadelen said:

It still effects Nvidia GPU's though, and seemingly at random.

Because DX11 is an old, outdated high overhead API. Both AMD and Nvidia cards suffer in some situations where neither have high enough draw call throughput. Situations like Arkham Knight and Assassin's Creed Unity.

 

1 minute ago, Dan Castellaneta said:

Nice to see that day one releases are still accurate.

obvious sarcasm

Look, I'm not defending the R9 390, I'm just saying that most of this BS that goes against it isn't fair towards the card.

Honestly, it looks like it's a partial mix of improvements in the architecture and RAM frequencies. 3200MHz DDR4 dwarfs 2133 DDR3 easily, so that's a net benefit. However, if we did 2133 DDR3/i5 2500K vs. 2133 DDR4/i5 6500, I think we'd see a more comparable match.

Are you kidding me? What does RAM have to do with this? They used the same exact system except they switched the graphics cards. And about the day one release argument, I consider it invalid unless you provide some evidence. Because from what Mahigan said, AMD can't fix the driver overhead issue.

i7 9700K @ 5 GHz, ASUS DUAL RTX 3070 (OC), Gigabyte Z390 Gaming SLI, 2x8 HyperX Predator 3200 MHz

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Monarch said:

Are you kidding me? What does RAM have to do with this? They used the same exact system except they switched the graphics cards. And about the day one release argument, I consider it invalid unless you provide some evidence. Because from what Mahigan said, AMD can't fix the driver overhead issue.

I was replying to @App4that about RAM.

I know driver overhead isn't something that can easily be fixed, but day one releases aren't really fair on either side. Especially with a game like Grand Theft Auto V. Used to get ~55FPS at 720p high with an FX-4100/7850, now takes an i7 2600 to do the same.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dan Castellaneta said:

I was replying to @App4that about RAM.

I know driver overhead isn't something that can easily be fixed, but day one releases aren't really fair on either side. Especially with a game like Grand Theft Auto V. Used to get ~55FPS at 720p high with an FX-4100/7850, now takes an i7 2600 to do the same.

Word, this pisses me off. And is why I hate unfinished games making it to market.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, App4that said:

-snip-

Oh I don't contest that the Nvidia drivers have a draw call advantage, but the person in that video was getting drops to 17 FPS on a 980Ti, also, it shouldn't be up to the player's to dive into the ini's to edit shadow distance, the game should have been built for DX11 at least, if not DX12. I'm intrigued as to how you ran Fallout 4 using Crossfire, seeing as I don't think there was a Crossfire profile released for the game before you switched to 980Ti.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×