Jump to content

GTX 970 vs R9 390

King Oriax

Let's see how much vram Shadow of Mordor needs at 5K.

 

In this game at this resolution you aren't even hitting 5GB vram, why on Earth you'd choose a card based on it having 8GB is an absolute mystery.

But anyway 390 is better card than 970 and more VRAM will help in future, strap in another 390 and you have 4K capable rig that will crush 970 SLI.

Link to comment
Share on other sites

Link to post
Share on other sites

But anyway 390 is better card than 970 and more VRAM will help in future, strap in another 390 and you have 4K capable rig that will crush 970 SLI.

sadly the 390 is only a to the limit overclocked 290 with 4gb more vram. its not at all "a better card" and unfortunatly you could run 4xSLI 970's instead of 2xCF 390's power consumption wise. 

 

If you are planning to play on a 4k resolution you wouldnt buy SLI/CF 970s/390s you would go something like FuryX/980Ti/TitanX

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

But anyway 390 is better card than 970 and more VRAM will help in future, strap in another 390 and you have 4K capable rig that will crush 970 SLI.

 

It's not, it's just overclocked. You can overclock the 970s to match 980 SLI performance. Good luck doing that on a card that's already overclocked as much as the 390 is out of the box.

 

I fail to see you needing that vram in the future either. If Shadow of Mordor, of all games, can only just fill 4GB at 5K then 8GB on any of these cards at this tier is about as useful as 4GB on a 740. The resolution you will have to be running to make use of it just isn't happening on any of these GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

390 by far. I hate to talk about futureproofing but that 3.5 gigs of vram will be very outdated soon. I'm already feeling it with my 780 Ti's. The other thing is that while nVidia has better support, they tend to drop it fast when they release a new card...like how my 780 Ti's got obsoleted in drivers less than a year after they came out by the lower end 970 because nVidia wanted people to buy it instead of getting a 780 Ti off eBay.

The way AMD is adding that framerate cap feature everyone is going nuts over to the 200 series...nVidia would NEVER do that.

You mean how Fermi has had frame limiting for years? Right ;)

Link to comment
Share on other sites

Link to post
Share on other sites

It's not, it's just overclocked. You can overclock the 970s to match 980 SLI performance. Good luck doing that on a card that's already overclocked as much as the 390 is out of the box.

 

I fail to see you needing that vram in the future either. If Shadow of Mordor, of all games, can only just fill 4GB at 5K then it's about as useful as 4GB on a 740.

a single 970 can overclock to a single 980 performance yes, 

sli 970 can overclock to sli 980 performance yes.

 

AND a 980 beats a 390 @ 4k  YES

 

 

You mean how Fermi has had frame limiting for years? Right  ;)

 

rumors.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

a single 970 can overclock to a single 980 performance yes, 

sli 970 can overclock to sli 980 performance yes.

 

lol that's what I meant, yes. Not that a single 970 can overclock to match 980 SLI, that would be worth shouting from the rooftops about lmao

Link to comment
Share on other sites

Link to post
Share on other sites

lol that's what I meant, yes. Not that a single 970 can overclock to match 980 SLI, that would be worth shouting from the rooftops about lmao

yea just wrote it to make it more clear to everyone

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

...
 

rumors.

 

Rumors? lol. I've been limiting framerates for years on both my GT 555M as well as on many friend's laptops and desktops with Fermi cards. So no, those are not rumors, those are facts.

Link to comment
Share on other sites

Link to post
Share on other sites

Rumors? lol. I've been limiting framerates for years on both my GT 555M as well as on many friend's laptops and desktops with Fermi cards. So no, those are not rumors, those are facts.

rumors ^^

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

rumors ^^

 

Ah, I didn't read the inflexion of your sentence so I was under the impression you were going all gung ho on that :)

Link to comment
Share on other sites

Link to post
Share on other sites

Ah, I didn't read the inflexion of your sentence so I was under the impression you were going all gung ho on that :)

internet side effect 

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

Rumors? lol. I've been limiting framerates for years on both my GT 555M as well as on many friend's laptops and desktops with Fermi cards. So no, those are not rumors, those are facts.

You get they point I was making. nVidia never goes back to add features to a soon to be discontinued card. AMD didn't have to add it to the 200 series but they did.

Galax/Sapphire fanboy for life!

Hall Of Fame ♕ Owner's Club

Always supporting Lyoto "The Dragon" Machida!

Link to comment
Share on other sites

Link to post
Share on other sites

You get they point I was making. nVidia never goes back to add features to a soon to be discontinued card. AMD didn't have to add it to the 200 series but they did.

its because it wouldnt work without adding it to the old cards since the new cards are the old cards. got it?

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

You get they point I was making. nVidia never goes back to add features to a soon to be discontinued card. AMD didn't have to add it to the 200 series but they did.

 

While I can agree with your point, nVidia tends to add new features to older cards well before they're discontinued and has done so over several generations, so it doesn't come to the point of, say, Kepler being discontinued in a matter of a couple months and not adding a new feature because of that mainly because said feature had been added earlier.

 

And if we're being entirely fair, AMD artificially locked VSR from the HD 7000 series due to a supposed hardware limitation that on something like a HD 7970 had half the VSR multiplier factors but could very well allow a more limited VSR but VSR all the same, something that bothered several of people with HD 7000 cards. Also, I found baffling that AMD made a big deal of the Rx 300 series having VSR as the 200 series already had its full implementation.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not like the power bill will be 100$ bigger. Not so sure why everyone's making a big deal out of it. If you're futureproofing , the 390 is the obvious choice.

And with the state of PC optimization , those 8GB of VRAM will be very useful very soon.

Link to comment
Share on other sites

Link to post
Share on other sites

its because it wouldnt work without adding it to the old cards since the new cards are the old cards. got it?

Yes, everyone gets that it's a rebrand. They still did not have to add it to the 200 series though. They could have held it over consumers as another reason to get a 390x instead of eBay for a 290x.

While I can agree with your point, nVidia tends to add new features to older cards well before they're discontinued and has done so over several generations, so it doesn't come to the point of, say, Kepler being discontinued in a matter of a couple months and not adding a new feature because of that mainly because said feature had been added earlier.

 

And if we're being entirely fair, AMD artificially locked VSR from the HD 7000 series due to a supposed hardware limitation that on something like a HD 7970 had half the VSR multiplier factors but could very well allow a more limited VSR but VSR all the same, something that bothered several of people with HD 7000 cards. Also, I found baffling that AMD made a big deal of the Rx 300 series having VSR as the 200 series already had its full implementation.

Like I said, nVidia offers better support on current cards but drop it extremely fast. My 780 Ti is a better card than Bill's 970. His will beat mine in new games though because nVidia stopped support for mine way to soon after it came out.

Galax/Sapphire fanboy for life!

Hall Of Fame ♕ Owner's Club

Always supporting Lyoto "The Dragon" Machida!

Link to comment
Share on other sites

Link to post
Share on other sites

It's not like the power bill will be 100$ bigger. Not so sure why everyone's making a big deal out of it. If you're futureproofing , the 390 is the obvious choice.

And with the state of PC optimization , those 8GB of VRAM will be very useful very soon.

calculate it for me, im playing 12 hours a day. what does 100watts more while playing cost me per day in one year? guess what?  I DO IT FOR YOU

0,28 Euro per KWH

       per day                      40 Watts                           60 Watts                          120 Watts                    200 Watts

        12 h                           0,134 €                              0,202 €                                0,40 €                           0,67 €

 

So on 1 day my power would cost 0,335€ more by 100watts more usage.

This is 122,275 € in one year. Just ADDITIONAL Power costs to a GTX970.

 

Even if you only play 2hours a day its above 20euros more in one year.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

Like I said, nVidia offers better support on current cards but drop it extremely fast. My 780 Ti is a better card than Bill's 970. His will beat mine in new games though because nVidia stopped support for mine way to soon after it came out.

 

I'll agree with that, especially since Fermi lasted longer compared to Kepler than Kepler compared to Maxwell, and without any actual valid reason in all honesty. Your 780 Ti should at the very least be as good as a 970 with strong inclinations to outperform it in most usages. Which driver version are you using on your 780 Ti?

Link to comment
Share on other sites

Link to post
Share on other sites

 

4K doesn't benefit from the difference, what makes you think 1440p would? o.0

 

Also does no one use a 780 Ti any more? Or did we just start pretending that card never existed the moment we heard about vram-gate?

 

 
 
Overclocked 970 > overclocked 290

 

you can still overclock a 390. So.... your argument it moot.

Link to comment
Share on other sites

Link to post
Share on other sites

calculate it for me, im playing 12 hours a day. what does 100watts more while playing cost me per day in one year? guess what?  I DO IT FOR YOU

 

       per day                      40 Watts                           60 Watts                          120 Watts                    200 Watts

        12 h                           0,134 €                              0,202 €                                0,40 €                           0,67 €

 

So on 1 day my power would cost 0,335€ more by 100watts more usage.

This is 122,275 € in one year. Just ADDITIONAL Power costs to a GTX970.

 

Even if you only play 2hours a day its above 20euros more in one year.

And how much do you make per hour?

Before you start whining about cost, START WHINING ABOUT IT RELATIVE TO WHAT YOU MAKE.

 

If you make 1 million euro a year and whine about a GPU raising your power bill by 100€, then you deserve a smack on the bum for being such an arrogant and greedy basterd that you complain about it in the first place. IF you however makes 15000€ a year, then sure, 100€ will be noticed, like a lot. However at that point, why are you sitting inside for 12 hours a day instead of getting an education, a job and a bit more money.

 

Ya know. Context makes the difference here.

Link to comment
Share on other sites

Link to post
Share on other sites

you can still overclock a 390. So.... your argument it moot.

its not, you cant overclock a 390 that well, the chip is on its limit. 

f.e. http://www.gamersnexus.net/guides/1986-amd-r9-390-r9-380-overclocking-benchmark

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

And how much do you make per hour?

Before you start whining about cost, START WHINING ABOUT IT RELATIVE TO WHAT YOU MAKE.

 

If you make 1 million euro a year and whine about a GPU raising your power bill by 100€, then you deserve a smack on the bum for being such an arrogant and greedy basterd that you complain about it in the first place. IF you however makes 15000€ a year, then sure, 100€ will be noticed, like a lot. However at that point, why are you sitting inside for 12 hours a day instead of getting an education, a job and a bit more money.

 

Ya know. Context makes the difference here.

If i would make 1million euro a year i wouldnt consider a 390 too. ? AH and btw im retired. I dont have to work anymore though. (since 2008) and im 30 since this january.. so ehm yea.  Have enough time to play ;P

 

EDIT: point is clear.. the card is causing costs while using and gets more expansive then a gtx970 in the long run.

CPU: Ryzen 7 5800x3D | MoBo: MSI MAG B550 Tomahawk | RAM: G.Skill F4-3600C15D-16GTZ @3800CL16 | GPU: RTX 2080Ti | PSU: Corsair HX1200 | 

Case: Lian Li 011D XL | Storage: Samsung 970 EVO M.2 NVMe 500GB, Crucial MX500 500GB | Soundcard: Soundblaster ZXR | Mouse: Razer Viper Mini | Keyboard: Razer Huntsman TE Monitor: DELL AW2521H @360Hz |

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll agree with that, especially since Fermi lasted longer compared to Kepler than Kepler compared to Maxwell, and without any actual valid reason in all honesty. Your 780 Ti should at the very least be as good as a 970 with strong inclinations to outperform it in most usages. Which driver version are you using on your 780 Ti?

347.52 which is fairly recent. They still add drivers for big titles every once in a while. It does beat the 970 in most games but it should beat it in pretty much every game tbh. It destroyed my 290x when it came out. Compare them now in every recent title and 290x looks like it's in a whole different class.

Galax/Sapphire fanboy for life!

Hall Of Fame ♕ Owner's Club

Always supporting Lyoto "The Dragon" Machida!

Link to comment
Share on other sites

Link to post
Share on other sites

347.52 which is fairly recent. They still add drivers for big titles every once in a while. It does beat the 970 in most games but it should beat it in pretty much every game tbh. It destroyed my 290x when it came out. Compare them now in every recent title and 290x looks like it's in a whole different class.

 

If I'm not mistaken, there seems to be some Kepler performance regression on 350.xx releases so it might be worth looking out for that in the event you do end up updating.

 

Overall, I'd say that the 780 Ti/290X performance shift happened due to both a surprisingly significant driver improvement on the 290X's side as well as a potentially artificial performance cap on the 780 Ti due to drivers as well. Just think of how the 780 Ti and the 980 initially had almost the same performance but as time went on the gap grew larger, which could also be due to driver optimizations specifically for Maxwell, yet it still bothers me that there were such large performance changes in architectures as of late.

Link to comment
Share on other sites

Link to post
Share on other sites

its not, you cant overclock a 390 that well, the chip is on its limit. 

f.e. http://www.gamersnexus.net/guides/1986-amd-r9-390-r9-380-overclocking-benchmark

Silicon lottery.

 

They got a bad or average one then provided it as basis for "all teh cards", then went back to buisniss as usual. Sure, i do not intend to say it will reach 50% overclock. But do not ever take ONE test as the "final verdict" on a card". Ive heard of 970s not overclocking anywhere NEAR where other people get their 970s, even though we all know the 970 OC pretty well. However, looking at the bigger picture.

 

390 is around 7-10% better then GTX 970, when both are at stock settings.

Now, we know that well overclocked 970s CAN reach slightly above stock 980 performance. And YOUR OWN TEST showed that on several occasions, the 390 performed SLIGHTLY above stock 980 performance when OCd....

 

Meaning, yes when OCd they level out and get pretty similar. However then there is the elephant in the room

 

NOT EVERYONE IS COMFORTABLE WITH OVERCLOCKING, AND YOU SHOULD RESPECT THAT. THUS BEFORE RUBBING IT INTO PEOPLES FACES THAT "IF YOU WRING THE LAST BIT OF JUICE OUT OF THIS, YOU WILL GET BETTER PERFORMANCE". It may still not be the best solution for certain people.

 

And thus, you should give people the best option as the one who is faster OUT OF THE BOX. While reccomending the other options as "the ones you can make better yourself by tinkering".

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×