Jump to content

A proper test of the 4GB vs 2GB GTX 960 from gamersnexus

SteveGrabowski

Because the 290 is at the end of its life and so won't be available for long. The 960, however, has only just been launched so has a good 17 months left.

That's really poor reasoning.

 

The R9 290 is a lot stronger than a 960.  New or old, give me the more powerful card.  Its also not like you are buying a used 290 at that price, its brand new, Gigabyte Windforce R9 290.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

I don't care about SLI/CF setups. I have a case with airflow. I can change the lightbulbs in my house first if electricity is costing me too much. I can buy aftermarket 290s with great cooling. I care about performance first and foremost. I don't understand your electricity cost argument for yourself when you're running a 4.7GHz extreme edition CPU. If cost of electricity really matters why aren't you running an 84W Haswell i5 locked?

 

Right, so because you don't care about how this affects SLI/CF, and AMD and Nvidia should innovate only according to your whims. I couldn't play Tomb Raider this summer because my old GPUs were throttling eachother because of ambient temperature that they were in turn exacerbating. It's a serious problem, and a big reason that 290Xs in CF were off the table for me when I upgraded.

 

In this post you have conflated power consumption, TDP and temperature as being all the same thing. You also don't seem to get that you're talking about using a performance-orientated GPU from years ago as a low-mid GPU now. Go on, stick a GTX 580 in a HP Pavilion to give it an extra lease of life. Never mind that it's got a cheap-arse 300W PSU in it. These are meant to be budget products, not performance. Power consumption, especially in budget markets, is incredibly important. I shouldn't have to point out to you that my Extreme Edition CPU is not a budget product. It's a completely different section of the market with drastically different requirements. Which is precisely why using this CPU in five years time as a budget mid-range component would be completely wrong.

 

Besides that it isn't running at 4.7GHz and 1.46V all the time. Most of the time it downclocks itself to 1.2GHz and 0.8V, because I really don't need that level of performance all of the time.

 

 

That's really poor reasoning.

 

The R9 290 is a lot stronger than a 960.  New or old, give me the more powerful card.  Its also not like you are buying a used 290 at that price, its brand new, Gigabyte Windforce R9 290.

 
OK, in three months time when you literally cannot get a 290 any more, and the 960 is still £150, do tell me how exactly I'm supposed to go out and get a brand new 290.
 
It's not poor reasoning, it's exactly why the 960 is priced the way it is. It isn't competing with the 290 for this reason.
Link to comment
Share on other sites

Link to post
Share on other sites

Back on topic, I'm impressed the 128 bit bus isn't killing performance of the 4GB card like all the doomsdayers were telling me in the other thread when I inquired as to how the 4GB version performed. Just because 4GB may have done nothing before in the 600 and 700 series wasn't a good reason to jump to the conclusion that it wouldn't matter in the 900 series. Because this test shows it does on some games that can use more than 2GB. I still wish they had done Shadow of Mordor though, since that's one of the games the R9 280 seems to outperform the 2GB 960 in. I would have loved to have seen how that handled the 4GB.

Link to comment
Share on other sites

Link to post
Share on other sites

Alright guys, enough.  Othertom clearly has no idea what he is talking about, just ignore him.

"I genuinely dislike the promulgation of false information, especially to people who are asking for help selecting new parts."

Link to comment
Share on other sites

Link to post
Share on other sites

Right, so because you don't care about how this affects SLI/CF, and AMD and Nvidia should innovate only according to your whims. I couldn't play Tomb Raider this summer because my old GPUs were throttling eachother because of ambient temperature that they were in turn exacerbating. It's a serious problem, and a big reason that 290Xs in CF were off the table for me when I upgraded.

 

In this post you have conflated power consumption, TDP and temperature as being all the same thing. You also don't seem to get that you're talking about using a performance-orientated GPU from years ago as a low-mid GPU now. Go on, stick a GTX 580 in a HP Pavilion to give it an extra lease of life. Never mind that it's got a cheap-arse 300W PSU in it. These are meant to be budget products, not performance. Power consumption, especially in budget markets, is incredibly important. I shouldn't have to point out to you that my Extreme Edition CPU is not a budget product. It's a completely different section of the market with drastically different requirements. Which is precisely why using this CPU in five years time as a budget mid-range component would be completely wrong.

 

Besides that it isn't running at 4.7GHz and 1.46V all the time. Most of the time it downclocks itself to 1.2GHz and 0.8V, because I really don't need that level of performance all of the time.

 

Stop putting words in my mouth son, you really are tiresome to argue with when you say crap like AMD and Nvidia should design according to my every whim. 

Link to comment
Share on other sites

Link to post
Share on other sites

This is nice to see I plan to use EVGA StepUp to go to a 4Gb card.

Link to comment
Share on other sites

Link to post
Share on other sites

So does anyone have thoughts on this card? Would you buy a 4GB 960 after seeing that it can make a big difference in ACU and BF Hardline at 1080p? I personally would pay the extra $30 if I was looking at a 960.

Link to comment
Share on other sites

Link to post
Share on other sites

Stop putting words in my mouth son, you really are tiresome to argue with when you say crap like AMD and Nvidia should design according to my every whim. 

 

I'm not putting words in your mouth, that was a point that you raised. I don't care about SLI/CF performance, therefore neither should AMD/Nvidia. Well good for you. I like my gaming habits to be untied to the weather, personally. The performance that comes with multi-GPU setups shouldn't be crippled by TDP, it's important.

 

 

Alright guys, enough.  Othertom clearly has no idea what he is talking about, just ignore him.

 
You planning to get a brand new 290 for Christmas or something? Because I hate to break it to you, but tech does become obsolete and stop being sold. Like... you're having trouble with one of the very core issues at the heart of technology -- stuff gets old and gets replaced. If your reaction to such a basic, obvious statement is that I have no idea what I'm talking about, you're in for a seriously nasty surprise with this industry.
 
Nice ad hominem though. Real mature.
Link to comment
Share on other sites

Link to post
Share on other sites

So does anyone have thoughts on this card? Would you buy a 4GB 960 after seeing that it can make a big difference in ACU and BF Hardline at 1080p? I personally would pay the extra $30 if I was looking at a 960.

 

More than likely if I was specifically in the market for a 960. I used a 4GB 760 for a while and even back then I was using more than 2GB, not just in skyrim. That being said, one of my 290's can easily outperform it by 50-75% with only a bit more power draw (because an overclocked 760 drinks power), and even though a 960 is more efficient than a 760, here in Canada a 960 4GB is at least 300 dollars, which is a serious gouge for little to no performance gain over a 760. I just don't see the reason for buying a 960, beyond small form factor and less fan noise under heavy load.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

I'm not putting words in your mouth, that was a point that you raised. I don't care about SLI/CF performance, therefore neither should AMD/Nvidia. Well good for you. I like my gaming habits to be untied to the weather, personally. The performance that comes with multi-GPU setups shouldn't be crippled by TDP, it's important.

 

I never said they shouldn't care, stop putting words in my mouth. I said I don't care. And Nvidia doesn't care because of SLI since that's a very small fraction of their customers, Nvidia cares about power consumption because they're building architectures they'd like to work well in mobile devices, just like Intel is doing. But I don't care because I'm not running a mobile device. I know it's the trend and everyone is headed that way because of mobile, but it doesn't matter to me for a reasonable mid-tower single GPU setup.

 

I bought my 970 not because of power savings or heat output, but because they offered Far Cry 4 and roughly 10% more performance than AMD did in the R9 290, which I found to be worth more than the $60 difference in price when I bought (performance I put worth about $30, game worth $60 since I was buying it either way, while the 290 only came with Civ BE which I didn't want).

Link to comment
Share on other sites

Link to post
Share on other sites

I never said they shouldn't care, stop putting words in my mouth. I said I don't care. And Nvidia doesn't care because of SLI since that's a very small fraction of their customers, Nvidia cares about power consumption because they're building architectures they'd like to work well in mobile devices, just like Intel is doing. But I don't care because I'm not running a mobile device. I know it's the trend and everyone is headed that way because of mobile, but it doesn't matter to me for a reasonable mid-tower single GPU setup.

 

I bought my 970 not because of power savings or heat output, but because they offered Far Cry 4 and roughly 10% more performance than AMD did in the R9 290, which I found to be worth more than the $60 difference in price when I bought (performance I put worth about $30, game worth $60 since I was buying it either way, while the 290 only came with Civ BE which I didn't want).

 

I don't care what you do and don't care about. It's perfectly valid for you to not give a shit about heat or power. However, the world revolves around more than what just you care about, and my point about creating new, low powered low end devices instead of just using what was the absolute best three years ago takes more than just your feelings into account. The fact that you seem to be throwing a hissy fit that other people have different priorities to you, especially in the low end market, is frankly bizarre.

 

 

More than likely if I was specifically in the market for a 960. I used a 4GB 760 for a while and even back then I was using more than 2GB, not just in skyrim. That being said, one of my 290's can easily outperform it by 50-75% with only a bit more power draw (because an overclocked 760 drinks power), and even though a 960 is more efficient than a 760, here in Canada a 960 4GB is at least 300 dollars, which is a serious gouge for little to no performance gain over a 760. I just don't see the reason for buying a 960, beyond small form factor and less fan noise under heavy load.

 
That's true every generation though. Why would someone with a 660 Ti upgrade to a 760? It doesn't make any sense. Who buys a mid-range GPU every generation? If they had that much money to burn, why would they settle for a mid range product? Someone with a 660 or a 560, though, could see the 960 as a decent upgrade.
Link to comment
Share on other sites

Link to post
Share on other sites

 

I don't care what you do and don't care about. It's perfectly valid for you to not give a shit about heat or power. However, the world revolves around more than what just you care about, and my point about creating new, low powered low end devices instead of just using what was the absolute best three years ago takes more than just your feelings into account. The fact that you seem to be throwing a hissy fit that other people have different priorities to you, especially in the low end market, is frankly bizarre.

 

You're just going to fight your strawman to death aren't you? Have fun with that.

Link to comment
Share on other sites

Link to post
Share on other sites

You're just going to fight your strawman to death aren't you? Have fun with that.

 

 

I don't care about SLI/CF setups.

 
It's not a strawman: you literally said this shit.
 
You need to learn what words like "strawman" mean before you throw them around just because you don't have an argument.
Link to comment
Share on other sites

Link to post
Share on other sites

 

 
 
It's not a strawman: you literally said this shit.
 
You need to learn what words like "strawman" mean before you throw them around just because you don't have an argument.

 

 

You're saying I'm telling AMD and Nvidia what to make. Are you retarded?

Link to comment
Share on other sites

Link to post
Share on other sites

You're saying I'm telling AMD and Nvidia what to make. Are you retarded?

 

I'm saying that Nvidia (AMD not so much) are making low to mid range cards with energy efficiency in mind because of reasons I've said. Your response is "but I don't care about energy efficiency." Well... good for you? Do you want a biscuit or something?

Link to comment
Share on other sites

Link to post
Share on other sites

For people who think they need it to run League of Legends

 

 

People care about this card?

 

 

For $230, you can get an R9 290. Still don't see the point in a 960, 2GB or 4GB.

 

 

I agree, buying anything but a 290 in that price range is nuts, but some people just flat out want Nvidia.

 

 

but i still dont get why the 960 is actually a thing

 

Versus the R9 285, it's an option for an HTPC or mITX build that can't house a volcano or large card. For a power user with no concern of heat or size, you better stick to the 280X or better.

 

I swear, the mentality here. "It's not for me, so it obviously shouldn't exist."

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

Versus the R9 285, it's an option for an HTPC or mITX build that can't house a volcano or large card. For a power user with no concern of heat or size, you better stick to the 280X or better.

 

I swear, the mentality here. "It's not for me, so it obviously shouldn't exist."

 

 

I should have said "for someone looking to build a gaming system to go in a standard non-HTPC build buying a GTX 960 is nuts when the R9 290 is so cheap and hugely outperforms it"

Link to comment
Share on other sites

Link to post
Share on other sites

People care about this card?

Not here

Workstation:
Intel Core i7 5820k @ 4.4Ghz, Asus Rampage V Extreme, 32Gb G.Skill Ripjaws 4 2400 DDR4,2 x Nvidia 980 Gtx Reference Cards in Sli,
1TB - 4 x 250Gb Samsung Evo 840 Raid 0, Corsair AX1200i, Lian Li PC-D600 Silver.

Link to comment
Share on other sites

Link to post
Share on other sites

I should have said "for someone looking to build a gaming system to go in a standard non-HTPC build buying a GTX 960 is nuts when the R9 290 is so cheap"

Well, it's just this whole "this card doesn't deserve to exist, it makes no sense" bullshit is just that, bullshit. The card has uses, even if it doesn't fit your needs, it'll fit the needs of another. There's actually a market for this GPU, shocking, I know, NVIDIA had to figure that out before they put it on the market.

 

Hell, Jay seems to like it, he just built his dad a rig using it (primarily for CUDA rendering, BUT HEY, THERE'S ANOTHER USE).

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm saying that Nvidia (AMD not so much) are making low to mid range cards with energy efficiency in mind because of reasons I've said. Your response is "but I don't care about energy efficiency." Well... good for you? Do you want a biscuit or something?

 

No, your response is Nvidia is making cards with energy efficiency in mind because your dual cards can't run as well in the summer. That's ridiculous. Nvidia is making cards with energy efficiency in mind because they can use Maxwell in mobile devices which are the lion's share of the market. I can't believe how butthurt you got over my comment that the 7970 and 7950 were great buys.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

Versus the R9 285, it's an option for an HTPC or mITX build that can't house a volcano or large card. For a power user with no concern of heat or size, you better stick to the 280X or better.

 

I swear, the mentality here. "It's not for me, so it obviously shouldn't exist."

 

its really not for anyone...

4690K // 212 EVO // Z97-PRO // Vengeance 16GB // GTX 770 GTX 970 // MX100 128GB // Toshiba 1TB // Air 540 // HX650

Logitech G502 RGB // Corsair K65 RGB (MX Red)

Link to comment
Share on other sites

Link to post
Share on other sites

No, your response is Nvidia is making cards with energy efficiency in mind because your dual cards can't run as well in the summer. That's ridiculous. Nvidia is making cards with energy efficiency in mind because they can use Maxwell in mobile devices which are the lion's share of the market. I can't believe how butthurt you got over my comment that the 7970 and 7950 were great buys.

 

No, that is one of the many reasons I gave for why people even care about energy efficiency.

 

If you had read my one comment actually relating to the 7970 and 7950 you'd see that my comment was that these were great buys at the time, but this status was completely unrelated to the fact that these have, through rebrands, been current products for the last three years. Instead, something with the performance of the 280 and 280X but with much lower power consumption would have been drastically more appropriate for the budget market than just using what was a good performance product in what is, in tech terms, ancient history. You were trying to spin this lack of innovation as a positive for owners of a 280X, when it was a pretty shitty way to introduce a line of products that were being (falsely) advertised as new.

 

No one is butthurt over anything, now actually fucking read the comments you are trying to argue with.

Link to comment
Share on other sites

Link to post
Share on other sites

No, that is one of the many reasons I gave for why people even care about energy efficiency.

 

If you had read my one comment actually relating to the 7970 and 7950 you'd see that my comment was that these were great buys at the time, but this status was completely unrelated to the fact that these have, through rebrands, been current products for the last three years. Instead, something with the performance of the 280 and 280X but with much lower power consumption would have been drastically more appropriate for the budget market than just using what was a good performance product in what is, in tech terms, ancient history. You were trying to spin this lack of innovation as a positive for owners of a 280X, when it was a pretty shitty way to introduce a line of products that were being (falsely) advertised as new.

 

No one is butthurt over anything, now actually fucking read the comments you are trying to argue with.

 

There you go throwing the hissy fit. I said it wasn't necessarily a bad thing because you get driver support for a GPU they're currently selling (even if it is under another name) even when it's old. E.g., if you bought a 7970 three years ago it was a great deal since they're still selling the 280x and thus still optimizing for it in their drivers. So based on past history, if you buy a 290 or 290x now you're likely not going to be abandoned by the driver team for a while, since they're still going to want to sell Hawaii chips and still need those to do well in benchmarks. For someone concerned with performance for his dollar that's more compelling than the other card uses 100W less power, I guess unless you're someone gaming 8 hours a day.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×