Jump to content

Nvidia 960 Performs Slightly Faster Than Radeon R9 280 (3D Mark Scores)

Personally I always have a custom fan curve on mine anyway through MSI Afterburner. I like having the fan on 100% when I'm gaming, because I'm more concerned about cool temps and max performance then the minimal fan noise - which is still pretty quiet even at 100%. You hear the air more then the fans. Just a wooshing noise that isn't high pitched or annoying. So I have the fan max out when it gets to like 70 C, and a much more gentle curve below 70 C, so when I'm not gaming, the fans run pretty low. At idle, the fans are at the lowest setting MSI Afterburner will let me set.

 

Gigabyte nailed the WF 7950. I will definitely consider their WF coolers first for any new GPU I look at buying, AMD or NVIDIA.

 

The highest temps I've seen after some testing today is max 62 degrees (OC'd) in open air (case open) and about 69 (OC'd) with the case closed (warm air gets trapped in the Node 304 with higher-end GPUs on air) - which is still cooler than it used to get. Before it would hit upwards of low-mid 70's and the fans don't run as aggressively as they used to either. Cooler and quieter makes me happy. :)

 

They nailed it indeed. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like a complete joke of a card. Same speed and price as a 2 year old rebranded card from AMD.

No one gives a shit that it uses 50W less nvidia. Thats only useful in the really low, or high end.

I honestly can't see anyone even considering buying this unless they are either such a massive fanboy they can't actually think about their decisions, just buying anything nvidia throw at them, or are just plain dumb.

Wouldn't surprise me if these are the bad binned cards and they will release a 760ti later down the line which is what the 760 should have actually been. Just like every other generation with a x60ti, the original x60 was just terrible in every aspect and the x60ti was the actual card people wanted.

It has the same memory controller/bus as the 750ti, and that really will cripple it even at 1080p... I am guessing 760ti with 192 bit bus, 3GB of VRAM and all the GM206 that didn't have shit yields in september this year.

Sadly they will still sell millions on release day to deluded fanboys.

Link to comment
Share on other sites

Link to post
Share on other sites

Doesn't account for power usage. 290 is a power hog.

Which doesn't matter 2 bull ballsacks unless your power costs a lot.
Link to comment
Share on other sites

Link to post
Share on other sites

Which doesn't matter 2 bull ballsacks unless your power costs a lot.

 

Yep. And even then, even if you live in a country with some of the highest electricity prices in the world, the difference will be minimal, especially if overclocked. Nvidia cards take a lot more power when overclocked, ratio wise, more than AMD really, people seem to forget that.

Link to comment
Share on other sites

Link to post
Share on other sites

Yep. And even then, even if you live in a country with some of the highest electricity prices in the world, the difference will be minimal, especially if overclocked. Nvidia cards take a lot more power when overclocked, ratio wise, more than AMD really, people seem to forget that.

Plus people seem to list power consumption as a reason to not take AMD cards, while it is a really a non-reason unless your psu can't handle it

Link to comment
Share on other sites

Link to post
Share on other sites

Plus people seem to list power consumption as a reason to not take AMD cards, while it is a really a non-reason unless your psu can't handle it

More power consumption also usually means more heat. More heat coming out GPU makes whole case hotter. Makes CPU temps go up. Not as much overclockability on the GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

More power consumption also usually means more heat. More heat coming out GPU makes whole case hotter. Makes CPU temps go up. Not as much overclockability on the GPU.

I guess, but most people don't even notice that.

Link to comment
Share on other sites

Link to post
Share on other sites

From the factory this card will boost/overclock to 1500MHz.

To make up for its weak spec.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

From the factory this card will boost/overclock to 1500MHz.

To make up for its weak spec.

 

Maxwell scales quite badly considering you can rip out 300 Mhz over stock clocks and still end up with similar OC scaling than Kepler. On some cases, worse.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell scales quite badly considering you can rip out 300 Mhz over stock clocks and still end up with similar OC scaling than Kepler. On some cases, worse.

Where does Kepler scale badly?  With my OC I got an extra 10fps and it's not even a high one for my GPU.

And when seeing how fast some GTX970 get I would say Maxwell scales just as well if not better

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Where does Kepler scale badly?  With my OC I got an extra 10fps and it's not even a high one for my GPU.

And when seeing how fast some GTX970 get I would say Maxwell scales just as well if not better

 

Still in the 15% range when looking at FPS gained. And I said Maxwell scales badly compared to Kepler. Usually FPS gains for the 780 Ti are in the 18-22% range.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Considering the kneecapped VRAM it'll be slower when it matters. This entire card a complete waste of time unless it's less than $200.

2GB of Vram is enough for 1080p before antialiasing + uncompressed textures. Until 2GB cards are completely phased out you'll see devs target their memory budget for them.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell scales quite badly considering you can rip out 300 Mhz over stock clocks and still end up with similar OC scaling than Kepler. On some cases, worse.

In my experience overclocking my 980, there's a point where increasing the clocks actuslly decreases performance. It may appear stable, but when you compare the performance you csn watch ot fall off a cliff. This is especially noticeable in 3DMark since its scores are very sensitive to overclocking. I'm fairly convinced people are aiming for the highest stable clockrate on their Maxwell cards rather than the highest performing one. This could be a voltage limitation on my strix card of course. (I know aome other cards have a higher voltage ceiling.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Maxwell scales quite badly considering you can rip out 300 Mhz over stock clocks and still end up with similar OC scaling than Kepler. On some cases, worse.

Yeah. Loads of people where actually quite happy breaking 1500MHz because they couldn't do that with kepler.

Reality is it isn't necessarily better. I think that the lower the spec the higher the code clock goes in order to make up for the lower spec.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

I get that it'll beat out a 680 and use less power, but it's still a low-spec card. Even looking at the 760 I couldn't see the point. If we have three resolutions (1080p, 1440p and 4k) and the 980 can manage 4k (obviously better when in SLI) with the 970 destroying 1080p and faring well at 1400p it makes sense for the 960 to follow as the 1080p card, meaning it should max 1080p and push forward.

 

On my 660 my modded Skyrim (which is more CPU-bound anyways) uses 1500MB+ VRAM constantly and that's WITHOUT any of those nice 4k or 8k texture mods. The poor card struggles when there's detailed textures in view etc. and I've seen evidence of the much more powerful 660Ti barely handling Star Citizen. I think that shows that 3GB VRAM is a minimum. Sure some games won't use more than 2, but the extra gig stops any game going over by 100MB and just allows for better textures.

 

As for the bus, Nvidia seem to have a thing about low bitrate buses. My 660 has a 192-bit bus (which people were bemoaning because the 560Ti had a 256-bit bus) and for all the optimisations and algorithms in the world I can't see why it makes sense to limit down to a bus LOWER than a previous x60 card. As for the cores, there's more of them (yay) but it doesn't seem too many more, though I can't remember how many the 980 has so that may be a bad point.

 

I guess a 770 on sale or saving up for the 970 makes the most sense if you need/want Nvidia. I used to use Shadowplay but I guess I could live with an R9 series card. The 3xx will either be good enough or will push the prices of the 2x down so much an R9 290/290x will be sub-£200

Link to comment
Share on other sites

Link to post
Share on other sites

2GB of Vram is enough for 1080p before antialiasing + uncompressed textures. Until 2GB cards are completely phased out you'll see devs target their memory budget for them.

 

Interestingly enough until devs stop targeting 2GB cards and start demanding more VRAM and making games that require more the manufacturers will keep making 2GB cards.

Link to comment
Share on other sites

Link to post
Share on other sites

2GB of Vram is enough for 1080p before antialiasing + uncompressed textures. Until 2GB cards are completely phased out you'll see devs target their memory budget for them.

 

The Witcher 3, Star Citizen and Dying Light are all examples of games that will be bottlenecked by a mere 2GB frame buffer already, filterless at 1080p. There's 8192p composite skin and clothing textures on characters in Dying Light on ultra. The Witcher 3 devs already stated 3GB as a requirement for Ultra 1080p. As for Star Citizen, it's already evident by tweaking configs to see how much VRAM it occupies with all filters manually turned off on ultra, and 2GB is not enough, and that's before they've added all the rendering techniques they promised as well as full PBR. In addition, this is in arena commander, which is an example of a light load within Star Citizen. Planetside/ FPS module will be torture on a 2GB buffer.

 

So no, game developers are forging ahead with or without Nvidia yielding to demands. The 960 is simply going to be obsolete before it arrives.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

The Witcher 3, Star Citizen and Dying Light are all examples of games that will be bottlenecked by a mere 2GB frame buffer already, filterless at 1080p. There's 8192p composite skin and clothing textures on characters in Dying Light on ultra. The Witcher 3 devs already stated 3GB as a requirement for Ultra 1080p. As for Star Citizen, it's already evident by tweaking configs to see how much VRAM it occupies with all filters manually turned off on ultra, and 2GB is not enough, and that's before they've added all the rendering techniques they promised as well as full PBR. In addition, this is in arena commander, which is an example of a light load within Star Citizen. Planetside/ FPS module will be torture on a 2GB buffer.

 

So no, game developers are forging ahead with or without Nvidia yielding to demands. The 960 is simply going to be obsolete before it arrives.

We'll see when those games come out.

8192x8192 textures will look identical to the 4096x4096 or even 2048x2048. Quite simply, the pixels are dense enough on the uv map that you're going to have one hell of a time finding a difference for 2x-4x the memory usage. Did Techland hire FakeFactory or something?

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Interestingly enough until devs stop targeting 2GB cards and start demanding more VRAM and making games that require more the manufacturers will keep making 2GB cards.

The low end/mid range will stay on 2gb for a long time.

The high end already moved on years ago.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

The low end/mid range will stay on 2gb for a long time.

The high end already moved on years ago.

 

Because of that vicious circle I just pointed out. It's the same reason Adobe won't even think about supporting Linux even though they have the money to do so - fact is not enough people use it for them to make a profit, but people WON'T use it because they want Photoshop and other things.

 

The 'high end' has had 2GB cards until very recently. The 6xx series was 2GB, the 770 was 2GB by default and the 4GB version wasn't worth it and it's only really been the 780 that had a higher frame buffer (disregarding the whole mess of Titans). Pre R9 series AMD cards were 2GB and aside from the 280 upwards they're just the old HD7xxx cards including amount of VRAM. The 280 and 280x are still the old cards but got 3GB instead, and the 290/290x actually got 4GB, so really it's only the 290/290x that have moved on 'years ago'.

 

Also I'd hardly call x60 'low end'. In fact if we look at the actual low end (the GT cards) we find that they ship with more than 2GB VRAM in some cases (though it's mostly pointless unless you're driving a 4K screen, at which point why do you need that sort of screen if you're not doing content creation or gaming on it). The x60 cards are supposed to represent the minimum for maxing games @ 1080p (sans filters), and as games become more demanding so does the card need to come significantly more powerful.

 

We're not running consoles and are stuck with a particular platform set of unchangeable hardware for 5 years until the poor machines have so much dust inside they can't manage 30fps at 792p anymore, so I really doubt we're going to be stuck and not progressing.

 

What's the point of buying a new card for it to have say £20 more performance than the old one, which was already bottlenecking hugely? Especially when it's not likely you can sell the old card for anywhere its RRP since it's been price cut and is now second hand. You're now close to twice the money you spent on the old card worse off and only have a tiny performance increase to show for it.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not particularly impressed to be honest... the 280 is 1.5 years old at this point, beating it is no huge feat at this point. I'm curious to see how this will stack up to a hypothetical R9 380.

I agree, especially since the 280 is a rebrand  of a card thats over 3 years old.....

 

The 960 is definitely looking underwhelming.

I refuse to read threads whose author does not know how to remove the caps lock! 

— Grumpy old man

Link to comment
Share on other sites

Link to post
Share on other sites

Mid range is by mo means a "max everything out at 1080p" bracket. It's a "games will be playabe on high settings" bracket. We've been a bit spoiled in the last few years, and the only game to really be our "Crysis" has been assassins creed unity, wherein omly top end cards can even play it decently.

No one uodatea their mid range GPU year after year. The average is 3 years, which ends up being around a 100% gain in the same bracket in that timeframe.

Games on Ultra require massively more horsepower to run than on High, without appreciable benefit. Ultra will always be on the bleeding edge, but most people will be content to pkay games with reduced settings. 2gb cards have a couple more years I would wager. It gives their higher end cards an extra bulletpoint on the box aftetall.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah. Loads of people where actually quite happy breaking 1500MHz because they couldn't do that with kepler.

Reality is it isn't necessarily better. I think that the lower the spec the higher the code clock goes in order to make up for the lower spec.

 

That's the thing. People buy them because of the higher core clock numbers. That's pretty much it.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

The 'high end' has had 2GB cards until very recently. The 6xx series was 2GB, the 770 was 2GB by default and the 4GB version wasn't worth it and it's only really been the 780 that had a higher frame buffer (disregarding the whole mess of Titans). Pre R9 series AMD cards were 2GB and aside from the 280 upwards they're just the old HD7xxx cards including amount of VRAM. The 280 and 280x are still the old cards but got 3GB instead, and the 290/290x actually got 4GB, so really it's only the 290/290x that have moved on 'years ago'.

 

The high-end HD 7XXX cards (7950/7970) had 3GB Vram, same as the 280/280X. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×