Jump to content

Official Nvidia GTX 970 Discussion Thread

C'mon,  for 4k the card is just as worthy/worthless today as it was when you bought it on the back of all the benchmarks that were out there.  I can't believe people actually think the card performs worse now than it did before they discovered all this.

 

The placebo effect hit strong on this one. Being upset that Nvidia lied is one thing. But acting like the cards are now worthless? Eh. The benchmarks 2 months ago still apply now. The card itself didn't get a ninja downgrade since this news came out. This card in single card form still sucks for 4K.

Want good 4K performance? Buy a couple of 980s and laugh your way to excellent benchmarks. Or 3 970s. Or 4 960s (if you could pull off such a thing)

Link to comment
Share on other sites

Link to post
Share on other sites

Yup, this "revelation" suddenly made your GPU worthless, eh? Your GPU was already second tier trash to Nvidia, never to be a flagship card. Now we know precisely how Nvidia made it a 2nd tier card and everyone is getting their jimmies rustled. 

amazing. 

the 970 is still a great card but that is not the point. the point is nvidia is being shady

Link to comment
Share on other sites

Link to post
Share on other sites

You're delusional if you think I'd indulge you in your fantasy or that I even care what you think.

So you admit you're a liar? That's fine.

Link to comment
Share on other sites

Link to post
Share on other sites

the 970 is still a great card but that is not the point. the point is nvidia is being shady

 

I get that, but that doesn't change how the card performed last week and how it performs today. I doubt many people on this forum are as noble in their causes of giving a shit what a company says as they claim they are. I'm judging the card on its merits and nothing else. Its merits tell me its a good performing card and thats where I stop caring. 

 

Maybe its because I'm not as invested in the integrity of my purchases, or having any principles in buying a GPU from a bunch of liars. 

Link to comment
Share on other sites

Link to post
Share on other sites

C'mon,  for 4k the card is just as worthy/worthless today as it was when you bought it on the back of all the benchmarks that were out there.  I can't believe people actually think the card performs worse now than it did before they discovered all this.

 

Yes, Nvidia misrepresented their product,  Don't buy from them again if you wish, take your card back if you wish, but don't try to insinuate the card can't do what it already can and go on as if AMD have never misrepresented any of their products.

What do you mean insinuate ? it's just the reality mr moose. The memory issue isn't reflected in FPS in benchmarks but in stutters in frame consistency. Something which is detectable through Nvidia's own FCAT. Something which only came to light when this memory issue was brought into question.

I can't believe I'm attacked because I'm asking for a 4GB card as advertised. 

Nvidia misrepresented their product,  I may not buy from them again, I may take my card back, but don't try to insinuate that this is a non-issue or that AMD does this to the same extent as Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

The placebo effect hit strong on this one. Being upset that Nvidia lied is one thing. But acting like the cards are now worthless? Eh. The benchmarks 2 months ago still apply now. The card itself didn't get a ninja downgrade since this news came out. This card in single card form still sucks for 4K.

Want good 4K performance? Buy a couple of 980s and laugh your way to excellent benchmarks. Or 3 970s. Or 4 960s (if you could pull off such a thing)

Or just buy two R9 290s. Cheaper than 970s, faster in 4K and actually can use all of their VRAM. At least you'd be buying the same card AMD advertised.

Link to comment
Share on other sites

Link to post
Share on other sites

Help me out here, I get it that NVIDIA lied to the whole tech community about the specs of the card and quite frankly, that is call for a reprimand of some sort but what I don't get is how does this actually affect the card's overall performance in say less than 4k resolution which is where the bulk of its users are?

Intel i5 4460 || Deepcool Gammaxx S40 || Gigabyte H97 Gaming 3 || Corsair Vengeance 2x4gb DDR3 1600 CL9 || Gigabyte GTX 970 G1 Gaming || WD Caviar Blue 1TB || Asus 24x DVDRW OEM || Bitfenix Comrade Window White || Asus VX239h || Razer Blackwidow 2013 || Razer Deathadder Chroma || Windows 8.1 x64 SL || Crucial M550 256gb || Deepcool RGB Led Controller

Link to comment
Share on other sites

Link to post
Share on other sites

Source via PCPer

It appears this is a classic case of false advertising. Nvidia's marketing team has been known to stretch the truth with their marketing in the past. For instance claiming a 165W TDP for the GTX 980 (which is false) to claim a 2X perf/watt boost over Kepler.

 

<snip>

Another example is advertising the Tegra X1 chip as a 1TFLOP "supercomputer" when in fact they're using half precision peak FP16 performance figures. A performance metric which was abandoned from Shader model 3.0 onwards and utterly unsuitable for a "supercomputer" which not only require FP32 but even more so double precision FP64 compute. In which the Tegra X1 can only deliver less than 3% of the advertised compute performance.

 

It should be noted that there isn't really a standard for measuring TDP, and it doesn't always necessarily mean exact power consumption. (I mean, do you seriously think that a midrange Intel i5 should have the exact same TDP as a top-tier quadcore i7? The i7 obviously consumes more power, and this is confirmed by the fact that the i7 will run hotter and louder on the stock heatsink than a midrange i5. Yet Intel just slapped one blanket "TDP" rating over all of them. Who the hell cares?)

 

Phones/Tables still heavily rely on 16-bit floating point math because 16-bit ALUs can be smaller and consume less power than a full-size 32-bit ALU.

On another note, they can only achieve 1 TFLOP of FP16 performance by running two FP16 operations on one 32-bit ALU. Pretty much all mobile SOCs do this.

AFAIK, FP64 is completely unheard of on phones (hell, it's even pretty unheard of in consumer desktop applications). The precision just simply isn't worth the power usage for phones and tablets.

 

 

Help me out here, I get it that NVIDIA lied to the whole tech community about the specs of the card and quite frankly, that is call for a reprimand of some sort but what I don't get is how does this actually affect the card's overall performance in say less than 4k resolution which is where the bulk of its users are?

 

1. It doesn't.

2. It doesn't even change the 4K performance results that we've already seen.

i7 not perfectly stable at 4.4.. #firstworldproblems

Link to comment
Share on other sites

Link to post
Share on other sites

What do you mean insinuate ? it's just the reality mr moose. The memory issue isn't reflected in FPS in benchmarks but in stutters in frame consistency. Something which is detectable through Nvidia's own FCAT. Something which only came to light when this memory issue was brought into question.

I can't believe I'm attacked because I'm asking for a 4GB card as advertised. 

Nvidia misrepresented their product,  I may not buy from them again, I may take my card back, but don't try to insinuate that this is a non-issue or that AMD does this to the same extent as Nvidia.

Lack of vram will cause your FPS to drop by a shitload because the GPU is waiting for the storage drive after it ran out of vram which also means the GPU will need more time to render a frame resulting in a higher rendering time. So no, you're only affected by huge frame times if you're using more than 3.5GB VRAM.

Link to comment
Share on other sites

Link to post
Share on other sites

Except its still a 4GB card, did you even READ what Nvidia released on this issue? 

It isn't, and that's my point.

sroyUQQ.png

 

The worst case scenario on the other hand would be to have the NVIDIA heuristics fail, or alternatively ending up with a workload where no great solution exists, and over 3.5GB of resources must be repeatedly and heavily accessed. In this case there is certainly the potential for performance to crumple, especially if accessing resources in the slow segment is a blocking action

 

In any case, the one bit of good news here is that for gaming running out of VRAM is generally rather obvious. Running out of VRAM, be it under normal circumstances or going over the GTX 970’s 3.5GB segment, results in some very obvious stuttering and very poor minimum framerates.

 

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/4

Link to comment
Share on other sites

Link to post
Share on other sites

Or just buy two R9 290s. Cheaper than 970s, faster in 4K and actually can use all of their VRAM. At least you'd be buying the same card AMD advertised.

The 290s are only faster in some scenarios, not all. Let's be candid here.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1. It doesn't.

2. It doesn't even change the 4K performance results that we've already seen.

 

Exactly, so what's all the fuzz about? I mean people did see the benchmarks before actually buying the cards, right?

Intel i5 4460 || Deepcool Gammaxx S40 || Gigabyte H97 Gaming 3 || Corsair Vengeance 2x4gb DDR3 1600 CL9 || Gigabyte GTX 970 G1 Gaming || WD Caviar Blue 1TB || Asus 24x DVDRW OEM || Bitfenix Comrade Window White || Asus VX239h || Razer Blackwidow 2013 || Razer Deathadder Chroma || Windows 8.1 x64 SL || Crucial M550 256gb || Deepcool RGB Led Controller

Link to comment
Share on other sites

Link to post
Share on other sites

Uhhhh, are people actually looking at specs or benchmarks when buying GPU's? 

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The placebo effect hit strong on this one. Being upset that Nvidia lied is one thing. But acting like the cards are now worthless? Eh. The benchmarks 2 months ago still apply now. The card itself didn't get a ninja downgrade since this news came out. This card in single card form still sucks for 4K.

Want good 4K performance? Buy a couple of 980s and laugh your way to excellent benchmarks. Or 3 970s. Or 4 960s (if you could pull off such a thing)

 

Exactly, And anyone who buys a product based on what they think it should do in the future using theoretical maximums is going to get stung, irrespective of false advertising.

 

 

What do you mean insinuate ? it's just the reality mr moose. The memory issue isn't reflected in FPS in benchmarks but in stutters in frame consistency. Something which is detectable through Nvidia's own FCAT. Something which only came to light when this memory issue was brought into question.

I can't believe I'm attacked because I'm asking for a 4GB card as advertised. 

Nvidia misrepresented their product,  I may not buy from them again, I may take my card back, but don't try to insinuate that this is a non-issue or that AMD does this to the same extent as Nvidia.

 

 

Except that in order to get to that stage of stuttering you have to push the card beyond what most reviewers said it could do,  any card will stutter if you push it beyond it's limits.

 

If Anandtech say it can only do 42 FPS @2160 in crysis3 with low settings,  then that's what you should expect, anandtech said its not really a 4k card, so why buy it then get pissed because it can't do it?

 

There is a big difference between being upset with a company for misrepresenting a product and trying to pretend the product can no longer do something.  It could never do what people are trying to get it to do.

The logic in the place has gone out the window.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Exactly, so what's all the fuzz about? I mean people did see the benchmarks before actually buying the cards, right?

 

They bought a cheaper card that wasn't a flagship card and were okay with it. Now everyone knows WHY it isn't a flagship card and they're getting real mad that it doesn't do 4K well (no benchmark said it would) and they're getting upset that it stutters when it should, because its NOT A HIGH POWERED FLAGSHIP WITH A FULL SIZED CHIP

Link to comment
Share on other sites

Link to post
Share on other sites

It's not even about that. Let me put it this way. You give me the choice of buying two oranges, one is slightly larger than the other. I opt to buy the smaller one because it fits my needs only to realize that it's not even an orange, it's a bitter grapefruit.

 

No it's not.

Link to comment
Share on other sites

Link to post
Share on other sites

Or just buy two R9 290s. Cheaper than 970s, faster in 4K and actually can use all of their VRAM. At least you'd be buying the same card AMD advertised.

And consuming twice the power and while easily being above 70 dBa in CF. Here's a noise test measured from 4" for the 290 Sapphire vaporx people praised as a quiet card; http://be.hardware.info/reviews/5540/8/sapphire-radeon-r9-290x-vapor-x-oc-4-gb-review-geluidsproductie

56 dBa at full load, 33 dBa at idle which is almost as loud as a 970 strix at full load (38dBa); http://be.hardware.info/reviews/5621/23/nvidia-geforce-gtx-980--970-review-incl-ultra-hd-test-asus-vs-msi-geforce-gtx-970

I can set my 970's fan speed fixed to 800 rpm while maintaining temps around 75°

Link to comment
Share on other sites

Link to post
Share on other sites

That is prety lame. I wonder if AMD will take advantage of this unless they screw up too which is possible though they cant afford it like NVIDIA can.

Luckily I didnt bough the card but if I did I would just probably return it and get my money back, yes it is still fast and good card but that doesent change the fact that it is not card I was/would paying/pay for because it is different product.

If you went on gas station and refueled your car from stand where it says 95 octane and then you find out it was 98 octane and the cashier tells you it was just misunderstanding but it works so whats the problem. (Stupid example but I just woke up so dont expect better of me :D ).

 

 

Uhhhh, are people actually looking at specs or benchmarks when buying GPU's? 

 

No, they are looking on how many fins there is on heatsink. The more fins the more powerfull the GPU is.

Oh you got only 157 fins?... Lame! I got 160!

Link to comment
Share on other sites

Link to post
Share on other sites

It's not even about that. Let me put it this way. You give me the choice of buying two oranges, one is slightly larger than the other. I opt to buy the smaller one because it fits my needs only to realize that it's not even an orange, it's a bitter grapefruit.

Of course my GTX 970 isn't worthless, I never implied it is. Not for 1080p or 1440p at least. But for 4K it is utterly worthless now. I had bought the card in anticipation of a 4K G-Sync monitor upgrade this year along with another 970 to power it. But now Nvidia slaps me in the face and tells me my card is worthless for the purpose I had originally bought it for. I'm sorry but if this isn't infuriating enough, the mere aspect of deception is enough for me to call it quits with this rotten arrangement.

UPDATE : And despite what Faa my lead you to believe I do in fact HAVE an MSI Gaming GTX 970, not that I care what an inconsiderate flame-baiter thinks.

 

This is a case of not doing your research, thorough research, before you make a purchase. If you were anticipating and planning to go 4K in the near future, then you should have looked more closely at all the 4k benchmarks thus far and then decided on a GPU. If you bought your 970 right at release, then I have no sympathy for you. You should have waited to see how they perform several months after release. If you'd done your homework you'd have realized the R9 290/290X are still better (overall) for 4K. You could have grabbed a 290X on sale and had a card that would give you decent 4k performance (better than a 970) and then added another later on or grab a 380X once they're released.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

It really isn't. 

Especially when benchmarks for 1080/1440/2160 gameplay were readily available before people decided to buy. The whole world knew this card struggled at 4K. Now people are getting upset about it? Jesus. 

 

This exactly.

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

What's this proving? SLI runs out of VRAM and a single one doesn't? Sorry but that doesn't make sense. That's a pic from Guru3D, so lets see what they had to say

SLI card performance wise there are no stutters recorded, BUT there is good and bad news, the bad news is that you can see a lot of framedrops. This issue continued persistently through all FCAT results. We are still investigating what is going on there. These are not stutters, frame-drops at 6 to 8 ms like these can not be seen with your eyes (that's the good news).

http://www.guru3d.com/articles-pages/geforce-gtx-970-sli-review,8.html

 

Link to comment
Share on other sites

Link to post
Share on other sites

Another example is advertising the Tegra X1 chip as a 1TFLOP "supercomputer" when in fact they're using half precision peak FP16 performance figures. A performance metric which was abandoned from Shader model 3.0 onwards and utterly unsuitable for a "supercomputer" which not only require FP32 but even more so double precision FP64 compute. In which the Tegra X1 can only deliver less than 3% of the advertised compute performance.

 

I wasn't aware of that one... I knew something was fishy when they announced at CES the X1 was capable of over 1 teraflop of performance. I'm glad my gut was right about the numbers being a pile of horse crap. 

 

I normally switch with each upgrade back and forth from AMD to Nvidia, but I think I will be very careful with Nvidia and think long and hard for my next upgrade. I can't believe just how deceptive Nvidia's spin has become in the past 3-4 years, and not just with the most recent crap either. They need to eat some crow.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

maybe they will give some free games for this.

and everyone will be hapy

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×