Jump to content

Do you think that Nvidia should recall 970's? (and Nvidia's response)

Doughnutnator

The thing is is that the Nvidia fully knew about this limitation, No refunds here... From Nvidia's perspective nothing is wrong the 970 runs how it was designed.

GM204_arch_0.jpg

This is the GM204 core in the 970.

Then why the specifications at launch where different than the "real" ones?

That's a dirty move from them ;(

Link to comment
Share on other sites

Link to post
Share on other sites

Quoting the source material here "the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer’s guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned."

Doesn't matter. They advertised it as something more than what it was, and therefore are in the wrong.

Desktop: AMD Threadripper 1950X @ 4.1Ghz Enermax 360L  Gigabyte Aorus Extreme   Zotac 1080Ti AMP Extreme  BeQuiet! Dark Base Pro 900  EVGA SuperNova 1000w G2  LG 34GK950f & ASUS PA248Q Klipsch Reference/Audeze Mobius

 

Synology Wireless AC-2600

 

 

Laptop: Alienware 17R5   Intel i7 8750H  Nvidia GTX1080   3840x2160 4k AdobeRGB IGZO Display   32GB DDR4 2133   256GB+1TB NVMe SSD    1TB Seagate SSHD   Killer 1550 Dual-Band Wireless AC

 

Link to comment
Share on other sites

Link to post
Share on other sites

they never refunded/compensated 660/660ti 2GB owners, I doubt there will be any actions taken on Nvidia's part here either for the 970, as angering as that may be.

What was the issue with the 660 and 660ti cards? Were they overheating like crazy or a misadvertised performance.


Main Rig

**CPU** | [Intel Core i7-4790K @ Stock]**CPU Cooler** | [Corsair H100i] **Motherboard** | [ASRock Z97 EXTREME6 ATX LGA1150 Motherboard]

**Memory** | [G.Skill Trident X Series 16GB (2 x 8GB) DDR3-2133 Memory] **Storage** | [Samsung 840 EVO 250GB 2.5" Solid State Drive]

**Storage** | [Seagate Barracuda 2TB 3.5" 7200RPM] **Video Card** | [Gigabyte GeForce GTX 970 4GB WINDFORCE]

**Case** | [Phanteks Enthoo Pro] **Power Supply** | [EVGA SuperNOVA NEX 750W 80+]


Main Rig PC Partpicker  

| https://pcpartpicker.com/b/DnKZxr |

Link to comment
Share on other sites

Link to post
Share on other sites

Then why the specifications at launch where different than the "real" ones?

That's a dirty move from them ;(

Have you ever dealt with marketing departments? Seriously, this could extremely easily have been a misunderstanding of the design by the marketing higher ups.

 

I have personally been in meetings that went exactly like this:

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

To all you who say they advertised it wrong.

 

Did they? 

 

 

 

The box, their website, etc etc say exactly what you got. If you researched the architecture, you would have found the same thing. GPU testing tools also say the same thing (that the card has XX ROPs etc). The only place it was wrong, was between their PR team, and reviewers. Not between you and nvidia. 

D3SL91 | Ethan | Gaming+Work System | NAS System | Photo: Nikon D750 + D5200

Link to comment
Share on other sites

Link to post
Share on other sites

What was the issue with the 660 and 660ti cards? Were they overheating like crazy or a misadvertised performance.

 

the last 512MB of memory was separate, and was much slower. It's not exactly the same scenario because the 660/660Ti 2GB cards had a 192bit bus, which meant the cards should have been 1.5GB, but they tacked on the extra 512MB to keep up with AMD's 7850/7870 and make it appear they had equal memory performance and size.

 

This was also done with the 550Ti 1GB card, where 256MB of VRam was gimped, but I honestly don't know anyone that bought that card.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nope, still performs better than a 780.. Just don't go full 4k with that GPU

And honestly every reviewer said that a single 970 or 980 was not optimal for 4k anyways.

Windows 10 likes to spy on you. Protect your Data! Run GNU/Linux!
That One Privacy Guy's VPN Comparison Chart.

Spoiler
Spoiler

|ARCH LINUX| |CPU i5 4690k @ 4.7GHz| |GPU: Asus Strix 390x| |Mobo: Asus Sabertooth Z97 Mk 2| |RAM: Corsair Veangence Pro 16gb (2x8gb) @2133mhz| |CPU Cooler: Corsair H100i| |PSU: 750w EVGA Supernova 80+ Gold| |Case: Phanteks Enthoo Evolv ATX(Silver)| |K/B: Pok3r w/ Cherry MX Blues|

Spoiler

|Samsung 250GB 840 EVO: Arch Linux installation.| |Seagate Barracuda 2TB: Mostly Games and stuff related to that. Music and most Media as well.| |Seagate NAS 4TB: Anime and Anime Art-Whoring.| |Seagate 1TB 2.5" SSHD: Arch install on my Thinkpad X220.| |Samsung OEM Lenovo SSD: Windows 8.1 cause I need to play JRPGs some how.|

Spoiler

|Cans: Sennheiser HD 558(Modded)| |Earbuds: Shure SE215| I'm working on expanding this.

 

Link to comment
Share on other sites

Link to post
Share on other sites

the last 512MB of memory was separate, and was much slower. It's not exactly the same scenario because the 660/660Ti 2GB cards had a 192bit bus, which meant the cards should have been 1.5GB, but they tacked on the extra 512MB to keep up with AMD's 7850/7870 and make it appear they had equal memory performance and size.

That's some shady business practices that Nvidia chooses to do.

I am really pissed off since I paid over 400 for my G1 edition and now we find out that it was kneecapped before it left the factory.


Main Rig

**CPU** | [Intel Core i7-4790K @ Stock]**CPU Cooler** | [Corsair H100i] **Motherboard** | [ASRock Z97 EXTREME6 ATX LGA1150 Motherboard]

**Memory** | [G.Skill Trident X Series 16GB (2 x 8GB) DDR3-2133 Memory] **Storage** | [Samsung 840 EVO 250GB 2.5" Solid State Drive]

**Storage** | [Seagate Barracuda 2TB 3.5" 7200RPM] **Video Card** | [Gigabyte GeForce GTX 970 4GB WINDFORCE]

**Case** | [Phanteks Enthoo Pro] **Power Supply** | [EVGA SuperNOVA NEX 750W 80+]


Main Rig PC Partpicker  

| https://pcpartpicker.com/b/DnKZxr |

Link to comment
Share on other sites

Link to post
Share on other sites

And honestly every reviewer said that a single 970 or 980 was not optimal for 4k anyways.

Nothing is ready for 4K yet. It's not prime time for 4K. We are honestly just getting able to max out @ 60fps min high end games @ 1080P.

Link to comment
Share on other sites

Link to post
Share on other sites

That's some shady business practices that Nvidia chooses to do.

I am really pissed off since I paid over 400 for my G1 edition and now we find out that it was kneecapped before it left the factory.

Wait, you bought a GM204 chip that was not a 980 and you didn't realize that it was purposely held back?

 

What did you think the differences were between the 970 and 980?

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Exactly but 4GB compared to 3.5GB @ 4k is quite noticeable. Depends on the game of course.

Definitely depends on the game. If I launch furmark for example and set it to 4k I only hit 500mb of vram usage.

Windows 10 likes to spy on you. Protect your Data! Run GNU/Linux!
That One Privacy Guy's VPN Comparison Chart.

Spoiler
Spoiler

|ARCH LINUX| |CPU i5 4690k @ 4.7GHz| |GPU: Asus Strix 390x| |Mobo: Asus Sabertooth Z97 Mk 2| |RAM: Corsair Veangence Pro 16gb (2x8gb) @2133mhz| |CPU Cooler: Corsair H100i| |PSU: 750w EVGA Supernova 80+ Gold| |Case: Phanteks Enthoo Evolv ATX(Silver)| |K/B: Pok3r w/ Cherry MX Blues|

Spoiler

|Samsung 250GB 840 EVO: Arch Linux installation.| |Seagate Barracuda 2TB: Mostly Games and stuff related to that. Music and most Media as well.| |Seagate NAS 4TB: Anime and Anime Art-Whoring.| |Seagate 1TB 2.5" SSHD: Arch install on my Thinkpad X220.| |Samsung OEM Lenovo SSD: Windows 8.1 cause I need to play JRPGs some how.|

Spoiler

|Cans: Sennheiser HD 558(Modded)| |Earbuds: Shure SE215| I'm working on expanding this.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I own an MSI 970 and you would have to pry it from my cold dead hands, I ignore marketing spin and focus on real world performance.

 

I play old games such as GTA SAMP and iRacing at 60 FPS at Ultra 5k (SoftTH Video wall) with high to max setting running 2x or 4x AA and AF.

Even games like Skyrim are playable at 5k on medium settings.

 

If nVidia want to give me a free game cool other wise meh, This card blew my expectations away and I know it will serve me for many years.

 

Right or wrong the card is amazing and even with this I will still continue to recommend the GTX 970

Link to comment
Share on other sites

Link to post
Share on other sites

Definitely depends on the game. If I launch furmark for example and set it to 4k I only hit 500mb of vram usage.

Furmark is useless. It's not a game you know LOL.

Link to comment
Share on other sites

Link to post
Share on other sites

I own an MSI 970 and you would have to pry it from my cold dead hands, I ignore marketing spin and focus on real world performance.

 

I play old games such as GTA SAMP and iRacing at 60 FPS at Ultra 5k (SoftTH Video wall) with high to max setting running 2x or 4x AA and AF.

Even games like Skyrim are playable at 5k on medium settings.

 

If nVidia want to give me a free game cool other wise meh, This card blew my expectations away and I know it will serve me for many years.

 

Right or wrong the card is amazing and even with this I will still continue to recommend the GTX 970

San Andreas and iracing are not Vram intensive games .

Link to comment
Share on other sites

Link to post
Share on other sites

Furmark is useless. It's not a game you know LOL.

What? I don't see how that matters I am merely pointing out that throwing more pixels on the screen doesn't necessarily amount to higher vram usage. 

Windows 10 likes to spy on you. Protect your Data! Run GNU/Linux!
That One Privacy Guy's VPN Comparison Chart.

Spoiler
Spoiler

|ARCH LINUX| |CPU i5 4690k @ 4.7GHz| |GPU: Asus Strix 390x| |Mobo: Asus Sabertooth Z97 Mk 2| |RAM: Corsair Veangence Pro 16gb (2x8gb) @2133mhz| |CPU Cooler: Corsair H100i| |PSU: 750w EVGA Supernova 80+ Gold| |Case: Phanteks Enthoo Evolv ATX(Silver)| |K/B: Pok3r w/ Cherry MX Blues|

Spoiler

|Samsung 250GB 840 EVO: Arch Linux installation.| |Seagate Barracuda 2TB: Mostly Games and stuff related to that. Music and most Media as well.| |Seagate NAS 4TB: Anime and Anime Art-Whoring.| |Seagate 1TB 2.5" SSHD: Arch install on my Thinkpad X220.| |Samsung OEM Lenovo SSD: Windows 8.1 cause I need to play JRPGs some how.|

Spoiler

|Cans: Sennheiser HD 558(Modded)| |Earbuds: Shure SE215| I'm working on expanding this.

 

Link to comment
Share on other sites

Link to post
Share on other sites

To all you who say they advertised it wrong.

Did they?

The box, their website, etc etc say exactly what you got. If you researched the architecture, you would have found the same thing. GPU testing tools also say the same thing (that the card has XX ROPs etc). The only place it was wrong, was between their PR team, and reviewers. Not between you and nvidia.

Look at this:

http://i59.tinypic.com/llbbr.png

I clearly see 64

BTW: Linus shared this thread in his FB page, this is awesome. Sadly nobody can access the site :P

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, you bought a GM204 chip that was not a 980 and you didn't realize that it was purposely held back?

 

What did you think the differences were between the 970 and 980?

Nvidia advertised the 970 to have less cuda cores but listed with no edits until now in regards to different amounts of cache and ROPs


Main Rig

**CPU** | [Intel Core i7-4790K @ Stock]**CPU Cooler** | [Corsair H100i] **Motherboard** | [ASRock Z97 EXTREME6 ATX LGA1150 Motherboard]

**Memory** | [G.Skill Trident X Series 16GB (2 x 8GB) DDR3-2133 Memory] **Storage** | [Samsung 840 EVO 250GB 2.5" Solid State Drive]

**Storage** | [Seagate Barracuda 2TB 3.5" 7200RPM] **Video Card** | [Gigabyte GeForce GTX 970 4GB WINDFORCE]

**Case** | [Phanteks Enthoo Pro] **Power Supply** | [EVGA SuperNOVA NEX 750W 80+]


Main Rig PC Partpicker  

| https://pcpartpicker.com/b/DnKZxr |

Link to comment
Share on other sites

Link to post
Share on other sites

 You're right but It's a little unreasonable to expect your average Joe to know what exact GPU architecture and variant he is running.

That is probably fair, but I also don't expect the average Joe to buy a 970 or if they do, pay enough attention to tech news to even know about this development.

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

What? I don't see how that matters I am merely pointing out that throwing more pixels on the screen doesn't necessarily amount to higher vram usage. 

Isn't Furmark from like 2007 ?

Link to comment
Share on other sites

Link to post
Share on other sites

I don't want a recall.

 

I don't want my GTX 970 fixed.

 

I don't want a free game.

 

I don't want a Steam gift card.

 

I don't want a GTX 980 in exchange (really, this is the dumbest suggestion I have ever heard).

 

I want a refund so I can give AMD my business and get a real 4GB card.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia advertised the 970 to have less cuda cores but listed with no edits until now in regards to different amounts of cache and ROPs

Yes but if it's not a full chip, it is purposely gimped. The exact details on how they are gimped is usually made clear, in this particular case; albeit incorrectly. 

 

But to buy a 970 and claim that you had no idea it was gimped is a bit of a stretch. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Kepler and Maxwell are totally different stuff. You can compare stuff like CUDA cores, ROPS and cache in chips that has the same architecture (the case of the 970 and 980)

I never said anything about the 3.5GB issue

I bought the card in October, and yes, I took a look at the specifications, because Linus said them in his video, as well as JayZ (i think). Back in that days nobody knew about that, Nvidia was selling a 4GB 64ROPS GPU.

Again, performance doesn't matter in this case. They sold me "x" product, and it ended up beign "y" product.

If performance doesnt matter to you then why did you buy a 970...

This is a consumer graphics card. You're meant to GAME with it not read the specs every day to be satisfied.

 

The reason people bought the 970 is because it performs great, not because it was supposed to have 64 ROPs

 

Since youre a special case because you care more about a number on the box rather than fps in a game then you should have checked the numbers when you first got your 970.

 

My point is that *normal* people shouldnt return their 970 just because of a stupid number. You return it if you have a performance problem, not a "This number is off by 8 and I don't like it." problem.

NEW PC build: Blank Heaven   minimalist white and black PC     Old S340 build log "White Heaven"        The "LIGHTCANON" flashlight build log        Project AntiRoll (prototype)        Custom speaker project

Spoiler

Ryzen 3950X | AMD Vega Frontier Edition | ASUS X570 Pro WS | Corsair Vengeance LPX 64GB | NZXT H500 | Seasonic Prime Fanless TX-700 | Custom loop | Coolermaster SK630 White | Logitech MX Master 2S | Samsung 980 Pro 1TB + 970 Pro 512GB | Samsung 58" 4k TV | Scarlett 2i4 | 2x AT2020

 

Link to comment
Share on other sites

Link to post
Share on other sites

They probably use preset infomation reported by the driver, a full GM204 has 64 so it reports 64...

Anyways Nvidia advertised the 970 as a "GTX 980 with less CUDA cores". At launch the only diffetence between the 970 and the 980 was CUDA Cores, not ROPS or L2 cache. They sold it like that, to thousands of people like me. Thinking that they got a "Less CUDA Cores GM204" not a "Less CUDA Cores, ROPS, Cache and dumbed down VRAM GM204"

Link to comment
Share on other sites

Link to post
Share on other sites

Yes but if it's not a full chip, it is purposely gimped. The exact details on how they are gimped is usually made clear, in this particular case; albeit incorrectly. 

 

But to buy a 970 and claim that you had no idea it was gimped is a bit of a stretch.

We are not saying we did not know there was a difference however a clear and honest explanation from Nvidia in their advertisements is something we deserve.


Main Rig

**CPU** | [Intel Core i7-4790K @ Stock]**CPU Cooler** | [Corsair H100i] **Motherboard** | [ASRock Z97 EXTREME6 ATX LGA1150 Motherboard]

**Memory** | [G.Skill Trident X Series 16GB (2 x 8GB) DDR3-2133 Memory] **Storage** | [Samsung 840 EVO 250GB 2.5" Solid State Drive]

**Storage** | [Seagate Barracuda 2TB 3.5" 7200RPM] **Video Card** | [Gigabyte GeForce GTX 970 4GB WINDFORCE]

**Case** | [Phanteks Enthoo Pro] **Power Supply** | [EVGA SuperNOVA NEX 750W 80+]


Main Rig PC Partpicker  

| https://pcpartpicker.com/b/DnKZxr |

Link to comment
Share on other sites

Link to post
Share on other sites

Dammit guys, my notifications are exploding like new years eve

 

stahp

dead-horse.gif

My modded Air 540 build

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×