Jump to content

Official Nvidia GTX 970 Discussion Thread

Actually, out of curiosity, is that still true of AMD gpus in crossfire using mantle?  I thought that part of the reason that crossfire/sli setups in the past could only use what each gpu had in memory alone was because the technique they used was alternate frame rendering, where each gpu would render the scenes one after the other.  With mantle, and presumably dx12, asychronous rendering is possible where one gpu could perform rendering tasks using its memory while another gpu, even of differing power and ram amounts could work on some other arena or visual effect.

 

https://www.youtube.com/watch?v=N_6CAneoW-0#t=23m28s

I feel like this is something that can only be done by Mantle with GCN, Nvidia/AMD would have to tinker a bit to get something like this to even work, or completely new hardware with this in mind.

 

says who? many people including myself dont upgrade each year... what happens when a year from now when BF5 comes out and i cant max it out because it goes over 3.5gb vram? hmmm?

What your saying is completely overdone. 

http--www.gamegpu.ru-images-stories-Test

Even If that was the case, You'll be pulling over 3.5gb's at 1080p/1440p. Over done amounts of MSAA and Scaling is unrealistic though. I doubt even with better textures that it'll use upwards of 3.5gb

Computing enthusiast. 
I use to be able to input a cheat code now I've got to input a credit card - Total Biscuit
 

Link to comment
Share on other sites

Link to post
Share on other sites

says who? many people including myself dont upgrade each year... what happens when a year from now when BF5 comes out and i cant max it out because it goes over 3.5gb vram? hmmm?

 

1080p goes over 3.5.

 

tumblr_lx9jb1SPMr1qdrpdr.gif

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

....

There's 4GB on the PCB. Since when does Nvidia or AMD advertise like "you got 4GB of framebuffer to tease your GPU with"? You assumed that when they provided 4GB on the PCB. It's not false advertising, just a design flaw which wouldn't have been a design flaw if they decided to not screw it up by informing reviewers properly. They just didn't have the balls to say "Ok we fucked something up, it's a 3.5GB card because blabla" after the release.

 

RT2EsRv.png

 

As part of our discussion with NVIDIA, they laid out the fact that the original published specifications for the GTX 970 were wrong, and as a result the “unusual” behavior that users had been seeing from the GTX 970 was in fact expected behavior for a card configured as the GTX 970 was. To get straight to the point then, NVIDIA’s original publication of the ROP/memory controller subsystem was wrong; GTX 970 has a 256-bit memory bus, but 1 of the 4 ROP/memory controller partitions was partially disabled, not fully enabled like we were originally told. As a result GTX 970 only has 56 of 64 ROPs and 1.75MB of 2MB of L2 cache enabled. The memory controllers themselves remain unchanged, with all four controllers active and driving 4GB of VRAM over a combined 256-bit memory bus.

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation

 

you said AMD lie about up to 50% advantage in mantle,

 

 

 

PHhns8q.png

 

Q: How much additional performance will Mantle provide?

A: Performance benefits will vary depending on the characteristics of each application, and of the CPU & GPU it is run on. In general, the largest benefits will be seen in cases where CPU overhead has traditionally limited GPU throughput.

 

http://www.guru3d.com/articles-pages/amd-mantle-preview,3.html

 

let's keep this on topic, please!, no need to start Red vs Green WAR again.

Link to comment
Share on other sites

Link to post
Share on other sites

To be honest, who cares if they lied. It's not going to change the performance of the card. Every reviewer has already reviewed it and the performance will not change. 

 

You're right about the performance being the same, but this is a SERIOUS issue for a number of reasons. It's not about the card being "good enough".

 

- It's sets a poor precedent for future business (what specs will they be able to claim for the next series of cards?)

- It stifles competition (*cough* AMD)

- It assumes the reason why a particular consumer purchased the card (specs vs. FPS @ 1080/1440/4k)

- It's against the law in many countries

- Apply this same scenario to an automobile/aircraft/television/etc, and their would be backlash as well.

 

I understand why some people don't see this as a big deal, even among gtx 970 owners, but it has to be taken seriously.  I do think many are hostile in their tone, but it doesn't make them incorrect for their indictment of Nvidia on this one.

Link to comment
Share on other sites

Link to post
Share on other sites

Reading so many phrases like "The performance will not change", "It's still a great card", "It doesn't change the benchmarks" and sometimes even "It's not false advertising, just a design flaw" makes me really want to share my opinion with you about this whole situation. (And I'd really love to hear your answers about my point).

 

Because I think everyone of us should put up a line in our heads about what Nvidia or any other company is allowed to do and advertise and what is too much.

 

- Would it still be okay if it wasn't 3.5 Gb fast and 0.5 Gb slow but 3 Gb and 1 Gb? (Nobody told you before, but the performance would still be the same)

 

- What if the release a new version with 8Gb Vram but in truth only 3.5 Gb would be accessible by a single game and that with the fast 224 Gb/s (or what it is), the rest would be the slow crap like it is now. (Keep in mind, it's still a great card)

 

- What if Nvidia told us "We made a cool new architecture with 37 % more awesome!!1! And this card has this exact new architecture". But in truth it has the old architecture and they didn't change much. (Nobody told you before but it doesn't change the benchmarks. It would still be a great card.)

 

- Now an even worse assumption, what if there is a real difference in performance? What if reviewers and tech sites would get explicit better cards then normal customers? How much difference in performance is acceptable? 10 %? 20 %? (Nobody told you before)

 

- Or finally, the whole micro-stutter-problem. It feels like some have it, some doesn't have it and nobody can really explain why and where there is a difference. Would it be okay if the card would be unusable for resultions of 1440p and above? (Not because the card just doesn't has enough horsepower but because they made a big mistake which should not have occured. Nobody told you before, but 1080p would still be a great experience.)

 

I know some of those examples might not be as good as i wish they are and in my opinion there is no real right answer to this question. It's more a personal decision which everybody should make on his/her own. But this still means we should make this decision and keep it in mind.

 

I (as a happy GTX 970 owner) am in a very bad situation right now. My fear is less about the card and more about the future of Nvidia and other companies, because right now it feels like they are telling us "LOLOLZ, Y U EVEN MAD BRO?!?" and if they get away with that manner, they might try something worse in the future and then it might hurt even more.

 

But right now, I feel like it is still the time and the possibility to vote with our wallets while we can still hurt them enough, so maybe we should punch them as hard as we can in the face (with our wallets of course ;) )

 

So what do you guys think I or we should do? Keep the card? (I have to admit, it IS still a good card) Or try to get refund and change to something else? (Maybe even team red)

Hopefully this isn't too big now. I apologize.

 

Edit: 

 

What i completly forgot for a second is the price point.

They might have had a problem with selling the card at it's current price with their specs of 3.5Gb. So if it's still a good card, they might have betrayed us about 20, 30 or even more bucks :(

BacardiRoqs

Link to comment
Share on other sites

Link to post
Share on other sites

1080p goes over 3.5.

 

 

have you played watch dogs at max settings? 

Case: NZXT Phantom PSU: EVGA G2 650w Motherboard: Asus Z97-Pro (Wifi-AC) CPU: 4690K @4.2ghz/1.2V Cooler: Noctua NH-D15 Ram: Kingston HyperX FURY 16GB 1866mhz GPU: Gigabyte G1 GTX970 Storage: (2x) WD Caviar Blue 1TB, Crucial MX100 256GB SSD, Samsung 840 SSD Wifi: TP Link WDN4800

 

Donkeys are love, Donkeys are life.                    "No answer means no problem!" - Luke 2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

It is not just about this stupid gaming 4k bullshit, if you have bought this card for any sort of 3D work or video editing (some kinds do use quite a bit of vram with cuda acceleration) then this is really annoying. I have looked at benchmarks for 4k gaming, but I don't just look at those when I decide to buy a card, I look at the specs of said card. When I looked I saw it had 4GB of video ram and those specified L2 Cache with that nummer of cores. Now it turns out I don't get that. It is completely unimportant for me whether I would actually use all that power (L2 cache will be used completely though). I wanted to buy a card with these specs, regardless what those benchmarks said and for a reason that is unimportant. Now it turns out I got a card that just doesn't have these specs and Nvidia lied about it. That is my issue. I don't care that the benchmarks are still relevant, the card I bought isn't the card I wanted to buy, simple as that. 

 

I am a little bit of an Nvidia fan boy and will never buy AMD, but I am genuily considering to return this GTX970 and buy a GTX980 instead. 

 

I have said before, I fully understand that people are pissed because of this,  Nvidia said it had x features and it doesn't. 

 

The same argument applies for 3d rendering as it does for gaming,  there are plenty of sites that do benchmarks for rendering as well.  The specs give us an idea how it will perform, they are important for us enthusiasts because we want to know such things.  But the reality is no one buys (or should buy) a card on specs. because specs != performance, This is not the first card/product that doesn't perform as the specs suggest it should.

 

I am sure if we put our heads together we could come up with a monstrous list of products from every company that are misleading in the same way.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

No, because WD at max settings is not a game, it's a slideshow.

no not really... 

Case: NZXT Phantom PSU: EVGA G2 650w Motherboard: Asus Z97-Pro (Wifi-AC) CPU: 4690K @4.2ghz/1.2V Cooler: Noctua NH-D15 Ram: Kingston HyperX FURY 16GB 1866mhz GPU: Gigabyte G1 GTX970 Storage: (2x) WD Caviar Blue 1TB, Crucial MX100 256GB SSD, Samsung 840 SSD Wifi: TP Link WDN4800

 

Donkeys are love, Donkeys are life.                    "No answer means no problem!" - Luke 2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

no not really... 

Woops, sorry about that, I forgot /s

 

I seem to recall though that the last time I checked WD's VRAM usage it was something like 2880MB or something.  And I was doing around 50 at 1440p.

AD2000x Review  Fitear To Go! 334 Review

Speakers - KEF LSX

Headphones - Sennheiser HD650, Kumitate Labs KL-Lakh

Link to comment
Share on other sites

Link to post
Share on other sites

I'm glad we continue to talk about how Nvidia lied about the 970, yet everyone was being blown away by it's benchmarks on release for it's price point.

Only if you are an nvida fanboy. Maxwell has very little performance increase. It's power usage is hugely overblown and is only a bit faster than the R9 290 which is far cheaper.

Link to comment
Share on other sites

Link to post
Share on other sites

 

you said AMD lie about up to 50% advantage in mantle,

 

http://www.guru3d.com/articles-pages/amd-mantle-preview,3.html

 

let's keep this on topic, please!, no need to start Red vs Green WAR again.

Still says 4GB and it's against their own DirectX, Nvidia's DX cuts CPU time as much as Mantle does. http://www.overclock.net/t/1528559/directx-driver-overhead-and-why-mantle-is-a-selling-point-bunch-of-benchmarks/0_100

Link to comment
Share on other sites

Link to post
Share on other sites

Woops, sorry about that, I forgot /s

 

I seem to recall though that the last time I checked WD's VRAM usage it was something like 2880MB or something.  And I was doing around 50 at 1440p.

max settings, 1080p, no mods. played multiplayer yesterday (surprisingly i found a lobby quick) and the game was stuttering a lot since i was racing and it was going over 3.5gb vram

 

 

http://i.imgur.com/b8RSS8S.png

Case: NZXT Phantom PSU: EVGA G2 650w Motherboard: Asus Z97-Pro (Wifi-AC) CPU: 4690K @4.2ghz/1.2V Cooler: Noctua NH-D15 Ram: Kingston HyperX FURY 16GB 1866mhz GPU: Gigabyte G1 GTX970 Storage: (2x) WD Caviar Blue 1TB, Crucial MX100 256GB SSD, Samsung 840 SSD Wifi: TP Link WDN4800

 

Donkeys are love, Donkeys are life.                    "No answer means no problem!" - Luke 2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

120W, anything you got to say sir?

Furmark has been known to increase voltages though. It can also brick GPU's. My 560Ti succumbed to Furmark. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

max settings, 1080p, no mods. played multiplayer yesterday (surprisingly i found a lobby quick) and the game was stuttering a lot since i was racing and it was going over 3.5gb vram

 

 

http://i.imgur.com/b8RSS8S.png

I don't understand why I haven't experienced much stuttering with a 3GB 780Ti then.  Sure bus-width is wider but that's not the problem that the 970 is facing.

AD2000x Review  Fitear To Go! 334 Review

Speakers - KEF LSX

Headphones - Sennheiser HD650, Kumitate Labs KL-Lakh

Link to comment
Share on other sites

Link to post
Share on other sites

have you played watch dogs at max settings? 

 

Did you just used Watch Dogs as an example? That game stutters even with 2-3 sli/xfire. 

 

nicholas.gif

 

 

I don't understand why I haven't experienced much stuttering with a 3GB 780Ti then.  Sure bus-width is wider but that's not the problem that the 970 is facing.

 

 

The game stutters a lot when driving. It's never been fixed.

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Did you just used Watch Dogs as an example? That game stutters even with 2-3 sli/xfire. 

 

 

i know i know but its the only game i have currently that gets to 3.5gb since ubi took my fc4 away XDDD

Case: NZXT Phantom PSU: EVGA G2 650w Motherboard: Asus Z97-Pro (Wifi-AC) CPU: 4690K @4.2ghz/1.2V Cooler: Noctua NH-D15 Ram: Kingston HyperX FURY 16GB 1866mhz GPU: Gigabyte G1 GTX970 Storage: (2x) WD Caviar Blue 1TB, Crucial MX100 256GB SSD, Samsung 840 SSD Wifi: TP Link WDN4800

 

Donkeys are love, Donkeys are life.                    "No answer means no problem!" - Luke 2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

i know i know but its the only game i have currently that gets to 3.5gb since ubi took my fc4 away XDDD

 

 

Absurd reason to use it as an example. 

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Absurd reason to use it as an example. 

so what do you propose i use? 

Case: NZXT Phantom PSU: EVGA G2 650w Motherboard: Asus Z97-Pro (Wifi-AC) CPU: 4690K @4.2ghz/1.2V Cooler: Noctua NH-D15 Ram: Kingston HyperX FURY 16GB 1866mhz GPU: Gigabyte G1 GTX970 Storage: (2x) WD Caviar Blue 1TB, Crucial MX100 256GB SSD, Samsung 840 SSD Wifi: TP Link WDN4800

 

Donkeys are love, Donkeys are life.                    "No answer means no problem!" - Luke 2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Did you just used Watch Dogs as an example? That game stutters even with 2-3 sli/xfire. 

 

 

The game stutters a lot when driving. It's never been fixed.

Now that I think about it, driving was pretty bad, I think I was too disgusted at the terrible pop-in to notice the framedrops.

 

Has anyone tried Crysis 3 maybe? 

AD2000x Review  Fitear To Go! 334 Review

Speakers - KEF LSX

Headphones - Sennheiser HD650, Kumitate Labs KL-Lakh

Link to comment
Share on other sites

Link to post
Share on other sites

so what do you propose i use? 

 

DSR? 

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Still says 4GB and it's against their own DirectX, Nvidia's DX cuts CPU time as much as Mantle does. http://www.overclock.net/t/1528559/directx-driver-overhead-and-why-mantle-is-a-selling-point-bunch-of-benchmarks/0_100

did i ever said it's not a 4GB card?, No. i just point out that it's false advertisement and Nvidia already admitted the error (in their technical marketing team).

Link to comment
Share on other sites

Link to post
Share on other sites

Even if they did it, I don't care. It performs amazing at an amazing pricepoint.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

Only if you are an nvida fanboy. Maxwell has very little performance increase. It's power usage is hugely overblown and is only a bit faster than the R9 290 which is far cheaper.

Quit using the word fanboy.

Link to comment
Share on other sites

Link to post
Share on other sites

Only if you are an nvida fanboy. Maxwell has very little performance increase. It's power usage is hugely overblown and is only a bit faster than the R9 290 which is far cheaper.

 

Wasn't efficiency the main point of maxwell this year? Little performance increase from the last generation but -40% to -50% decrease in TDP. And don't compare 970 to 290, almost 90% of the benchmark made completely obliterated 290. The 290x would be a better choice for comparison.

Corsair 760T White | Asus X99 Deluxe | Intel i7-5930k @ 4.4ghz | Corsair H110 | G.Skill Ripjawz 2400mhz | Gigabyte GTX 970 Windforce G1 Gaming (1584mhz/8000mhz) | Corsair AX 760w | Samsung 850 pro | WD Black 1TB | IceModz Sleeved Cables | IceModz RGB LED pack

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×