Jump to content

Official Nvidia GTX 970 Discussion Thread

No, SLI 970s isn't an option because it runs 4k horribly. Even 980s don't maintain 4k @ 60fps all that well. We're not yet at a point where 4k can be properly driven -- hopefully AMD's R9 380x will change that.

Ignore zappian, his mouth breathing is getting tiresome. He calls me a professional idiot yet has the intellect of a dodo bird.

He thinks SLI 970s should outdo SLI 980s in 4K, 4K that even 2 way 290s struggle with. 4K is still not a thing. It's demanding.

To be Safe you need 6GB minimum across 3 or more cards just to have enough raw power and VRAM to simply smash through every game.

Two way configurations still have such atrocious FPS minimums and variances. I'll take rock solid 60/120 performance at 1080/1440 over 30-60 (hell, even lower than 30) at 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

even at 200w over average use in the course of a year is only like 40 bucks. people run lightbulbs that use more wattage than some Nvidia cards, do you honestly think electricity is THAT expensive?

It makes enough of a difference that it equalizes the price difference between a 290x and 970.

Ignore zappian, his mouth breathing is getting tiresome. He calls me a professional idiot yet has the intellect of a dodo bird.

He thinks SLI 970s should outdo SLI 980s in 4K, 4K that even 2 way 290s struggle with. 4K is still not a thing. It's demanding.

To be Safe you need 6GB minimum across 3 or more cards just to have enough raw power and VRAM to simply smash through every game.

Two way configurations still have such atrocious FPS minimums and variances. I'll take rock solid 60/120 performance at 1080/1440 over 30-60 (hell, even lower than 30) at 4K.

I think 6gb is a bit excessive. 4gb should be fine, we just don't have the gpu horsepower to properly drive it.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Source via PCPer

 

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation/4

It appears this is a classic case of false advertising. Nvidia's marketing team has been known to stretch the truth with their marketing in the past. For instance claiming a 165W TDP for the GTX 980 (which is false) to claim a 2X perf/watt boost over Kepler.

c2umF2t.jpg

 

Another example is advertising the Tegra X1 chip as a 1TFLOP "supercomputer" when in fact they're using half precision peak FP16 performance figures. A performance metric which was abandoned from Shader model 3.0 onwards and utterly unsuitable for a "supercomputer" which not only requires FP32 but even more so double precision FP64 compute. In which the Tegra X1 can only deliver less than 2% of the advertised compute performance.

 

Are yout rying to say that TDP = power consumption?

TDP has nothing to do with power consumption.

The stars died for you to be here today.

A locked bathroom in the right place can make all the difference in the world.

Link to comment
Share on other sites

Link to post
Share on other sites

Ignore zappian, his mouth breathing is getting tiresome. He calls me a professional idiot yet has the intellect of a dodo bird.

He thinks SLI 970s should outdo SLI 980s in 4K, 4K that even 2 way 290s struggle with. 4K is still not a thing. It's demanding.

To be Safe you need 6GB minimum across 3 or more cards just to have enough raw power and VRAM to simply smash through every game.

Two way configurations still have such atrocious FPS minimums and variances. I'll take rock solid 60/120 performance at 1080/1440 over 30-60 (hell, even lower than 30) at 4K.

 

I never said SLI 970 outdid 980 SLI , i just said 970 SLI would be faster if they didnt have a nerfed memory architecture.

 

As it stands cfx 290x destroys that combo at 4k.

Link to comment
Share on other sites

Link to post
Share on other sites

I never said SLI 970 outdid 980 SLI , i just said 970 SLI would be faster if they didnt have a nerfed memory architecture.

 

As it stands cfx 290x destroys that combo at 4k.

 

I wouldn't go that far, out of 17 games tested 970 SLI is on average a whopping 6% slower than a 295x2 @ 4K:

 

perfrel_3840.gif

 

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't go that far, out of 17 games tested 970 SLI is on average whopping 6% slower than a 295x2 @ 4K:

 

perfrel_3840.gif

 

Yes but frame stutters after 3.5 VRAM arent accounted there.

 

The 970 SLI craps out at 4k , highly demanding games for the reasons we already know.

Link to comment
Share on other sites

Link to post
Share on other sites

The 970 SLI craps out at 4k

As does 290x XFIRE.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes but frame stutters after 3.5 VRAM arent accounted there.

 

The 970 SLI craps out at 4k , highly demanding games for the reasons we already know.

 

You won't use over 3.5GB of VRAM in 4K. You have to disable Anti-Aliasing and other settings anyway to achieve 60 fps. So you will never see over 3.5 GB of VRAM usage (because enabling settings that would push VRAM utilization further will hurt performance too much anyway):

 

 

Q54ragB.png

 

bf44kvram.jpg

 

crysis34kvram.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

As does 290x XFIRE.

 

I have a 4k monitor and my 290X's have yet to crap out or stutter in a game...

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

 

You won't use over 3.5GB of VRAM in 4K. You have to disable Anti-Aliasing and other settings anyway to achieve 60 fps. So you will never see over 3.5 GB of VRAM usage (because enabling settings that would push VRAM utilization further will hurt performance too much anyway):

 

 

Q54ragB.png

 

bf44kvram.jpg

 

crysis34kvram.jpg

 

 

Do you have 4k shadow of mordor ultra textures benchmarks?

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

The reason why they don't show what might happen when you break past the 3.5GB barrier is because, the card is not powerful enough to drive the settings required to push past the 3.5 GB barrier. That's why when running 4K even in SLI you see reviewers disabling AA. Because the card is not capable of doing such things. All of a sudden, people are getting distraught over the fact that their card has difficulty running 4K DSR, 8X MSAA, Ultra Textures, along with every other setting maxed out, and it's suddenly not performing perfectly. This type of scenario would happen on any card aside from the 970. Increasing settings to achieve over 3.5GB of VRAM usage will hurt performance anyway. So trying to prove whether or not that is directly related to the .5 GB memory section is going to be difficult to determine. They might have misrepresented the ROP count, or separated the full 4GB of memory. But why is everyone "up in arms" over it now. It makes no sense. We knew what the card was capable of doing before this information surfaced, now we just know why it is as so, it suddenly makes it a problem? It's like trying to sue, or demand a refund when the 3GB GTX 780 struggled in certain games that demanded more VRAM, but the 6GB GTX 780 had no problem with those particular games. Then trying to demand NVIDIA and other manufactures to replace their 3GB GTX 780 with a 6GB GTX 780. Because an individual is trying to run settings that we knew the card wasn't capable of doing before hand.

 

It might be a dick move, but there were many dick moves in this industry over the past couple of years. Remember the mining craze? When retailers were charging $800 for an R9 290X? The card retailed for $550. Now that card is going for $300. Where's everyone getting ready to sue over price gouging? Demanding a difference in the money they spent? What about when AMD released a cooler that didn't effectively cool the unit for the reference 290's and 290x's? In order for the cooler to keep the card from throttling you had to have it at 100% fan speed and which made it sound like a jet engine. Why didn't anyone complain about a functionality issue then? I mean they did complain, but not to the extent where they demanded refunds or their money back in hoards. This was obviously a hardware issue design, where the reference cooler just wasn't built properly for the card. Tons of people complained, but I don't recall their being an uproar of people demanding their money back to the extent of this issue.

 

Have you tested the card in rendering when you go over the 3.5GB limit? I mean, it's not exactly an issue until you test it and discover it is an issue. For all you know it might not be an issue at all and even mentioning it to use as a argument against them is pointless.

 

Except I've seen a handful of people test a decent amount games up to 4GB of usage reporting no problems at all. Then another handful reporting problems. So who is right? When there is a 50/50 stance on the issue?

 

People think they are entitled to a lot of stuff. However, I think people aren't giving NVIDIA the benefit of the doubt. As if they aren't going to make this right. They will. They have no choice. They currently hold market share over AMD. They can't afford to lose all that market share because of one mishap. If they do, they will be in trouble. So before anyone says, I demand this or I demand that or I'm expected to see this happen. They should wait and see what NVIDIA is going to do or what the manufacturers of the cards themselves are going to do. Because you have to believe that this is all being talked about right now behind closed doors. There is going to be a solution, there just isn't one yet. These type of things take time, and people aren't having any patience over it. 

 

Well this is where we differ, I don't see why people should be pissed. All the benchmarks for the card is available in troves from many different sources. If you researched your card before hand, and knew what it was capable of. You wouldn't be pissed, since this information was available for many months now. If you expected anything different, I really don't understand. This information was available. The only difference between now and then, is we actually know why it is the case. Which doesn't change the facts of how it performed, just adds to the understanding of why it performs this way. 

 

 

Firstly, I'd love if you would drop that AMD / nVidia mentality. This has absolutely nothing to do with it. This is consumer vs manufacturer, not "this is justified because vendor X used to do this". With R9 290X bitcoin mining, AFAIK it was vendors jacking up the price anyway and well, they just couldn't produce enough cards to meet the demand. Similar stuff did happen with the 5800-series.

 

If you pair two of the cards together, suddenly that 4GB becomes a lot more sound for the cards. For a single card, maybe not, but it's great that it is there. And for rendering use, even 512MB is plenty since it means that you can add ~12% more stuff without sacrificing speed.

 

My point is pretty simple. The product was falsely advertised, the consumer was misinformed during the time of the purchase, and subsequently that misinformation would make you eligible for a refund as you bought a product in good faith due to the manufacturer giving the public false information.

 

Everyone was informed of the 290X being a jet engine. Every reviewer noted that. Most reviewers recommended you to wait a custom cooler version or to put the thing under water. Everyone noted that the cards did perform, but were both hot and loud. The consumer will know that he will buy a card that is hot and loud, the reviewers were not sent some extra special binned versions with tweaked coolers.

 

780 3GB was advertised as 3GB, so that has nothing do with it once again. 780 6GB was advertised as 6GB, the consumer is making an informed decision when buying those cards. With the GTX 970, not necessarily. I mean, the spec sheet flat-out lied about certain properties, and conveniently left out some details. The error wasn't corrected in a timely manner.

 

The difference here is that the customer was never lied to when he was purchasing a 290X with a markup or a hot & loud card. You could say the same thing about GTX 480 that you said about R9 290X, but the consumers did very well know that the 480 could hardly handle itself at stock settings. There was no problem back then, either, since the reviewers got their samples and had a chance to inform consumers so that consumers could make informed decisions.

 

Is it huge, does it affect benchmarks? Probably not impactful for most users, and it won't affect the benchmarks. But there are uses for the cards that are not directly benchmarked, especially that VRAM since VRAM is pretty binary (and struttering doesn't even always show up in benchmarks) since you either have enough or you don't. The issue would've never come up if people hadn't noticed this, so obviously it does have an impact on something. Like the PC Perspective guy said, nVidia could have called it "n-Cache" memory and call it a day. It has happened to them before, too, but those cases weren't quite as relevant as this one.

 

Consumer laws around in the EU at least are pretty tight on stuff like this. U.S might have some class action lawsuit over this if it gets enough traction, like that intel thing. For most people it's more about principle. What if the VRAM treated like this was half of it? 1/8 of the available memory pool is quite something already, but where would you draw the line where it would be unacceptable?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm tired of this bullsht. Since fcking initial reviews it was indicated that a fcking gtx970 has 56 ROPs. I don't even know where people got the 64ROPs info for a gtx970...

 

Source: http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-970-g1-gaming-review,5.html Dated fcking september 2014

 

Also, there are videos around (don't have links atm) of people playing AC:U and Shadows of Mordor over 3.5GB vram without any stutter  (wich, to me, is weird).

 

No, i'm not saying that nvidia isn't to blame, bcuz it is!

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.change.org/p/nvidia-refund-for-gtx-970

Sign it if you want I just came across it and signed it myself.

|King Of The Lost|
Project Dark: i7 7820x 5.1GHz | X299 Dark | Trident Z 32GB 3200MHz | GTX 1080Ti Hybrid | Corsair 760t | 1TB Samsung 860 Pro | EVGA Supernova G2 850w | H110i GTX
Lava: i9 12900k 5.1GHz (Undervolted to 1.26v)| MSI z690 Pro DDR4| Dominator Platnium 32GB 3800MHz| Power Color Red Devil RX 6950 XT| Seasonic Focus Platnium 850w| NZXT Kraken Z53
Unholy Rampage: i7 5930k 4.7GHz 4.4 Ring| X99 
Rampage|Ripjaws IV 16GB 2800 CL13| GTX 1080 Strix(Custom XOC Signed BIOS) | Seasonic Focus Platinum 850w |H100i v2 
Revenge of 775: Pentium 641 | Biostar TPower i45| Crucial Tracer 1066 DDR2 | GTX 580 Classified Ultra | EVGA 650 BQ | Noctua NH D14

Link to comment
Share on other sites

Link to post
Share on other sites

It appears this is a classic case of false advertising. Nvidia's marketing team has been known to stretch the truth with their marketing in the past. For instance claiming a 165W TDP for the GTX 980 (which is false) to claim a 2X perf/watt boost over Kepler.

c2umF2t.jpg

 

 

While I agree with the 970 hate and overall message of this post I have to correct this part here.

1.) TDP watts and the Watts listed here are two completely different things. TDP watts are thermal watts, essentially how much heat is produced due to resistance and how much cooling is needed, where as the watts listed above are electrical watts which is power drawn. Since GPUs aren't toasters with heating elements, TDP watts are much lower than actual power wattage drawn from the wall.

2.) That's also power consumption from a 100% stress environment where the power throttling of the cards does not come into play, the gaming numbers are much lower than that for the two OCed cards. Those OC cards also have much higher power allowances than the reference card which throttles the card back to reduce heat.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm tired of this bullsht. Since fcking initial reviews it was indicated that a fcking gtx970 has 56 ROPs. I don't even know where people got the 64ROPs info for a gtx970...

 

Source: http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-970-g1-gaming-review,5.html Dated fcking september 2014

Guru3d, among all the other review websites, changed it when this whole controversy started boiling over in the last week, and originally listed the in the specification as having 64 ROPs.

https://webcache.googleusercontent.com/search?q=cache:QNgsL0qDpdoJ:www.guru3d.com/articles-pages/gigabyte-geforce-gtx-970-g1-gaming-review,5.html+&cd=1&hl=en&ct=clnk&gl=us

Mayonnaise is an instrument!  

Current Build - Ryzen 7 3800x (eco mode enabled), MSI B550M MAG Mortar, G.Skill Ripjaws V 32 GB (2x16) 3200 14-14-14-34, EVGA 2060 KO Ultra, EVGA G2 550W, Phanteks Enthoo Pro M, Creative Sound Blaster Audigy Rx

Link to comment
Share on other sites

Link to post
Share on other sites

its not a massive issue BECAUSE ITS NOT A 4K CARD 

It's just as much a 4K card as the R9-290X and the GTX980 is.

Those cards should play 4K at 30-40fps but the GTX970 won't because of stutter just as it won't be able to use AA in Vram hungry games or high res textures.

 

 

 

It shouldn't play 4K, it was never going to, and all the reviews said it couldn't.  Why would anyone think it should?

It should play 4K/30fps but it won't because of stutter.

The Vram bottleneck on my GTX670 pissed me off enough I don't need another card that has horse power to drive games but you have turn textures,AA,resolution down because it doesn't have enough Vram.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

 

Do you have 4k shadow of mordor ultra textures benchmarks?

 

The only test I found, for 4K UItra Preset Shadow of Mordor said the following about attempting 4K Ultra on Shadow of Mordor:

 

"At 4K with the Ultra HD pack  the 4GB cards don’t seem to be enough. We see full utilization of our video memory and swapping occurs with system resources. We see a system memory usage of above 6GB as well as a pagefile of over 11GB bringing all our setups to their knees."

 

http://www.hardwarepal.com/shadow-mordor-benchmark/

 

So while it did use more than 3.5GB of VRAM, playing at those settings was a problem for all of their cards including Crossfire R9 290. Which means, even with Crossfire R9 290, as I said before, even though having access to a full 4GB of memory at full bandwidth. You are still limited by the cards power and the VRAM at that level. 4GB just doesn't cut it. Which means you have to lower settings to achieve playable framerates because you are hitting the video memory buffer regardless; whether you are using 970 SLI or R9 290 Crossfire. 

 

Firstly, I'd love if you would drop that AMD / nVidia mentality. This has absolutely nothing to do with it. This is consumer vs manufacturer, not "this is justified because vendor X used to do this". AFAIK it was vendors jacking up the price anyway and well, they just couldn't produce enough cards to meet the demand. Similar stuff did happen with the 5800-series.
 

If you pair two of the cards together, suddenly that 4GB becomes a lot more sound for the cards. For a single card, maybe not, but it's great that it is there. And for rendering use, even 512MB is plenty since it means that you can add ~12% more stuff without sacrificing speed.

 

My point is pretty simple. The product was falsely advertised, the consumer was misinformed during the time of the purchase, and subsequently that misinformation would make you eligible for a refund as you bought a product in good faith due to the manufacturer giving the public false information.

 

Everyone was informed of the 290X being a jet engine. Every reviewer noted that. Most reviewers recommended you to wait a custom cooler version or to put the thing under water. Everyone noted that the cards did perform, but were both hot and loud. The consumer will know that he will buy a card that is hot and loud, the reviewers were not sent some extra special binned versions with tweaked coolers.

 

780 3GB was advertised as 3GB, so that has nothing do with it once again. 780 6GB was advertised as 6GB, the consumer is making an informed decision when buying those cards. With the GTX 970, not necessarily. I mean, the spec sheet flat-out lied about certain properties, and conveniently left out some details. The error wasn't corrected in a timely manner.

 

The difference here is that the customer was never lied to when he was purchasing a 290X with a markup or a hot & loud card. You could say the same thing about GTX 480 that you said about R9 290X, but the consumers did very well know that the 480 could hardly handle itself at stock settings. There was no problem back then, either, since the reviewers got their samples and had a chance to inform consumers so that consumers could make informed decisions.

 

Is it huge, does it affect benchmarks? Probably not impactful for most users, and it won't affect the benchmarks. But there are uses for the cards that are not directly benchmarked, especially that VRAM since VRAM is pretty binary (and struttering doesn't even always show up in benchmarks) since you either have enough or you don't. The issue would've never come up if people hadn't noticed this, so obviously it does have an impact on something. Like the PC Perspective guy said, nVidia could have called it "n-Cache" memory and call it a day. It has happened to them before, too, but those cases weren't quite as relevant as this one.

 

Consumer laws around in the EU at least are pretty tight on stuff like this. U.S might have some class action lawsuit over this if it gets enough traction, like that intel thing. For most people it's more about principle. What if the VRAM treated like this was half of it? 1/8 of the available memory pool is quite something already, but where would you draw the line where it would be unacceptable?

 

There is no AMD or NVIDIA mentality, if you perceived being given simple examples as that then I don't know what to tell you. Vendors can't blindly jack up the prices of products, without hearing from their manufacturer that it is okay to do that. Nobody knows if they had enough cards produced to meet demand, it could have been a ploy. Jacking up prices gives a representation of whether they have enough cards produced to meet demand, it doesn't mean it's a fact or that is the case. The whole thing could have been faked to allow AMD to make more money. Nobody knows. The 5800 series supply demand was different. They produced the cards, and two months later demand went up. The mining craze started quite a bit of time after launch. And you would think a company of that level would learn from their previous mistake if that was actually the problem. I don't know about you, with the mining craze, I don't remember cards being out of stock. I just remember the prices of the cards being extremely inflated. Completely different scenario.

 

In what situation does the full 4GB become more useful? Even in SLI? Any resolution you play at that is demanding enough to make 4GB of memory viable, is going to require you to lower settings anyway to achieve at least 60 fps. Take 4K for instance, in order for you to break over 3.5GB of VRAM usage (see above) you will need to have some crazy AA settings like 8x AA. There is no setup on the market that will allow you to run 8x AA @ 4K resolution. You will be much lower than 60 fps. So it's a nullified concept. Also, any situation like a game that does require more VRAM like Shadow of Mordor for instance requires more than 4GB of VRAM to be able to play @ 4K Ultra settings. 

 

You say even for rendering use, but you don't even know if rendering is affected you just used it previously as an example without any idea of if it is definite fact.

 

Okay, so if you are going to make NVIDIA be an example for all companies who falsely advertise. Then you have to hold all companies accountable for falsely advertising. For every single thing. Like telling people that Dual-GPU cards have twice the amount of VRAM, even though we know VRAM doesn't stack. Where's all the refunds for that? There are none, because it is ridiculous. The consumer wasn't misinformed, the performance numbers remain the same. There is nothing different about how a 970 performs in all the benchmarks that the many reviewers did. What's different? Now you know why it performs the way it does it certain scenarios, that's the only difference. However the information on how it performs remains the same.

 

And reviewers were sent the same 970 that everyone else bought. And the reviewers reviewed the same 970's as everyone else bought. The review's performance numbers stayed the same. There is nothing different about the benchmarks they did. No one was persuaded any differently after the fact. Everyone was more than capable of seeing what a 970 was able to do. If they were expecting it to be able to do more than what the reviewers said, then that is the consumers fault for not paying attention. You say lied to, as if the 970 benchmarks were faked. When they weren't. Stop saying lied to. The majority of the people posting on the internet have no freaking idea what a ROP even is or what it does or why having 8 less is any different. 

 

You say it has nothing to do with it again, because you want it to have nothing to do with it. If you cannot see the difference, again that is your problem for trying to be on the unreasonable side of the argument. The spec sheet is pointless. The reviews that were done, that explained the performance numbers are the same. If you cannot understand this, then I don't know what else to tell you. There is no point in continuing this discussion with you any further. The reviews stayed the same, the performance is the same. Nothing has changed. Nothing.

 

Reviewers got the 970 also, and let me repeat myself again. They told the entire community and the consumers exactly how the 970 performed. They ran the benchmarks. They posted the numbers. This didn't change anything. Oh it has a little less cache, and little less ROPs and .5GB of memory is segmented differently. Big freaking deal. The card still performs exactly the same as all the benchmarks show it.

 

If you are saying it's doesn't have an impact. Then why are you arguing in the first place? If you are saying it doesn't affect benchmarks, then why are you arguing in the first place? Just for the sake of arguing? All the reviews of the 970 are available for all eyes to see, so what it can do and what it cant do are widely accepted at this point. Someone magically discovering what it cant do shouldn't be taken as a surprise. 

 

Yeah, you know what happened with the Intel lawsuit, 13 years later they finally settled and everyone got $15. Do you know how ridiculous that sounds? A class-action lawsuit? There is no principle here. Just annoying little children who are overreacting on things they don't even understand. There's plenty examples on the internet of people testing over 3.5GB of VRAM being used without any issues (like stuttering) then there is that same amount of people saying they are experiencing issues. So it is a 50/50 stance on whether it is actually a problem. For that, I say you and many other people are taking this too far.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guru3d, among all the other review websites, changed it when this whole controversy started boiling over in the last week, and originally listed the in the specification as having 64 ROPs.

https://webcache.googleusercontent.com/search?q=cache:QNgsL0qDpdoJ:www.guru3d.com/articles-pages/gigabyte-geforce-gtx-970-g1-gaming-review,5.html+&cd=1&hl=en&ct=clnk&gl=us

don't get me wrong but i kinda remember reading it before that it didn't have 64 rops as the fully capable chip present on the 980. I'm almost sure that the rop count i have in my head came from that website.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

The only test I found, for 4K UItra Preset Shadow of Mordor said the following about attempting 4K Ultra on Shadow of Mordor:

 

"At 4K with the Ultra HD pack  the 4GB cards don’t seem to be enough. We see full utilization of our video memory and swapping occurs with system resources. We see a system memory usage of above 6GB as well as a pagefile of over 11GB bringing all our setups to their knees."

 

http://www.hardwarepal.com/shadow-mordor-benchmark/

 

So while it did use more than 3.5GB of VRAM, playing at those settings was a problem for all of their cards including Crossfire R9 290. Which means, even with Crossfire R9 290, as I said before, even though having access to a full 4GB of memory at full bandwidth. You are still limited by the cards power and the VRAM at that level. 4GB just doesn't cut it. Which means you have to lower settings to achieve playable framerates because you are hitting the video memory buffer regardless; whether you are using 970 SLI or R9 290 Crossfire. 

 

 

There is no AMD or NVIDIA mentality, if you perceived being given simple examples as that then I don't know what to tell you. Vendors can't blindly jack up the prices of products, without hearing from their manufacturer that it is okay to do that. Nobody knows if they had enough cards produced to meet demand, it could have been a ploy. Jacking up prices gives a representation of whether they have enough cards produced to meet demand, it doesn't mean it's a fact or that is the case. The whole thing could have been faked to allow AMD to make more money. Nobody knows. The 5800 series supply demand was different. They produced the cards, and two months later demand went up. The mining craze stared quite a bit of time after launch. And you would think a company of that level would learn from their previous mistake if that was actually the problem. I don't know about you, with the mining craze, I don't remember cards being out of stock. I just remember the prices of the cards being extremely inflated. Completely different scenario.

 

In what situation does the full 4GB become more useful? Even in SLI? Any resolution you play at that is demanding enough to make 4GB of memory viable, is going to require you to lower settings anyway to achieve at least 60 fps. Take 4K for instance, in order for you to break over 3.5GB of VRAM usage (see above) you will need to have some crazy AA settings like 8x AA. There is no setup on the market that will allow you to run 8x AA @ 4K resolution. You will be much lower than 60 fps. So it's a nullified concept. Also, any situation like a game that does require more VRAM like Shadow of Mordor for instance requires more than 4GB of VRAM to be able to play @ 4K Ultra settings. 

 

You say even for rendering use, but you don't even know if rendering is affected you just used it previously as an example without any idea of if it is definite fact.

 

Okay, so if you are going to make NVIDIA be an example for all companies who falsely advertise. Then you have to hold all companies accountable for falsely advertising. For every single thing. Like telling people that Dual-GPU cards have twice the amount of VRAM, even though we know VRAM doesn't stack. Where's all the refunds for that? There are none, because it is ridiculous. The consumer wasn't misinformed, the performance numbers remain the same. There is nothing different about how a 970 performs in all the benchmarks that the many reviewers did. What's different? Now you know why it performs the way it does it certain scenarios, that's the only difference. However the information on how it performs remains the same.

 

And reviewers were sent the same 970 that everyone else bought. And the reviewers reviewed the same 970's as everyone else bought. The reviews performance numbers stayed the same. There is nothing different about the benchmarks they did. No one was persuaded any differently after the fact. Everyone was more than capable of seeing what a 970 was able to do. If they were expecting it to be able to do more than what the reviewers said, then that is the consumers fault for not paying attention. You say lied to, as if the 970 benchmarks were faked. When they weren't. Stop saying lied to. The majority of the people posting on the internet have no freaking idea what a ROP even is or what it does or why having 8 less is any different. 

 

You say it has nothing to do with it again, because you want it to have nothing to do with it. If you cannot see the difference, again that is your problem for trying to be on the unreasonable side of the argument. The spec sheet is pointless. The reviews that were done, that explained the performance numbers are the same. If you cannot understand this, then I don't know what else to tell you. There is no point in continuing this discussion with you any further. The reviews stayed the same, the performance is the same. Nothing has changed. Nothing.

 

Reviewers got the 970 also, and let's repeat myself again. They told the entire community and the consumers exactly how the 970 performed. They ran the benchmarks. They posted the numbers. This didn't change anything. Oh it has a little less cache, and little less ROPs and .5GB of memory is segmented differently. Big freaking deal. The card still performs exactly the same as all the benchmarks show it.

 

If you are saying it's doesn't have an impact. Then why are you arguing in the first place? If you are saying it doesn't affect benchmarks, then why are you arguing in the first place? Just for the sake of arguing? All the reviews of the 970 are available for all eyes to see, so what it can do and what it can do are widely accepted at this point. Someone magically discovering what it cant do shouldn't be taken as a surprise. 

 

Yeah, you know what happened with the Intel lawsuit, 13 years later they finally settled and everyone got $15. Do you know how ridiculous that sounds? A class-action lawsuit? There is no principle here. Just annoying little children who are overreacting on things they don't even understand. There's plenty examples on the internet of people testing over 3.5GB of VRAM being used without any issues (like stuttering) then there is that same amount of people saying they are experiencing issues. So it is a 50/50 stance on whether it is actually a problem. For that, I say you and many other people are taking this too far.

 

 

 

Far cry 4 and other games also use more than 3 GB VRAM At 4k.

 

http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/

 

Look buddy its not a matter of the performance , its a matter of nvidia faking the specs and thats shady as fuck.

Its a moral issue at the end of the day.

 

I think most people are mad because they were scammed and their card dies after 3.5 GB VRAM as ilustrated on the link i shared.

 

There is no principle for a class action lawsuit?

What about false advertisement , faking specs.

 

Proof:

 

 

Exibit A:

 

The video i linked 

https://www.youtube.com/watch?v=b74MYv8ldXc

 

Exbith B:

Fake specs at launch:

 

REVISED SPECS after this debacle

 

NJjhPSB.pngOriginal specs:

 

http://webcache.googleusercontent.com/search?q=cache:http://www.guru3d.com/articles-pages/asus-geforce-gtx-970-strix-review,5.html

 

 

 

False advertising or deceptive advertising is the use of false or misleading statements in advertising, and misrepresentation of the product at hand, which may negatively affect many stakeholders, especially consumers. As advertising has the potential to persuade people into commercial transactions that they might otherwise avoid, many governments around the world use regulations to control false, deceptive or misleading advertising. "Truth" refers to essentially the same concept, that customers have the right to know what they are buying, and that all necessary information should be on the label.

False advertising, in the most blatant of contexts, is illegal in most countries. However, advertisers still find ways to deceive consumers in ways that are legal, or technically illegal but unenforceable.

Link to comment
Share on other sites

Link to post
Share on other sites

Just want to reiterate that you don't need 4 gb of VRAM for 4K.  3-3.5 is plenty.  The only games that use a full 4 GB are those that have game engines which utilize all available VRAM.  (So, anything that came out this November.)

 

https://drive.google.com/file/d/0B4HpnFBkAjlIUGdpNEpBcm9MOFE/view?usp=sharing

 

I didn't run anything with anti-aliasing, so you could expect an extra 500-1000 MB tops to do that.  

 

Stop panicing over nothing.  Worst case scenario is that you slightly reduce texture quality from SUPERULTRADELUXE to High.  

 

(That said, a pair of 290x's will perform better in 4K, but that was already the case.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Far cry 4 and other games also use more than 3 GB VRAM At 4k.

 

http://www.reddit.com/r/buildapc/comments/2tu86z/discussion_i_benchmarked_gtx_970s_in_sli_at_1440p/

 

Look buddy its not a matter of the performance , its a matter of nvidia faking the specs and thats shady as fuck.

Its a moral issue at the end of the day.

 

I think most people are mad because they were scammed.

 

People willingly bought cheaper 970s instead of 980s after looking at the DOZENS of benchmarks that were out before they bought the card. Who is getting upset again? Just a bunch of justice warriors who can't seem to understand this simple fact:

1. We don't like that Nvidia wasn't honest about ROP and RAM allocations

2. We don't agree that the 970 is a worse card for it. The benchmarks at launch are the same ones you can perform now. The card is the exact same physical thing as before. Nothing changed in its performance. 

People like you are conflating the issues and then coming in and making shitposts about "NVIDIA IS HITLER GO BUY 290s CAUSE 4K AND SHIET" 

Link to comment
Share on other sites

Link to post
Share on other sites

It's just as much a 4K card as the R9-290X and the GTX980 are.

Those cards should play 4K at 30-40fps but the GTX970 won't because of stutter just as it won't be able to use AA in Vram hungry games or high res textures.

 

It should play 4K/30fps but it won't because of stutter.

The Vram bottleneck on my GTX670 pissed me off enough I don't need another card that has horse power to drive games but you have turn textures,AA,resolution down because it doesn't have enough Vram.

 

 

??, not too sure your point makes any sense.  The stutter is caused by the way the ram scales down which is due to the cut down process.  If it wasn't cut down then it would be a 980 not a 970.  Ergo unless it is a 980 or a different GPU it can't and shouldn't be expected to do 4k.

 

What you are essentially saying is this card should perform better if it was better. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

People willingly bought cheaper 970s instead of 980s after looking at the DOZENS of benchmarks that were out before they bought the card. Who is getting upset again? Just a bunch of justice warriors who can't seem to understand this simple fact:

1. We don't like that Nvidia wasn't honest about ROP and RAM allocations

2. We don't agree that the 970 is a worse card for it. The benchmarks at launch are the same ones you can perform now. The card is the exact same physical thing as before. Nothing changed in its performance. 

People like you are conflating the issues and then coming in and making shitposts about "NVIDIA IS HITLER GO BUY 290s CAUSE 4K AND SHIET" 

 

The fact is the card NVIDIA promised does NOT exist.

 

They engaged in false advertisement plain and simple.

 

And they have done anything to appease the situation.

 

I would suggest you would read my sig.

 

The card is the same but only recently we have learned its true handicaps , which happen after 3,5 GB .

 

After that threshold the performance drops like a bag of nails.

Link to comment
Share on other sites

Link to post
Share on other sites

If everyone is getting so hellbent on "false advertisement," because NVIDIA is not advertising correctly. Why aren't we getting out the pitchforks and torches for AMD:

 

 

 

 

 

The 295x2 doesn't have 8GB of VRAM usable, it only has 4GB usable. So why isn't anyone getting ready to sue over this blatant advertising fraud? Everyone knows it doesn't have 8GB. There isn't 8GB usable. If people fill up their VRAM to the maximum 4GB and performance issues ensue. Why isn't anyone going crazy over the fact that they said it came with 8GB but there's really only 4GB usable? 

 

Makes no sense, seems like we are making exceptions here. "It's okay for one company to false advertise but not the other." Or their false advertising is less severe than the other. That just makes you a hypocrite. 

this is common for dual gpus so people already know this the titan z and 690 and 7990 are the same while here there is no way for people to tell

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×