Jump to content

Official Nvidia GTX 970 Discussion Thread

No, I'm genuinely asking because I don't know how you can say Victorious is being a fanboy when he, like the very few people in this thread, are looking at this from a rational standpoint and have attempted multiple times to show you people what is and what isn't. Obviously that isn't working, so that's why I asked: do you have any reading and comprehension skills?

 

My definition of "rational" is obviously much different to yours. Clearly being rational means you flame anyone who disagrees with you or has issues with what Nvidia have done.

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

My definition of "rational" is obviously much different to yours. Clearly being rational means you flame anyone who disagrees with you or has issues with what Nvidia have done.

I'd do the same thing if you people were being dumb about what any company is doing - I have no company bias.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd do the same thing if you people were being dumb about what any company is doing - I have no company bias.

 

So in that case he's not being rational. I have nothing more to say on the matter.

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

I'd do the same thing if you people were being dumb about what any company is doing - I have no company bias.

 

Its amusing, people have issues with being told to calm down and look at things rationally. I don't really care if you're Red or Green team, be rational about it and avoid descending into doomsday levels of nonsense about everything. 

But its easier to be knee jerking every reaction than it is to think things through. Then again, this site doesn't have a stellar population of grown ups who take things more slowly so I'm not surprised. Its funny what "issues" people chose to get invested in yet ignore the things that are far more detrimental to their PC use/gaming experience. 

Link to comment
Share on other sites

Link to post
Share on other sites

we never were. but i dont hold grudges, so if you realise youre the idiot in this thread, i will treat you like everyone else again.

hell if @patrickjp93 managed to get me to treat him like everyone else again, im sure you can :P (no offense patrick)

How dare you treat me like one of the other plebeians on this horrid excuse for a tech and computing forum. Who do you think you are?!

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

My my this thread has blown up into a giant e-peen contest. 

 

Are we talking about Nvidia misleading consumers or who has the biggest ego? Holy cow. 

Link to comment
Share on other sites

Link to post
Share on other sites

I like how your trying to evade the topic at hand since you have no valid response.

But.... but I thought you were the Tech Geek!

Here is a hint: Arch____ture impro______ts

 

But I guess architectures, implementations, improvements, specifications and all that lengthy non-techie BS is irrelevant, right?

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

But.... but I thought you were the Tech Geek!

Here is a hint: Arch____ture impro______ts

 

But I guess architectures, implementations, improvements, specifications and all that lengthy non-techie BS is irrelevant, right?

Which is exaclty why specs are meaningless. Specs don't tell you anything about how different architectures compare. So again, how does reading off a spec list tell you anything of any importance? The only reason you can compare different architectures is because benchmarks exist -- unless you happen to be some sort of savant who can map out circuits and draw conclusions of the power/efficiency of an architecture with nothing but an architectural map. 

 

But hey, if you can, then more power to you. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Which is exaclty why specs are meaningless. Specs don't tell you anything about how different architectures compare. So again, how does reading off a spec list tell you anything of any importance? The only reason you can compare different architectures is because benchmarks exist -- unless you happen to be some sort of savant who can map out circuits and draw conclusions of the power/efficiency of an architecture with nothing but an architectural map. 

 

But hey, if you can, then more power to you. 

 

Agreed, been saying this for awhile now. Specs=/=performance but is simply an indicator of performance. 

 

That being said. If the spec sheet said that I was paying for X amount of something. I better have that in my product irregardless if that product still performs admirably. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

The 970 and 980 still performs exactly how it does in all the benchmarks before people knew about any of this, so I don't see any of your point. 970 SLI is fine (most people knew it was a superior alternative to a single 980):

 

 

There are plenty of uses for these cards outside of the benchmarks, and these particular benchmarks don't show what might happen should you break the 3.5GB limit. The specs were false, nVidia chose not to come forward with it until people started noticing.

 

I'm not saying this is the end of the world, but a dick move? Very much so.

 

I can understand your side of the argument, had I bought the cards for 1080p gaming I would not care. However, I do rendering myself, and it'd be interesting to know what happens with the card once you break that 3.5GB limit. I'd expect consistent performance. Sure the counterargument of "professional cards lol" is there, but frankly the GTX series of cards are faster in that use, and far more cost efficient as a result. There are only very limited uses where the drivers of a quadro series card might pull ahead, and they aren't my concern.

 

Nobody runs the benchmarks for full complex 4GB time because that happens to take disturbingly long periods of time and nobody expects the memory performance to magically drop halfway through.

 

However I do think that a customer who bought the card without this knowledge, should be entitled to return a functional card and be refunded if he chooses to do so. The card might have technically have 4GB of RAM, but at least the amount of cache and RoPs is very much a lie. However, the difference in memory is what concerns people the most, and rightfully so. A high-end card released when it was has a pretty low amount of VRAM tbh, and having any less effective memory (and it does show up in some uses, otherwise people wouldn't have noticed it in the first place) is a concern some people might have.

 

Again, I wouldn't be too pissed had I only bought one card for 1080p use like most folks. If I'd sli a couple of them for higher resolutions (where it was with the old specs, and unquestionable choice unless you can afford not to budget) I'd be slightly pissed. If I'd get a couple for a mixture of gaming and rendering (pure rendering people would most probably get a pair of 780 6GB's) I'd be pissed.

Link to comment
Share on other sites

Link to post
Share on other sites

Agreed, been saying this for awhile now. Specs=/=performance but is simply an indicator of performance. 

 

That being said. If the spec sheet said that I was paying for X amount of something. I better have that in my product irregardless if that product still performs admirably. 

True, and I think Nvidia deserves all the flak they get for posting inaccurate specifications. But that doesn't change the fact that the 970's performance is exactly the same as it has always been so anyone complaining about the 970 being stubborn. 

 

Also, ew you're one of those people...'irregardless' :unsure:

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Which is exaclty why specs are meaningless. Specs don't tell you anything about how different architectures compare. So again, how does reading off a spec list tell you anything of any importance?

IF they WERE meaningless then how could you distinguish the 970 from a 980?

 

The only reason you can compare different architectures is because benchmarks exist -- unless you happen to be some sort of savant who can map out circuits and draw conclusions of the power/efficiency of an architecture with nothing but an architectural map.

So that means I cannot compare the 970 with the 980. They have the same architectures. :(

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

True, and I think Nvidia deserves all the flak they get for posting inaccurate specifications. But that doesn't change the fact that the 970's performance is exactly the same as it has always been so anyone complaining about the 970 being a crippled card is just being naive. 

 

Also, ew you're one of those people...'irregardless' :unsure:

 

Mmmm... who knew irregardless was improper usage of the English language :o ? Being that this is an internet forum, I think English etiquette is out of the window  :lol: .

 

I understand that the 970 is still the same card before and after the information leak but its definitely not the same card you paid for. If that makes any sense. 

Link to comment
Share on other sites

Link to post
Share on other sites

True, and I think Nvidia deserves all the flak they get for posting inaccurate specifications. But that doesn't change the fact that the 970's performance is exactly the same as it has always been so anyone complaining about the 970 being stubborn. 

 

Also, ew you're one of those people...'irregardless' :unsure:

The problem is that they should

 

The reason why I am upset of this is simple.

 

Technically there are 4 GB of RAM chips on the card and yes, the card can use all of them

But it was made in such a way, that when the system access the last chip, the bandwidth drops.

 

So what is the problem for me is that they are trying to walk on the very very edge of false advertising, and if this company (or any company for that matter) who starts to walk on this route must be stopped, unless you want false advertising to take place, or borderline false advertising.

 

 

This GM204 can handle 3.5 GB of RAM by its architecture, but they sorted out that it gets the last one with performance drop. It is not true advertising in my eye

Link to comment
Share on other sites

Link to post
Share on other sites

Agreed, been saying this for awhile now. Specs=/=performance but is simply an indicator of performance. 

 

That being said. If the spec sheet said that I was paying for X amount of something. I better have that in my product irregardless if that product still performs admirably. 

 

This I agree with, one caveat though: specs can only indicate performance if you have an understanding of the product history and what each spec means within that environment.  Ergo you'd have a hard time working out if a card is better or worse based on the clock if you don't know how many cores the card has etc. Much like the clock spec for processors is only relative to a few CPUS of the same model let alone brand.  This makes looking at specs a waste of time unless you really know what you are looking at (enthusiast).

 

 

IF they WERE meaningless then how could you distinguish the 970 from a 980?

 

So that means I cannot compare the 970 with the 980. They have the same architectures. :(

 

Personally, I ascertain what product I am buying by looking at the serial on the chip, the name on the heatsink/shroud and the printing on the box. I figure reviewers and product testers do the same.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

This GM204 can handle 3.5 GB of RAM by its architecture, but they sorted out that it gets the last one with performance drop. It is not true advertising in my eye

But one can argue that: isn't it the reviewer fault for not noticing? It is their duty to give us all the information. Some sites write large articles on new architectures, yet they don't know when they do reviews?

 

Playing devil advocate here.

Link to comment
Share on other sites

Link to post
Share on other sites

IF they WERE meaningless then how could you distinguish the 970 from a 980?

 

So that means I cannot compare the 970 with the 980. They have the same architectures. :(

 

One word; benchmarks. Only benchmarks can tell you what is a more powerful card over the other, because it uses a real-life case scenario. On it's own, the GPU specs are next to completely useless, because it does not tell us how it will be utilized by the programs and games. I have said this before, I'll say it again, the benchmarks from before and after the incident did not change, because NVidia did not change the physical unit (unlike Kingston). All that changed was some numbers on a piece of paper.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

IF they WERE meaningless then how could you distinguish the 970 from a 980?

 

So that means I cannot compare the 970 with the 980. They have the same architectures.  :(

Within the same architechture specs aren't entirely useless, but performance doesn't always scale linearly in all applications. So just because one card has x more cores and y higher clock speed doesn't mean it will by x/y faster. 

 

Mmmm... who knew irregardless was improper usage of the English language  :o ? Being that this is an internet forum, I think English etiquette is out of the window   :lol: .

 

I understand that the 970 is still the same card before and after the information leak but its definitely not the same card you paid for. If that makes any sense. 

It's not improper, it's just redundant :P 

 

/I agree with you there too, the internet/forums are not a place where grammar is really important especially when the forums are international. 

 

And yes, it makes sense, but at the same time, you should base a decision on (realistic) benchmarks/real-world performance more than numbers. 

 

The problem is that they should

 

The reason why I am upset of this is simple.

 

Technically there are 4 GB of RAM chips on the card and yes, the card can use all of them

But it was made in such a way, that when the system access the last chip, the bandwidth drops.

 

So what is the problem for me is that they are trying to walk on the very very edge of false advertising, and if this company (or any company for that matter) who starts to walk on this route must be stopped, unless you want false advertising to take place, or borderline false advertising.

 

This GM204 can handle 3.5 GB of RAM by its architecture, but they sorted out that it gets the last one with performance drop. It is not true advertising in my eye

Like I said, Nvidia deserves to take flak for this, but thats not really whats going on here. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Who is ever 4" away from their GPU??

 

Whoever or wherever told you that must clearly be mistaken...

That's measured from a meter away, lots of accuracy is lost. A 290x reference is 40 dBa at quiet mode, a tiny bit higher than a 970 which is also BS, feel free to find people who claimed their reference to be quiet, a tiny bit louder than a 970 is also BS. 780 reference is quieter than the VaporX according to your graph, I've owned them and they're freaking loud hence why I upgraded to a 970 Strix.

 

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is that they should

 

The reason why I am upset of this is simple.

 

Technically there are 4 GB of RAM chips on the card and yes, the card can use all of them

But it was made in such a way, that when the system access the last chip, the bandwidth drops.

 

So what is the problem for me is that they are trying to walk on the very very edge of false advertising, and if this company (or any company for that matter) who starts to walk on this route must be stopped, unless you want false advertising to take place, or borderline false advertising.

 

 

This GM204 can handle 3.5 GB of RAM by its architecture, but they sorted out that it gets the last one with performance drop. It is not true advertising in my eye

 

Whole heartedly agree.

 

Consumers should not be complacent to tactics such as false advertising, both borderline and blatant. Businesses are more than just the transfer of goods between two parties. It's a relationship between the consumer and seller. Allowing things like this to happen ruins the dynamic of this relationship. 

Link to comment
Share on other sites

Link to post
Share on other sites

 

it is the same....it couldnt do 4k before and it still cant now. thatpile of turd you linked even showed in battlefield 4 @ 4k average fps was 30fps...not playable regardless of memory issues. things really went to shit though when he used watch dogs and farcry 4 as testing scenarios, both of which are known to have all sorts of issues to do with optimisation. the facts remain the same that the 970 was never a 4k card 1440p is it max and TBH its ok @ that nothing mind blowing.  seriously its getting embarrassing how badly this is being blown out of all proportion. yes information has been misrepresented but you lot are acting like they advertised a 980 and sold you a 750ti 

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

But one can argue that: isn't it the reviewer fault for not noticing? It is their duty to give us all the information. Some sites write large articles on new architectures, yet they don't know when they do reviews?

 

Playing devil advocate here.

no it is not, as some has noticed at SLI the frame droppings, and games today rarely use more than 3.5GB of RAM

It is just purely nVidia`s fault about dancing on the edge of the false advertising

 

@LukaP it does not matter whether a game is well written or not in a consumer stand point, as you would have a card with 4GB of RAM, so it should handle

Link to comment
Share on other sites

Link to post
Share on other sites

 

There are plenty of uses for these cards outside of the benchmarks, and these particular benchmarks don't show what might happen should you break the 3.5GB limit. The specs were false, nVidia chose not to come forward with it until people started noticing.

 

I'm not saying this is the end of the world, but a dick move? Very much so.

 

I can understand your side of the argument, had I bought the cards for 1080p gaming I would not care. However, I do rendering myself, and it'd be interesting to know what happens with the card once you break that 3.5GB limit. I'd expect consistent performance. Sure the counterargument of "professional cards lol" is there, but frankly the GTX series of cards are faster in that use, and far more cost efficient as a result. There are only very limited uses where the drivers of a quadro series card might pull ahead, and they aren't my concern.

 

Nobody runs the benchmarks for full complex 4GB time because that happens to take disturbingly long periods of time and nobody expects the memory performance to magically drop halfway through.

 

However I do think that a customer who bought the card without this knowledge, should be entitled to return a functional card and be refunded if he chooses to do so. The card might have technically have 4GB of RAM, but at least the amount of cache and RoPs is very much a lie. However, the difference in memory is what concerns people the most, and rightfully so. A high-end card released when it was has a pretty low amount of VRAM tbh, and having any less effective memory (and it does show up in some uses, otherwise people wouldn't have noticed it in the first place) is a concern some people might have.

 

Again, I wouldn't be too pissed had I only bought one card for 1080p use like most folks. If I'd sli a couple of them for higher resolutions (where it was with the old specs, and unquestionable choice unless you can afford not to budget) I'd be slightly pissed. If I'd get a couple for a mixture of gaming and rendering (pure rendering people would most probably get a pair of 780 6GB's) I'd be pissed.

 

The reason why they don't show what might happen when you break past the 3.5GB barrier is because, the card is not powerful enough to drive the settings required to push past the 3.5 GB barrier. That's why when running 4K even in SLI you see reviewers disabling AA. Because the card is not capable of doing such things. All of a sudden, people are getting distraught over the fact that their card has difficulty running 4K DSR, 8X MSAA, Ultra Textures, along with every other setting maxed out, and it's suddenly not performing perfectly. This type of scenario would happen on any card aside from the 970. Increasing settings to achieve over 3.5GB of VRAM usage will hurt performance anyway. So trying to prove whether or not that is directly related to the .5 GB memory section is going to be difficult to determine. They might have misrepresented the ROP count, or separated the full 4GB of memory. But why is everyone "up in arms" over it now. It makes no sense. We knew what the card was capable of doing before this information surfaced, now we just know why it is as so, it suddenly makes it a problem? It's like trying to sue, or demand a refund when the 3GB GTX 780 struggled in certain games that demanded more VRAM, but the 6GB GTX 780 had no problem with those particular games. Then trying to demand NVIDIA and other manufactures to replace their 3GB GTX 780 with a 6GB GTX 780. Because an individual is trying to run settings that we knew the card wasn't capable of doing before hand.

 

It might be a dick move, but there were many dick moves in this industry over the past couple of years. Remember the mining craze? When retailers were charging $800 for an R9 290X? The card retailed for $550. Now that card is going for $300. Where's everyone getting ready to sue over price gouging? Demanding a difference in the money they spent? What about when AMD released a cooler that didn't effectively cool the unit for the reference 290's and 290x's? In order for the cooler to keep the card from throttling you had to have it at 100% fan speed and which made it sound like a jet engine. Why didn't anyone complain about a functionality issue then? I mean they did complain, but not to the extent where they demanded refunds or their money back in hoards. This was obviously a hardware issue design, where the reference cooler just wasn't built properly for the card. Tons of people complained, but I don't recall their being an uproar of people demanding their money back to the extent of this issue.

 

Have you tested the card in rendering when you go over the 3.5GB limit? I mean, it's not exactly an issue until you test it and discover it is an issue. For all you know it might not be an issue at all and even mentioning it to use as a argument against them is pointless.

 

Except I've seen a handful of people test a decent amount games up to 4GB of usage reporting no problems at all. Then another handful reporting problems. So who is right? When there is a 50/50 stance on the issue?

 

People think they are entitled to a lot of stuff. However, I think people aren't giving NVIDIA the benefit of the doubt. As if they aren't going to make this right. They will. They have no choice. They currently hold market share over AMD. They can't afford to lose all that market share because of one mishap. If they do, they will be in trouble. So before anyone says, I demand this or I demand that or I'm expected to see this happen. They should wait and see what NVIDIA is going to do or what the manufacturers of the cards themselves are going to do. Because you have to believe that this is all being talked about right now behind closed doors. There is going to be a solution, there just isn't one yet. These type of things take time, and people aren't having any patience over it. 

 

Well this is where we differ, I don't see why people should be pissed. All the benchmarks for the card is available in troves from many different sources. If you researched your card before hand, and knew what it was capable of. You wouldn't be pissed, since this information was available for many months now. If you expected anything different, I really don't understand. This information was available. The only difference between now and then, is we actually know why it is the case. Which doesn't change the facts of how it performed, just adds to the understanding of why it performs this way. 

Link to comment
Share on other sites

Link to post
Share on other sites

snip

 

I think many of us here are trying to say that this is an Nvidia problem not a 970 problem. You may link as many benchmarks and performance figures as you'd like but that still doesn't rectify the problem of misleading information. 

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, you and people like you need to grow up. If those "real" specs were never researched and released, none of you would be bitching about the 970. Why? Because it's a good card. Nothing has changed since it's release. Every contracted reviewer gave it a thumbs up due to its performance per watt, overall performance in general and it's price point.

How are you going to tell me I'm defending a company that doesn't care about me? I'm looking at this rationally; so don't call me a fanboy or someone defending a company.

For fuck sake, I was supposedly "defending Ubisoft" for doing something good despite the p.o.s. game they released (Unity). You can be objective and rational and not be a "fanboy" or a company's white Knight.

Edit: I also doubt that they're throwing their marketing team under the bus. When you read something like that on the internet with little context, one would over sensationalize it's actual meaning. It seems like a simple miscommunication, possibly. It happens.

 

 

I'd accept that argument if the performance didn't fall to shit after hitting 3.5gb+ on the card. From most of the content I've seen the gpu starts to stutter and glitch when pushing over 3.5gb of VRAM usage even though when its not stuttering and glitching the game would be completely playable 30+fps.

 

My argument is they lied. It doesn't actually work as intended, you push the card to its limits to play your favorite games and it breaks. You seem to believe its ok just because most games are playable at 1080p? I call some mad bullshit yo.

CPU: Intel i5 4690k W/Noctua nh-d15 GPU: Gigabyte G1 980 TI MOBO: MSI Z97 Gaming 5 RAM: 16Gig Corsair Vengance Boot-Drive: 500gb Samsung Evo Storage: 2x 500g WD Blue, 1x 2tb WD Black 1x4tb WD Red

 

 

 

 

"Whatever AMD is losing in suddenly becomes the most important thing ever." - Glenwing, 1/13/2015

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×