Jump to content

I will explain you why nvidia lied to you.

zappian

Ok, but let's agree to use proper grammar as well, ok?

Hey look everybody we got a grammar Nazi here.

Link to comment
Share on other sites

Link to post
Share on other sites

English is not my native language asshole.

 

I think my grammar was just fine for you to understand what i said.

You do seem quick to reply though? It doesn't have to be your native language, but you seem to know it well enough to capitalize an 'I.'

 

Not hate, but consructive criticism!

Link to comment
Share on other sites

Link to post
Share on other sites

Hey look everybody we got a grammar Nazi here.

we've*

Link to comment
Share on other sites

Link to post
Share on other sites

You do seem quick to reply though? It doesn't have to be your native language, but you seem to know it well enough to capitalize an 'I.'

 

Not hate, but consructive criticism!

 

That was your problem , a capital letter?

Link to comment
Share on other sites

Link to post
Share on other sites

That was your problem , a capital letter?

Having a lowercase 'i' greatly degrades your percieved intelligence when all we know about you is how well you understand basic grammar.

Link to comment
Share on other sites

Link to post
Share on other sites

The 970 is prone to VRAM stutter at high resolution because it has 500 mb of donkey speed memory.
Nvidia proved jack shit when they launched their benchmarks without minimum fps that spot the stutter.
Just some damage control that people that dont understand nothing about gpus will eat up.

 

the real irony is that the problem was discovered by gamers playing high memory usage games, and getting stuttering, frame drops and hitching.

 

Nvidia tries to turn it around like the problem was discovered through other means, and its no big deal for gaming, when it was discovered because of gaming issues lol

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

we've*

Nobody appreciates an ass clown troll being a Grammar Nazi !

Link to comment
Share on other sites

Link to post
Share on other sites

the real irony is that the problem was discovered by gamers playing high memory usage games, and getting stuttering, frame drops and hitching.

 

Nvidia tries to turn it around like the problem was discovered through other mean, and its no big deal for gaming, when it was discovered because of gaming issues lol

 

I like you , you have common sense.

 

I totally agree.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay mom  :( .

Sure thing, sweetie.

 

Nobody appreciates an ass clown troll being a Grammar Nazi !

We should all appreciate learning how to improve grammar, becuz this inst a god way 2 spel ,even on teh internet.

 

Not being a troll!

Link to comment
Share on other sites

Link to post
Share on other sites

Sure thing, sweetie.

 

We should all appreciate learning how to improve grammar, becuz this inst a god way 2 spel ,even on teh internet.

 

Not being a troll!

This is not Grammar School and your not the teacher.

Link to comment
Share on other sites

Link to post
Share on other sites

This is not Grammar School and your not the teacher.

Uh... you're*

 

The entire internet needs to understand two basic things; the difference between your and you're, and there, their, and they're.

Link to comment
Share on other sites

Link to post
Share on other sites

Uh... you're*

 

The entire internet needs to understand two basic things; the difference between your and you're, and there, their, and they're.

 

You are the biggest troll on this forum and i love it.

Link to comment
Share on other sites

Link to post
Share on other sites

Because you should ever trust a company's own benchmarks. Look up Corsair's RAM benchmarks, 2400mhz ram yields like a 20fps improvement in BF4 over the standard 1600mhz.

Or how about MPG? Listed MPG is ALWAYS higher than what the car actually gets (and not within margin of error). All companies use incredibly idealistic benchmarks that NEVER occur in the real world or even in normal benchmarks for that matter. This is why LTT/AnandTech/TechPowerUp/etc... exist, otherwise companies could just advertise their own products and their own benchmarks.

Ignore them let them do their own thing. It's funny to me that the people bitching the most are those who don't own the card. Their opinions are irrelevant to those of us who do have the card. I buy from micro center, I can return this card if I want no questions asked, but there's no point. That would be the most short sighted thing I've ever done. Ignore fanboys, go your own way. :)
Link to comment
Share on other sites

Link to post
Share on other sites

You are the biggest troll on this forum and i love it.

:angry:

Link to comment
Share on other sites

Link to post
Share on other sites

Ignore them let them do their own thing. It's funny to me that the people bitching the most are those who don't own the card. Their opinions are irrelevant to those of us who do have the card. I buy from micro center, I can return this card if I want no questions asked, but there's no point. That would be the most short sighted thing I've ever done. Ignore fanboys, go your own way. :)

 

The card is excellent, i just put nvidia´s business ethics on question.

 

If you like your card keep it by all means as long you are aware what nvidia did.

 

I´m a big proponent of people using what does the job for them dont get me wrong.

Link to comment
Share on other sites

Link to post
Share on other sites

I'd just like to point out the review made by Guru 3D back in September... http://www.guru3d.com/articles-pages/asus-geforce-gtx-970-strix-review,5.html

 

 

NJjhPSB.png

 

So theres nothing new here; and only some of the reviewer's notes list it as 64 ROPs. 

 

 

 

 

 

You're wrong there man. This was also likely updated today. Look at the cached website from two days ago here where it lists the 970 as having 64 ROPs

 

http://webcache.googleusercontent.com/search?q=cache:http://www.guru3d.com/articles-pages/asus-geforce-gtx-970-strix-review,5.html

 

 

 

 

 

g0OHl7t.png

Link to comment
Share on other sites

Link to post
Share on other sites

This was also likely updated today. Look at the cached website here where it lists the 970 as having 64 ROPs

 

http://webcache.googleusercontent.com/search?q=cache:http://www.guru3d.com/articles-pages/asus-geforce-gtx-970-strix-review,5.html

 

Wow nice find.

 

I was right so.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry did your parents spank you if you didn't have perfect grammar?

Legit question.

m8 donut lizten to him

You're grammar are good, your a dank dud

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry did your parents spank you if you didn't have perfect grammar?

 

Legit question.

No, I just found it to be a natural part of being a writer.

 

Taking an AP English class next year, and an Honors course this year.

Link to comment
Share on other sites

Link to post
Share on other sites

Tomorrow everyone will forget and will buy their cards like nothing happened

 

However

 

58502674.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Hello. I am new around here. Just wanted to point out few things :

 

1) The disabled ROPs do matter. While they are right about pixel fillrate bottlenecked by SM. In the case of stuff that requires ROP only, such as Anti-Aliasing its a big deal. If anyone (like myself) bought this card expecting it to take a minimal performance hit in future titles with AA on due to having more ROP, then its not the case anymore.

 

2) The 3.5 and 0.5 memory do not only differ in bandwith. When the GPU works with 0.5, it cannot acess 3.5 during the same cycle. Its either or. Or XOR, as the folk at annandtech pointed out in their acticle. Thus, not only acessing 0.5 is helluva slower, it also means that when 0.5 is in use, your effective bandwith for the main memory partition takes a hit too.

 

3) The lack of additional L2 cache can affect GPGPU performance.

 

4) The current benchmarks tell only one half of story. GTX 970's performance is great, and its as good as it was month ago. BUT the card's longevity is now in question. I bought this card to get the great performance now, and to be able to play with AA on for at least 2 years. But what happens when the games will need all 4 GB, and they will need them all on a high bandwith? GTX 970 is only 15% slower than GTX 980. In a year, that will be 30%. In 2 years, 45%. The more memory intensive the games become - the higher the performance difference will be. The higher the scene complexity, the more load will be put on ROP by AA. That is the sad truth about unconventional asymetric GPU desings - they are good for when they designed, but their performance falls of a cliff after a certain time. While conventional symmetrical designs scale down smoothly in performance.

 

5) It is clear now that in order to card work well, high degrees of driver/API/OS optimisation is needed. What will happen when the next generation of cards comes along, and nVidia will no longer want to spend money on optimisations for the new games for GTX 970? What happens when a game comes out that will not work correctly with additional RAM, and driver improvements will need to be made for it to work properly with GTX 970. Will nVidia spend their resources optimising for that game, given the fact that the optimisation is only need for GTX 970. Do you think they will spend as much effort for GTX 970 drivers, at the expensive of their other products, seeing as that kind of optimisation is only needed for GTX 970?

 

Main problem is not the design of GTX 970. It is what it is. Main problem is despite the card's performance, I would not have bought it because I am unwilling to be a lab rat for nVidia's experiments with unconventional GPU design. If I knew about this, I would not have bouth GTX 970 - not because of its performance, its the same GPU either way. But because of the concerns stated above.

Link to comment
Share on other sites

Link to post
Share on other sites

Tomorrow everyone will forget and will buy their cards like nothing happened

 

However

 

58502674.jpg

 

 

Tomorrow everyone will forget and will buy their cards like nothing happened

 

However

 

58502674.jpg

 

People are already ignoring this already.

When people say AMD gpus are shit because there drivers are crap i will throw this into their face if you don't mind.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×