Jump to content

Official Nvidia GTX 970 Discussion Thread

Hypocrite.

 

attachicon.gifanalogy.png

 

Busted.

 

I liked it because it was funny,  and believe it or not I don't want to see humor go out of the forums, I just don't want to see it coupled with an overly derisory comment that reflects on other forum members.

 

I don't mind the 3.5/4 jokes or the 1=1=3 jokes or the space heater jokes, so long as they aren't every second post and aren't degrading to the forum in general.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

I liked it because it was funny,  and believe it or not I don't want to see humor go out of the forums, I just don't want to see it coupled with an overly derisory comment that reflects on other forum members.

 

I don't mind the 3.5/4 jokes or the 1=1=3 jokes or the space heater jokes, so long as they aren't every second post and aren't degrading to the forum in general.

 

It was pretty funny.

The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had.

Link to comment
Share on other sites

Link to post
Share on other sites

I just want my money back, and they won't give it to me. I'm very disappointed in Nvidia.

Why? After announcing that the card couldnt access .5gb of vram did your gpu all of a sudden get slower? Are you getting less performance than Nvidia said it would? Or do you simply feel that your friends would be less impressed after mentioning it has less rops and less vram quickly accessible? Did you buy the card because it has 4gb of vram or did you buy it because it was a great performer in most games?

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

I'm talking about Microsoft asking to use Mantle, wich was what Richard Huddy claimed, several times.

I don't have to prove shit - you are the one claiming what no offcial party claims: that Richard Huddy, and others, are lying.

Link me where I said Richard Huddy was lying. I was asking to proof your goddamn claims just for freaking once instead of just fantasizing what you think Richard Huddy is saying.

Given the way Microsoft has presented Direct3D 12, it's hard not to draw parallels with AMD's Mantle API. Mantle was introduced last September, and much like D3D12, it provides a lower level of abstraction that lets developers write code closer to the metal. The result, at least in theory, is lower CPU overhead and better overall performance—the same perks Microsoft promises for D3D12.

 

The question, then, almost asks itself. Did AMD's work on Mantle motivate Microsoft to introduce a lower-level graphics API?

 

When I spoke to AMD people a few hours after the D3D12 reveal, I got a strong sense that that wasn't the case—and that it was developers, not AMD, who had spearheaded the push for a lower-level graphics API on Windows. Indeed, at the keynote, Microsoft's Development Manager for Graphics, Anuj Gosalia, made no mention of Mantle. He stated that "engineers at Microsoft and GPU manufacturers have been working at this for some time," and he added that D3D12 was "designed closely with game developers."

 

I then talked with Ritche Corpus, AMD's Software Alliances and Developer Relations Director. Corpus told me that AMD shared its work on Mantle with Microsoft "from day one" and that parts of Direct3D 12 are "very similar" to AMD's API. I asked if D3D12's development had begun before Mantle's. Corpus' answer: "Not that we know." Corpus explained that, when AMD was developing Mantle, it received no feedback from game developers that would suggest AMD was wasting its time because a similar project was underway at Microsoft. I recalled that, at AMD's APU13 event in November 2013, EA DICE's Johan Andersson expressed a desire to use Mantle "everywhere and on everything." Those are perhaps not the words I would have used if I had known D3D12 was right around the corner.

 

The day after the D3D12 keynote, I got on the phone with Tony Tamasi, Nvidia's Senior VP of Content and Technology. Tamasi painted a rather different picture than Corpus. He told me D3D12 had been in in the works for "more than three years" (longer than Mantle) and that "everyone" had been involved in its development. As he pointed out, people from AMD, Nvidia, Intel, and even Qualcomm stood on stage at the D3D12 reveal keynote. Those four companies' logos are also featured prominently on the current landing page for the official DirectX blog:

http://techreport.com/review/26239/a-closer-look-at-directx-12

Ritche Corpus claiming Mantle has been in D3D12 since the beginning so years ago where as Richard Huddy claims;

"Development on DirectX 12's new features may have begun before Mantle"

http://techreport.com/news/26922/amd-hopes-to-put-a-little-mantle-in-opengl-next

Microsoft says game developers have been begging for an API that would go lower for more performance and we got some nice proof for this;

'It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

 

 

I don't know where the fuck you got that information LOL - DX8.0 LOOOOL? If you want a really low level API check Glide API.

So you can quote it for the 2nd time;

https://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/what_is_directx.mspx?mfr=true

 

MicrosoftDOTcom dude. The only difference is a lower abstraction level meaning going closer to the metal. For freaking once stop thinking Microsoft needs AMD's help and no one's help to do this, the API they designed for all of their consoles are much lower than Mantle/DirectX12.  

 

 

And again you, like many other, live in the wonderland where companys copy shit from eachother without any legal issues, and still manage to develop a solution 2 years previously the ones who are originally developing it, and still got almost 100 Devs supporting it - RIGHT! LOOOOL

 

Haha you're literally bogus. 
Link to comment
Share on other sites

Link to post
Share on other sites

Take that shit to private messages, not relevant to this.

The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had.

Link to comment
Share on other sites

Link to post
Share on other sites

Whilst I am enjoying it, I have to agree it is really off topic.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Why? After announcing that the card couldnt access .5gb of vram did your gpu all of a sudden get slower? Are you getting less performance than Nvidia said it would? Or do you simply feel that your friends would be less impressed after mentioning it has less rops and less vram quickly accessible? Did you buy the card because it has 4gb of vram or did you buy it because it was a great performer in most games?

 

Some people bought 970 SLI for 4k gaming and now are royally screwed.

 

Hell even at 1440p some games are pushing 3-5 GB+ VRAM.

 

Even at 1080p we have games like shadow of mordor and dying Light doing the same.

 

Have some compassion perhaps.

Link to comment
Share on other sites

Link to post
Share on other sites

I think that I have distanced myself from this discussion fairly well, but now I really have to say that no matter what, saying what Nvidia did with the spec sheets is okay is just absolutely idiotic and just proves how deeply rooted brand loyalty makes people worship a certain brand no matter what kind of shit they pull. Go ahead and call me raging fanboy too, I'll gladly relinquish my Red Team member status the very moment AMD pulls shit like this. Just like I dropped Intel after the Prescott line-up.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

I do own a gtx970 asus strix and it plays battlefield 4 at mostly 60fps without microstutter at 4k with mixed settings but ultra texturs, lighting and more. Yes i csn prove it to you.

Stil im dissapointed in nvidia cuz i like there cards allot.

I think the card is stille worth buying for 4k and i will buy a second one zoon.

[Edit]

Ment to say gtx970!

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia should like compensate GTX 970 owners with some free games or some discount coupon with their next purchase. There is a petition for the refund on the GTX 970, if anyone didn't know yet.

https://www.change.org/p/nvidia-refund-for-gtx-970

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia should like compensate GTX 970 owners with some free games or some discount coupon with their next purchase. There is a petition for the refund on the GTX 970, if anyone didn't know yet.

https://www.change.org/p/nvidia-refund-for-gtx-970

 

i just don't think thats gonna happen. to do so is an admission of guilt.

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

i just don't think thats gonna happen. to do so is an admission of guilt.

 

And admitting guilt in that manner will open them up to proper lawsuits, which they will never do. 

 

I know the internet wants blood, you're not going to get it. Nvidia might have idiots for marketing department employees but their legal team is less stupid. 

Link to comment
Share on other sites

Link to post
Share on other sites

And admitting guilt in that manner will open them up to proper lawsuits, which they will never do. 

 

I know the internet wants blood, you're not going to get it. Nvidia might have idiots for marketing department employees but their legal team is less stupid. 

 

i honestly think a lawsuit would be hard to win for people against nvidia.  nvidia were very dumb but there was no malice in there actions and it didn't really misrep the product, it still performs well and was ding fine before all this came out.  its a shame really the 970 is a great card.

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

i honestly think a lawsuit would be hard to win for people against nvidia.  nvidia were very dumb but there was no malice in there actions and it didn't really misrep the product, it still performs well and was ding fine before all this came out.  its a shame really the 970 is a great card.

 

Yup, thats what makes it so hard. Legally, this isn't a clear cut case of anything. Its too murky for a very successful lawsuit (either individually or by a class). You can sue for willful negligence, its hard to sue for idiocy between two departments. Nvidia would have a strong case for straight up dismissal, to be honest. 

Link to comment
Share on other sites

Link to post
Share on other sites

Yup, thats what makes it so hard. Legally, this isn't a clear cut case of anything. Its too murky for a very successful lawsuit (either individually or by a class). You can sue for willful negligence, its hard to sue for idiocy between two departments. Nvidia would have a strong case for straight up dismissal, to be honest. 

if there was  huge hit from the ram in real world games then i think this would be much easier but 1-3% just aint enough.....i mean its within the margin of error. i have run my 970 on farcry 4 dsr with no AA and got the ram upto 3.8ish never once did i see a stutter at 60fps. i just dont think there is an issue

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

I do own a gtx960 asus strix and it plays battlefield 4 at mostly 60fps without microstutter at 4k with mixed settings but ultra texturs, lighting and more. Yes i csn prove it to you.

Stil im dissapointed in nvidia cuz i like there cards allot.

I think the card is stille worth buying for 4k and i will buy a second one zoon.

 

Seriously, recommending a 960 for 4K? I may rarely go above 3.4GB vram usage, but the only games that use comfortably less than 2GB are decidedly old. Games like Mass Effect.

 

That's not something I could agree with, however well it may perform in Battlefield 4. The 280X would be a better choice, but not by a lot.

Link to comment
Share on other sites

Link to post
Share on other sites

I do own a gtx960 asus strix and it plays battlefield 4 at mostly 60fps without microstutter at 4k with mixed settings but ultra texturs, lighting and more. Yes i csn prove it to you.

Stil im dissapointed in nvidia cuz i like there cards allot.

I think the card is stille worth buying for 4k and i will buy a second one zoon.

 

Did you mean to say 970? Pretty sure gtx 960 @ 4k is a pipe dream.

The Internet is the first thing that humanity has built that humanity doesn't understand, the largest experiment in anarchy that we have ever had.

Link to comment
Share on other sites

Link to post
Share on other sites



The usual traditional video that comes out when there is controversy in the tech world.  I can't get enough of these videos. LOL (Btw for the people who are going to report me I already had this cleared by Slick)

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

I think that I have distanced myself from this discussion fairly well, but now I really have to say that no matter what, saying what Nvidia did with the spec sheets is okay is just absolutely idiotic and just proves how deeply rooted brand loyalty makes people worship a certain brand no matter what kind of shit they pull. Go ahead and call me raging fanboy too, I'll gladly relinquish my Red Team member status the very moment AMD pulls shit like this. Just like I dropped Intel after the Prescott line-up.

I don't think anyone in this thread (or on this forum for that matter) is actually defending them, for the 100000th time.

I haven't actually seen someone say "there's nothing wrong with them giving out false information."

Are you and everyone at still taking the "shut up about your hate for the 970" as defending Nvidia, still?

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously, recommending a 960 for 4K? I may rarely go above 3.4GB vram usage, but the only games that use comfortably less than 2GB are decidedly old. Games like Mass Effect.

That's not something I could agree with, however well it may perform in Battlefield 4. The 280X would be a better choice, but not by a lot.

UPS, i kent gtx970!
Link to comment
Share on other sites

Link to post
Share on other sites

Did you mean to say 970? Pretty sure gtx 960 @ 4k is a pipe dream.

Yes i did ment to say gtx970 haha.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think anyone in this thread (or on this forum for that matter) is actually defending them, for the 100000th time.

I haven't actually seen someone say "there's nothing wrong with them giving out false information."

Are you and everyone at still taking the "shut up about your hate for the 970" as defending Nvidia, still?

 

I was going to say something but I didn't want to perpetuate this thread with a new person and then go over old ground.  So here's a tl:dr for those who seem to get the wrong Idea:

 

 

1.  Nvidia handed out the wrong specs concerning rops and cache to the reviewers,  there is no way to prove this was intentional and the debate has always been believe what you want, but there is ample reasoning to dispute any assumption.

 

2. No one in this thread is defending nvidia falsely representing their products,  we all hate it and no one has said otherwise.

 

3.  How major the issue with the ram is is solely in the eye of the consumer,  for many who did not look at the specs and purchased of the performance reviews it is a non issue.  for a few it is a major issue because they were hoping that based on the specs the GPU would perform better than it does now.  (again it's upto the individual to decide if this is nvidias fault or a fault of the consumer for making assumptions about future software and hardware requirements).

 

4. people are getting tired of the constant hate on nvidia and anyone who doesn't have an issue with their purchase.  It's o.k to hate what nvidia did (I have yet to read a post where someone says it was o.k), but not o.k to extend that opinion on to others as if they are somehow wrong because their perspective is different.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

https://www.youtube.com/watch?v=xMA9xKn0DaE

Btw here is an interesting video I came across today where someone showed the GTX 970 breaking down badly when hitting the 4 GB mark.

 

If this is common place and happens on all cards at 3.5G why didn't the reviewers mention it and why are there people showing the card going over 3.5 with much less stuttering?

 

Is it possible this is a rare or setup up video? There are no onscreen displays to verify the settings and FPS etc. 

 

Not trying to fuel the anti-debate, but I would like to see people be a little more critical of videos and arguments that appear designed to antagonize the situation.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

If this is common place and happens on all cards at 3.5G why didn't the reviewers mention it and why are there people showing the card going over 3.5 with much less stuttering?

 

Is it possible this is a rare or setup up video? There are no onscreen displays to verify the settings and FPS etc. 

 

Not trying to fuel the anti-debate, but I would like to see people be a little more critical of videos and arguments that appear designed to antagonize the situation.

Its still proof that things run like shit when the GTX 970 goes over 3.5GB of vRAM. I actually had windows 7 unusable until I brought the vRAM usage under 3.5GB, and I'm using the classic theme.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×