Jump to content

Official Nvidia GTX 970 Discussion Thread

I know that, I wouldn't even argue otherwise, but that doesn't mean that it's somehow the fault of 970 owners that their cards won't run certain games at this resolution very well. Those people have all the right to be mad right now and they were being mislead, that's all I'm saying.

 

Because to drive a 4K display you need two of them. Whatever you go for, from Nvidia or AMD, you need two of them. And according to the benchmarks the FPS difference between any of these GPUs was surprisingly minimal. Most of the remarks about the 970 were along the lines of "hoshit this thing is... basically no different to a 980, yet costs nearly £200 less". Then bear in mind that you're buying two of them, so this price difference is compounded. For context, the 980 cost the same as a 4K monitor, each for what is still very little return in terms of performance. And these benchmarks don't suddenly not be true just because you find out something new about the card. It's still an extremely well-performing GPU, and the 290X's price had to be slashed dramatically, in a very short amount of time to remain relevant.

 

So that's why not the 980. Why not AMD then? The answer's simple: price. Given that between all of these solutions there's only a few fps difference, price becomes important. And while two 970s cost me £500, an R9 2905X2 was still £1200 at the time. Two R9 290Xs were £800.

 

There's also the issue that two open air coolers sandwiched together in SLI/CF aren't going to be ideally placed thermally. With a 970 this isn't an issue, it runs pretty cool. The 290X, though, is another story. Do you use a reference blower design so that their temperatures don't affect eachother, but they throttle? Or do you use two open air coolers, allow the bottom card to massively heat the upper card, have louder and more obnoxious fans, AND the possibility of it throttling? At any sign of throttling any potential fps advantage over the 970 is pretty much gone.

 

So tl;dr it was the only solution that made sense, and it still makes sense. It's a solution that consistently gets between 50 and 80 fps in most games that I play. I still recommend that people just buy whichever is cheaper where they are and try to wade through the fanboyism and misinformation going around.

 

There are no sli benchmarks for the 970 that show it acceptably doing 4K,  that is the point I am trying to make.  If you want 4K gaming why buy a card (or pair of cards) that won't do it?  The 970 was never in the race for 4K and the Nvidia marketing balls up didn't cause that.  It doesn't matter what the cards cost, if you've got the money for 4K then great, but if you can't stretch the budget to cards that can support 4K why buy the monitor and then complain the cheaper cards are useless?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

There are no sli benchmarks for the 970 that show it acceptably doing 4K,  that is the point I am trying to make.  If you want 4K gaming why buy a card (or pair of cards) that won't do it?  The 970 was never in the race for 4K and the Nvidia marketing balls up didn't cause that.  It doesn't matter what the cards cost, if you've got the money for 4K then great, but if you can't stretch the budget to cards that can support 4K why buy the monitor and then complain the cheaper cards are useless?

 

Yes there are? It handles it just fine. Most benchmarks look worse than they really are because they pile on the MSAA. If you turn that off it's quite uncommon to not get 60fps at ultra preset.

 

The thing is most of the complaining is being done by people who don't have 970s, and don't game at 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes there are? It handles it just fine. Most benchmarks look worse than they really are because they pile on the MSAA. If you turn that off it's quite uncommon to not get 60fps at ultra preset.

 

The thing is most of the complaining is being done by people who don't have 970s, and don't game at 4K.

 

I haven't seen any, most games on good settings are all sub 40 for 4K

 

http://www.guru3d.com/articles_pages/geforce_gtx_970_sli_review,16.html

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/17.html

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Yes there are? It handles it just fine. Most benchmarks look worse than they really are because they pile on the MSAA. If you turn that off it's quite uncommon to not get 60fps at ultra preset.

 

The thing is most of the complaining is being done by people who don't have 970s, and don't game at 4K.

I don't get why any form of AA is used at 4K, because it really just isn't needed.

Edit: BTW for those who have a x16 slot available and the power to spare, if your going to run BF4 at 4K, use a dedicated PhysX card. At 1080p with a GTX 659Ti OC 2GB handling the PhysX for my GTX 970 I had the Maximum FPS of the 970 change to the average with the 650 dedicated to PhysX. If more games utilized PhysX people would be able to make good use of their older GPU's.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

(the whole "you could afford a 980" argument is still pretty disgusting though)

If you can't afford to game on 4K, don't game on 4K

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

If you can't afford to game on 4K, don't game on 4K

Game at 2K instead, and if you can't, 1080p is very affordable, even with AA in use. Word of wisdom from one who goes with what looks good within their budget.

(finished your post, I think)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Game at 2K instead, and if you can't, 1080p is very affordable, even with AA in use. Word of wisdom from one who goes with what looks good within their budget.

(finished your post, I think)

Isn't 1080p 2K?

 

I was led to believe the naming was the horizontal measure of the pixels rounded up to the nearest thousand. 1920x1080p is 2K, 3840x2560 is 4K, 2560x1440p is 2.5K, etc. Somebody correct me if I'm wrong.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

I think people are making a way bigger deal out of this than they should, but ultimately it is good for the consumer because it prompted AMD to step up and drop prices.  

 

I think it just goes to show how important it is to keep the trust of consumers.  /r/Nvidia is still freaking out and manufacturers are having to react.  Gigabyte and EVGA both have step up programs for example.  

 

It's an absolute PR nightmare because the most popular card has turned out to be slightly less magnificent post-release.  If it had released with everything in the open, it would have still been everyone's 2014 darling.

 

 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't 1080p 2K?

 

I was led to believe the naming was the horizontal measure of the pixels rounded up to the nearest thousand. 1920x1080p is 2K, 3840x2560 is 4K, 2560x1440p is 2.5K, etc. Somebody correct me if I'm wrong.

 

Sort of.  2K is actually an established standard for film and is different from 1080p. 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Game at 2K instead, and if you can't, 1080p is very affordable, even with AA in use. Word of wisdom from one who goes with what looks good within their budget.

(finished your post, I think)

 

 

Personally I am really happy at 1080, my next upgrade will be 1440 and I believe in a years time just about everything on the market should do that (excepting the sub $100 cards of course).  

 

4K is my goal for the upgrade after that.  It seems kinda over the top to spend the cash to get it now when for half the price a more than acceptable 1440 experience is within budget.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

If you can't afford to game on 4K, don't game on 4K

 

Or how about you don't tell people what to do? Or how about you don't assume that everyone who owns a 4k monitor can print money? Or how about the fact that even if you have the money you could still be price conscious? That's what disgusts me, benchmarks showed that the 970 was an excellent card, it struggled a bit at 4k but it was still doing a good job. It was also advertised (and basically everybody took it for granted) that it has 4GB of memory than can be fully utilized. A pair of 970s seemed like a great deal for many people, and they had no idea that they were buying gimped cards that might not be capable for future 4k gaming due to VRAM limitations. IIRC Linus even recommended a pair of 970s for his "high-end" build guide at some point, talking about that build being 4k ready and SLI 970s being insane value. People were practically being mislead, and they have all the right to be fucking pissed. "Miscommunication" or not, in the end Nvidia fucked up. You can't just shift the blame on the people who bought 970s, that's not how this works (and it's disgusting behaviour and shows severe lack of empathy for fellow human beings, and for what? Just to help a multi-billion dollar corporation. I mean, really?).

~7:30

4k capable build, "futureproof" for the most part. As it turns out that's not the case. If people like Linus who do this stuff for a living couldn't see it coming then you can't expect even educated consumers to realize it and act accordingly. The 970 (and SLI configs) were recommended, even for 4k, by many people.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't get why any form of AA is used at 4K, because it really just isn't needed.

Edit: BTW for those who have a x16 slot available and the power to spare, if your going to run BF4 at 4K, use a dedicated PhysX card. At 1080p with a GTX 659Ti OC 2GB handling the PhysX for my GTX 970 I had the Maximum FPS of the 970 change to the average with the 650 dedicated to PhysX. If more games utilized PhysX people would be able to make good use of their older GPU's.

 

Most of the reviews and benchmarks I've seen don't have AA or it's set to 2xAA for 4K.   My guess is setting it any higher did both little for the graphics and simply bogged down the cards.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1440p > 4k for gaming IMO, especially on a lovely ROG swift.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

Or how about you don't tell people what to do? Or how about you don't assume that everyone who owns a 4k monitor can print money? Or how about the fact that even if you have the money you could still be price conscious? That's what disgusts me, benchmarks showed that the 970 was an excellent card, it struggled a bit at 4k but it was still doing a good job. It was also advertised (and basically everybody took it for granted) that it has 4GB of memory than can be fully utilized. A pair of 970s seemed like a great deal for many people, and they had no idea that they were buying gimped cards that might not be capable for future 4k gaming due to VRAM limitations. IIRC Linus even recommended a pair of 970s for his "high-end" build guide at some point, talking about that build being 4k ready and SLI 970s being insane value. People were practically being mislead, and they have all the right to be fucking pissed. "Miscommunication" or not, in the end Nvidia fucked up. You can't just shift the blame on the people who bought 970s, that's not how this works (and it's disgusting behaviour and shows severe lack of empathy for fellow human beings, and for what? Just to help a multi-billion dollar corporation. I mean, really?).

~7:30

 

*snip*

 

4k capable build, "futureproof" for the most part. As it turns out that's not the case. If people like Linus who do this stuff for a living couldn't see it coming then you can't expect even educated consumers to realize it and act accordingly. The 970 (and SLI configs) were recommended, even for 4k, by many people.

 

Great post, other than the part about Nvidia fucking up. It worked out pretty well for them considering how many 970s they sold and how far they made AMD drop prices on their flagships. It took four long months before they got called on their bullshit and they made boatloads of money out of it.

Link to comment
Share on other sites

Link to post
Share on other sites

1440p > 4k for gaming IMO, especially on a lovely ROG swift.

 

I really can't believe that statement.

 

1440p with 3686400 pixels or 8294400 pixels at 4k. I don't believe there is no appreciative difference, or that 1440p would be better in any way, beyond not being able to drive 4k.

 

Correct my math if I'm wrong though. I'm stupid as hell.

Link to comment
Share on other sites

Link to post
Share on other sites

Great post, other than the part about Nvidia fucking up. It worked out pretty well for them considering how many 970s they sold and how far they made AMD drop prices on their flagships. It took four long months before they got called on their bullshit and they made boatloads of money out of it.

 

That was more about "they made the mistake (factually and on a moral level)" as opposed to "they shot themselves in the foot". I know that they made bank on the 900 series and I know that this'll probably be forgotten sooner or later, but I fucking hate that attitude towards other people just because someone's "favourite" company is involved. But now I'm just repeating myself.

 

People will downplay the whole thing and try to blame the ones who bought a 970, they will discredit the people who are complaining and they will do their best to sweep everything under the rug, there's not a whole lot that I can do about that. I'll just watch it happen, do my best to state my arguments and quietly lose my faith in some people.

 

 

I really can't believe that statement.

 

1440p with 3686400 pixels or 8294400 pixels at 4k. I don't believe there is no appreciative difference, or that 1440p would be better in any way, beyond not being able to drive 4k.

 

Correct my math if I'm wrong though. I'm stupid as hell.

 

I can't be bothered to check your math but it seems about right. 4k has more than double the pixels of 1440p after all. And yes, if the monitors are of similar size there is a huge difference, I owned a 28'' 4k monitor before and switched to 1440p after all. People who claim that 1440p is somehow "better" than 4k have no clue what they're talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

Guys I usually play Dota 2 all the time and have Afterburner running to monitor my Zotac GTX 970 dual-fan gpu as welll. Now when I checked my memory usage(MB), it maxes out to 3506 and wont go any further. I really dont think that Dota 2 requires that much VRAM. Is this normal?

CPU: Intel i5-4690K @ Stock | GPU: Zotac GTX 970 @ Stock | COOLER:Corsair H110i GT Stock fans | MOBO: Gigabyte Z97-X Gaming 5 | RAM: 2x4GB GSKILL RIPJAWS X 1866mhz PSU: Silverstone 500W Strider Essential Series | CASE: NZXT S340 | CASE FANS: x1 120mm TT Riing red LED, x1 140mm TT Riing red LED MOUSE: Razer DeathAdder Chroma & Logitech G502 Proteus Spectrum KEYBOARD: Varmilo VA87MR Gateron Red Switches | HEADSET: Logitech G230 | SSD: ADATA 256GB SP900 HDD: 500GB WD Blue & 1TB WD Caviar Black OS: Windows 10 | DISPLAYS: ASUS VX239H and some shitty Samsung 720p

Link to comment
Share on other sites

Link to post
Share on other sites

More pixels doesnt always = better quality. 1440p with 1.5x DSR on a better quality monitor (144 hz + Gsync, better colours) can beat a low specced 4k monitor.

 

Plus there a lot of games where the UI and text doesnt scale well on 4k displays making the game menus unreadable.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

Guys I usually play Dota 2 all the time and have Afterburner running to monitor my Zotac GTX 970 dual-fan gpu as welll. Now when I checked my memory usage(MB), it maxes out to 3506 and wont go any further. I really dont think that Dota 2 requires that much VRAM. Is this normal?

 

Games will find a way to use all available Vram for caching if they can, its more efficient to load the info from the vram than from the system ram. This is one of the main obstacles to benchmarking Vram as its hard to tell whether a game is actually using and reading from all of the Vram, or just using it to cache. 

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

 

I don't get why any form of AA is used at 4K, because it really just isn't needed.
Edit: BTW for those who have a x16 slot available and the power to spare, if your going to run BF4 at 4K, use a dedicated PhysX card. At 1080p with a GTX 659Ti OC 2GB handling the PhysX for my GTX 970 I had the Maximum FPS of the 970 change to the average with the 650 dedicated to PhysX. If more games utilized PhysX people would be able to make good use of their older GPU's.

 

It's needed because they're benchmarks. Their purpose is to differentiate between five examples of extremely powerful GPUs. They aren't meant to be examples of how a game will run at a certain resolution.

 

 

That's because after everything I've just said, you're still linking to benchmarks that use MSAA in their tests. You just don't need this at 4K. I have never played any game at less than ultra, and never got less than 40 fps.

 

Even the game that did drop to 40 fps was because of a hefty CPU bottleneck.

Link to comment
Share on other sites

Link to post
Share on other sites

I dont really understand what is in this for me if I keep my 970. Do I just lose 0.5gb of memory? If its just that... Well I was fine with 3gb on my 7970.

Can someone explain it to me. I am really comfused by all the hate on it. (bla bla false advertising)

Case : Corsiar Air 540 CPU : Intel i5 8600k GPU : Asus Nvidia 970 DirectCUII RAM : 16gb DDR4 MB : ASUS Prime z370-p PSU : OCZ Modxtreme 700w SSD : Samsung 840 EVO 250gb 

Link to comment
Share on other sites

Link to post
Share on other sites

I dont really understand what is in this for me if I keep my 970. Do I just lose 0.5gb of memory? If its just that... Well I was fine with 3gb on my 7970.

Can someone explain it to me. I am really comfused by all the hate on it. (bla bla false advertising)

Unless you plan on going SLI at higher resolutions, do not worry about it. 

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

Unless you plan on going SLI at higher resolutions, do not worry about it.

1080p suits my need and I will go SLI when game will bw too demanding to maintain 60fps with ok graphical setting.

Case : Corsiar Air 540 CPU : Intel i5 8600k GPU : Asus Nvidia 970 DirectCUII RAM : 16gb DDR4 MB : ASUS Prime z370-p PSU : OCZ Modxtreme 700w SSD : Samsung 840 EVO 250gb 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm happy with 1080p for now, but I did intend to get the ROG Swift (or an IPS equivalent) at 1440p and pick up a second 970. I won't be doing this any more. I still hope to upgrade to a 1440p monitor, but it won't be until I'm satisfied with my GPU setup.

 

Gigabyte is supposed to be offering a step-up program for 970 users. I'm tempted, but I also hate that I am. If I were a patient and understanding human being, I would wait for the 390X and sell my G1 Gaming 970 to pay for it. Of course, there is that worry that 970 prices will plummet due to this fiasco and I would have been better to upgrade to the 980 and not bother with a 390X at all.

Link to comment
Share on other sites

Link to post
Share on other sites

According to news in the UK, gigabyte is the only company so far that's refusing refunds on the cards. If they offer a step up I might just do that.

Linus is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×