Jump to content

Official Nvidia GTX 970 Discussion Thread

-snip-

 

Linus himself said that the dual 980s is a more practical solution, as you do not need to deal with the weaker scaling ratio of the cards. Also, you would reduce the amount of SLI-related problems, such as micro-stuttering.

Read the community standards; it's like a guide on how to not be a moron.

 

Gerdauf's Law: Each and every human being, without exception, is the direct carbon copy of the types of people that he/she bitterly opposes.

Remember, calling facts opinions does not ever make the facts opinions, no matter what nonsense you pull.

Link to comment
Share on other sites

Link to post
Share on other sites

Linus himself said that the dual 980s is a more practical solution, as you do not need to deal with the weaker scaling ratio of the cards. Also, you would reduce the amount of SLI-related problems, such as micro-stuttering.

 

Still Some people get two 970 and then add another later for good measure.

Whats so weird about that?

Link to comment
Share on other sites

Link to post
Share on other sites

How do you explain the poor performance after 3.5 GB?

 

It clearly has less SMs and less cahce than the 980.

 

It is also lower priority.

That 0.5GB doesn't matter, here's my VRAM usage at idle on 3440x1440;

26B738Ol.jpg

I got 3.5GB left to use, Windows should schedule my idle vram to that 0.5GB since that doesn't require a high bandwidth speed like @Majestic explained. I still have 3.5GB left to play with, at 4K you'll have a higher idle vram usage than I do and having now a 4GB card won't change a damn thing. You still get 3.5GB to play with on a 4GB card, so tell me what the difference is?

If this was the other way around so the 3.5GB would be on a single 32 bit controller and the 0.5GB on the full 256 bit controller then we have an issue.

Link to comment
Share on other sites

Link to post
Share on other sites

That 0.5GB doesn't matter, here's my VRAM usage at idle on 3440x1440;

26B738Ol.jpg

I got 3.5GB left to use, Windows should schedule my idle vram to that 0.5GB since that doesn't require a high bandwidth speed like @Majestic explained. I still have 3.5GB left to play with, at 4K you'll have a higher idle vram usage than I do and having now a 4GB card won't change a damn thing. You still get 3.5GB to play with on a 4GB card, so tell me what the difference is?

If this was the other way around so the 3.5GB would be on a single 32 bit controller and the 0.5GB on the full 256 bit controller then we have an issue.

 

Matters somewhat in high resolution gaming.

Like 4 k.

 

The 290x crossfire is more advisable for 4k now that we now what we know.

Link to comment
Share on other sites

Link to post
Share on other sites

Linus himself said that the dual 980s is a more practical solution, as you do not need to deal with the weaker scaling ratio of the cards. Also, you would reduce the amount of SLI-related problems, such as micro-stuttering.

Still Some people get two 970 and then add another later for good measure.

Whats so weird about that?

You are full of red herrings and straw-man arguments.
Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

Matters somewhat in high resolution gaming.

Like 4 k.

 

The 290x crossfire is more advisable for 4k now that we now what we know.

Answer my fucking question. With an idle usage of 500MB, you do NOT have 4GB to play with. Can you do the math? 4-0.5GB=?

Link to comment
Share on other sites

Link to post
Share on other sites

Answer my fucking question. With an idle usage of 500MB, you do NOT have 4GB to play with. Can you do the math? 4-0.5GB=?

Well, you could also run secondary monitors off of integrated graphics. And would idling on the desktop at 4k use .5gb of VRAM (or even close to that)? 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

You keep rambling on about how it matters in 4k, three way SLI, or some other nonsense. Its not a damn 4k card, and I don't ever recall it being marketed as such. Three way SLI of any card is generally a bad idea.

Link to comment
Share on other sites

Link to post
Share on other sites

You keep rambling on about how it matters in 4k, three way SLI, or some other nonsense. Its not a damn 4k card, and I don't ever recall it being marketed as such. Three way SLI of any card is generally a bad idea.

 

2 way SLI gives great performance.

 

Who is the fucking straw man here?

 

http://www.hardocp.com/article/2014/11/19/nvidia_geforce_gtx_970_sli_4k_nv_surround_review/2#.VMlCOCusVSk

 

Fuck this im out.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, you could also run secondary monitors off of integrated graphics. And would idling on the desktop at 4k use .5gb of VRAM (or even close to that)? 

Oh yes, at least. I tested it with 3x4K DSR when the 970 was still alive. It was about 1200-1500MB of VRAM in desktop.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, you could also run secondary monitors off of integrated graphics. And would idling on the desktop at 4k use .5gb of VRAM (or even close to that)? 

I'm using over a gig but I have a 4k + 1440x900 display with a shit ton of stuff open, so take it with a grain of salt I guess.

2fEJ8OW.png

RIP in pepperonis m8s

Link to comment
Share on other sites

Link to post
Share on other sites

@zappian 

You meet an asshole in the morning, you just met an asshole. You meet assholes all day, you're the asshole. Some critical advice for you in your hopefully long and prosperous future. If you are going to break down and leave us, go ahead. No one will stop you, no one will yell at you to come back. We aren't perfect, we bicker too, but we don't go scorched Earth like you have decided to. 

We can have proper debates about this if you had bothered understanding our main intents:

  1. We don't like that Nvidia lied to us about this. That was wrong and has no excuse on any level. Marketing and engineering screwed this one up very nicely. 
  2. The card is the same card as 2 months ago. Same benchmarks still apply. Nothing has changed. Its the same card that put up the same results before. People bought off those benchmarks and while they are going to be upset about point #1, they can still live with the card. 

If you can't even understand that, then genuinely leave. We've got enough egos and insanity here to go for a lifetime, we don't need someone else who takes it so seriously and can't even admit how biased and unprofessional he is being. 

Link to comment
Share on other sites

Link to post
Share on other sites

nVidia did all the design and marketing that caused this in the first place. Board partners usually throw in some better voltage control (or worse if you go cheap), attach their own cooler and a fancy sticker and that's about it. Ultimately it doesn't matter to the consumer though, and nVidia has already assumed some responsbility by offering to help consumers to get refunds. I could understand if there was one or two board partners who changed the design to a cheaper one having this issue (like some dick moves done before where new batches of cards get their power delivery and other features gimped after the first batch and reviews are out), then I would tell people to be mad at the specific board partners.

 

 

Firstly, the difference between a 980 and 970 prior to this information was that they were equal in everything but texture fillrate and core count. Nothing that's really a bottleneck for SLI configurations. The 970 was supposed to be "90% of a 980" and was marketed as such.

 

Secondly, you don't have to have a 4K monitor to push a game to 4K. You've got supersampling which nVidia now natively supports with DSR. Some games do actually run fine with 4K since they aren't very demanding to begin with.

 

Thirdly, the game might not be computationally taxing compared to the VRAM usage. Example: Skyrim with mods.

 

Lastly, not everyone demands 60 fps or 120 fps from every one of their games. Not everyone is wiggling their e-peen in front of console users, either. 30 FPS is playable for some games, the thing is that certain developers and other parties have stated that 30 fps might be actually superior to 60 fps beacuse it is more cinematic or some bullcrap like that, which isn't true. Games always feel better to play at higher framerates, however a racing game might need that 60 fps a lot more than your turn-based strategy game. Meanwhile that turn-based strategy game with pre-rendered animations looks hella lot prettier at 4K, and you might get a better combat view at higher resolutions and are willing to play at lower framerates to achieve that.

 

I have a 120Hz display (actually 144 Hz) but some games I rather play in crisp 1440p @ 60Hz rather than at 1080p with higher refresh rates, and vice versa.

 

But ultimately it is false advertisement. They've chosen to take a direction where the least amount of people possible will demand to swap the cards. 970 is still the best single card for 1080p at its price range. I don't doubt that at all. But suddenly losing practically 1/8th of the VRAM as it'll end up causing godlike strutter when used does kind of suck.

 

And just in general, no you cannot expect every developer to specifically code for this. If that was actually a great way of doing things, people would actually use this as a cost-cutting measure far more regularly. A game having less performance because the developer wasn't bothered to implement a new memory profile for a very specific card doesn't really take the blame towards a developer.

 

In general the badly coded games argument doesn't hold much air. Even if a card is great but the games don't run well on it, blaming the game's code won't make it run any better for the user. Games with bad performance might still be awesome otherwise, and rather than moving the mountain (yelling at developer to fix the game they might not be even able to because of publisher contracts) they want to go around it (buying the best card possible for the situation that fits their budget).

 

 

to be honest i switched off after 90% of a 980 and something about a monitor i realised that you were talking crap and switched off....sorry i will say this 30fps is shit the only games i can really think of that 60fps isnt a big things is hearthstone and games like diablo and for that im guessing id notice

"if nothing is impossible, try slamming a revolving door....." - unknown

my new rig bob https://uk.pcpartpicker.com/b/sGRG3C#cx710255

Kumaresh - "Judging whether something is alive by it's capability to live is one of the most idiotic arguments I've ever seen." - jan 2017

Link to comment
Share on other sites

Link to post
Share on other sites

You havent watched the pc per video probably so.

 

Or read the article to the end.

 

It speaks by itself.

Let me repeat myself; You have absolutely zero footing in this argument.

 

I reread the article and it doesn't support you at all. I'm convinced either you haven't read it, or you lack the ability to comprehend higher levels of reading. Pick your poison.

 

The question “should the GTX 970 be called a 3.5GB card?” is more of a philosophical debate. There is 4GB of physical memory on the card and you can definitely access all 4GB of when the game and operating system determine it is necessary. But 1/8th of that memory can only be accessed in a slower manner than the other 7/8th, even if that 1/8th is 4x faster than system memory over PCI Express. NVIDIA claims that the architecture is working exactly as intended and that with competent OS heuristics the performance difference should be negligible in real-world gaming scenarios.

 NVIDIA has come clean; all that remains is the response from consumers to take hold. For those of you that read this and remain affronted by NVIDIA calling the GeForce GTX 970 a 4GB card without equivocation: I get it. But I also respectfully disagree. Should NVIDIA have been more upfront about the changes this GPU brought compared to the GTX 980? Absolutely and emphatically. But does this change the stance or position of the GTX 970 in the world of discrete PC graphics? I don’t think it does.

 

The article supports my argument miles before it enforces yours.  

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

to be honest i switched off after 90% of a 980 and something about a monitor i realised that you were talking crap and switched off....sorry i will say this 30fps is shit the only games i can really think of that 60fps isnt a big things is hearthstone and games like diablo and for that im guessing id notice

 

The only differences advertised were 81,25% of the CUDA cores and Texture units. Both of which aren't great hindrances in SLI use. Okay, let's put it at 80%. The point remains the same.

 

Let's put it this way; Civ 5 @ 30 FPS @ 4K on a big monitor vs 120 FPS @ 1080P or whatever. Would you think that the tradeoff is unimaginable? 60 FPS is objectively better than 30 FPS, but some people are ready to make the trade for a higher resolution at the cost of framerate.

 

But yeah, it's not the end of the world. Regardless, customers got every right to complain, and possibly return the product. It was falsely advertised - that's what matters. Benchmarks didn't show the 3.5GB VRAM tank, and no reviewer noticed it until they started looking at it much more closely. No reviewer made the assumption because they had no reason to believe such, yet users found out about it later down the road when they had much more time with the cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Answer my fucking question. With an idle usage of 500MB, you do NOT have 4GB to play with. Can you do the math? 4-0.5GB=?

No one has yet had 4 true GB to work with. Some of that memory has always been reserved for the driver and system info. Jesus Christ...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No one has yet had 4 true GB to work with. Some of that memory has always been reserved for the driver and system info. Jesus Christ...

Which points to the fact that 3.5gb vs. 4gb of fast VRAM doesn't make a huge difference since part of that slower .5gb can be used for all the random bullshit that needed to be done where speed doesn't matter. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

Holy Shit, someone call the guy's at CERN, I think we have just discovered the densest matter on the planet!!!  Last time I looked something that dense usually only exists for a few hundredths of a second, but this one has existed on for almost 40 pages. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

No one has yet had 4 true GB to work with. Some of that memory has always been reserved for the driver and system info. Jesus Christ...

Think I might start a sensationalized tech news post about how Seagate didn't actually give me 1tb of storage on my mechanical drive.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh man, I only have 223GB available on my 240GB drive, should I start a post too?

You should. We should stick it to the man for not delivering what they promised!

Link to comment
Share on other sites

Link to post
Share on other sites

Oh man, I only have 223GB available on my 240GB drive, should I start a post too?

My 512GB drive only has 499 free.

#SpaceGate

Technically, GPUs have 4096 in memory. They advertise 4GB. 

HERESEY! 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×