Jump to content

How well would 2 of these perform in SLI? I'm "about to" (prolly months away) build my first desktop and will probably buy one of these and then a second once price comes down substantially. I game at 1080p (for now) in World of Tanks, War Thunder (occasionally), Farming Simulator 15 (I know, plz don't judge me), Euro Truck Simulator 2, American Truck Simulator (as soon as it's released), occasional Battlefield 3, and other assorted ones not worth mentioning. I plan to use my new PC to youtube with ie; recording/rendering/editing/encoding

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

How well would 2 of these perform in SLI? I'm "about to" (prolly months away) build my first desktop and will probably buy one of these and then a second once price comes down substantially. I game at 1080p (for now) in World of Tanks, War Thunder (occasionally), Farming Simulator 15 (I know, plz don't judge me), Euro Truck Simulator 2, American Truck Simulator (as soon as it's released), occasional Battlefield 3, and other assorted ones not worth mentioning. I plan to use my new PC to youtube with ie; recording/rendering/editing/encoding

(a fellow WoT player, yay! :) )

For those games at 1080p, one 970 would be more than sufficient, and for eventual editing stuffs/more demanding games/possible resolution upgrade a second 970 seems like a perfectly reasonable upgrade :)

 

Or just get one 980 ;)

-·- BitFenix Prodigy M (Arctic White) -·- Asus Maximus VII Gene -·- Intel Core i7-4790K -·- Corsair H100i -·- G.Skill Trident X 2133MHz CL9 32GB (4x8GB) -·- Sparkle Calibre GTX580 -·- Samsung 500GB 850 Evo SSD -·- WD Caviar Green 4TB -·- Cooler Master V700 -·- LG 25UM55 21:9 2560x1080 25" -·- Logitech G600 -·-

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

(a fellow WoT player, yay! :) )

For those games at 1080p, one 970 would be more than sufficient, and for eventual editing stuffs/more demanding games/possible resolution upgrade a second 970 seems like a perfectly reasonable upgrade :)

 

Or just get one 980 ;)

 

(what serever do u usually play on? i'd love to play at some point! pm me whenever...)

 

i would get a 980, but i have 2 issues with that. 1) it's another $200 and, 2) it's kinda overkill for what i'm doing (at least thats what i think)

 

Thanks for the feedback anyway! i'll probably start with 1 and get a 2nd later when they're "cheap"

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

I hate to necro a thread like this, but something stinks/doesnt-make-any-sense.

 

With the settings given in the DOC for this video, I would like to know, just how in the FUCK, LTT got an "core clock actual" of 1527, with a "core clock" of +110 offset. I have my strix currently OC'ed with 120 on the power limit, 7600 on the memory, 1400 on the clock (before gpu boost 2.,0) and the max it gets is 1475. (max temp target of 92c, not sure how they got 95)

 

Why doesn't this make any sense? because my offset is +172 above stock. Unless they're claiming +110 over the "gaming" built-in setting in GPUtweak. (even then, +110 only takes it to 1363 before GPU boost 2.0)

 

@nicklmg @LinusTech @Slick, care to elaborate? I'm not claiming any falsification, just curious as to how this could happen. My rig probably has better thermal dissipation than your test bench, air 540 with 8 fans running at max (not counting GPU or PSU), with a 280mm AIO corsair cooler, and no OC on my 4690k. I've run all day at 1475 with temps never getting above 60c.

 

Edit: Upon watching the video again, I also notice a discrepancy between the stock clock for the strix 970 that nvidia gives, and the actual stock clock that the strix comes with, although I'm sure this is normal.

 

Edit #2: Also, GPU tweak does not allow for a GPU voltage offset of +50. Even with "Overclocking range enhancement" enabled, the max it currently goes to is +38 offset.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

I hate to necro a thread like this, but something stinks/doesnt-make-any-sense.

 

With the settings given in the DOC for this video, I would like to know, just how in the FUCK, LTT got an "core clock actual" of 1527, with a "core clock" of +110 offset. I have my strix currently OC'ed with 120 on the power limit, 7600 on the memory, 1400 on the clock (before gpu boost 2.,0) and the max it gets is 1475. (max temp target of 92c, not sure how they got 95)

 

Why doesn't this make any sense? because my offset is +172 above stock. Unless they're claiming +110 over the "gaming" built-in setting in GPUtweak. (even then, +110 only takes it to 1363 before GPU boost 2.0)

 

@nicklmg @LinusTech @Slick, care to elaborate? I'm not claiming any falsification, just curious as to how this could happen. My rig probably has better thermal dissipation than your test bench, air 540 with 8 fans running at max (not counting GPU or PSU), with a 280mm AIO corsair cooler, and no OC on my 4690k. I've run all day at 1475 with temps never getting above 60c.

 

Edit: Upon watching the video again, I also notice a discrepancy between the stock clock for the strix 970 that nvidia gives, and the actual stock clock that the strix comes with, although I'm sure this is normal.

 

Edit #2: Also, GPU tweak does not allow for a GPU voltage offset of +50. Even with "Overclocking range enhancement" enabled, the max it currently goes to is +38 offset.

 

Silicon lottery and modded bios I would wager.

Link to comment
Share on other sites

Link to post
Share on other sites

Silicon lottery and modded bios I would wager.

this,

The silicon lottery could explain different clocks after the oc is applied.

Two identical cards compared will not have the same Asic score or boost frequency.

Link to comment
Share on other sites

Link to post
Share on other sites

Silicon lottery and modded bios I would wager.

this,

The silicon lottery could explain different clocks after the oc is applied.

Two identical cards compared will not have the same Asic score or boost frequency.

 

So GPU boost 2.0 only giving my 75MHz over my base frequency, is apart of the silicone lottery? Wouldn't I have worse temps if that was the case?

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

So GPU boost 2.0 only giving my 75MHz over my base frequency, is apart of the silicone lottery? Wouldn't I have worse temps if that was the case?

yes, how much boost your card gets over the specs is a result of the silicon lottery.

Its not limited by temps. It is limited by how high it can clock at a certain power draw (wattage).

Temps will just cause to throttle or not boost at all.

Link to comment
Share on other sites

Link to post
Share on other sites

yes, how much boost your card gets over the specs is a result of the silicon lottery.

Its not limited by temps. It is limited by how high it can clock at a certain power draw (wattage).

Temps will just cause to throttle or not boost at all.

So if I take off the 1400MHz limit on gpu tweak, and push it further, what could I expect to see? I pushed it to 1425 already, but didn't notice anything.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×