Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
D2ultima

The SLI information guide

Recommended Posts

Posted · Original PosterOP

Should I sli two 950s i already have one but I'd have to buy a new mb or just buy a r9 380.

I'd suggest the R9 380 4GB wholeheartedly. SLI nothing under a 980Ti if you can help it.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

You should have mentioned Nvidia Inspector SLI bit tweaking. Really useful for better compatibility.

Dark Souls prepare to die edition doesn't support SLI but you can get SLI working with the right bit. Wish AMD had this ability but it's crap right now in terms of tweaks ever since RadeonPro took a dump.


Mid-range Emulation Gaming and Video Rendering PC

[CPU] i7 4790k 4.7GHz & 1.233v Delidded w/ CLU & vice method [Cooling] Corsair H100i [Mobo] Asus Z97-A [GPU] MSI GTX 1070 SeaHawk X[RAM] G.Skill TridentX 2400 9-11-11-30 CR1 [PSU] Corsair 750M 

Link to post
Share on other sites
Posted · Original PosterOP

You should have mentioned Nvidia Inspector SLI bit tweaking. Really useful for better compatibility.

Dark Souls prepare to die edition doesn't support SLI but you can get SLI working with the right bit. Wish AMD had this ability but it's crap right now in terms of tweaks ever since RadeonPro took a dump.

 

 

Now that that's done, let's get into the benefits of SLI. There's some benefits I'll list that most people don't actually know.

 
 
5 - SLI can actually be forced on some games that don't actually support it... such as Skyrim's RCRN or ENB modded states. If you pick the correct thing, (which in the case of Skyrim is "Force Alternate Frame Rendering 2"), then you can still enjoy the benefits of SLI even while using a large amount of graphical mods and keeping your lovely framerates. Another possible way, or a way to fix some incompatibility issues even with a "supported" driver, may be to use SLI compatibility bits via nVidia Inspector. Here is a (rather short; I know there are much more elsewhere) list of games and compatibility bits that improve them; it's a good start, so good luck! http://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912
Please note: some engines (like Unreal Engine 4, ID Tech 5 and Unity) do not support SLI at their base level, and it is NOT a "game" issue. For games on such engines, there is no "forcing" SLI.

Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

 

 

Oh hey, you mentioned it!


Mid-range Emulation Gaming and Video Rendering PC

[CPU] i7 4790k 4.7GHz & 1.233v Delidded w/ CLU & vice method [Cooling] Corsair H100i [Mobo] Asus Z97-A [GPU] MSI GTX 1070 SeaHawk X[RAM] G.Skill TridentX 2400 9-11-11-30 CR1 [PSU] Corsair 750M 

Link to post
Share on other sites
Posted · Original PosterOP

Oh hey, you mentioned it!

It's been there for a considerable amount of time.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

It's been there for a considerable amount of time.

 

Maybe, maybe it wasn't and you just added it. We'll never know.


Mid-range Emulation Gaming and Video Rendering PC

[CPU] i7 4790k 4.7GHz & 1.233v Delidded w/ CLU & vice method [Cooling] Corsair H100i [Mobo] Asus Z97-A [GPU] MSI GTX 1070 SeaHawk X[RAM] G.Skill TridentX 2400 9-11-11-30 CR1 [PSU] Corsair 750M 

Link to post
Share on other sites
Posted · Original PosterOP

Maybe, maybe it wasn't and you just added it. We'll never know.

There is a "last edited" in the main page. It's prior to your comment today, so it must have been there before.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

There is a "last edited" in the main page. It's prior to your comment today, so it must have been there before.

I know that #16 has mentioned it in passing, but cpu driver overhead is really beginning to be a huge issue for sli both as graphics card improvements outstrip cpu improvements and as games themselves are becoming more cpu intensive.

If you'd like I can link quite a bit of comparison videos, but the jist of it is even for 970 sli, going from an I5 to an I7 gives serious performance gains (and quite large reductions in stutter).

For flagship class gpus, I cannot recommend anything less than Haswell-E (new anyways, x79 should still be better than z97 and possibly z170 for sli).

(GM200 sli at 4k high/ultra settings shows huge improvements comparing same clocked haswell-E hexcores and haswell quad core I7s).


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
Posted · Original PosterOP

I know that #16 has mentioned it in passing, but cpu driver overhead is really beginning to be a huge issue for sli both as graphics card improvements outstrip cpu improvements and as games themselves are becoming more cpu intensive.

If you'd like I can link quite a bit of comparison videos, but the jist of it is even for 970 sli, going from an I5 to an I7 gives serious performance gains (and quite large reductions in stutter).

For flagship class gpus, I cannot recommend anything less than Haswell-E (new anyways, x79 should still be better than z97 and possibly z170 for sli).

(GM200 sli at 4k high/ultra settings shows huge improvements comparing same clocked haswell-E hexcores and haswell quad core I7s).

I know games are becoming much more CPU intensive. Too many people still feel an i5-4460 can run every game in the universe.

 

I would need to know which games these are. If it's a game that already devours CPU like GTA V or Dying Light, then it would make a bit more sense that doubling GPU power would bite the CPU. What you would need to check however, is midranged SLI vs heavily OC'd top end single GPU. If you're checking two 970s in SLI, then check them at stock vs a 980Ti at 1500MHz or so.

 

Current flagship class GPUs like 980Tis or Fury X cards use quite a lot of data. In this instance, X79 and X99 might have an advantage in sheer PCI/e lane amount. To make sure that the PCI/e lane amount is not the reason, one should try disabling two cores and two threads in the PCI/e lanes, running it quadcore + HT vs quadcore + HT and see if the enthusiast platforms in itself still give better performance. I know that 4K resolutions benefit from more bandwidth being available to a point. While it isn't a vRAM issue so to speak, the cards DO render a lot of pixels. I've also heard that people who've hacked in SLI into Unreal Engine 4 games have noticed that TAA is a MASSIVE performance drop in multi-GPU as opposed to single GPU, and that users on PCI/e 3.0 slots had a huge advantage... though I didn't ask for verification of this from the user who told me, so I can't make it as an official statement here or anything.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

I know games are becoming much more CPU intensive. Too many people still feel an i5-4460 can run every game.

Snips...

I have no problems recommending a 4460 for a budget single gpu build, but sli is just too much cpu overhead.

Also the pcie lane difference has been tested by lots of people and 8x vs 16x is within margin of error. So that isn't a factor.

Here is some sample games at 4k ultra. I don't have a 4790k to test with myself anymore so I cant help here other than linking some sources.

https://www.youtube.com/watch?v=ZABt8bHgDHo

Copied from an earlier post I made:

Sli 970 comparison, the benchmarks are inconveniently hidden in here, but everyone shows significant improvements by the I7. The link starts with gta v and others times are listed below.

https://youtu.be/ew5MIgXfHL0?t=299

@6:48 you can see W3, 8:25 is BF:4, 10:02 is crysis 3.

Here is a great example of classic CPU bottle-necking of SLI systems (note low gpu load and high differences in load in comparison to the original video I linked by the same person). (near matching frequency i5/i7 oc'd and oc'd 980 sli)

https://www.youtube.com/watch?v=pDmuv8Q45iY

So really I guess the statement is, even if you claim that really it's a cpu limitation and not "indicative of sli" then that I show significant performance reductions even with 970 sli using a 4690k over a 4790k should really explain why like I said from the beginning "I 100% would never ever recommend sli on an i5."


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
Posted · Original PosterOP

I have no problems recommending a 4460 for a budget single gpu build, but sli is just too much cpu overhead.

Also the pcie lane difference has been tested by lots of people and 8x vs 16x is within margin of error. So that isn't a factor.

Here is some sample games at 4k ultra. I don't have a 4790k to test with myself anymore so I cant help here other than linking some sources.

-snip-

So really I guess the statement is, even if you claim that really it's a cpu limitation and not "indicative of sli" then that I show significant performance reductions even with 970 sli using a 4690k over a 4790k should really explain why like I said from the beginning "I 100% would never ever recommend sli on an i5."

Okay. I've got a lot to explain now xD. Firstly, i5-4460 for a budget build? Yes. For a 980Ti? No. For SLI? No. For anything beyond maybe a R9 380 or GTX 960? No. Squeeze out some more cash, especially if new AAA titles are your desires. 2015's AAA title CPU usage was off the charts, and as you say, we don't have strong enough CPUs to compensate.

 

The PCI/e lane testing I was considering purely for 4K.

 

See this post is in games that are known to use more than 4 cores/8 threads, and in the case of Crysis, in a specific scenario where CPU utilization is high (the place with the cannon and all the grass in the field). What I'm trying to get at, is that even single GPU games not at a heavy GPU bottleneck are likely to benefit from the CPU, and it's not just SLI. Hence why I was suggesting two stock 970s versus an overclocked 980Ti. If possible test them with 3DMark and see that their GPU values are rather close together, and then run them through the same scenarios. The i5 should still produce less frames on average for the single 980Ti than the i7 in those games tested.

 

On the flip side, look at Tomb Raider, which had basically the same FPS.

 

When you're at a GPU bottleneck, you notice the CPU affecting your frames a lot less. And what people fail to realize is that just because a GPU is at "99%" doesn't mean it's at maximum load. That's an estimation, and the reason why some games at 100% heat up a video card less than others at 100%. It's how the GPU is being used. Also, 4K Ultra is a rather GPU-bottlenecked situation. If it were at 1080p you might notice more of a gap.

 

But I always tell people to get an i7 for gaming if they want to play new games especially at high FPSes. People tend to be on the entire "omg, i5 is king, nothing more than i5-4460 is ever needed, omg, stfu" etc etc and not listen. Even if your i7 isn't being utilized by the game, the rest of your PC's programs have the hyperthreads to use, and you essentially can get a bit more performance for the game anyway, as it isn't really sharing performance with the game... kind of. Hyperthreading is a bit difficult to explain, but I think you understand what I'm trying to convey. The reason I didn't leave it as a recommendation or "my thoughts" in the guide is because this is more focused on what SLI itself does. It's rare that SLI will say... double your CPU utilization (though Killing Floor 2 is single-thread bound, and SLI has it use an entire second thread for the second GPU, so it does in effect nearly double utilization there).


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

Okay. I've got a lot to explain now xD. Firstly, i5-4460 for a budget build? Yes. For a 980Ti? No. For SLI? No. For anything beyond maybe a R9 380 or GTX 960? No. Squeeze out some more cash, especially if new AAA titles are your desires. 2015's AAA title CPU usage was off the charts, and as you say, we don't have strong enough CPUs to compensate.

 

The PCI/e lane testing I was considering purely for 4K.

 

See this post is in games that are known to use more than 4 cores/8 threads, and in the case of Crysis, in a specific scenario where CPU utilization is high (the place with the cannon and all the grass in the field). What I'm trying to get at, is that even single GPU games not at a heavy GPU bottleneck are likely to benefit from the CPU, and it's not just SLI. Hence why I was suggesting two stock 970s versus an overclocked 980Ti. If possible test them with 3DMark and see that their GPU values are rather close together, and then run them through the same scenarios. The i5 should still produce less frames on average for the single 980Ti than the i7 in those games tested.

 

On the flip side, look at Tomb Raider, which had basically the same FPS.

 

When you're at a GPU bottleneck, you notice the CPU affecting your frames a lot less. And what people fail to realize is that just because a GPU is at "99%" doesn't mean it's at maximum load. That's an estimation, and the reason why some games at 100% heat up a video card less than others at 100%. It's how the GPU is being used. Also, 4K Ultra is a rather GPU-bottlenecked situation. If it were at 1080p you might notice more of a gap.

 

But I always tell people to get an i7 for gaming if they want to play new games especially at high FPSes. People tend to be on the entire "omg, i5 is king, nothing more than i5-4460 is ever needed, omg, stfu" etc etc and not listen. Even if your i7 isn't being utilized by the game, the rest of your PC's programs have the hyperthreads to use, and you essentially can get a bit more performance for the game anyway, as it isn't really sharing performance with the game... kind of. Hyperthreading is a bit difficult to explain, but I think you understand what I'm trying to convey. The reason I didn't leave it as a recommendation or "my thoughts" in the guide is because this is more focused on what SLI itself does. It's rare that SLI will say... double your CPU utilization (though Killing Floor 2 is single-thread bound, and SLI has it use an entire second thread for the second GPU, so it does in effect nearly double utilization there).

i am well aware of all that you said. i dont have 970 sli or a non-x99 cpu rig to test with it currently

 

I listed the first video because if you take a look at it, even in basically the most gpu intensive situation on the market today, an x99 cpu shows huge fps increases over a 4790k at the same very high clocks. aka dont use flagship class gpus in sli without the -e 6+ core platform, and you'd better oc it lol.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
Posted · Original PosterOP

i am well aware of all that you said. i dont have 970 sli or a non-x99 cpu rig to test with it currently

 

I listed the first video because if you take a look at it, even in basically the most gpu intensive situation on the market today, an x99 cpu shows huge fps increases over a 4790k at the same very high clocks. aka dont use flagship class gpus in sli without the -e 6+ core platform, and you'd better oc it lol.

Yeah. The love for mainstream is pretty high though. I don't know why people pick up mainstream equipment and then shove two 980Tis or something at it.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

Is it possible to run 4-way SLI with four dual-GPU graphics cards?

No. The maximum under any situation is 4 GPUs. You can use more for F@H or other computation if you so desire, but SLI will NOT work.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites
Posted · Original PosterOP

Is it possible to run 4-way SLI with four dual-GPU graphics cards?

What @Curufinwe_wins said


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

I'm currently on a gtx 970, and getting a 980ti has proven difficult since I can barely sell the 970 for a decent amount. Those with SLI experience: Do the problems outweigh the benefits for you? Is it worth the hassle? I'm currently on a 1080p, 144hz monitor, and I play a bunch of games and stream as well.

Link to post
Share on other sites
Posted · Original PosterOP
3 hours ago, Aulowry said:

I'm currently on a gtx 970, and getting a 980ti has proven difficult since I can barely sell the 970 for a decent amount. Those with SLI experience: Do the problems outweigh the benefits for you? Is it worth the hassle? I'm currently on a 1080p, 144hz monitor, and I play a bunch of games and stream as well.

I would not suggest SLI for you if you're a streamer and have a 970. If you play a lot of new games you're also going to find you're not using your second card. Even if you want to save up and jump into Pascal or Polaris, that might be a better idea, rather than jumping at a 980Ti.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
Posted · Original PosterOP

So, apparently the SLI guide is completely broken, and using the edit button doesn't work (I just see a blank page). I'm waiting for a while to see if it gets fixed; if not, I might have to simply copy/paste the entire page and re-do it.

 

If a staff member could help (maybe restore it from a cache so it's not broken) that'd be amazing. @Godlygamer23 maybe?


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
Posted · Original PosterOP

Okay guide has been re-done and is now fixed. Nested spoilers do not work, and broke the guide multiple times while I was trying to fix it.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
On 26/11/2015 at 9:51 PM, D2ultima said:

I'd suggest the R9 380 4GB wholeheartedly. SLI nothing under a 980Ti if you can help it.

As of right now, that is good advice. However, in general it isn't. More generally speaking, the second-from-top tier, which was based on the same GPU as the absolute top, offered the best cost/performance ratio, and the most sensible high performance purchase was two of these. So 290 Crossfire made much more sense than the much more expensive but only marginally better 290X Crossfire. Same with 780s in SLI vs 780 Tis in SLI. And before the 980 Ti/Titan X were released, 970 SLI was barely worse than 980s in SLI.

 

The difference right now is that the 390X and 980 never saw enough of a price drop to ever make any kind of sense. It's kind of unfortunate that the 390X is still £70 more expensive than the 290X -- the same exact GPU -- reached in response to the GTX 970s aggressive price point. Of course, you could still make the case that using two 980 Tis or two Furies makes more sense than two Titan Xs or Fury Xs (or even one of either), so I guess the landscape hasn't changed all that much.

 

On 19/01/2016 at 9:11 PM, D2ultima said:

But I always tell people to get an i7 for gaming if they want to play new games especially at high FPSes. People tend to be on the entire "omg, i5 is king, nothing more than i5-4460 is ever needed, omg, stfu" etc etc and not listen. Even if your i7 isn't being utilized by the game, the rest of your PC's programs have the hyperthreads to use, and you essentially can get a bit more performance for the game anyway, as it isn't really sharing performance with the game... kind of. Hyperthreading is a bit difficult to explain, but I think you understand what I'm trying to convey. The reason I didn't leave it as a recommendation or "my thoughts" in the guide is because this is more focused on what SLI itself does. It's rare that SLI will say... double your CPU utilization (though Killing Floor 2 is single-thread bound, and SLI has it use an entire second thread for the second GPU, so it does in effect nearly double utilization there).

 

I agree with getting an i7 for gaming. There's a difference between what is needed and what is optimal. Do people ever suggest putting 4GB of RAM with that i5? It's very rare to find a game that uses more than 4GB, let alone requires it. Screw the fact that you need your OS and other background applications running concurrently. And god help you if your AV decides to do a system scan while you are gaming if you figured an i3 would be enough.

 

For me it's not so much about what you need as what your budget can allow for. Do I need a 5960X and two 980 Tis? Probably not. But if my budget allows for it, then why the hell not?

Link to post
Share on other sites

Advantage: Multi GPU looks baller. 

 

Disadvantage: Cabling can be a PITA.

 

xD


AMD Ryzen 9 3950X | BeQuiet! Dark Rock Pro 4 | Crosshair VIII Impact | Trident Z 3200MHz 2x4GB | GTX 1080 HOF

Samsung Galaxy S7 Edge Black 32GB | Exynos 8890 Octa | SanDisk Ultra 200GB SDXC

1 | 2 | 3 | 4 | Valley | Superposition

Link to post
Share on other sites
Posted · Original PosterOP
23 minutes ago, othertomperson said:

As of right now, that is good advice. However, in general it isn't. More generally speaking, the second-from-top tier, which was based on the same GPU as the absolute top, offered the best cost/performance ratio, and the most sensible high performance purchase was two of these. So 290 Crossfire made much more sense than the much more expensive but only marginally better 290X Crossfire. Same with 780s in SLI vs 780 Tis in SLI. And before the 980 Ti/Titan X were released, 970 SLI was barely worse than 980s in SLI.

 

The difference right now is that the 390X and 980 never saw enough of a price drop to ever make any kind of sense. It's kind of unfortunate that the 390X is still £70 more expensive than the 290X -- the same exact GPU -- reached in response to the GTX 970s aggressive price point. Of course, you could still make the case that using two 980 Tis or two Furies makes more sense than two Titan Xs or Fury Xs (or even one of either), so I guess the landscape hasn't changed all that much.

 

 

I agree with getting an i7 for gaming. There's a difference between what is needed and what is optimal. Do people ever suggest putting 4GB of RAM with that i5? It's very rare to find a game that uses more than 4GB, let alone requires it. Screw the fact that you need your OS and other background applications running concurrently. And god help you if your AV decides to do a system scan while you are gaming if you figured an i3 would be enough.

 

For me it's not so much about what you need as what your budget can allow for. Do I need a 5960X and two 980 Tis? Probably not. But if my budget allows for it, then why the hell not?

I keep updating the guide as things go along. I initially had a listing where I said if your next upgrade is only one tier higher, to get a second GPU instead of an upgrade, and only jump two or more cards (I.E. 770 to 780Ti, 7970 to R9 290X, etc) if you wanted to upgrade without multi-GPU... but this is no longer the case. Put simply, the devs, both nVidia and game devs, do not care about SLI at the present time. Maxwell was DESIGNED to be anti-SLI, no matter what anybody else can tell me. They removed AA features like CSAA from the cards (including the high CSAA abilities that SLI could use), and added MFAA... but MFAA doesn't work in SLI. Then they made the voltage bullshit that is unstable in its own right, but even worse in SLI, and then had the cheek to add notes to their drivers' release notes stating that trying to match the voltages in SLI does not benefit the consumer at all... which is a lie, since custom vBIOSes for maxwell cards that render voltage at a constant are RIDICULOUSLY beneficial, both in stability and in overclockability. Then the forcing GPUs to maximum speeds that doesn't always work in a multi-GPU setup with Maxwell and maxwell alone? Recipe for disaster. In addition to that, DSR came for Maxwell first, but DSR + SLI + Gsync doesn't work, which means not only can you not force higher AA in the drivers, and you lack MFAA for performance, you can't even use DSR if you want to take advantage of gsync.

 

And then you have their drivers. Their drivers since GTA V's game ready driver have been awful. The TDRs have *mostly* stopped, but crashes and visual corruption have plagued the drivers for quite some time, and the last three drivers out of them have been BSOD-happy depending on what you're doing. It's garbage, and they honestly don't care anymore. If they did, they'd take their time and fix things instead of pumping out broken drivers left and right.

 

And on top of that, Unity engine and Unreal Engine 4 being so popular means many games are coming out that won't touch a second card, except with quite a bit of hacking. Ark: Survival Evolved and Survival of the Fittest can't use it without hacking in support (and even then it doesn't work perfectly), Street Fighter 5 is unoptimized for its visuals (and for the fact that it's on UE4) but you can't use multi-GPU if you've got an older setup like mine; Unreal Tournament 4 is the same way too, etc. When your card maker isn't trying, your cards have the functionality mostly as an afterthought as in "yeah people still use that", and many games refuse to use it and if they use it they use it awfully (Assassin's Creed Syndicate on launch had sometimes negative scaling in SLI, even)? Then why would I ever suggest getting two midranged cards over one powerful one? Even if the two midranged cards have more direct grunt in them in a good scenario, most of the time, the single stronger card will be the better experience. And I haven't even touched on games like Killing Floor 2 with its ridiculous stutter in windowed modes as long as SLI is turned on, or the stutter it got when I turned on PhysX while SLI was running. It's made me more than once wish I could trade my two 780Ms for a single 980M and be done with it (which I could do if I had $$); I'm not afraid to admit that.

 

The 390X in USD is just over $400, which honestly probably isn't so worth it over the 390 which is about $100 less give or take, but it's definitely somewhat closer to the 980 in performance for about $130 less (the 980 in USD is $550 still without being on sale). The R9 Fury is the SAME PRICE as the 980, so there's no reason to ever buy a 980 for the proper price... in the US. I have yet to figure out why AMD is so much more expensive in any non-US country (it's not that both teams aren't more expensive; just that nVidia's cards are somewhat comparable... the price in pounds for a 980 would translate to about $600 USD. But the price in pounds for a R9 Fury is closer to a 980Ti, which is retarded). In the UK I could completely understand your statements about the price, unfortunately. I really wish it weren't so.

 

You understand where I come from with the i7. Most games won't max it, but y'know... you eventually will find one that does, and you'll probably be glad for it. And for the rest, well you just have extra power. That's always a good thing. I do somewhat agree with the what your budget allows, but I still think a system should be maximized to everyone's needs within a budget. If their budget is $10,000 for example but they want single-GPU gaming and they're not going to overclock at all and they want the machine fast, there's little reason to go over an i7-4790K and a single 980Ti with some good RAM and some SSDs. On the other hand, if somebody tells me they want minimum 144fps in all games and they want to livestream and their budget is $10,000... well you're getting a 5960X with something like a Kraken X61 or a custom loop with a 980Ti or two or three, all with custom vBIOSes and 32GB DDR4 3000MHz 14-14-14-31 RAM in quad channel, and OC the CPU to minimum 4.5GHz. Leh we see ah game dare to be under 144fps then nah. Leh we see.

 

I always get the best possible choice out of a budget, but I never overspend for anyone. No need to get them something they're not going to use.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites
7 hours ago, D2ultima said:

[lots of text]

One of the things that I'm looking forward to with Dx 12 is the possibility of API-based multi-GPU support. Of course it relies on support from the devs, but not only could you link up your old 680 (for instance) with your shiny new Pascal (or Polaris!) GPU, but if you did decide to go with dual GTX 1070s or R9 490s which will probably make the most sense at launch before the Big Boys are released there is a very real possibility (though not a given) that the lack of specific driver support needed and full utilisation of your GPUs as independent compute devices could have this massively out-performing SLI or Crossfire.

 

I'm also looking forward to people benchmarking R9 480 vs GTX 480 for shits and giggles, but that's neither here nor there :P

Link to post
Share on other sites
Posted · Original PosterOP
2 hours ago, othertomperson said:

One of the things that I'm looking forward to with Dx 12 is the possibility of API-based multi-GPU support. Of course it relies on support from the devs, but not only could you link up your old 680 (for instance) with your shiny new Pascal (or Polaris!) GPU, but if you did decide to go with dual GTX 1070s or R9 490s which will probably make the most sense at launch before the Big Boys are released there is a very real possibility (though not a given) that the lack of specific driver support needed and full utilisation of your GPUs as independent compute devices could have this massively out-performing SLI or Crossfire.

 

I'm also looking forward to people benchmarking R9 480 vs GTX 480 for shits and giggles, but that's neither here nor there :P

Mmm. I've been watching DirectX 12 titles for quite some time. What I've gathered is that while it indeed has great potential, nobody cares to put in the time to make use of it. I've seen a lot of benchmarks where DX12 surpasses things, but every single DX12-only title that came through the UWP so far has had absolutely abysmal performance; with a GTX 970 on average basically performing like a Xbox 1 in Quantum Break specifically. Rise of the Tomb Raider got benefits on PCs with lower-end CPUs, but nVidia/i7 gamers in particular actually saw no improvement to a downgrade, and it released without a SLI profile too. I am not sure if it ever did get one. DX12 also has the ability to use multiple GPUs in two ways. Implicit multi-adapter is basically SLI and CrossfireX, letting the drive handle the load distribution. Explicit multi-adapter is where you could use whatever you want, and the game simply is coded to use more than one card. I may have them switched; if I do let me know. Either way, the one I've listed as explicit seems to perform better than implicit on the whole. But since it needs the devs to actually care and optimize, it's a crapshoot.

 

Honestly right now, as far as I see, Vulkan and DX12 are straight downgrades for all but maybe 5% of the titles they're announced for, and if titles don't have a DX11 counterpart, they simply perform like garbage in general. It all boils down to the same problem we've had the entire time: If people would take the time to optimize for PC (or even build for PC and port to consoles after), we would have games running maybe four, five times better than they are right now. DX12 might make things better, with draw calls being higher and less CPU usage, but honestly... DX12 with the poor PC support we have now is just a disaster in performance. Not to mention the artificial locking to Windows 10.

 

I would love to see a R9 480 compared to a GTX 480 for shits and giggles too =D.


Clevo P870DM3 (Eurocom) | i7-7700K | 32GB DDR4 2400MHz | GTX 1080N SLI | 850 Pro 256GB | 850 EVO 500GB M.2 | Samsung PM961 256GB NVMe | Crucial M4 512GB | Intel 8265ac | 120Hz Matte screen | 780W PSU

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×