Jump to content

The SLI information guide

D2ultima
13 hours ago, Elmarrr said:

Hi D2ultima,

 

thank you for the reply. Just to be specific, I know I can use another card to run secondary monitors, but can I game on it if it's connected to another card than I would like to render on? I'm considering upgrading to a triple monitor setup and I would like to use my current XL2411T as one of the screens, and only have to buy two new ones. (The XL2411P appears to be basically the same monitor, except it has a display port connector, instead of DVI-D.)

I figured, if the 970 was connected via SLI, even if it's just as PhysX card, it might be able to display whatever the main card outputs? I don't actually care about PhysX, the games I play (mostly just iRacing) don't even support it. Do you know if the DVI port would work in this specific use case?

 

Thank you for your help!

You cannot connect them via SLI

 

I do not even understand what you're asking, you did not word yourself properly. Just try it and see. However it probably would be better to just use a new card alone.

1 hour ago, HumdrumPenguin said:

Time to update the whole guide. Just type "It's dead, now move on".

It's maxwell all over again

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, D2ultima said:

You cannot connect them via SLI

 

I do not even understand what you're asking, you did not word yourself properly. Just try it and see. However it probably would be better to just use a new card alone.

Sorry for the confusion, I incorrectly assumed that installing a dedicated PhysX card involved SLI. My question boils down to this:

 

If I do have a dedicated PhysX card for whatever reason, can it display the same image as the primary card?

 

I'll leave the thread be now. Thanks again and have a good day.

Link to comment
Share on other sites

Link to post
Share on other sites

Hello all,

 

it’s me again, with the upcoming rtx 3000 series and the 3080 looking like it could be around 50% better than the 2080 ti, my new question is this:

 

2080 ti’s are selling for about the same price as a 3080. Since I was dumb and bought a 2080 ti earlier this year I am now stuck with a card that just dropped in value. Do you think it would be worth buying a second 2080 ti and SLI linking the two verses just one 3080? Or selling my current card and pay the piper and get a 3080?

 

I know that  the 3080 is not out yet, so it’s not clear where it’s true value will lie, however there are not that many games that run with ray tracing even now, so I am curious if SLI linking would be better than a newer RTX card. (Also if I need to sell my 2080 ti I think I need to get it on the market before it’s value equals 0)

 

As always thanks and I look forward to being educated!

T_D

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/7/2020 at 2:21 PM, The-Director said:

Do you think it would be worth buying a second 2080 ti and SLI linking the two verses just one 3080? Or selling my current card and pay the piper and get a 3080?

The answer is "it depends". If your typical gaming involves lots of games that do support SLI then it might be worthwhile (I'm still contemplating doing something similar with my existing 2080). If you're currently happy with your GPU performance, then I'd probably leave it entirely until the initial surge dies down. Otherwise I'd pony up.

[ P R O J E C T _ M E L L I F E R A ]

[ 5900X @4.7GHz PBO2 | X570S Aorus Pro | 32GB GSkill Trident Z 3600MHz CL16 | EK-Quantum Reflection ]
[ ASUS RTX4080 TUF OC @3000MHz | O11D-XL | HardwareLabs GTS and GTX 360mm | XSPC D5 SATA ]

[ TechN / Phanteks G40 Blocks | Corsair AX750 | ROG Swift PG279Q | Q-Acoustics 2010i | Sabaj A4 ]

 

P R O J E C T | S A N D W A S P

6900K | RTX2080 | 32GB DDR4-3000 | Custom Loop 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/7/2020 at 9:21 AM, The-Director said:

Hello all,

 

it’s me again, with the upcoming rtx 3000 series and the 3080 looking like it could be around 50% better than the 2080 ti, my new question is this:

 

2080 ti’s are selling for about the same price as a 3080. Since I was dumb and bought a 2080 ti earlier this year I am now stuck with a card that just dropped in value. Do you think it would be worth buying a second 2080 ti and SLI linking the two verses just one 3080? Or selling my current card and pay the piper and get a 3080?

 

I know that  the 3080 is not out yet, so it’s not clear where it’s true value will lie, however there are not that many games that run with ray tracing even now, so I am curious if SLI linking would be better than a newer RTX card. (Also if I need to sell my 2080 ti I think I need to get it on the market before it’s value equals 0)

 

As always thanks and I look forward to being educated!

T_D

If you are a tinkerer, and are willing to manually fiddle and such your stuff, two 2080Tis will definitely provide a noted benefit, more than a single 3080 can provide, in a lot of games. However that means buying a new 2080Ti, which I just do not recommend unless it's around $400 or less (USD anyway). If the 2080Ti truly is not enough for you, sell it yourself and grab a 3080 when the Ampere refresh comes out in a year (3080Ti or 3080 Super or whatever it will be). 

On 9/9/2020 at 8:45 AM, HM-2 said:

The answer is "it depends". If your typical gaming involves lots of games that do support SLI then it might be worthwhile (I'm still contemplating doing something similar with my existing 2080). If you're currently happy with your GPU performance, then I'd probably leave it entirely until the initial surge dies down. Otherwise I'd pony up.

In YOUR case, you most certainly should just grab a 3080. The 3080 is being compared to the 2080 because of price, but the rasterization performance of Ampere doesn't appear to be any big deal of a generational jump (take with a grain of salt until gamernexus gets some cards). The 3080 silicon-wise is a 2080Ti upgrade, and seems to be ~30% faster, which is what the 2080Ti was to the 1080Ti, and the 1080Ti was to the 980Ti, etc. Turing had an extra benefit that Ampere keeps, where the numerous backend improvements meant some games scale exceptionally well with it, allowing a 2070 to match a 1080Ti, instead of a 2080 trading blows with the 1080Ti, and making the performance jump of 1080Ti to 2080Ti very massive... but that should not happen with Turing to Ampere, and better yet the high end is back down to $700, which means price to performance the gap is quite large compared to last gen.

 

I definitely would not buy a second 2080. If a 2080 is $300 second hand and you sell yours and put that $300 for another one up with it you're almost at a 3080 and the gains from previous gen midranged to new gen high end are large. Think, 980 to 1080Ti, or 1080 to 2080Ti.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
4 hours ago, dilpickle said:

https://nvidia.custhelp.com/app/answers/detail/a_id/5082

 

So SLI is now OFFICIALLY officially dead. The 3090 will only support the Multi-GPU function of DX12 and Vulkan. One question though: Does this mean the NVLink bridge is no longer required?

No, it's required.

 

And yes, SLI is dead, because at the least you can make your own profiles with bits on pre-Ampere and force it on, but now if the devs don't want to spend extra time to uselessly add SLI support to their games, it's not working, which means... lmao

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Multi GPU Is now exclusive to $3000 video cards so I can't imagine any developer is going to bother with it. There are those few existing games and we might see it in sequels or games that use the same engine but that's about it. And benchmarks like 3Dmark.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/19/2020 at 1:06 PM, dilpickle said:

https://nvidia.custhelp.com/app/answers/detail/a_id/5082

 

So SLI is now OFFICIALLY officially dead. The 3090 will only support the Multi-GPU function of DX12 and Vulkan. One question though: Does this mean the NVLink bridge is no longer required?

Thanks for sharing! The article is very very clear that SLI is NOT dead, just NVIDIA's strangle hold on being involved in the profiles for drivers ... they are very clear to state that it continues to be valuable to games studios and non-game uses ... and is integral in the DX12 and Vulcan foundations, the developers just have to leverage it, and can do so without NVIDIA ... which means there's enough evolved interest in studios and content software developers that it was agnostic ally implemented at a deep level ... how that relates to the NVLINK connector, I remain very interested.

Edited by JinnGonQui

VR Snob ... looking for ultimate full-power full-portable no-compromise VR Box ... Streacom's DA2 starting to look good ...

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, JinnGonQui said:

Thanks for sharing! The article is very very clear that SLI is NOT dead, just NVIDIA's strangle hold on being involved in the profiles for drivers ... they are very clear to state that it continues to be valuable to games studios and non-game uses ... and is integral in the DX12 and Vulcan foundations, the developers just have to leverage it, and can do so without NVIDIA ... which means there's enough evolved interest in studios and content software developers that it was agnostic ally implemented at a deep level ... how that relates to the NVLINK connector, I remain very interested.

The advantage of SLI was that you could hack most games to work with it. But now you can't do anything unless the developer explicitly adds support.

 

Most PC games are ports from consoles. And even when they aren't there is little chance any developer is going to spend time and money on a feature that only benefits those with $3000 video cards.

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/20/2020 at 12:19 AM, dilpickle said:

Multi GPU Is now exclusive to $3000 video cards so I can't imagine any developer is going to bother with it. There are those few existing games and we might see it in sequels or games that use the same engine but that's about it. And benchmarks like 3Dmark.

Even if it wasn't, forcing devs to implement it natively (especially if that means DX12 or Vulkan-based only; it's unclear whether it does or does not) is simply... extra work, for no reason whatsoever. It's different with Nvidia TWIMTBP titles, where Nvidia helps the devs work with it and try to get things working, but as seen with AC Unity and most Ubishit titles in general as well as the basic quality of AAA games on PC, even TWIMTBP titles don't even like SLI very much or are simply far too unoptimized (well NVLink solved that, but then games would benefit from hacked profiles and whatnot still, so shakes head)

On 9/20/2020 at 7:04 PM, JinnGonQui said:

Thanks for sharing! The article is very very clear that SLI is NOT dead, just NVIDIA's strangle hold on being involved in the profiles for drivers ... they are very clear to state that it continues to be valuable to games studios and non-game uses ... and is integral in the DX12 and Vulcan foundations, the developers just have to leverage it, and can do so without NVIDIA ... which means there's enough evolved interest in studios and content software developers that it was agnostic ally implemented at a deep level ... how that relates to the NVLINK connector, I remain very interested.

SLI is 1000% dead. I guarantee that. Nvidia killed it because it doesn't want to do any more work on it since they are the ones who made all the game profiles, but banning driver profiles is a complete nail in the coffin.

 

Previously if you had PCI/e 3.0 x16/x16 and a HB bridge on Pascal/Maxwell you could get improved or positive and/or bug-free scaling in games that otherwise didn't support it, even UE4 titles, like Dragonquest XI (scaling but buggy and flashing on 3.0 x8/x8 with HB bridge), Unreal Tournament 4 (almost no scaling on x8/x8), and Witcher 3 with TAA on (low scaling on x8/x8), usually with custom SLI profiles (UE4 titles need these). Even games that had fairly solid SLI performance could improve from modified bits or other profiles, like how Black Ops 3 benefitted from the Battleforge SLI profile over its own Nvidia-provided one.

 

Since bandwidth became such an issue (which NVLink did fix) I had been telling people single strongest card and then SLI if you want it, unless they wanted to spend time on guru3D forums with the master list of modded SLI profiles and/or tinkering with bits to get optimum performance, and there were lots of gains to be had with NVLink...

 

But now they're saying they aren't allowing us to modify driver bits with Nvidia Profile Inspector for SLI or force it on games that don't otherwise support it, even if they could. That means Dragonquest XI, UT4, etc which supports and scales well with SLI if you have enough inter-card bandwidth (read: NVLink) and likely a whole host of other Unreal Engine 4 games and generally others I can't remember off the top of my head are now forever locked to ONE card.

 

SLI is dead, there is no saving it, Nvidia killed it and admitted to killing it, and I highly doubt there is a way they're going to bring it back. Even if they insist they will "work with developers" to "get a lot of games supported" it doesn't mean developers will want to put in that extra work for free, even if you could SLI all the way down to whatever a 3050Ti will be, which is why pre-existing SLI support for a vast majority of games just doesn't really exist these days. If you think that there's some hope for it, feel free to buy a couple 3090 cards and run MSI AB + RTSS in every game and tell me how often your games actually properly load both cards and get back to us. 

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Sad. I needed two 3090s anyway, and will get to play with them in SLI here or there, but sad to hear it may not be useful going forward. Who knows though, nvidia has done this a couple of times before and then re-invented sli with a new name. If AMD keeps up xfire, then nvidia will likely have to reinvent sli again in another form to look competative. I do remember the last time this happened, and like 6 months to 12 months later NVLink was introduced after SLI was supposedly dead.

 

I've been doing SLI since before SLI via the Voodoo cards back in the day. Of course, that was pre-GPU. It's a passion...

 

How can we organize and leverage this to demand they allow us to manipulate driver bits? If enough people care, they will at least give us that for our troubles (or to shut us up... lol).

 

I actually get my second 3090 today from evga (ftw3 ultra) and have my second board (MSI Godlike x570) setup to go when a 5950x is available. Card will spend most of the time between a 10900k setup and the 5950x setup depending on how many systems we need each week. I wanted to just go with intel again since I have a 10700k lying around that can do 5.1ghz stable, but LGA1200 doesn't seem to have any boards for 3090 SLI.

Link to comment
Share on other sites

Link to post
Share on other sites

yea i use to have 1080s that had sli capatibility it would not boot with th sli bridge

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Hi, I have a nvidia quadro k2200, i use to do modeling and rendering. Will it be a good idea to buy a second card (the same) to increase performance?

i have a asus Z170-A and i7 6700.

Tx, ps, i'm new to computer building, this computer i use wasn't build by me. But i have build a pc for my son and start to like that. Reason why i start thinking of upgrading mine.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

So in the instance of say, HEDT motherboards? Would something like that kind of set up still make SLI truthfully dead?

Link to comment
Share on other sites

Link to post
Share on other sites

Rest in peace SLI. you will be missed, 90% for your aesthetics and 10% for your bragging rights.

PC 0: Pinky 2.0

Ryzen 9 5950x — 64GB G.Skill Ripjaws 5 @3600Mhz CL14-13-13-28-288 — ROG Crosshair 8 Dark Hero — RX 6900 XT — Hardline Loop — Sabrent Rocket 4.0 2TB — Samsung PM961 1TB  WD Blue 4TB HDD — Corsair AX1500i — Thermaltake Core P5 

 

PC 1: Pinky (Yes that is her name) Here's the build

Xeon E5-1680V3 — 64GB G.Skill Ripjaws 4 @2400Mhz — MSi X99A Godlike Gaming — GTX 980Ti SLI (2-WAY) — Hardline Loop — Samsung 950Pro 512GB — Seagate 2TB HDD — Corsair RM1000 — Thermaltake Core P5

 

PC 1.1: Pinky (Mom Edition) Here's the build

i7-5960X — 64GB G.Skill Ripjaws 4 @2400Mhz — MSi X99A Godlike Gaming — GTX 980Ti SLI (2-WAY) — Hardline Loop — Sabrent Rocket 3 1TB — Samsung Q 870 Evo 4TB — Corsair HX850i — InWin S-Frame #190

 

PC 2: Red Box/Scarlet Overkill (Dual Xeon)

Xeon E5-2687W x2 — 96GB Kingston DDR3 ECC REG @1333Mhz — EVGA Classified SR-X Dual CPU — GTX 1070 SLI (2-WAY) —Hardline Loop — Samsung 750 EVO 256GB — Seagate 2TB 2.5" HDD x3 — Self-Built Case

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 months later...

Pretty sure there won't be much help on this but I just bought a new pc and put two EKWB 3070s on it (I would have preferred to have a single EKWB 3080 but it was completely impossible to get it). I know that SLI is of the table, but what can I do to take the most benefit out of them? Thank you in advance.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/21/2021 at 2:44 PM, gnrendeiro said:

Pretty sure there won't be much help on this but I just bought a new pc and put two EKWB 3070s on it (I would have preferred to have a single EKWB 3080 but it was completely impossible to get it). I know that SLI is of the table, but what can I do to take the most benefit out of them? Thank you in advance.

What do you want to use them for? For gaming the second card will do absolutely nothing for you.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Seeing my 2080S have SLI support.. is it even worth it at this point?

 

Need to get a -E board anyway.. But tempted.

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/17/2021 at 2:48 PM, MultiGamerClub said:

Seeing my 2080S have SLI support.. is it even worth it at this point?

 

Need to get a -E board anyway.. But tempted.

There are still plenty of games that support SLI but I wouldn't spend too much money specifically for it. If you already have the main hardware then go for it.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 8/14/2014 at 12:37 AM, D2ultima said:

Hi everyone. I originally wrote this guide over at the kbmod forums, but as it turns out that forum is as dead as Aeris in FF7. This forum is more lively and thus I figured it'd be good to copy over my guide for all to read. This is a real-world, layman's terms assessment of what SLI does and how it works. I have never used and therefore cannot say that all of these will hold true for CrossfireX. Original guide (no longer updated there) is over at http://kbmod.com/forums/viewtopic.php?f=22&t=6212

 

------------------------------------------------------------------------------------------------------------------------------------------------

 

Basically, this is a guide meant to explain the upsides and downsides of SLI. It's mainly geared toward people who have midrange or last generation cards who wonder if they should get multiple cards or simply upgrade to a stronger one. This will list pretty much everything you will ever want to know about SLI in as great a detail as I can possibly create. It WILL BE A LONG READ. Also note that I have never attempted to Crossfire any cards, so this guide is MAINLY directed toward SLI, and while most of the ups/downs will be common to both SLI and CrossfireX, THIS IS NOT A CROSSFIRE GUIDE. There are large differences and I am not in a position to explain in-depth about CrossfireX.

 

 

First, I will clear up some fairly common misconceptions about SLI. 

 

  Reveal hidden contents

1 - Your video memory is not added or shared. It is copied. If you have two 1GB video cards, the data in GPU 1 is copied to GPU 2, and thus you only ever benefit from 1GB of video memory. If a game wants to use 2GB, it won't put 1GB in GPU 1 and 1GB in GPU 2. Because of this, pairing a 4GB card with say a 2GB card for example does not work. Also, please be wary of the fact that most multi-gpu cards are sold with the TOTAL vRAM listed as the selling point. For example, the Titan Z. It does NOT have 12GB of available vRAM, it has 6GB on each card. You will only have 6GB usable vRAM in games/renders/etc. Same goes for the R9 295X2 and its uhh... "8GB" of vRAM.

 

2 - Dual GPU cards (like the GTX 590 and 690) ARE using SLI. They also use weaker versions of the flagships of its generation (usually through downclocks) for each of the GPUs in the dual GPU card. They also often cost the same as OR MORE THAN two of the cards that comprise it, for example 2 x 680 = ~$1000 at release, and 1 x 690 = ~$1000 at release. So the two single cards which comprise the multi-gpu card will always be better to purchase, however the dual-gpu card works on motherboards with only one PCI/e x16 slot. Since these cards are no longer made however, you're far better off just buying a newer generation single card for any situation.

 

3 - You usually do NOT need 800W+ PSUs for most dual GPU SLI solutions. Most cards will work easily on a much smaller wattage PSU once it's good quality. Lots of dual GPU users I've seen (not saying all of them do) tend to go for 850W or 1000W PSUs. You would not need these PSUs for just two cards unless you're a very serious overclocker with a top-end CPU (i7-4790K/6700K/7700K are NOT top end) and two top-end GPUs... but then you wouldn't be using this recommendation anyway would you?

 

4 - If one card is clocked differently than the other on Pascal GPUs, SLI will still not function asynchronously. If one card is overheating (Pascal responds to temperatures heavily and will throttle boost before even approaching the thermal limits) or hitting a TDP limit while another card is fine (for example, a Reference 1080 card's maximum allowed TDP limit is 180W, whereas an ASUS Strix 1080 is 400W, if you ran them in mGPU the reference will throttle/not hold boost, as well as overheat, far more than the Strix ever will... the reference 1080 may throttle to 1750MHz at 1440p high FPS gaming while the Strix may be capable of 2000MHz at the same point) then both cards will be downclocked to match the slower card. This means there is serious performance lost, and great care should be considered when picking cards for multi-GPU. I do not know how Maxwell responds to this, I never had it to test and I've never ran into someone who is willing to test things like I might. On Kepler cards (which you shouldn't be using anymore if you play games late 2014 onward, for long reasons I will not get into right now, just take my word for it and upgrade when you get your next chance) if one card is clocked differently than the other, SLI will still function asynchronously, meaning that it won't downclock 1 card to match the other slower card. This was actually fairly new in the long life of SLI. Also confirmed that mismatched memory clocks will work in SLI just like mismatched clock speeds will. Please note however that extreme differences in clock speeds will cause stuttery, jittery gameplay; similar to microstutter, even at very high FPS counts (60+) due to bad frame pacing. Also, regardless of utilization, I have confirmed that differing clockspeeds will not benefit the game that much; it will simply "function". If you want to get a boost from an overclock, keep both cards as close to each other as possible. EVGA Precision X's K-boost functionality is useful for this... but don't run Precision X on a dGPU-only laptop unless you want to find a bricked screen shortly after due to it opening up a vulnerability where the nVidia driver is able to overwrite the LCD's EDID data (note: the driver ALWAYS tries to do this; Precision X simply allows it to, as it is otherwise-unable to actually do it). If you find yourself with a SLI laptop, stick to other tuning software such as MSI Afterburner or nVidia Inspector and find yourself a modified video BIOS (Kepler/Maxwell only) so your cards actually do what they are supposed to. If you have Pascal mobile, well, make sure your cooling keeps the cards similar speeds/temps.

 

5 - The percentage utilization on the cards does not always equal the exact bonus in raw performance (hereafter referred to as "scaling"). For example, 99% utilization on both cards (a bit rare) may only be 90% boost in FPS. SLI should get between 85-95% boosts in FPS as per what the technology allows (game dependent), though many games since 2014 have much less scaling... 70% is considered "good" by many and below is common. Also note that the higher the resolution the game is rendered at, the lower the scaling for quite a few titles. If you want to play at 4K for some reason, expect as low as 26% scaling in Witcher 3 (with hairworks on anyway). The high bandwidth/LED/doubled-up-flex bridge solutions can help in SOME titles here, but PCI/e 3.0 x16/x16 is far more important. This is explained fully in the "The bandwidth issue" section of my guide below.

 

6 - If a game has slow support for SLI to be added, it is *NOT* nVidia's fault. It's the developer's duty to follow up and work with nVidia to fix multi-GPU support issues for their games before launch and post launch. If it takes 5 months for SLI to appear (like in Titanfall, which STILL has SLI issues) then you need to blame Respawn for it. The same also applies to AMD and Crossfire. Also do note that some game engines simply do not support SLI at all (officially) like Unreal Engine 4 and Unity 5. While you may have varying success forcing SLI on (depending on resolution, bridge type and PCI/e lane width as well as whether or not it produces visual bugs), understand that games coded in these engines will almost never get SLI (or crossfire) support.

 

7 - Your CPU and motherboard have an effect on what you can SLI and how many cards you can SLI. You need to count PCI/e lanes, essentially. Mainstream intel CPUs (like the i7-4790K) have only 16 CPU lanes (different and separate from chipset lanes); so 2-way SLI (with each card running at 8x bus speed) will be the max the CPU can support, unless the motherboard has a PLX chip to support more cards in SLI (which won't really be happening anymore due to the death of 3-way and 4-way SLI). SLI requires 8x bus speeds for a card to work (ironically, the PCI/e revision is irrelevant... PCI/e 3.0 x4/x4 is double the bandwidth of PCI/e 1.1 x8/x8, but the former doesn't work and the latter will); AMD cards I believe can run on 4x bus speeds. SSDs in the M.2 format may or may not use up PCI/e lanes from the CPU; you will need to check your manual or contact the motherboard vendor to tell. Intel's Skylake chipsets have more chipset lanes to use on PCI/e SSDs than previous chipsets have, even Enthusiast ones like X99.

 

8 - Microstutter is mostly a thing of the past. If your game isn't running at ~35fps or below, you should experience no microstutter without some external influence (like watching a high resolution flash video on a second monitor in Google Chrome). If you get a lot of microstutter in various games at high FPS counts, then the fault lies elsewhere on your system, or the developers have screwed up in the games you are testing (most likely this). I can confirm Mass Effect 3, CoD: Black Ops 2, Battlefield 4, Dark Souls 2 (SOTFS as well), CoD: Black Ops 1, Dark Souls 3, Dying Light, Skyrim, Killing Floor 1, Super/Ultra Street Fighter 4, and Elite Dangerous have no stutter in Windowed, Borderless Windowed or Fullscreen modes using SLI. Use those to test your system, and if there is a stutter issue in another game in say... borderless windowed mode (like Killing Floor 2), then blame the devs.

 

What can I SLI? (970 SLI issue information and potential fix)

 

  Reveal hidden contents

Before we get into "what" you can SLI, make sure your motherboard supports SLI. SLI requires PCI/e x8/x8 configurations at the minimum. It does not matter what revision of PCI/e is present, even though PCI/e 3.0 x4 is more bandwidth than PCI/e 1.1 x8. Your motherboard must be capable of running at minimum x8/x8 configurations of the same PCI/e revision (3.0 x16/2.0 x16 will fail) to use SLI.

 

You:

CAN use the same cards in non-performance-boosting mode (driving extra monitors) to do triple monitor gaming.

CAN use cards with different core clock speeds in SLI without one card being slowed to match the other.

CAN use cards with different memory clock speeds in SLI without one card being slowed to match the other.

CAN use cards from different manufacturers as long as their specs are the same (I.E. "MSI GTX 780 lightning" + "Gigabyte GTX 780 Windforce" will work).

CAN use cards with different forms of cooling in SLI (reference cooler + non-reference cooler) excepting possibly the GTX 970.

CAN use two dual-GPU cards (like the 590 or 690) in QUAD SLI in a board that only supports 2-way SLI.

CAN use four cards in 4-way SLI (NOT Quad SLI) in a board that specifically supports 4-way SLI (and with a 40-lane or higher CPU).

 

MAYBE can use mismatched cards to force higher levels of AA on DX9 and older OpenGL games. I cannot test it.

 

CANNOT use mismatched cards (GTX 770 + GTX 980 will fail).

CANNOT use cards with different cores even if the same name (GK104 GTX 660 (OEM) + GK106 GTX 660 will fail, or GM206 GTX 960 + GM204 GTX 960 (OEM) will fail).

CANNOT use cards with different vRAM sizes even if the name and core is the same (GTX 770 4GB + GTX 770 2GB will fail).

CANNOT use cards with different memory bus width even if the same name and core is the same (GTX 760 192-bit (OEM) + GTX 760 256-bit will fail).

CANNOT use cards that do not support SLI technology (most GT-series cards, and the GTX 750Ti do not support SLI).

CANNOT use 4-way SLI in a motherboard that does not specifically support 4-way SLI. Even if the CPU has 40+ PCI/e lanes, it does not mean the board can run 4-way SLI. PLEASE NOTE: a board CAN be capable of 4-way Crossfire and NOT 4-way SLI, but a board capable of 4-way SLI *must* be capable of 4-way Crossfire.

 

While not being "SLI" itself, these apply to multiple card usage:

 

CAN use mismatched cards if you are driving a second or third monitor from the extra card.This needs no SLI bridge.

CAN use mismatched cards if you are driving PhysX on the second card. This needs no SLI bridge.

 

MAYBE can use mismatched cards driving extra monitors for triple monitor gaming (I do not know, so I am leaving it as a maybe. ANYBODY who can test and let me know, please do.)

 

More 970 issues (why do people still buy this card?):

Apparently some users have been having issues turning on SLI with two GTX 970s. While I am unaware of the cause, it was pointed out that a program called DifferentSLIAuto can fix this. The program also has a possibility of allowing two GPUs with different vRAM to enable SLI (but only the vRAM of the lower card will work). I AM NOT ADVOCATING THE PURCHASE OF TWO GPUS WITH DIFFERING VIDEO RAM SIZES AND EXPECTING THIS PROGRAM TO WORK. I am simply stating its existence, and that your mileage may vary.

 

^ further to the above, apparently manufacturers have been using different ID buses? (more info needed) for various kinds of 970 cards, and those often will not work together with SLI. Here is EVGA's compatibility table for reference; I don't have any other manufacturers' tables to show, though other manufacturers suffer from the same issues. Anyone who wishes to PM me with these tables, or add them as replies, feel free to do so and I will update the guide as necessary.

 

Now that that's done, let's get into the benefits of SLI. There's some benefits I'll list that most people don't actually know.

 

  Reveal hidden contents

1 - Obviously, better game performance. Most games have SLI profiles and will perform better with more than one card, and your stuff will run better. This is almost a given.

 

2 - Your memory access bandwidth almost doubles, even though memory size and memory write speed is not added. Two cards with a 256-bit memory bus will act almost like a 512-bit memory bus for the game. This means that two cards with a 256-bit mem bus (like 670s or 770s) will be more beneficial in memory-bound games than even a single 780Ti (especially if using the 4GB versions, as you won't be overshadowed by the extra 1GB the 780Ti has over the 2GB versions of the cards).

 

3 - Less per-card heat. It is extremely unlikely to push 100% on each card in multi-GPU, even if both cards are running near 100% utilization as per monitoring software. Also, your memory controller will have less load. This means each card uses less power and thus runs a bit cooler than forcing it to one card. Do note that this is assuming cooling is well set up, and one card is not hindering the cooling of another card, etc. If your cards are cramped in your case, expect the top card to run quite a bit hotter. Also, know that temps are *NOT* primarily related to card usage, but more to HOW the games use the card and how high your FPS is. For example: Skyrim (RCRN, max volumetric lighting option, 2k res boulders/etc texture mods, pure water mods, etc) lets my GPUs run cooler than fallout new vegas (no mods) despite the higher GPU load on Skyrim, believe it or not.

 

4 - 3D utilization. If you happen to have one of those games that are good in 3D and you have nVidia's Stereoscopic 3D glasses, there's a(n albeit rare) benefit for you with SLI! Basically, 3D essentially cuts your frames in half. You WILL have to render the game twice to see in 3D, which means half the FPS. But let's say your game uses... 70% of each card and refuses to use any more and you get good framerates. If you turn on 3D, you would likely ramp your usage up to over 90% and you'll end up not halving your FPS, and instead only losing maybe 20% or so. Do note that this DOES NOT translate into benefits for virtual reality.

 

4b - If you have a game that is good in 3D and does not support SLI, does not benefit from SLI (not "does not support") OR is simply best run with a single video card (which can be forced in nVidia control panel or nVidia Profile Inspector) you can run it off of one video card and turn on 3D, and your second card will simply render the second set of frames perfectly and give absolutely NO drop in FPS. And I mean NONE. When playing paranautical activity (single GPU only) I tried it in 3D to find that my 250+ fps remained consistent in 3D instead of getting it halved. THIS ALSO WORKS WITH GAMES WHICH HAVE ISSUES WITH MULTIPLE CARDS. The issues will not present themselves. Again, does not apply to VR.

 

5 - SLI can actually be forced on some games that don't actually support it... such as Skyrim's RCRN or ENB modded states. You can try forcing Alternate Frame Rendering (AFR) 1 or 2 via nVidia Control Panel (which in the case of Skyrim enables SLI for RCRN/ENB for example using AFR2) or you can use nVidia Profile Inspector to force specific profiles or play with SLI bits to get things working as optimally as possible (for example, the 0x080040F5 "Daylight, Evolve, Monster Hunter Online benchmark, etc" profile grants me positive scaling and no bugs in ARK: Survival Evolved which is an Unreal Engine 4 title as long as I don't turn on in-game AA). Then you can still enjoy the benefits of SLI even in many "unsupported" titles while keeping your lovely framerates. Here is a (rather short; I know there are much more elsewhere) list of games and compatibility bits that improve them; it's a good start, so good luck! http://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912

 

6 - Using SLI and vSync forces games by default to use "quad buffering", which almost always skips the issues present with non-triple-buffered vSync where if your fps drops below your monitor's refresh rate it instantly drops a large amount (I.E. hitting 55FPS when vsynced to 60 would drop you to 30fps until you could render 60 again, and going below 30 would drop you to 20fps, then to 15fps, etc). This almost entirely negates a need for adaptive vSync in games which don't support triple buffered vSync in their options menus, as long as SLI is supported for the game (some rare titles, like Black Ops 2 Zombies, will still drop you to specific points of FPS counts; it's better to use Rivatuner's framerate limiter option to limit your game to your desired framerate instead of vSync... which also, unlike vSync, works 100% of the time in windowed modes.You could also use NVPI's "frame rate limiter" option which does not require anything hooking to the game, but it increases input lag much more than RTSS does, though not as much as Vsync will).

 

7 - PhysX delegation (this doesn't actually require SLI as mismatched cards can be used, but can be done with SLI setups). Some games use PhysX really badly. But either way, if you force PhysX to a dedicated card, your other card(s) can run your game easy. This is ESPECIALLY useful with triple cards, as you can force one card to run PhysX and the other two to run the game. So don't throw away that old 780 before you grab a 980Ti or something! It can still be useful! N.B. Don't use extremely old cards with extremely new cards though. If you pair a 560Ti with a 780, it might actually run worse, because the 560Ti isn't able to keep up with the 780 (though it can with a 680). But if your old card is something like a 770, then say a new 980 and a 770 for PhysX would probably work quite well. There's no real formula for this, you kind of have to average what'd be good or not. And do note that while the number of games using PhysX is very small, it can indeed help. Note that sometimes this can give weird behaviour; Killing Floor 2 found it best to leave both cards enabled and just turn on PhysX, otherwise weird stutter and inexplicable dropped framerates was noticed, with the second card being pegged at 99% for long periods of time with nothing PhysX-related on-screen (Pascal; works perfectly on Kepler).

 

And now here come the downsides!

 

  Reveal hidden contents

1 - Not all games use SLI. This means that sometimes your game will run on one video card, and if you force SLI on it may decrease performance or introduce graphical glitches. For example: Titanfall. SLI is a detriment to that game because it causes graphical glitches. Dark Souls 3 is another title; SLI scaling is negative in that game. This means using it on one card will almost certainly give you higher FPS than using it on two cards. Deus Ex: Human Revolution Director's Cut also did not have an SLI profile for a few months before recently getting one. This is extremely common these days too... developers are coding graphics using AFR-unfriendly rendering techniques for no real reason whatsoever, and many simply are not bothering to get multi-GPU working.

 

2 - Some games actually do not benefit from SLI at all. This is different from not supporting SLI. For example: Arma 2 & the DayZ mod, CS:GO. SLI or not, my FPS is usually the same. Because of this, I simply force it to a single card partially to save energy and to avoid any detriments that SLI being on may bring (such as stutter in Arma 2 when you're at 25fps in Cherno because the engine is about as optimized as a car with square wheels). This can actually become a bit of a benefit if you wish to use 3D or something, (as I've played Arma 2 in 3D before and it's worked pretty okay honestly) as 3D'll use your second card for 0 FPS hit, but it's mainly a detriment if you don't have 3D, don't wish to run a game in 3D, the game doesn't work well in 3D or don't wish to use your second game-oriented card for something else. It also reduces the usefulness of your dual GPUs if your mainly played games don't use SLI. Like if you're a Dark Souls 1, Dark Souls 3, Binding of Isaac & BOI Rebirth/Afterbirth, Wolfenstein: The New Order, Dishonored, DayZ, FFX/X-2 HD, Terraria, Street Fighter V and Mortal Kombat XL player? SLI will not benefit you in ANY format.

 

3 - Games sometimes will not use past a certain point of SLI usage. For example: I couldn't get Metal Gear Solid V: Ground Zeroes to run past 70% or so of my cards. Now, while 70% x 2 = 140% of one card vs 99% usage on a single card gives the benefit to SLI, if you buy a single stronger card it'll outstrip your two cards easily if it also sits at 95-99%. For example, at 70% util on both cards for two 980s, a single superclocked 980Ti or any 1080 will give you better performance. This util varies per game AND PER IN-GAME FPS. Some games simply don't use up the video cards well past a certain fps; so you may easily get well over 60fps before your cards start to drop utilization (THIS IS RARE). If you're aiming for a consistent 120fps or something? A single stronger card will do better.

 

4 - SLI can break. It's rare, but it happens, and has happened across multiple OS installs and OSes (win 7, win 8, win 8.1 and multiple driver versions). It'll ramp your utilization straight up for one of your cards to 95%+ (usually secondary card, but it can happen on the primary), and your other will sit at about 40-60% or even less, and your FPS will tank (though smoothness will NOT be affected very much). The only way to fix this is with a computer restart or (I assume, but never tried, disabling/re-enabling your cards), and it doesn't actually SEVERELY drop your performance. Like, it might break a little. Drop you from 90fps to 70, but still playable. Other games though it could be a far worse issue depending on how much you need your second card to get playable FPS. I don't know what causes it and I've had it happen to me a few times, but it is annoying to have to fix it.

 

5 - SLI drains more power. This should be obvious.

 

6 - SLI only benefits you by raw power increases. If a game requires more vRAM? Multiple GPUs aren't gonna do as much for it as you would think. For example: 2 x 770 2GB cards playing Watch Dogs; ultra spec uses 3GB vRAM which it cannot provide.

 

7 - SLI causes problems for streamers. Though it's obsolete now, OBS Classic only grabs 1/2 the framerate with game capture if a game is using two cards. It only takes frames from 1 card, so if you set 60fps and use game cap with SLI, you'll be visually outputting 30fps even though OBS reports full 60fps being displayed. To combat this you need windowed mode for the game and window capture (which then causes the game to get bouts of slowdown for a couple seconds on OBS Classic), or a capture card, or to use something like Dxtory or Playclaw 4 and their virtual webcam recording feature, but people have their problems with those programs in their own right and may not wish to use it. Window capture also means more of a performance hit in OBS Classic, and most importantly, some games dislike windowed mode *cough* skyrim *cough*. SO if you are a PC streamer and like to use OBS Classic and don't have a streaming PC, take this into EXTREME consideration. OBS Studio lacks this issue, but may take some elbow grease to get working. I expand on this under the "My thoughts" section lower in the guide.

 

8 - SLI with three monitors FORCES them to be a huge widescreen. If you fullscreen a game at say... 1080p with 3 monitors and SLI enabled, your other two screens will go black. This makes the extra screens for multitasking difficult to use. You actually need FOUR screens to get a "second monitor" effect once SLI is enabled. There MAY be software out there which removes the widescreen effect, but I believe it has issues otherwise.

 

9 - Turning on/off SLI requires a restart of the PC on laptops (I mistakenly thought desktops had this issue. Sorry about that). It's annoying, especially as your SLI turns off after each driver update. It's more of an annoyance because nVidia's drivers, for quite some time now have NOT required a restart after installation. There IS a windows registry hack you can use to remove it (which I used to use), but like all windows registry hacks, one should be careful. You can also use desktop drivers by modifying the driver .inf prior to installation (what I now do, screw mobile drivers), but it is a pain to do in windows 8 and 10, as modifying the .inf required involves rebooting with driver signing enforcement disabled to allow installations (you can't even do this on Windows 10 with Anniversary update 1607 onward if Secure Boot is enabled), and editing the registry after driver installation (if not using a modded driver .inf) will still require at least 1 restart... so either way you're going to have to restart once. So if you don't usually ever turn on/off SLI, you may not want to bother with trying to find this fix. Do note that nVidia could change this to match desktop functionality whenever they want, and simply don't care enough about laptop users to do it. I have had confirmation that they know this issue exists from one of their reps literally years ago.

 

10 - SLI performs a worse if your game is in Windowed mode or Borderless Windowed/Fullscreen Windowed. It used to hold high utilization % and such, but with newer drivers/cards (and newer games) it has dropped, and usually there's wildly varying utilization %; Overwatch for example grants about 80% primary GPU util and 92% slave GPU util. The ACTUAL performance hit used to be somewhere around 5% from what I once saw, but now it's between 10-15%. Let's say if you were getting 150fps in windowed, you might get 210-220fps fullscreen. It's now enough to write home about, and if you like to play in windowed/borderless windowed modes yet you find yourself performance-squeezing in a particular game? You may want to just fullscreen and block out the alt-tabbing distractions for a while. 

 

11 - SMAA and TAA are amazing forms of Anti Aliasing. They're the most efficient forms of AA currently out; barely any performance hit while still getting rid of jaggies in full-scene images without the excessive blur of FXAA and without the performance hit of MSAA. SMAA 1x is essentially a completely free form of AA, and SMAA T2x has a hit of about 3fps (from 60fps in Crysis 3), while TAA is something like a better SMAA T2x. The problem with SMAA is that SMAA's "T" AA formats are unusable in multi-gpu configurations. THIS IS IMPORTANT. If you see SMAA T1x or T2x in a game while using SLI and you attempt to enable it, it will force SMAA M2GPU on you if you're unlucky, and simply disable temporal filtering if you are lucky. SMAA M2GPU is a terrible form of AA, which combines SMAA 1x and MSAA 2x for a greater performance hit than MSAA 2x on its own would take. If a developer uses SMAA 2x (without the temporal filter that the T stands for) then that should be fine, but never use M2GPU or enable T2x with SLI on. The visuals are not worth the performance hit and you're far better off using 1x or MSAA 2x instead (please note, the only game I've ever seen SMAA 2x as a separate AA choice to T2x has been Evolve, and Evolve Stage 2 no longer has that benefit). TAA, like TXAA, manages to use temporal filters (that remove shimmering and crawling lines, usually in the distance) on textures in multi-GPU setups, even though temporal filter should need to have all the frame data, and with alternate frame rendering formats they shouldn't be able to access the frames the other GPU is processing. Unlike TXAA, however, TAA has a massive performance difference between single and multi-GPU, and can go so far as to make people believe it not worth the performance hit in SLI. It only exists in a few titles that use SLI now as well, so this may change in the future, but it was developed by the Epic devs for their SLI-less Unreal Engine 4, so don't expect miracles. Also, beware that turning on TAA in games that do not support SLI but have SLI forced on may cause RIDICULOUS CPU usage and slow framerates to a crawl on Kepler cards.

 

12 - MFAA does not currently work in SLI.

 

13 - As with the above downside, Maxwell's voltage fluctuations also have stability and overclocking problems in SLI configurations. If you're unlucky, they can happen at stock clocks too. I'll link this thread a friend of mine n=1 wrote over on notebookreview, as he has put it into much more detail than I can (since I've never owned maxwell cards). http://forum.notebookreview.com/threads/on-the-subject-of-maxwell-gpus-efficiency.773102/  Now, what seems to happen is that the cards' voltages don't always match each other, and their clockspeeds as a result may not always match due to the nature of "boost". nVidia started adding a disclaimer to their driver patch notes that differing voltages in SLI is an "intended feature" and that changing them stands to gain nothing to the user, but fixing them far improves stability (and increases power consumption, making their "low power" cards no longer "low power"... so they will never fix the broken design). If you're not a user who is keen on doing this kind of "fix" as outlined in the above thread, but you experience problems in SLI (driver crashes, or random stutters, etc) then you may very well have no choice. If this sounds like a scary/stupid bug, it's because it is. It shouldn't exist. But it does, and while the number of people who experience it may be low (and the others don't actually care), it IS a bug that more than one person has noticed, so it remains on this list. Note that Pascal cards do not appear to exhibit this behaviour as their voltage curves are much tighter.

 

14 - SLI uses extra CPU power due to driver overhead. It is something that's more apparent in recent games, but I've seen it to be true. It might not matter a whole lot to many people, but it is a downside of SLI, and thus in the guide it goes.

 

15 - It seems some games have a great disdain for borderless and windowed modes with SLI lately. The most blatant offender is Killing Floor 2, but there are others lately. Running in borderless or windowed modes causes stutters and makes the game feel choppy, even with high FPS. PLEASE NOTE: THIS IS A DEVELOPER ISSUE AS IT IS NOT PRESENT IN ALL GAMES. But because I've lately seen a couple titles having the issue, I feel a need to add it to this guide. If you like borderless windowed mode gaming, titles post-early-2014 may not be your friend with SLI.

 

16 - Bonus round (correct me if I'm wrong😞 CrossFireX only works on fullscreen games. So games which dislike alt-tabbing so you run in windowed mode will NOT benefit from crossfire, which renders the mode entirely useless if you need to run a lot of your games windowed for alt-tabbing/streaming/etc purposes. Actually, multi GPU has gotten so bad that I have to disregard CrossfireX entirely. There is no point to buying multiple of any AMD card for the purpose of gaming.

 

Resolved and/or no-longer applicable downsides to SLI (If you have SLI, read this section to see if any of these fixes apply to you).

 

  Reveal hidden contents

THE PROBLEM: Google chrome dislikes SLI. It will flash your pages white or black and you will need to move the tab into a different window, close/re-open chrome or maximize/restore down the window whenever it happens. This can be mitigated by going into chrome's flags and disabling "hardware accelerated video encoding". This puts more strain on the CPU for videos and livestreams however, but it's the only way to use chrome 100% bug free while the bug is in effect.

THE FIX: This no longer happens as of WHQL 340.52 drivers. If for some reason you need to use older drivers for benchmarking or specific-game-performance purposes however, this will still be a problem for you, and thus I have left the old fix and the problem description intact.

 

THE PROBLEM: Asynchronous SLI scaling can cause a lack of smoothness if the difference in clock speeds is very high between cards, even at higher framerates (60+). I've been unable to record it in any way, mainly because recording will clock up the card. Now, you might be thinking "well my cards are going to be close together in speeds, this is not going to affect me!". Well that's where you're wrong! Kepler cards (and possibly Maxwell, which shares similar boosting traits to Kepler) have a "feature" where if your card has triggered 3D clocks but is not actually using them to its fullest potential (like say... <40% scaling on both cards for 60fps?) then it may begin to downclock to save power, as the game isn't drawing enough to keep it at its base 3D clock. It may not hit the 2D clocks, but it may indeed throttle, and your game will suddenly become very annoying to play. The problem here, is that the slave card won't downclock. Only the primary. Older Fermi cards should not have this problem.

THE FIX: Open nVidia control panel's 3D application settings and find the game and select the "Power Management" option and set this to "Prefer maximum performance". I do NOT know if this is a feature purely available to laptops, and if it isn't, if ALL desktop Kepler/Maxwell GPUs have the capability. Here is a screenshot of what the option looks like, using a culprit of mine, Dark Souls 2. I suggest not setting this function in global settings as your card will never actually downclock itself. If this fix is unavailable to you, the second, less-reliable way to fix this is to simply bump resolution and graphics more. The higher you can go, the better. If you have a way to force supersampling or higher AA values or anything? Go for it; the more power the game takes is the better for you. If you have a Kepler DESKTOP card, DSR should be available to you as of 344.48 drivers and later, and thus supersampling ought to not be a problem. Kepler and Maxwell notebook cards now have DSR, but since nVidia doesn't care enough about notebook users to update their information online, I can't find which driver contains it (their official spec page claims we don't have it yet). I know that 347.88 and later drivers do, but not which driver started it. N.B. Maxwell and Pascal cards will clock up, but only to base clocks, when doing this.

 

THE PROBLEM: Gsync + SLI + DSR does not work.

THE FIX: Some of the newer drivers from nVidia have fixed this. I'd assume r364 branch, though those aren't drivers I'd suggest to anyone. If 362.00 doesn't allow it (I can't test; I don't own gsync, sorry) then wait for a later branch that's more stable. It'll be fine.

 

 

My thoughts and suggestions section.

 

  Reveal hidden contents

1 - SLI support is getting progressively worse. Most new, demanding games that have released absolutely detest SLI. I keep seeing more and more of it, and I'm at the point where I believe SLI is something that should not even be ATTEMPTED without owning the single strongest (or second strongest if there is a small performance difference, like 980Ti vs Titan X) GPU on the market. Many new and popular engines are designed to NOT support AFR SLI or Crossfire, like Unreal Engine 4 (list of its games) and Unity (list of its games; no idea why anyone would use Unity for 3D/graphics-heavy titles though). Right now, SLI is a feature that's more of an "added benefit" rather than an "expected power boost" in new, demanding games. Maybe if developers actually bother coding optimizations directly into DX12 and Vulkan games and we switch from AFR to SFR (split-frame rendering) and/or the system is able to use multiple GPUs as one big GPU, we won't need to worry anymore. But that is a long way away, and the performance bonuses will likely be less great than AFR is, and since we'd need devs to actually put time, money & effort into coding for DX12 or Vulkan, I highly doubt it. All DX12/Vulkan titles currently use driver-side optimizations and simply utilize the API for less CPU load anyway. And even if they did do it, almost every existing title TODAY will not get DX12/Vulkan api upgrades (let's be honest) and DX12'd require us all to be on Windows 10 anyway, which is honestly an extremely anti-consumer OS I recommend nobody be on, and most importantly: NO MAXWELL OR PRIOR GENERATION nVidia card is built to support SFR for most games. We have a new type of bridge decent bandwidth improvements to deal with the vRAM buffer data issue, but only Pascal cards will officially use it properly. So... you want to get SLI? MAKE SURE THE CARD YOU ALREADY OWN IS AT LEAST WITHIN 20% OF THE STRONGEST GPU ON THE MARKET. That way if a game screws you over, for SLI support, you can still play it fairly well. I swear if you SLI two GTX 960s, 970s, 980s or 1070s after reading this guide I will {REDACTED}.

 

2 - If you're a heavy streamer and don't have a streaming PC but you want your stuff to look good still, then SLI may not be for you. OBS Multiplatform now sufficiently works well enough with SLI that I can say it's much less of a headache, but some games still have issues. For example: Though a single-GPU game, The Binding Of Isaac Rebirth/Afterbirth required me using game capture with multi-adapter configuration, or it would literally have the game run at 10fps. But then, when I used multi-adapter capture, it would run the game at about 80fps for some reason, and actually force the game to go beyond its intended speed, and run about 33% faster. So I had to stream at 60fps, and use the "limit capture framerate" option to get it to work properly. For a game that doesn't even use the second GPU. Please don't misunderstand. This does not mean that ALL games have these issues, or that SLI never works correctly. It is an example intended to show that simply having SLI turned on for streaming purposes requires some problem-solving attitudes to get working right. I did not ASK anybody how to fix Binding of Isaac... I figured it out in exactly 3 minutes just via trial and error. But this is the difference between my mindset and the mindset of the person who tosses $3000+ at iBuyPower or OriginPC to just "get a Beast PC to stream on". If problems occur, you need to be someone who is comfortable figuring out what's wrong. And if you want or need to use old OBS for some reason, the limitations from before apply. Some games simply work better fullscreen, and alt tabbing can be problematic for them, and of course you cannot window capture a fullscreen window. So either invest in a cap card/streaming PC/etc or make it work with dxtory or simply be prepared to force a lot of games to single-card usage, which further reinforces the need for strong single cards before SLI even happens.

 

3 - Building off of the point in #2 but going a bit further, if you're NOT a consumer who is willing to learn and try things and just want to get a machine to run well and play your games, but your current midrange card isn't giving you that 60fps constant in BF4 on ultra like you want, the best thing for you is to upgrade your card. It won't benefit you to acquire potential issues that you won't really know a thing about. Single cards you just update ya drivers and blow a canned air at it every month and keep it cool. Dual cards are a different story. Gotta make sure they're enabled and working well and forcing them on/off depending on if games are bad or good with it etc... basically it's a headache you don't want. To someone like me, it's no headache at all really, but I like this stuff. You likely don't like it as much as me.

 

4 - If 3D is your thing, extra cards help. There's no contest. SLI usually does not scale 100% to both cards, therefore you usually have headroom and won't lose 1/2 your FPS instantly. Just remember that not everything works well with 3D. Even BF3 which is 3D vision ready led to BF4 being "not reommended". And BELIEVE ME. IT IS NOT RECOMMENDED. DON'T EVER PUT BF4 IN 3D. TRUST D2. Unfortunately 3D is basically dead now, and I wouldn't tell anybody who doesn't already have it or isn't willing to only play a lot of older titles with it on to get it. And note that VR ready titles are not 3D ready (or even 3D happy) titles... Elite: Dangerous is one such example.

 

 

The bandwidth issue

 

  Reveal hidden contents

 


So, I keep stating that there is a bandwidth issue with regards to SLI and current generation cards. What the issue is, is that a lot of tech requires a lot more bandwidth between cards than is currently available to work properly. Cards transfer data between the PCI/e bus and the bridge (if one is present; Bridgeless SLI solutions exist though rare). The standard SLI bridge is clocked at 400MHz and grants approximately 1GB/s of bandwidth between GPUs + bandwidth from the PCI/e connection. This is not enough for the amount of data in each frame (frame buffer data), or relevant memory between cards, to be transferred between frames for new tech, or technologies such as Temporal anti-aliasing in general (though we've seen TXAA and in some cases, albeit at a large performance hit, TAA function in SLI). It has gotten to the point where forcing SLI in certain recent unsupported games/game engines or at certain resolutions (usually, but not always, above 1080p) literally requires PCI/e 3.0 x16/x16 for the cards in SLI to even grant positive scaling. Yes, that means that negative scaling (less FPS than single GPU) happens otherwise. And if it doesn't, functional scaling is often low (30% or less, making it a useless money sink). This means your CPU must cost a minimum of $500 USD for the intel mid-tier enthusiast CPU of any generation (4930K, 5930K, etc) and AMD CPUs which only use PCI/e 2.0 lanes (Bulldozer, Piledriver) cannot even be considered, and as for Ryzen, let me get something out of the way right now. It is AMD's version of an INTEL MAINSTREAM-CLASS PROCESSOR LINE. There are only 20 CPU PCI/e lanes (down from 38 lanes of Piledriver, though this was created when PCI/e 2.0 was the best available) and there is no quad channel memory support, etc. One should not consider it for multi-GPU usage either in light of this. As for some proof of the bad scaling? Here you go:

Proof #1

Proof #2

Proof #3 (x8/x8), and PLX Mobo (x16/x16).

Proof #4

Proof #5

Proof #6 (x16/x16 vs x8/x8 and even x16/x16 vs x16/x8, from Kepler through Pascal)


AMD offered a solution; one that I consider to be the best possible option. They decided to make use of the PCI/e bus bandwidth to talk to each card by configuring the memory in a certain way, which they call XDMA (Crossfire Direct Memory Access). This introduces a latency issue, but it's one that even AMD has found ways to minimize/eliminate, and I am certain nVidia could do the same if they cared (they apparently do not). In a PCI/e 3.0 x8/x8 situation (what a user is most likely to be using with a mainstream intel CPU as is the most popular PC configuration for gamers) a user would have 7.88GB/s bandwidth in this way, which is far more than the 900MB/s or so that the crossfire bridge granted previously, and far more than the 1GB/s that the SLI bridge grants. It requires the memory on the card to be designed in a certain way, however, so it cannot be retroactively added as a "technology" to any GPU. Also, as an added benefit, XDMA also helps to fix framepacing in multi-GPU, making it more smooth to use.

nVidia also made a solution to improve interconnectivity between cards... NVLink. However because NVLink replaces PCI/e, it is only sold on enterprise boards intended for use with Tesla cards. Supercomputers, to simplify. You are simply not getting a pair of gaming cards into one. Since it is a replacement for PCI/e, and apparently requires proprietary card connections, and it inevitably costs more money to have it, it would never work for mass-produced consumer boards. This I believe requires XDMA-style memory configurations too. But either way, it can't be sold to people like me.

So, recognizing the need for more bandwidth, but not being able to sell us simple XDMA-style cards for more $$, they decided to simply improve the bridge. There was already an overclocked LED bridge, that runs at 650MHz up from the standard 400MHz, granting 62.5% more bandwidth for 1.625GB/s. nVidia simply doubled up the connectors to use both SLI fingers and presto! 3.25GB/s bandwidth between cards! This is a huge step up from 1GB/s, however nowhere near the 7.88GB/s to 15.76GB/s that is present in XDMA-style x8/x8 or x16/x16 configurations respectively. Also, this effectively killed three-way and four-way SLI. I thoroughly dislike this approach that is being made, and it further reinforces my belief that SLI is basically being discarded by devs and nVidia alike (especially considering the new information about cards throttling each other that I introduced at the top). This is nothing but a band-aid on the bandwidth issue, and developers have little to no reason to be coding AFR-unfriendly rendering techniques either. I hope things make a turn for the better in the future, but as of right now, things are exceedingly bleak. It also does not help that they have no competition and only need do the bare minimum that is necessary to get the population to love them, and most larger voices in tech and the majority of PC gamers don't care about SLI.

 

 

 

I wish to add that as far as performance is concerned, two GPUs will far outstrip what one GPU can do, unless you're using two entry-level midranged cards and comparing a flagship enthusiast card (for example, two 960s versus a superclocked 980Ti). I DO like SLI, but SLI isn't for everyone and with the recent terrible state of SLI support that I see in a constant decline, as well as Maxwell and Pascal's anti-multi-GPU design, I can no longer recommend it to... well... anyone, really. If the current high end GPU isn't enough performance for you, SLI is the way to go, sure. But I would take a single stronger GPU over SLI-ing two weaker GPUs as long as that single GPU is 25% or more better than one of the weaker GPUs that would be SLI'd (I.E. I'd take Card A over Card B SLI if Card A is 25%+ faster than Card B no SLI). The amount of times with recent titles (that'll actually need GPU grunt, unlike many older titles with newer cards) where the single card will simply do a lot better than the SLI setup is going to be a very high number, and there is no guarantee that SLI will even properly work with nVidia Profile Inspector bits forcing (Dead by Daylight, for example, is a popular new title that will not get positive SLI scaling without flickering characters no matter what I do). This is, I believe, more the developers' faults than nVidia's, however nVidia's readiness to discard SLI is also apparent. They know it doesn't work well and are not showing intent on fixing it, as seen with GTX 1060s being incapable of SLI, despite being stronger than GTX 970s in raw performance.

 

Further to the above bashing of the state of multi-GPU, here is a nice article's summary page for performance in multi-GPU games in 2015 and later titles, to back up the statements I make in here, since I often get people telling me I'm deluded or some other kind of nonsense when I make such claims.

 

NB: I add to the benefits or detriments lists when I remember something, discover something or technology changes to keep the guide up to date. I wish I could speak more about Maxwell, but unless someone sends me a pair of maxwell GPUs and heatsinks for my Clevo, I'm not going to be able to test much, unfortunately.

 

If you want the vRAM information or mobile i7 CPU information guides, they're in my sig!

 

Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23

Not gonna lie though SLI is kinda dead now

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly, there are very few cards with an SLI slot that are not obsolete or cost you a fortune (2080ti,3090)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

I am impressive that RTX 3090 works enough well in SLI mode. I found this video :

 

 

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

The first post says you don't necessarily need a super powerful PSU to run SLI...


...I've got an EVGA Supernova 850 T2 and wondering if it'd be enough to run two Titan RTX?

I have one already and thinking I might pick up a second once supply chain difficulties are resolved and they become less expensive.

 

Edit: Actually reading back over it, I guess Titan RTX would be considered 'top end', so...would I be looking at needing a PSU upgrade?

"I try to put good out into the world...that way I can believe it's out there." --CKN                  “How people treat you is their karma; how you react is yours.” --Wayne Dyer            

[Needs Updating] My PC: i5-10600K @TBD / 32GB DDR4 @4000MHz / Z490 AORUS Elite AC / Titan RTX / Samsung 1TB 960 Evo / EVGA SuperNova 850 T2

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×