Jump to content

The SLI information guide

D2ultima

Hi everyone. I originally wrote this guide over at the kbmod forums, but as it turns out that forum is as dead as Aeris in FF7. This forum is more lively and thus I figured it'd be good to copy over my guide for all to read. This is a real-world, layman's terms assessment of what SLI does and how it works. I have never used and therefore cannot say that all of these will hold true for CrossfireX. Original guide (no longer updated there) is over at http://kbmod.com/forums/viewtopic.php?f=22&t=6212

 

------------------------------------------------------------------------------------------------------------------------------------------------

 

Basically, this is a guide meant to explain the upsides and downsides of SLI. It's mainly geared toward people who have midrange or last generation cards who wonder if they should get multiple cards or simply upgrade to a stronger one. This will list pretty much everything you will ever want to know about SLI in as great a detail as I can possibly create. It WILL BE A LONG READ. Also note that I have never attempted to Crossfire any cards, so this guide is MAINLY directed toward SLI, and while most of the ups/downs will be common to both SLI and CrossfireX, THIS IS NOT A CROSSFIRE GUIDE. There are large differences and I am not in a position to explain in-depth about CrossfireX.

 

 

First, I will clear up some fairly common misconceptions about SLI. 

 

  Hide contents

1 - Your video memory is not added or shared. It is copied. If you have two 1GB video cards, the data in GPU 1 is copied to GPU 2, and thus you only ever benefit from 1GB of video memory. If a game wants to use 2GB, it won't put 1GB in GPU 1 and 1GB in GPU 2. Because of this, pairing a 4GB card with say a 2GB card for example does not work. Also, please be wary of the fact that most multi-gpu cards are sold with the TOTAL vRAM listed as the selling point. For example, the Titan Z. It does NOT have 12GB of available vRAM, it has 6GB on each card. You will only have 6GB usable vRAM in games/renders/etc. Same goes for the R9 295X2 and its uhh... "8GB" of vRAM.

 

2 - Dual GPU cards (like the GTX 590 and 690) ARE using SLI. They also use weaker versions of the flagships of its generation (usually through downclocks) for each of the GPUs in the dual GPU card. They also often cost the same as OR MORE THAN two of the cards that comprise it, for example 2 x 680 = ~$1000 at release, and 1 x 690 = ~$1000 at release. So the two single cards which comprise the multi-gpu card will always be better to purchase, however the dual-gpu card works on motherboards with only one PCI/e x16 slot. Since these cards are no longer made however, you're far better off just buying a newer generation single card for any situation.

 

3 - You usually do NOT need 800W+ PSUs for most dual GPU SLI solutions. Most cards will work easily on a much smaller wattage PSU once it's good quality. Lots of dual GPU users I've seen (not saying all of them do) tend to go for 850W or 1000W PSUs. You would not need these PSUs for just two cards unless you're a very serious overclocker with a top-end CPU (i7-4790K/6700K/7700K are NOT top end) and two top-end GPUs... but then you wouldn't be using this recommendation anyway would you?

 

4 - If one card is clocked differently than the other on Pascal GPUs, SLI will still not function asynchronously. If one card is overheating (Pascal responds to temperatures heavily and will throttle boost before even approaching the thermal limits) or hitting a TDP limit while another card is fine (for example, a Reference 1080 card's maximum allowed TDP limit is 180W, whereas an ASUS Strix 1080 is 400W, if you ran them in mGPU the reference will throttle/not hold boost, as well as overheat, far more than the Strix ever will... the reference 1080 may throttle to 1750MHz at 1440p high FPS gaming while the Strix may be capable of 2000MHz at the same point) then both cards will be downclocked to match the slower card. This means there is serious performance lost, and great care should be considered when picking cards for multi-GPU. I do not know how Maxwell responds to this, I never had it to test and I've never ran into someone who is willing to test things like I might. On Kepler cards (which you shouldn't be using anymore if you play games late 2014 onward, for long reasons I will not get into right now, just take my word for it and upgrade when you get your next chance) if one card is clocked differently than the other, SLI will still function asynchronously, meaning that it won't downclock 1 card to match the other slower card. This was actually fairly new in the long life of SLI. Also confirmed that mismatched memory clocks will work in SLI just like mismatched clock speeds will. Please note however that extreme differences in clock speeds will cause stuttery, jittery gameplay; similar to microstutter, even at very high FPS counts (60+) due to bad frame pacing. Also, regardless of utilization, I have confirmed that differing clockspeeds will not benefit the game that much; it will simply "function". If you want to get a boost from an overclock, keep both cards as close to each other as possible. EVGA Precision X's K-boost functionality is useful for this... but don't run Precision X on a dGPU-only laptop unless you want to find a bricked screen shortly after due to it opening up a vulnerability where the nVidia driver is able to overwrite the LCD's EDID data (note: the driver ALWAYS tries to do this; Precision X simply allows it to, as it is otherwise-unable to actually do it). If you find yourself with a SLI laptop, stick to other tuning software such as MSI Afterburner or nVidia Inspector and find yourself a modified video BIOS (Kepler/Maxwell only) so your cards actually do what they are supposed to. If you have Pascal mobile, well, make sure your cooling keeps the cards similar speeds/temps.

 

5 - The percentage utilization on the cards does not always equal the exact bonus in raw performance (hereafter referred to as "scaling"). For example, 99% utilization on both cards (a bit rare) may only be 90% boost in FPS. SLI should get between 85-95% boosts in FPS as per what the technology allows (game dependent), though many games since 2014 have much less scaling... 70% is considered "good" by many and below is common. Also note that the higher the resolution the game is rendered at, the lower the scaling for quite a few titles. If you want to play at 4K for some reason, expect as low as 26% scaling in Witcher 3 (with hairworks on anyway). The high bandwidth/LED/doubled-up-flex bridge solutions can help in SOME titles here, but PCI/e 3.0 x16/x16 is far more important. This is explained fully in the "The bandwidth issue" section of my guide below.

 

6 - If a game has slow support for SLI to be added, it is *NOT* nVidia's fault. It's the developer's duty to follow up and work with nVidia to fix multi-GPU support issues for their games before launch and post launch. If it takes 5 months for SLI to appear (like in Titanfall, which STILL has SLI issues) then you need to blame Respawn for it. The same also applies to AMD and Crossfire. Also do note that some game engines simply do not support SLI at all (officially) like Unreal Engine 4 and Unity 5. While you may have varying success forcing SLI on (depending on resolution, bridge type and PCI/e lane width as well as whether or not it produces visual bugs), understand that games coded in these engines will almost never get SLI (or crossfire) support.

 

7 - Your CPU and motherboard have an effect on what you can SLI and how many cards you can SLI. You need to count PCI/e lanes, essentially. Mainstream intel CPUs (like the i7-4790K) have only 16 CPU lanes (different and separate from chipset lanes); so 2-way SLI (with each card running at 8x bus speed) will be the max the CPU can support, unless the motherboard has a PLX chip to support more cards in SLI (which won't really be happening anymore due to the death of 3-way and 4-way SLI). SLI requires 8x bus speeds for a card to work (ironically, the PCI/e revision is irrelevant... PCI/e 3.0 x4/x4 is double the bandwidth of PCI/e 1.1 x8/x8, but the former doesn't work and the latter will); AMD cards I believe can run on 4x bus speeds. SSDs in the M.2 format may or may not use up PCI/e lanes from the CPU; you will need to check your manual or contact the motherboard vendor to tell. Intel's Skylake chipsets have more chipset lanes to use on PCI/e SSDs than previous chipsets have, even Enthusiast ones like X99.

 

8 - Microstutter is mostly a thing of the past. If your game isn't running at ~35fps or below, you should experience no microstutter without some external influence (like watching a high resolution flash video on a second monitor in Google Chrome). If you get a lot of microstutter in various games at high FPS counts, then the fault lies elsewhere on your system, or the developers have screwed up in the games you are testing (most likely this). I can confirm Mass Effect 3, CoD: Black Ops 2, Battlefield 4, Dark Souls 2 (SOTFS as well), CoD: Black Ops 1, Dark Souls 3, Dying Light, Skyrim, Killing Floor 1, Super/Ultra Street Fighter 4, and Elite Dangerous have no stutter in Windowed, Borderless Windowed or Fullscreen modes using SLI. Use those to test your system, and if there is a stutter issue in another game in say... borderless windowed mode (like Killing Floor 2), then blame the devs.

 

What can I SLI? (970 SLI issue information and potential fix)

 

  Hide contents

Before we get into "what" you can SLI, make sure your motherboard supports SLI. SLI requires PCI/e x8/x8 configurations at the minimum. It does not matter what revision of PCI/e is present, even though PCI/e 3.0 x4 is more bandwidth than PCI/e 1.1 x8. Your motherboard must be capable of running at minimum x8/x8 configurations of the same PCI/e revision (3.0 x16/2.0 x16 will fail) to use SLI.

 

You:

CAN use the same cards in non-performance-boosting mode (driving extra monitors) to do triple monitor gaming.

CAN use cards with different core clock speeds in SLI without one card being slowed to match the other.

CAN use cards with different memory clock speeds in SLI without one card being slowed to match the other.

CAN use cards from different manufacturers as long as their specs are the same (I.E. "MSI GTX 780 lightning" + "Gigabyte GTX 780 Windforce" will work).

CAN use cards with different forms of cooling in SLI (reference cooler + non-reference cooler) excepting possibly the GTX 970.

CAN use two dual-GPU cards (like the 590 or 690) in QUAD SLI in a board that only supports 2-way SLI.

CAN use four cards in 4-way SLI (NOT Quad SLI) in a board that specifically supports 4-way SLI (and with a 40-lane or higher CPU).

 

MAYBE can use mismatched cards to force higher levels of AA on DX9 and older OpenGL games. I cannot test it.

 

CANNOT use mismatched cards (GTX 770 + GTX 980 will fail).

CANNOT use cards with different cores even if the same name (GK104 GTX 660 (OEM) + GK106 GTX 660 will fail, or GM206 GTX 960 + GM204 GTX 960 (OEM) will fail).

CANNOT use cards with different vRAM sizes even if the name and core is the same (GTX 770 4GB + GTX 770 2GB will fail).

CANNOT use cards with different memory bus width even if the same name and core is the same (GTX 760 192-bit (OEM) + GTX 760 256-bit will fail).

CANNOT use cards that do not support SLI technology (most GT-series cards, and the GTX 750Ti do not support SLI).

CANNOT use 4-way SLI in a motherboard that does not specifically support 4-way SLI. Even if the CPU has 40+ PCI/e lanes, it does not mean the board can run 4-way SLI. PLEASE NOTE: a board CAN be capable of 4-way Crossfire and NOT 4-way SLI, but a board capable of 4-way SLI *must* be capable of 4-way Crossfire.

 

While not being "SLI" itself, these apply to multiple card usage:

 

CAN use mismatched cards if you are driving a second or third monitor from the extra card.This needs no SLI bridge.

CAN use mismatched cards if you are driving PhysX on the second card. This needs no SLI bridge.

 

MAYBE can use mismatched cards driving extra monitors for triple monitor gaming (I do not know, so I am leaving it as a maybe. ANYBODY who can test and let me know, please do.)

 

More 970 issues (why do people still buy this card?):

Apparently some users have been having issues turning on SLI with two GTX 970s. While I am unaware of the cause, it was pointed out that a program called DifferentSLIAuto can fix this. The program also has a possibility of allowing two GPUs with different vRAM to enable SLI (but only the vRAM of the lower card will work). I AM NOT ADVOCATING THE PURCHASE OF TWO GPUS WITH DIFFERING VIDEO RAM SIZES AND EXPECTING THIS PROGRAM TO WORK. I am simply stating its existence, and that your mileage may vary.

 

^ further to the above, apparently manufacturers have been using different ID buses? (more info needed) for various kinds of 970 cards, and those often will not work together with SLI. Here is EVGA's compatibility table for reference; I don't have any other manufacturers' tables to show, though other manufacturers suffer from the same issues. Anyone who wishes to PM me with these tables, or add them as replies, feel free to do so and I will update the guide as necessary.

 

Now that that's done, let's get into the benefits of SLI. There's some benefits I'll list that most people don't actually know.

 

  Hide contents

1 - Obviously, better game performance. Most games have SLI profiles and will perform better with more than one card, and your stuff will run better. This is almost a given.

 

2 - Your memory access bandwidth almost doubles, even though memory size and memory write speed is not added. Two cards with a 256-bit memory bus will act almost like a 512-bit memory bus for the game. This means that two cards with a 256-bit mem bus (like 670s or 770s) will be more beneficial in memory-bound games than even a single 780Ti (especially if using the 4GB versions, as you won't be overshadowed by the extra 1GB the 780Ti has over the 2GB versions of the cards).

 

3 - Less per-card heat. It is extremely unlikely to push 100% on each card in multi-GPU, even if both cards are running near 100% utilization as per monitoring software. Also, your memory controller will have less load. This means each card uses less power and thus runs a bit cooler than forcing it to one card. Do note that this is assuming cooling is well set up, and one card is not hindering the cooling of another card, etc. If your cards are cramped in your case, expect the top card to run quite a bit hotter. Also, know that temps are *NOT* primarily related to card usage, but more to HOW the games use the card and how high your FPS is. For example: Skyrim (RCRN, max volumetric lighting option, 2k res boulders/etc texture mods, pure water mods, etc) lets my GPUs run cooler than fallout new vegas (no mods) despite the higher GPU load on Skyrim, believe it or not.

 

4 - 3D utilization. If you happen to have one of those games that are good in 3D and you have nVidia's Stereoscopic 3D glasses, there's a(n albeit rare) benefit for you with SLI! Basically, 3D essentially cuts your frames in half. You WILL have to render the game twice to see in 3D, which means half the FPS. But let's say your game uses... 70% of each card and refuses to use any more and you get good framerates. If you turn on 3D, you would likely ramp your usage up to over 90% and you'll end up not halving your FPS, and instead only losing maybe 20% or so. Do note that this DOES NOT translate into benefits for virtual reality.

 

4b - If you have a game that is good in 3D and does not support SLI, does not benefit from SLI (not "does not support") OR is simply best run with a single video card (which can be forced in nVidia control panel or nVidia Profile Inspector) you can run it off of one video card and turn on 3D, and your second card will simply render the second set of frames perfectly and give absolutely NO drop in FPS. And I mean NONE. When playing paranautical activity (single GPU only) I tried it in 3D to find that my 250+ fps remained consistent in 3D instead of getting it halved. THIS ALSO WORKS WITH GAMES WHICH HAVE ISSUES WITH MULTIPLE CARDS. The issues will not present themselves. Again, does not apply to VR.

 

5 - SLI can actually be forced on some games that don't actually support it... such as Skyrim's RCRN or ENB modded states. You can try forcing Alternate Frame Rendering (AFR) 1 or 2 via nVidia Control Panel (which in the case of Skyrim enables SLI for RCRN/ENB for example using AFR2) or you can use nVidia Profile Inspector to force specific profiles or play with SLI bits to get things working as optimally as possible (for example, the 0x080040F5 "Daylight, Evolve, Monster Hunter Online benchmark, etc" profile grants me positive scaling and no bugs in ARK: Survival Evolved which is an Unreal Engine 4 title as long as I don't turn on in-game AA). Then you can still enjoy the benefits of SLI even in many "unsupported" titles while keeping your lovely framerates. Here is a (rather short; I know there are much more elsewhere) list of games and compatibility bits that improve them; it's a good start, so good luck! http://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912

 

6 - Using SLI and vSync forces games by default to use "quad buffering", which almost always skips the issues present with non-triple-buffered vSync where if your fps drops below your monitor's refresh rate it instantly drops a large amount (I.E. hitting 55FPS when vsynced to 60 would drop you to 30fps until you could render 60 again, and going below 30 would drop you to 20fps, then to 15fps, etc). This almost entirely negates a need for adaptive vSync in games which don't support triple buffered vSync in their options menus, as long as SLI is supported for the game (some rare titles, like Black Ops 2 Zombies, will still drop you to specific points of FPS counts; it's better to use Rivatuner's framerate limiter option to limit your game to your desired framerate instead of vSync... which also, unlike vSync, works 100% of the time in windowed modes.You could also use NVPI's "frame rate limiter" option which does not require anything hooking to the game, but it increases input lag much more than RTSS does, though not as much as Vsync will).

 

7 - PhysX delegation (this doesn't actually require SLI as mismatched cards can be used, but can be done with SLI setups). Some games use PhysX really badly. But either way, if you force PhysX to a dedicated card, your other card(s) can run your game easy. This is ESPECIALLY useful with triple cards, as you can force one card to run PhysX and the other two to run the game. So don't throw away that old 780 before you grab a 980Ti or something! It can still be useful! N.B. Don't use extremely old cards with extremely new cards though. If you pair a 560Ti with a 780, it might actually run worse, because the 560Ti isn't able to keep up with the 780 (though it can with a 680). But if your old card is something like a 770, then say a new 980 and a 770 for PhysX would probably work quite well. There's no real formula for this, you kind of have to average what'd be good or not. And do note that while the number of games using PhysX is very small, it can indeed help. Note that sometimes this can give weird behaviour; Killing Floor 2 found it best to leave both cards enabled and just turn on PhysX, otherwise weird stutter and inexplicable dropped framerates was noticed, with the second card being pegged at 99% for long periods of time with nothing PhysX-related on-screen (Pascal; works perfectly on Kepler).

 

And now here come the downsides!

 

  Hide contents

1 - Not all games use SLI. This means that sometimes your game will run on one video card, and if you force SLI on it may decrease performance or introduce graphical glitches. For example: Titanfall. SLI is a detriment to that game because it causes graphical glitches. Dark Souls 3 is another title; SLI scaling is negative in that game. This means using it on one card will almost certainly give you higher FPS than using it on two cards. Deus Ex: Human Revolution Director's Cut also did not have an SLI profile for a few months before recently getting one. This is extremely common these days too... developers are coding graphics using AFR-unfriendly rendering techniques for no real reason whatsoever, and many simply are not bothering to get multi-GPU working.

 

2 - Some games actually do not benefit from SLI at all. This is different from not supporting SLI. For example: Arma 2 & the DayZ mod, CS:GO. SLI or not, my FPS is usually the same. Because of this, I simply force it to a single card partially to save energy and to avoid any detriments that SLI being on may bring (such as stutter in Arma 2 when you're at 25fps in Cherno because the engine is about as optimized as a car with square wheels). This can actually become a bit of a benefit if you wish to use 3D or something, (as I've played Arma 2 in 3D before and it's worked pretty okay honestly) as 3D'll use your second card for 0 FPS hit, but it's mainly a detriment if you don't have 3D, don't wish to run a game in 3D, the game doesn't work well in 3D or don't wish to use your second game-oriented card for something else. It also reduces the usefulness of your dual GPUs if your mainly played games don't use SLI. Like if you're a Dark Souls 1, Dark Souls 3, Binding of Isaac & BOI Rebirth/Afterbirth, Wolfenstein: The New Order, Dishonored, DayZ, FFX/X-2 HD, Terraria, Street Fighter V and Mortal Kombat XL player? SLI will not benefit you in ANY format.

 

3 - Games sometimes will not use past a certain point of SLI usage. For example: I couldn't get Metal Gear Solid V: Ground Zeroes to run past 70% or so of my cards. Now, while 70% x 2 = 140% of one card vs 99% usage on a single card gives the benefit to SLI, if you buy a single stronger card it'll outstrip your two cards easily if it also sits at 95-99%. For example, at 70% util on both cards for two 980s, a single superclocked 980Ti or any 1080 will give you better performance. This util varies per game AND PER IN-GAME FPS. Some games simply don't use up the video cards well past a certain fps; so you may easily get well over 60fps before your cards start to drop utilization (THIS IS RARE). If you're aiming for a consistent 120fps or something? A single stronger card will do better.

 

4 - SLI can break. It's rare, but it happens, and has happened across multiple OS installs and OSes (win 7, win 8, win 8.1 and multiple driver versions). It'll ramp your utilization straight up for one of your cards to 95%+ (usually secondary card, but it can happen on the primary), and your other will sit at about 40-60% or even less, and your FPS will tank (though smoothness will NOT be affected very much). The only way to fix this is with a computer restart or (I assume, but never tried, disabling/re-enabling your cards), and it doesn't actually SEVERELY drop your performance. Like, it might break a little. Drop you from 90fps to 70, but still playable. Other games though it could be a far worse issue depending on how much you need your second card to get playable FPS. I don't know what causes it and I've had it happen to me a few times, but it is annoying to have to fix it.

 

5 - SLI drains more power. This should be obvious.

 

6 - SLI only benefits you by raw power increases. If a game requires more vRAM? Multiple GPUs aren't gonna do as much for it as you would think. For example: 2 x 770 2GB cards playing Watch Dogs; ultra spec uses 3GB vRAM which it cannot provide.

 

7 - SLI causes problems for streamers. Though it's obsolete now, OBS Classic only grabs 1/2 the framerate with game capture if a game is using two cards. It only takes frames from 1 card, so if you set 60fps and use game cap with SLI, you'll be visually outputting 30fps even though OBS reports full 60fps being displayed. To combat this you need windowed mode for the game and window capture (which then causes the game to get bouts of slowdown for a couple seconds on OBS Classic), or a capture card, or to use something like Dxtory or Playclaw 4 and their virtual webcam recording feature, but people have their problems with those programs in their own right and may not wish to use it. Window capture also means more of a performance hit in OBS Classic, and most importantly, some games dislike windowed mode *cough* skyrim *cough*. SO if you are a PC streamer and like to use OBS Classic and don't have a streaming PC, take this into EXTREME consideration. OBS Studio lacks this issue, but may take some elbow grease to get working. I expand on this under the "My thoughts" section lower in the guide.

 

8 - SLI with three monitors FORCES them to be a huge widescreen. If you fullscreen a game at say... 1080p with 3 monitors and SLI enabled, your other two screens will go black. This makes the extra screens for multitasking difficult to use. You actually need FOUR screens to get a "second monitor" effect once SLI is enabled. There MAY be software out there which removes the widescreen effect, but I believe it has issues otherwise.

 

9 - Turning on/off SLI requires a restart of the PC on laptops (I mistakenly thought desktops had this issue. Sorry about that). It's annoying, especially as your SLI turns off after each driver update. It's more of an annoyance because nVidia's drivers, for quite some time now have NOT required a restart after installation. There IS a windows registry hack you can use to remove it (which I used to use), but like all windows registry hacks, one should be careful. You can also use desktop drivers by modifying the driver .inf prior to installation (what I now do, screw mobile drivers), but it is a pain to do in windows 8 and 10, as modifying the .inf required involves rebooting with driver signing enforcement disabled to allow installations (you can't even do this on Windows 10 with Anniversary update 1607 onward if Secure Boot is enabled), and editing the registry after driver installation (if not using a modded driver .inf) will still require at least 1 restart... so either way you're going to have to restart once. So if you don't usually ever turn on/off SLI, you may not want to bother with trying to find this fix. Do note that nVidia could change this to match desktop functionality whenever they want, and simply don't care enough about laptop users to do it. I have had confirmation that they know this issue exists from one of their reps literally years ago.

 

10 - SLI performs a worse if your game is in Windowed mode or Borderless Windowed/Fullscreen Windowed. It used to hold high utilization % and such, but with newer drivers/cards (and newer games) it has dropped, and usually there's wildly varying utilization %; Overwatch for example grants about 80% primary GPU util and 92% slave GPU util. The ACTUAL performance hit used to be somewhere around 5% from what I once saw, but now it's between 10-15%. Let's say if you were getting 150fps in windowed, you might get 210-220fps fullscreen. It's now enough to write home about, and if you like to play in windowed/borderless windowed modes yet you find yourself performance-squeezing in a particular game? You may want to just fullscreen and block out the alt-tabbing distractions for a while. 

 

11 - SMAA and TAA are amazing forms of Anti Aliasing. They're the most efficient forms of AA currently out; barely any performance hit while still getting rid of jaggies in full-scene images without the excessive blur of FXAA and without the performance hit of MSAA. SMAA 1x is essentially a completely free form of AA, and SMAA T2x has a hit of about 3fps (from 60fps in Crysis 3), while TAA is something like a better SMAA T2x. The problem with SMAA is that SMAA's "T" AA formats are unusable in multi-gpu configurations. THIS IS IMPORTANT. If you see SMAA T1x or T2x in a game while using SLI and you attempt to enable it, it will force SMAA M2GPU on you if you're unlucky, and simply disable temporal filtering if you are lucky. SMAA M2GPU is a terrible form of AA, which combines SMAA 1x and MSAA 2x for a greater performance hit than MSAA 2x on its own would take. If a developer uses SMAA 2x (without the temporal filter that the T stands for) then that should be fine, but never use M2GPU or enable T2x with SLI on. The visuals are not worth the performance hit and you're far better off using 1x or MSAA 2x instead (please note, the only game I've ever seen SMAA 2x as a separate AA choice to T2x has been Evolve, and Evolve Stage 2 no longer has that benefit). TAA, like TXAA, manages to use temporal filters (that remove shimmering and crawling lines, usually in the distance) on textures in multi-GPU setups, even though temporal filter should need to have all the frame data, and with alternate frame rendering formats they shouldn't be able to access the frames the other GPU is processing. Unlike TXAA, however, TAA has a massive performance difference between single and multi-GPU, and can go so far as to make people believe it not worth the performance hit in SLI. It only exists in a few titles that use SLI now as well, so this may change in the future, but it was developed by the Epic devs for their SLI-less Unreal Engine 4, so don't expect miracles. Also, beware that turning on TAA in games that do not support SLI but have SLI forced on may cause RIDICULOUS CPU usage and slow framerates to a crawl on Kepler cards.

 

12 - MFAA does not currently work in SLI.

 

13 - As with the above downside, Maxwell's voltage fluctuations also have stability and overclocking problems in SLI configurations. If you're unlucky, they can happen at stock clocks too. I'll link this thread a friend of mine n=1 wrote over on notebookreview, as he has put it into much more detail than I can (since I've never owned maxwell cards). http://forum.notebookreview.com/threads/on-the-subject-of-maxwell-gpus-efficiency.773102/  Now, what seems to happen is that the cards' voltages don't always match each other, and their clockspeeds as a result may not always match due to the nature of "boost". nVidia started adding a disclaimer to their driver patch notes that differing voltages in SLI is an "intended feature" and that changing them stands to gain nothing to the user, but fixing them far improves stability (and increases power consumption, making their "low power" cards no longer "low power"... so they will never fix the broken design). If you're not a user who is keen on doing this kind of "fix" as outlined in the above thread, but you experience problems in SLI (driver crashes, or random stutters, etc) then you may very well have no choice. If this sounds like a scary/stupid bug, it's because it is. It shouldn't exist. But it does, and while the number of people who experience it may be low (and the others don't actually care), it IS a bug that more than one person has noticed, so it remains on this list. Note that Pascal cards do not appear to exhibit this behaviour as their voltage curves are much tighter.

 

14 - SLI uses extra CPU power due to driver overhead. It is something that's more apparent in recent games, but I've seen it to be true. It might not matter a whole lot to many people, but it is a downside of SLI, and thus in the guide it goes.

 

15 - It seems some games have a great disdain for borderless and windowed modes with SLI lately. The most blatant offender is Killing Floor 2, but there are others lately. Running in borderless or windowed modes causes stutters and makes the game feel choppy, even with high FPS. PLEASE NOTE: THIS IS A DEVELOPER ISSUE AS IT IS NOT PRESENT IN ALL GAMES. But because I've lately seen a couple titles having the issue, I feel a need to add it to this guide. If you like borderless windowed mode gaming, titles post-early-2014 may not be your friend with SLI.

 

16 - Bonus round (correct me if I'm wrong): CrossFireX only works on fullscreen games. So games which dislike alt-tabbing so you run in windowed mode will NOT benefit from crossfire, which renders the mode entirely useless if you need to run a lot of your games windowed for alt-tabbing/streaming/etc purposes. Actually, multi GPU has gotten so bad that I have to disregard CrossfireX entirely. There is no point to buying multiple of any AMD card for the purpose of gaming.

 

Resolved and/or no-longer applicable downsides to SLI (If you have SLI, read this section to see if any of these fixes apply to you).

 

  Hide contents

THE PROBLEM: Google chrome dislikes SLI. It will flash your pages white or black and you will need to move the tab into a different window, close/re-open chrome or maximize/restore down the window whenever it happens. This can be mitigated by going into chrome's flags and disabling "hardware accelerated video encoding". This puts more strain on the CPU for videos and livestreams however, but it's the only way to use chrome 100% bug free while the bug is in effect.

THE FIX: This no longer happens as of WHQL 340.52 drivers. If for some reason you need to use older drivers for benchmarking or specific-game-performance purposes however, this will still be a problem for you, and thus I have left the old fix and the problem description intact.

 

THE PROBLEM: Asynchronous SLI scaling can cause a lack of smoothness if the difference in clock speeds is very high between cards, even at higher framerates (60+). I've been unable to record it in any way, mainly because recording will clock up the card. Now, you might be thinking "well my cards are going to be close together in speeds, this is not going to affect me!". Well that's where you're wrong! Kepler cards (and possibly Maxwell, which shares similar boosting traits to Kepler) have a "feature" where if your card has triggered 3D clocks but is not actually using them to its fullest potential (like say... <40% scaling on both cards for 60fps?) then it may begin to downclock to save power, as the game isn't drawing enough to keep it at its base 3D clock. It may not hit the 2D clocks, but it may indeed throttle, and your game will suddenly become very annoying to play. The problem here, is that the slave card won't downclock. Only the primary. Older Fermi cards should not have this problem.

THE FIX: Open nVidia control panel's 3D application settings and find the game and select the "Power Management" option and set this to "Prefer maximum performance". I do NOT know if this is a feature purely available to laptops, and if it isn't, if ALL desktop Kepler/Maxwell GPUs have the capability. Here is a screenshot of what the option looks like, using a culprit of mine, Dark Souls 2. I suggest not setting this function in global settings as your card will never actually downclock itself. If this fix is unavailable to you, the second, less-reliable way to fix this is to simply bump resolution and graphics more. The higher you can go, the better. If you have a way to force supersampling or higher AA values or anything? Go for it; the more power the game takes is the better for you. If you have a Kepler DESKTOP card, DSR should be available to you as of 344.48 drivers and later, and thus supersampling ought to not be a problem. Kepler and Maxwell notebook cards now have DSR, but since nVidia doesn't care enough about notebook users to update their information online, I can't find which driver contains it (their official spec page claims we don't have it yet). I know that 347.88 and later drivers do, but not which driver started it. N.B. Maxwell and Pascal cards will clock up, but only to base clocks, when doing this.

 

THE PROBLEM: Gsync + SLI + DSR does not work.

THE FIX: Some of the newer drivers from nVidia have fixed this. I'd assume r364 branch, though those aren't drivers I'd suggest to anyone. If 362.00 doesn't allow it (I can't test; I don't own gsync, sorry) then wait for a later branch that's more stable. It'll be fine.

 

 

My thoughts and suggestions section.

 

  Hide contents

1 - SLI support is getting progressively worse. Most new, demanding games that have released absolutely detest SLI. I keep seeing more and more of it, and I'm at the point where I believe SLI is something that should not even be ATTEMPTED without owning the single strongest (or second strongest if there is a small performance difference, like 980Ti vs Titan X) GPU on the market. Many new and popular engines are designed to NOT support AFR SLI or Crossfire, like Unreal Engine 4 (list of its games) and Unity (list of its games; no idea why anyone would use Unity for 3D/graphics-heavy titles though). Right now, SLI is a feature that's more of an "added benefit" rather than an "expected power boost" in new, demanding games. Maybe if developers actually bother coding optimizations directly into DX12 and Vulkan games and we switch from AFR to SFR (split-frame rendering) and/or the system is able to use multiple GPUs as one big GPU, we won't need to worry anymore. But that is a long way away, and the performance bonuses will likely be less great than AFR is, and since we'd need devs to actually put time, money & effort into coding for DX12 or Vulkan, I highly doubt it. All DX12/Vulkan titles currently use driver-side optimizations and simply utilize the API for less CPU load anyway. And even if they did do it, almost every existing title TODAY will not get DX12/Vulkan api upgrades (let's be honest) and DX12'd require us all to be on Windows 10 anyway, which is honestly an extremely anti-consumer OS I recommend nobody be on, and most importantly: NO MAXWELL OR PRIOR GENERATION nVidia card is built to support SFR for most games. We have a new type of bridge decent bandwidth improvements to deal with the vRAM buffer data issue, but only Pascal cards will officially use it properly. So... you want to get SLI? MAKE SURE THE CARD YOU ALREADY OWN IS AT LEAST WITHIN 20% OF THE STRONGEST GPU ON THE MARKET. That way if a game screws you over, for SLI support, you can still play it fairly well. I swear if you SLI two GTX 960s, 970s, 980s or 1070s after reading this guide I will {REDACTED}.

 

2 - If you're a heavy streamer and don't have a streaming PC but you want your stuff to look good still, then SLI may not be for you. OBS Multiplatform now sufficiently works well enough with SLI that I can say it's much less of a headache, but some games still have issues. For example: Though a single-GPU game, The Binding Of Isaac Rebirth/Afterbirth required me using game capture with multi-adapter configuration, or it would literally have the game run at 10fps. But then, when I used multi-adapter capture, it would run the game at about 80fps for some reason, and actually force the game to go beyond its intended speed, and run about 33% faster. So I had to stream at 60fps, and use the "limit capture framerate" option to get it to work properly. For a game that doesn't even use the second GPU. Please don't misunderstand. This does not mean that ALL games have these issues, or that SLI never works correctly. It is an example intended to show that simply having SLI turned on for streaming purposes requires some problem-solving attitudes to get working right. I did not ASK anybody how to fix Binding of Isaac... I figured it out in exactly 3 minutes just via trial and error. But this is the difference between my mindset and the mindset of the person who tosses $3000+ at iBuyPower or OriginPC to just "get a Beast PC to stream on". If problems occur, you need to be someone who is comfortable figuring out what's wrong. And if you want or need to use old OBS for some reason, the limitations from before apply. Some games simply work better fullscreen, and alt tabbing can be problematic for them, and of course you cannot window capture a fullscreen window. So either invest in a cap card/streaming PC/etc or make it work with dxtory or simply be prepared to force a lot of games to single-card usage, which further reinforces the need for strong single cards before SLI even happens.

 

3 - Building off of the point in #2 but going a bit further, if you're NOT a consumer who is willing to learn and try things and just want to get a machine to run well and play your games, but your current midrange card isn't giving you that 60fps constant in BF4 on ultra like you want, the best thing for you is to upgrade your card. It won't benefit you to acquire potential issues that you won't really know a thing about. Single cards you just update ya drivers and blow a canned air at it every month and keep it cool. Dual cards are a different story. Gotta make sure they're enabled and working well and forcing them on/off depending on if games are bad or good with it etc... basically it's a headache you don't want. To someone like me, it's no headache at all really, but I like this stuff. You likely don't like it as much as me.

 

4 - If 3D is your thing, extra cards help. There's no contest. SLI usually does not scale 100% to both cards, therefore you usually have headroom and won't lose 1/2 your FPS instantly. Just remember that not everything works well with 3D. Even BF3 which is 3D vision ready led to BF4 being "not reommended". And BELIEVE ME. IT IS NOT RECOMMENDED. DON'T EVER PUT BF4 IN 3D. TRUST D2. Unfortunately 3D is basically dead now, and I wouldn't tell anybody who doesn't already have it or isn't willing to only play a lot of older titles with it on to get it. And note that VR ready titles are not 3D ready (or even 3D happy) titles... Elite: Dangerous is one such example.

 

 

The bandwidth issue

 

 

 


So, I keep stating that there is a bandwidth issue with regards to SLI and current generation cards. What the issue is, is that a lot of tech requires a lot more bandwidth between cards than is currently available to work properly. Cards transfer data between the PCI/e bus and the bridge (if one is present; Bridgeless SLI solutions exist though rare). The standard SLI bridge is clocked at 400MHz and grants approximately 1GB/s of bandwidth between GPUs + bandwidth from the PCI/e connection. This is not enough for the amount of data in each frame (frame buffer data), or relevant memory between cards, to be transferred between frames for new tech, or technologies such as Temporal anti-aliasing in general (though we've seen TXAA and in some cases, albeit at a large performance hit, TAA function in SLI). It has gotten to the point where forcing SLI in certain recent unsupported games/game engines or at certain resolutions (usually, but not always, above 1080p) literally requires PCI/e 3.0 x16/x16 for the cards in SLI to even grant positive scaling. Yes, that means that negative scaling (less FPS than single GPU) happens otherwise. And if it doesn't, functional scaling is often low (30% or less, making it a useless money sink). This means your CPU must cost a minimum of $500 USD for the intel mid-tier enthusiast CPU of any generation (4930K, 5930K, etc) and AMD CPUs which only use PCI/e 2.0 lanes (Bulldozer, Piledriver) cannot even be considered, and as for Ryzen, let me get something out of the way right now. It is AMD's version of an INTEL MAINSTREAM-CLASS PROCESSOR LINE. There are only 20 CPU PCI/e lanes (down from 38 lanes of Piledriver, though this was created when PCI/e 2.0 was the best available) and there is no quad channel memory support, etc. One should not consider it for multi-GPU usage either in light of this. As for some proof of the bad scaling? Here you go:

Proof #1

Proof #2

Proof #3 (x8/x8), and PLX Mobo (x16/x16).

Proof #4

Proof #5

Proof #6 (x16/x16 vs x8/x8 and even x16/x16 vs x16/x8, from Kepler through Pascal)


AMD offered a solution; one that I consider to be the best possible option. They decided to make use of the PCI/e bus bandwidth to talk to each card by configuring the memory in a certain way, which they call XDMA (Crossfire Direct Memory Access). This introduces a latency issue, but it's one that even AMD has found ways to minimize/eliminate, and I am certain nVidia could do the same if they cared (they apparently do not). In a PCI/e 3.0 x8/x8 situation (what a user is most likely to be using with a mainstream intel CPU as is the most popular PC configuration for gamers) a user would have 7.88GB/s bandwidth in this way, which is far more than the 900MB/s or so that the crossfire bridge granted previously, and far more than the 1GB/s that the SLI bridge grants. It requires the memory on the card to be designed in a certain way, however, so it cannot be retroactively added as a "technology" to any GPU. Also, as an added benefit, XDMA also helps to fix framepacing in multi-GPU, making it more smooth to use.

nVidia also made a solution to improve interconnectivity between cards... NVLink. However because NVLink replaces PCI/e, it is only sold on enterprise boards intended for use with Tesla cards. Supercomputers, to simplify. You are simply not getting a pair of gaming cards into one. Since it is a replacement for PCI/e, and apparently requires proprietary card connections, and it inevitably costs more money to have it, it would never work for mass-produced consumer boards. This I believe requires XDMA-style memory configurations too. But either way, it can't be sold to people like me.

So, recognizing the need for more bandwidth, but not being able to sell us simple XDMA-style cards for more $$, they decided to simply improve the bridge. There was already an overclocked LED bridge, that runs at 650MHz up from the standard 400MHz, granting 62.5% more bandwidth for 1.625GB/s. nVidia simply doubled up the connectors to use both SLI fingers and presto! 3.25GB/s bandwidth between cards! This is a huge step up from 1GB/s, however nowhere near the 7.88GB/s to 15.76GB/s that is present in XDMA-style x8/x8 or x16/x16 configurations respectively. Also, this effectively killed three-way and four-way SLI. I thoroughly dislike this approach that is being made, and it further reinforces my belief that SLI is basically being discarded by devs and nVidia alike (especially considering the new information about cards throttling each other that I introduced at the top). This is nothing but a band-aid on the bandwidth issue, and developers have little to no reason to be coding AFR-unfriendly rendering techniques either. I hope things make a turn for the better in the future, but as of right now, things are exceedingly bleak. It also does not help that they have no competition and only need do the bare minimum that is necessary to get the population to love them, and most larger voices in tech and the majority of PC gamers don't care about SLI.

 

 

 

I wish to add that as far as performance is concerned, two GPUs will far outstrip what one GPU can do, unless you're using two entry-level midranged cards and comparing a flagship enthusiast card (for example, two 960s versus a superclocked 980Ti). I DO like SLI, but SLI isn't for everyone and with the recent terrible state of SLI support that I see in a constant decline, as well as Maxwell and Pascal's anti-multi-GPU design, I can no longer recommend it to... well... anyone, really. If the current high end GPU isn't enough performance for you, SLI is the way to go, sure. But I would take a single stronger GPU over SLI-ing two weaker GPUs as long as that single GPU is 25% or more better than one of the weaker GPUs that would be SLI'd (I.E. I'd take Card A over Card B SLI if Card A is 25%+ faster than Card B no SLI). The amount of times with recent titles (that'll actually need GPU grunt, unlike many older titles with newer cards) where the single card will simply do a lot better than the SLI setup is going to be a very high number, and there is no guarantee that SLI will even properly work with nVidia Profile Inspector bits forcing (Dead by Daylight, for example, is a popular new title that will not get positive SLI scaling without flickering characters no matter what I do). This is, I believe, more the developers' faults than nVidia's, however nVidia's readiness to discard SLI is also apparent. They know it doesn't work well and are not showing intent on fixing it, as seen with GTX 1060s being incapable of SLI, despite being stronger than GTX 970s in raw performance.

 

Further to the above bashing of the state of multi-GPU, here is a nice article's summary page for performance in multi-GPU games in 2015 and later titles, to back up the statements I make in here, since I often get people telling me I'm deluded or some other kind of nonsense when I make such claims.

 

NB: I add to the benefits or detriments lists when I remember something, discover something or technology changes to keep the guide up to date. I wish I could speak more about Maxwell, but unless someone sends me a pair of maxwell GPUs and heatsinks for my Clevo, I'm not going to be able to test much, unfortunately.

 

If you want the vRAM information or mobile i7 CPU information guides, they're in my sig!

 

Moderator note: If you believe any information found in this guide is incorrect, please message me or D2ultima and we will investigate it, thank you. - Godlygamer23

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

slight correction with the SLI of different VRAM GPU

 

the newest driver have stopped users from using same GPU with different amount of VRAM

 

so you can't mix a GTX760 2GB with a GTX760 4GB now

 

 

this has been confirm by nVidia Staff and forums

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Good! Time to update! How do I use spoiler tags O_O

"{spoiler}"   "{/spoiler}"

 

 

change the {} to []

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Now one for xfire :P

|| MAELSTROM ||

|| i5 3570K W/ CM 212x @ 4GHz || r9 290 Windforce || Corsair Obsidian 450D || 2tb HDD + 120gb 840 evo SSD || 2x4gb Hyperx Fury Blue

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I can give a few points for CrossFire

 

one major advantage of CrossFire is the ability to mix same series GPUs with different amount of VRAM together

 

so you can mix a 280 with a 280X

 

or you can mix the HD7970 with the 280X

 

or 280X 3GB with a 6GB 280X

 

 

and also the 290 with a 290X

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

How do you get this pinned? @Windspeed36 seems well done enough to be :)

|| MAELSTROM ||

|| i5 3570K W/ CM 212x @ 4GHz || r9 290 Windforce || Corsair Obsidian 450D || 2tb HDD + 120gb 840 evo SSD || 2x4gb Hyperx Fury Blue

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I can give a few points for CrossFire

 

one major advantage of CrossFire is the ability to mix same series GPUs with different amount of VRAM together

 

so you can mix a 280 with a 280X

 

or you can mix the HD7970 with the 280X

 

or 280X 3GB with a 6GB 280X

 

 

and also the 290 with a 290X

I knew this much, but to be fair, you're essentially listing the same GPUs with rebrands as being compatible. What's more interesting is that apparently the HD 7950 and the 7970 are crossfire-capable with each other, due to being so similar under the hood. It would be nice to know if a 680 and a 770 could SLI, since the hardware is basically the same thing. I wouldn't want to try writing a full crossfire guide until I have time to seriously dig into it and do all the crazy stuff. About 80% of what I've listed (anything that includes 3 monitors or 3-4 GPUs I have been unable to try) is firsthand experience, and I would want to write a guide for Crossfire the same way, if I could.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

I knew this much, but to be fair, you're essentially listing the same GPUs with rebrands as being compatible. What's more interesting is that apparently the HD 7950 and the 7970 are crossfire-capable with each other, due to being so similar under the hood. It would be nice to know if a 680 and a 770 could SLI, since the hardware is basically the same thing. I wouldn't want to try writing a full crossfire guide until I have time to seriously dig into it and do all the crazy stuff. About 80% of what I've listed (anything that includes 3 monitors or 3-4 GPUs I have been unable to try) is firsthand experience, and I would want to write a guide for Crossfire the same way, if I could.

yea

 

Crossfire is very loose with the list of compatible cards

 

R9 and R7 Series
 
AMD Radeon R9 290X CrossFire compatible with AMD Radeon R9 290X.
 
AMD Radeon R9 290 CrossFire compatible with AMD Radeon R9 290.
 
AMD Radeon R9 280X CrossFire compatible with AMD Radeon R9 280X and AMD Radeon HD 7970.
 
AMD Radeon R9 270X CrossFire compatible with AMD Radeon R9 270X, AMD Radeon HD 7870 and AMD Radeon 7850.
 
AMD Radeon R9 270 CrossFire compatible with AMD Radeon R9 270.
 
AMD Radeon R7 260X CrossFire compatible with AMD Radeon R7 260X and AMD Radeon HD 7790.
 
AMD Radeon R7 260 CrossFire compatible with AMD Radeon R7 260.
 
AMD Radeon R7 250 CrossFire compatible with AMD Radeon R7 250. 
 
AMD Radeon R7 240 CrossFire compatible with AMD Radeon R7 240.
 
 
Radeon HD 7xxx Series
 
AMD Radeon HD 7990 CrossFire compatible with AMD Radeon HD 7990.
 
AMD Radeon HD 7970 GHz Edition CrossFire compatible with AMD Radeon R9 280X, AMD Radeon HD 7970 and AMD Radeon HD 7950.
 
AMD Radeon HD 7950 CrossFire compatible with AMD Radeon R9 280X, AMD Radeon HD 7970 and AMD Radeon HD 7950.
 
AMD Radeon HD 7870 XT CrossFire compatible with AMD Radeon R9 280X, AMD Radeon R9 280, AMD Radeon HD 7970, AMD HD Radeon 7950 and AMD Radeon HD 7870 XT.
 
AMD Radeon HD 7870 GHz Edition CrossFire compatible with AMD Radeon R9 270X, AMD Radeon HD 7870 and AMD Radeon HD 7850.
 
AMD Radeon HD 7850 CrossFire compatible with AMD Radeon R9 270X, AMD Radeon HD 7870 and AMD Radeon HD 7850.
 
AMD Radeon HD 7790 CrossFire compatible with AMD Radeon R7 260X, AMD Radeon R7 260 and AMD Radeon HD 7790.
 
AMD Radeon HD 7770 GHz Edition CrossFire compatible with AMD Radeon HD 7770 and AMD Radeon HD 7750.
 
AMD Radeon HD 7750 CrossFire compatible with AMD Radeon HD 7770 and AMD Radeon HD 7750 (External bridge not required to enable AMD CrossFire Technology with the AMD Radeon HD 7750).
 
 
 
Radeon HD 6xxx Series
 
AMD Radeon HD 6990 CrossFire compatible with AMD Radeon HD 6990, AMD Radeon HD 6970 and AMD Radeon HD 6950.
 
AMD Radeon HD 6970 CrossFire compatible with AMD Radeon HD 6990, AMD Radeon HD 6970 and AMD Radeon HD 6950.
 
AMD Radeon HD 6950 CrossFire compatible with AMD Radeon HD 6990, AMD Radeon HD 6970 and AMD Radeon HD 6950.
 
AMD Radeon HD 6870 CrossFire compatible with AMD Radeon HD 6870 and AMD Radeon HD 6850.
 
AMD Radeon HD 6850 CrossFire compatible with AMD Radeon HD 6870 and AMD Radeon HD 6850.
 
AMD Radeon HD 6790 CrossFire compatible with AMD Radeon HD 6790.
 
AMD Radeon HD 6770 CrossFire compatible with AMD Radeon HD 6770, AMD Radeon HD 6750, AMD Radeon HD 5770 and AMD Radeon HD 5750.
 
AMD Radeon HD 6750 CrossFire compatible with AMD Radeon HD 6770, AMD Radeon HD 6750, AMD Radeon HD 5770 and AMD Radeon HD 5750 (External bridge not required to enable AMD CrossFire Technology with the AMD Radeon HD 5750).
 
 
 
Radeon HD 5xxx Series
 
AMD Radeon HD 5970 CrossFire compatible with AMD Radeon HD 5970, AMD Radeon HD 5870, AMD Radeon HD 5850 and AMD Radeon HD 5770.
 
AMD Radeon HD 5870 CrossFire compatible with AMD Radeon HD 5970, AMD Radeon HD 5870, AMD Radeon HD 5850 and AMD Radeon HD 5770.
 
AMD Radeon HD 5850 CrossFire compatible with AMD Radeon HD 5970, AMD Radeon HD 5870, AMD Radeon HD 5850 and AMD Radeon HD 5770.
 
AMD Radeon HD 5830 CrossFire compatible with AMD Radeon HD 5970, AMD Radeon HD 5870, AMD Radeon HD 5850 and AMD Radeon HD 5770.
 
AMD Radeon HD 5770 CrossFire compatible with AMD Radeon HD 5770 and AMD Radeon HD 5750.
 
AMD Radeon HD 5750 CrossFire compatible with AMD Radeon HD 5770 and AMD Radeon HD 5750 (External bridge not required to enable AMD CrossFire Technology with the AMD Radeon HD 5750).
 
 
there are do some interesting exceptions like the recent R9 295x2 
 
apparently you can do a hybrid CrossFire the 295 with a 290X for Tri-Fire or 3 way CrossFire
 
same with the HD7990 which can be matched with the 280X or HD7970
 
 
The dual GPU cards like the HD7990 and 295x2 are CrossFire internally by default so you do not need to enable CrossFire mode 
 
only when you have dual HD7990 or dual 295 do you see the option to enable CrossFire for 4 way CrossFire
 
only the 290/X and 295x2 do not need the CrossFire finger/bridge
 
the 280/X, 270/X, 265 and below do need the bridge  

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm..... First of all, nice guide. I didn't know about the added benefit of SLI for 3D gaming and that you can actually use it to force ridiculous AA on games. I've actually thought about getting 2 flagship GPUs whenever Nvidia and AMD have released their next lineup (probably going to be Nvidia since they're going for power efficiency and I can't be arsed to upgrade my PSU) but I still can't figure out if it's worth it for me. Especially the issues with multiple monitors and borderless window mode seem like a huge dealbreaker to me, since I can't live without multitasking while gaming and my 3 monitors anymore. I've actually never had a multi-GPU setup before and I really want to try it out, but it seems to be a pain in the ass (not setting it up, running into small issues, etc. I don't have a problem with that, but if stuff like multiple monitors simply don't work and I can't do anything to fix it then it would drive me nuts.....). Eh, I'm as indecisive as ever.

Link to comment
Share on other sites

Link to post
Share on other sites

Now I just need a pair of 880's to give it a try...

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

I've found AA and 3D parts very interesting and helpful. Thank you, well-written article. :)

SimRacer - Casual FSX Pilot!

 

Spoiler

Mobo: MSI B550 Tomahawk | CPU: Ryzen R5 3600 | GPU: Vega 64 Sapphire Nitro | RAM: 16GB Crucial Ballistix DDR4 3200Mhz | PSU: EVGA P2 1000W 80Plus Platinum | Storage: 256GB Samsung 840 Pro SSD - 1TB WD Black - 2TB Seagate HDD | Cooling: Dark Rock Pro 3, Noiseblocker eLoop Fans | Case: Phanteks Enthoo Luxe | Audio: Sennheiser HD598 - JBL LSR305s | Display: BenQ EX3501R, Asus VG278H

Spoiler
 
Link to comment
Share on other sites

Link to post
Share on other sites

I've found AA and 3D parts very interesting and helpful. Thank you, well-written article. :)

As long as it helps someone with decisions or simply clearing up misconceptions, then my job is done ^_^. I wrote this to share information, because there's ALWAYS a zillion questions about "should I do this?" or "should I do that"? etc. I would like to do one for CrossfireX but unless somebody provides me with a high-end full AMD PC, I'm not going to be doing one. I only have a laptop too (which is currently not even available to me, stupid motherboard replacement) so it's not as simple as simply throwing in two AMD cards and checking how it works xD.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
  • 1 month later...
  • 1 month later...

Great overview!

PC: Ryzen 5 2600 // 16 GB Corsair 3200 // MSI RX 580 8 GB // 500 GB WD Blue M.2 (sata) // Silverstone Raven RVZ03B // Fedora 33

LAPTOP: i7 5700HQ // 16 GB Kingston 1866 // GTX 970M // 250 GB Samsung 850 EVO M.2 // 1 TB HDD // Windows 10

ETC: (2) Dell U2515H (2560x1440) // Corsair K63 // Logitech G603 // FiiO E10 // Beyerdynamic DT 770 PRO (250 ohm) // Audio Technica ATH-M60x

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

Now I just need a pair of 880's to give it a try...

Wat?

edit: just now realized you wrote that before 900 series launch ^^

To OP: very well done! :]

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

Wat?

edit: just now realized you wrote that before 900 series launch ^^

To OP: very well done! :]

Thanks! Glad you got something out of it

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

Heyyo,

Good guide. The only recommendation I'd add? A link to this table of user-found SLI bits. It's quite the comprehensive list of SLI bits for either better SLI scaling or support for SLI in games that don't have it.

http://www.forum-3dcenter.org/vbulletin/showthread.php?t=509912

and of course in order to use those bits the program NVIDIA Inspector is needed.

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×