Jump to content

Will multi-gpu setups be a thing with NVLink?

Lajamerr_Mittesdine

I'm not too sure but I've been hearing speculation that multi-GPU scaling will be a thing with NVLink.

 

The reason SLI failed is because it required game developers to specifically add support for multi-GPU setups and as we know that contains such a small part of the market that barely any developers spent time supporting it.

 

I've been hearing that with NVLink from various unsupported rumors / speculation that it might be possible to bypass game developers and instead make gpu scaling on nvidias end / driver end.

 

NVLink as far as I know on the business cards(that supported it) required support from application developers for their own application. But I'm hearing this might not be the case with games.

 

I know there isn't any facts to support any of these claims but does anyone with more knowledge know if these is feasible / possible? Could we see a return of multi-GPU setups with these new cards?

Link to comment
Share on other sites

Link to post
Share on other sites

Well with SLI games actually required games to support it or there was no performance scaling at all.

 

Are you saying that NVLink is actually designed around not requiring applications to support it?

 

So I could expect all applications that are GPU bound to have increases if I just add in more cards with NVLink?

 

I hope to see Linus and the team test if automatic GPU scaling is a thing with games once they have their hands on these cards.

Link to comment
Share on other sites

Link to post
Share on other sites

Anybody who told you that NV-Link does some sort of magical trick to multi-GPU configs, AS OF NOW, is wrong. Right now, you can use NV-Link either for SLI or for combining cards to get lots of screen outputs and such. This was a thing with the Quadro GV100, I don't know how much of those workstation-specific abilities are inherited from the Volta Quadro to Turing gaming GPUs, but YOU are only looking at SLI over NV-Link currently. Nothing more, nothing less. It's a faster interface for multi-GPU bridging. Maybe in the future you might get other things regarding NV-Link multi-GPU on gaming cards, but certainly not now.

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you for the response. That's unfortunate that it will still require application developers to support multi-GPU scaling in their applications/games.

 

I was hoping to throw in as many cards I could(4) and just get automatic scaling as if they were all the same single device. Even if it wasn't 100% scaling and more like 50% scaling.

 

If NVidia or AMD could somehow bypass application developers needing to add support for multi-GPUs and make applications think they are talking to a single device that would be a game changer.

 

Though I guess my hype was too optimistic, thanks for the ice cold bucket poured on me. :)

Link to comment
Share on other sites

Link to post
Share on other sites

The bandwidth is just not there to fully virtualize cards together.  Yes, Nvidia can encapsulate some of the management of resource sharing in their drivers, but with multiple APIs and game engines it's up to the devs to figure it out.  Even if NVLink is 50GB/s, how can they properly manage memory distribution generically? A game may be able to load the majority of assets into the shared memory pull, but that's not enough bandwidth to make it fully available to both cores.

Gaming - AMD TR 3970X | ASUS ROG Zenith Extreme II | G.SKILL Neo 3600 64GB | Zotac Nvidia 2080 Ti AMP | 2x Sabarent 1TB NVMe | Samsung 860 EVO 1TB SSD | Phanteks Enthoo 719 | Seasonic Prime Ultra Platinum 1000w | Corsair K70 RGB Lux | Corsiar M65 | 2x ASUS Rog PG279Q | BenQ EW3270U | Windows 10 Pro | EKWB Custom loop

ITX - Intel i7-10700k | Asus ROG Z490-I Gaming | G.SKILL TridentZ RGB 3200 32GB | EVGA 2080 Super| Samsung 970 Evo 1TB | Samsung 860 Evo 1TB SSD | NZXT H1| Windows 10 Pro

HTPC - Intel i9-9900k | Asus ROG Maximus XI Code | G.SKILL TridentZ RGB 3200 32GB | EVGA 1070 | Samsung 970 1TB | WD Blue 1TB SSD | NZXT H700  | EVGA G3 1000W | Corsair H150i | Windows 10 Pro

Servers - SuperMicro 846 | 2x 2695L V2 | 128GB | Chelsio 10Gbe | Chelsio 40Gbe | 24 x 6TB | FreeNas - SuperMicro 826 | 2 x 2695L | 128GB | Chelsio 10Gbe | Chelsio 40Gbe | 8 x 10TB | 847 24 x 1TB SSD | Windows Server 2019

Work - Dell XPS 15 9560 | i7-7700HQ | 32 GB RAM | 1TB NVMe | 4k dsiplay

Link to comment
Share on other sites

Link to post
Share on other sites

I love how you started this thread.  "The reason SLI failed".  It hasn't failed.  Stop propagating falsehoods.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, jasonvp said:

I love how you started this thread.  "The reason SLI failed".  It hasn't failed.  Stop propagating falsehoods.

of course it has, just look at the poor support from game developer and nvidia themselves.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, i_build_nanosuits said:

of course it has, just look at the poor support from game developer and nvidia themselves.

Funny: the AAA titles I play all support, and do so super well.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, jasonvp said:

Funny: the AAA titles I play all support, and do so super well.


Let me take you back to the drawing board since this is some poor laughing stock material. SLI and CrossFire suck arse, because new titles don't support them right away all the time. Take RE7 for instance, I had 1080 SLI at the time, which didn't work with it when the game was new. I played it with poor tearing all over the place at 4k with a single 1080. Do you honestly think I was going to wait all that time for SLI support to come to it? (it took a long while). The answer is NO. Nobody buys hot new SP games only to play them weeks or even couple months later while waiting for SLI support.

Then you have the game engines that DO NOT work with SLI regardless of what. Certainly game renderers do not support SLI, full stop. Finally, you have games that have been finicky with SLI for years, such as Siege.

I wouldn't mind if Nvidia killed SLI off COMPLETELY on RTX cards since it seems like you'll be able to full-on haul ass with a 2080 Ti, even at 4k. So yeah, there you have it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, jasonvp said:

I love how you started this thread.  "The reason SLI failed".  It hasn't failed.  Stop propagating falsehoods.

I'm sorry this statement by me was more reactionary than I intended. I meant to say that the hype and people doing multi-GPU setups seems to have died down as of recently this generation.

 

Which I presumptuously mostly attribute to applications people wanting to use are not scaling well or not scaling at all.

 

Again sorry for this and I'll need to learn to discuss stuff in a more neutral tone.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Motifator said:


Let me take you back to the drawing board since this is some poor laughing stock material

Take me back... where?

 

Ubi's AnvilNext supports it.

Ubi's Dunia supports it (better than any engine I've seen!)

Dice's Frostbite supports it.

 

Those are the engines I'm playing (with) lately.  The AAA games I play support it.  And NVidia hasn't "dropped support" for it at all.

 

Try again.  When you've got some real data, then let me know.  Otherwise you're just pulling these claims right out of your backside.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, jasonvp said:

Take me back... where?

 

Ubi's AnvilNext supports it.

Ubi's Dunia supports it (better than any engine I've seen!)

Dice's Frostbite supports it.

 

Those are the engines I'm playing (with) lately.  The AAA games I play support it.  And NVidia hasn't "dropped support" for it at all.

 

Try again.  When you've got some real data, then let me know.  Otherwise you're just pulling these claims right out of your backside.


Take you back to where you're completely clueless at. The Siege engine HAS BEEN crap with SLI for years. You can do Google searches of Siege and SLI issues and find problems from literally all the way beginning with 2015 to 2018. You also talked about Frostbite supporting it... DX12 and SLI in Frostbite? What about that? Are you even aware of ALL the problems regarding DX12 titles / renderers and SLI?

I didn't say they "dropped support", I said "I wouldn't mind if they drop support". Learn to read.

Real data? Vulkan doesn't support SLI either. Either stop replying if you're this clueless or go on and make more of a clown of yourself.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Motifator said:


Take you back to where you're completely clueless at. The Siege engine HAS BEEN crap with SLI for years.

 

Ubi fixed that.  It works now.  And it works well.

 

Quote

You can do Google searches of Siege and SLI issues and find problems from literally all the way beginning with 2015 to 2018.

 

I don't need to do searches.  I did a lot of testing when Siege was first released and even produced a video for Ubi showing them it doesn't work, and how to disable it specifically for Siege.  I know about the challenges with Siege and SLI.  But they're not true any longer.  It works.

 

Quote

You also talked about Frostbite supporting it... DX12 and SLI in Frostbite? What about that? Are you even aware of ALL the problems regarding DX12 titles / renderers and SLI?

 

Who the FUCK said anything about DX12?  Moving the goal posts, are we?

 

Quote

Real data? Vulkan doesn't support SLI either. Either stop replying if you're this clueless or go on and make more of a clown of yourself.

Moved the goal posts again.  Who said anything about Vulkan?

 

You've got no real data to back up your claim, and have no (or very little) first hand, real-world experience on the matter.  There's a "clown" in this thread and it's the guy I'm responding to here.  Try again.  Get some real world experience with the subject and then come back armed with knowledge.  Until then you're just parroting an oft-repeated line of horseshit found on this forum, put forth by people that either don't want to or can't afford to buy two GPUs and they use this as justification.

 

I'll say it once again for the cheap seats (which is where you're sitting right now) - SLI works very well depending on the title.  It's possible you play games in which the support is lacking.  The games I play: it works wonderfully.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Motifator said:


Take you back to where you're completely clueless at. The Siege engine HAS BEEN crap with SLI for years. You can do Google searches of Siege and SLI issues and find problems from literally all the way beginning with 2015 to 2018. You also talked about Frostbite supporting it... DX12 and SLI in Frostbite? What about that? Are you even aware of ALL the problems regarding DX12 titles / renderers and SLI?

I didn't say they "dropped support", I said "I wouldn't mind if they drop support". Learn to read.

Real data? Vulkan doesn't support SLI either. Either stop replying if you're this clueless or go on and make more of a clown of yourself.

Look.. we get it.  You had SLI, had a poor experience and don't plan on going back.   It's totally understandable.  

 

SLI is and will always be an enthusiast level technology and as such takes some tweaks and mods to try and push performance past currently available hardware. Once you achieve those results, the investment becomes worth it. It isn't meant for folks who are scared to flirt with the inevitability of diminishing returns and is certainly too much work for the "plug and play" crowd. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, jasonvp said:

 

Ubi fixed that.  It works now.  And it works well.

 

 

I don't need to do searches.  I did a lot of testing when Siege was first released and even produced a video for Ubi showing them it doesn't work, and how to disable it specifically for Siege.  I know about the challenges with Siege and SLI.  But they're not true any longer.  It works.

 

 

Who the FUCK said anything about DX12?  Moving the goal posts, are we?

 

Moved the goal posts again.  Who said anything about Vulkan?

 

You've got no real data to back up your claim, and have no (or very little) first hand, real-world experience on the matter.  There's a "clown" in this thread and it's the guy I'm responding to here.  Try again.  Get some real world experience with the subject and then come back armed with knowledge.  Until then you're just parroting an oft-repeated line of horseshit found on this forum, put forth by people that either don't want to or can't afford to buy two GPUs and they use this as justification.

 

I'll say it once again for the cheap seats (which is where you're sitting right now) - SLI works very well depending on the title.  It's possible you play games in which the support is lacking.  The games I play: it works wonderfully.


I don't care whether they fixed it now or not. It was broken for YEARS, get it? I don't play the game as much as I used to and I gave up on SLI WHEN IT WAS BROKEN.

Why are NOT we talking about DX12? Oh yeah, because YOU want to evade the topic and not want to talk about it. BF V will use it and EA again won't give a fuck about that then it won't work using that renderer.

No real world claim? Are you really ignoring every fact I told you? You are seriously a clown. I have done 1080 SLI TWICE, and sold them each time. We have Doom Eternal OTW, it WILL use OpenGL / Vulkan and it WILL most likely suck with multi-GPU setups.

What about the ray tracing, will it work properly with multi GPU? I don't care whether it works in 3 year old games YOU are now playing. People want the damn thing to work straight away with new games and it DOES NOT.

What the hell does my budget have anything to do with this? I own more musical toys than I need that I can always sell and buy two RTX 2080 Ti's. Stop being a clown and put down an actual argument other than basically trying to say "I'm ignoring every fact you put on the table".
 

2 hours ago, tcari394 said:

Look.. we get it.  You had SLI, had a poor experience and don't plan on going back.   It's totally understandable.  

 

SLI is and will always be an enthusiast level technology and as such takes some tweaks and mods to try and push performance past currently available hardware. Once you achieve those results, the investment becomes worth it. It isn't meant for folks who are scared to flirt with the inevitability of diminishing returns and is certainly too much work for the "plug and play" crowd. 

 


So you're saying I'm NOT an enthusiast? What makes you think that? I have done SLI 10 years ago, with multi-GPU cards too. Things you might not have even done. The investment is not worth it anymore to A LOT OF PEOPLE, and I'm not the only one against SLI. You have GPU's that can do every resolution for the mass market these days, even at high refresh rates so the value in it isn't as much as you think it is. Tweaks and mods? Do you realize that changing SLI rendering mode in NVCP most the time only causes MORE issues and then that Nvidia Inspector is just a glorified software to cause even more problems? You need SLI profiles in the majority of these cases for it to work, period.

Link to comment
Share on other sites

Link to post
Share on other sites

What I personally don't understand is why are people saying that the new cards are not SLI because it's NVLink.

 

I simply don't get it - nothing in the description of SLI technology connects it to the type of bridge used. It's like saying that M.2 SSD is not SSD because it has different form of connection than SATA.

 

NVIDIA still calls the cards SLI ready, right on their website under NVLink next to RTX Cards. It's still SLI, just different type of bridge. Better, faster bridge.

 

And I'm willing to bet that if you get new RTX cards and open the NVIDIA Control Panel, you will still see "Configure SLI" tab there.

 

Since I have gotten my SLI rig last year, I have played: ME: Andromeda, Final Fantasy XV, Witcher 3, Tomb Raider, Rise of the Tomb Raider. Next in line is Nier: Automata and Shadow of the Tomb Raider.

 

Which means that since I had my rig I have been utilizing SLI 100%. FF XV is the only game without official support that I have played and that was rectified by custom profile. Nier Automata also doesn't support it officially but I have it on good authority that custom setting with FAR mod gives it almost 100% scaling without any artifacts = better than most games that officially support SLI.

 

The only unknown is Shadow of the Tomb Raider but considering that Tomb Raider supported it and Rise of the Tomb Raider as well, I'm feeling good about my chances.

 

Sure, SLI isn't perfect but as long as someone going for it knows the advantages and disadvantages, it's fine.

 

1 hour ago, Motifator said:

So you're saying I'm NOT an enthusiast

Not a multi GPU enthusiast for sure with the way you speak about them ;-)

 

Multi GPU enthusiasts know everything you wrote and expect it and still buy a second card. As with everything, the performance is inadequate to the cost but such is the way of squeezing every bit of performance in many aspects of life. The last % cost exponentially more to achieve than the previous 90% and might even come with its own problems to that.

 

1 hour ago, Motifator said:

You have GPU's that can do every resolution for the mass market these days, even at high refresh rates

Well, a single 1080ti isn't enough to play the games that I play the way I play them. As long as I'm aware that it might not get me 100% scaling or work right from day 1 of the premiere, it's all good.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, jasonvp said:

The games I play: it works wonderfully

This is what we call an anecdote. In case you didn't already know, anecdotes are pretty useless.

 

If you look at the broad picture SLI support is pretty poor. Only a small % of game really support it and when they do it's rare that it works well. And even when it works perfectly it only makes sense for the highest tier of cards as it's almost always better and cheaper to simply go with a single more powerful card rather than 2 weaker ones.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Motifator said:

So you're saying I'm NOT an enthusiast? What makes you think that? I have done SLI 10 years ago, with multi-GPU cards too. Things you might not have even done. The investment is not worth it anymore to A LOT OF PEOPLE, and I'm not the only one against SLI. You have GPU's that can do every resolution for the mass market these days, even at high refresh rates so the value in it isn't as much as you think it is. Tweaks and mods? Do you realize that changing SLI rendering mode in NVCP most the time only causes MORE issues and then that Nvidia Inspector is just a glorified software to cause even more problems? You need SLI profiles in the majority of these cases for it to work, period.

Absolutely not, friend.  I am saying that the majority of the folks who complain about SLI simply don't have patience to make it work.  Most bleeding edge technologies are abandoned for this very reason.  As an engineer that builds software for ML and AI, I do have the patience and enjoy the quirky challenges that SLI presents.

 

The tweaks that many of us deploy go very far beyond "changing the rendering mode" in NVCP.  (Side note:  Anything that has a fun little UI to change game critical settings is bound to cause issues.)  We make changes to the SLI profile via the endless bits presented within the profile inspector, help compatibility via config files and even go so far as to edit older builds of the game to see if we can learn something new about rendering techniques.

 

Some of us aren't really concerned about FPS.  If I can get a game that doesn't officially support SLI to utilize both cards beyond 0% I consider it a win.  Often times this can achieved by turning off some lighting affects such as global illumination.  (At least in the case of KCD)

 

Is it perfect?  Nope.   But we're here to learn, adapt and have a little fun.  

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Lathlaer said:

Not a multi GPU enthusiast for sure with the way you speak about them ;-)
 

Multi GPU enthusiasts know everything you wrote and expect it and still buy a second card. As with everything, the performance is inadequate to the cost but such is the way of squeezing every bit of performance in many aspects of life. The last % cost exponentially more to achieve than the previous 90% and might even come with its own problems to that.

 

Well, a single 1080ti isn't enough to play the games that I play the way I play them. As long as I'm aware that it might not get me 100% scaling or work right from day 1 of the premiere, it's all good.


So I'm not a multi GPU enthusiast because I no longer am a fan of it? I have SLI'ed GTX 285s, 470s, 570s... Crossfire'd all the way from HD 2900's, 3870s... GTX 680s... more cards than I can remember.

And no, a single 1080 Ti is perfectly capable of 3440x1440. It's a non-issue resolution for that card, it will perfectly get into the G-Sync range of your monitor and then so on. What you "need" is different than what you "want" here.
 

9 minutes ago, tcari394 said:

I do have the patience and enjoy the quirky challenges that SLI presents.
 

Is it perfect?  Nope.   But we're here to learn, adapt and have a little fun.  


I don't anymore, I'm no longer a fan of wasting money, PCI-E lanes and excessive heat for something that sits and does nothing / causes issues. Sure, there are times it works but like I said, it has significant problems with new titles. PUBG for example doesn't work with it because of the engine, so why should I? I'll simply buy a 2080 Ti and it will deliver what I want, 60 FPS with my 4k monitor. Then I'll be having fun indeed.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Motifator said:

So I'm not a multi GPU enthusiast because I no longer am a fan of it? I have SLI'ed GTX 285s, 470s, 570s... Crossfire'd all the way from HD 2900's, 3870s... GTX 680s... more cards than I can remember.

Yes, do you know the definition of being enthusiastic about something?

 

You may have the knowledge and experience but you're not exactly oozing enthusiasm towards the technology.

 

1 minute ago, Motifator said:

And no, a single 1080 Ti is perfectly capable of 3440x1440. It's a non-issue resolution for that card, it will perfectly get into the G-Sync range of your monitor and then so on. What you "need" is different than what you "want" here.

Well, duh. Now we have a firm grasp of the obvious. Since gaming is my hobby, I will play the games however I want them, not however I need them. If I want to play at 4213x1764 via DSR or apply some crazy AA settings, then I will do so. A single 1080ti isn't enough for me to do that.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Motifator said:

And no, a single 1080 Ti is perfectly capable of 3440x1440.

I think we're understanding more of our disagreement.  See, to me: 3440x1440 isn't enough.  We'll get back to that in a moment.

 

Quote

 

I don't anymore, I'm no longer a fan of wasting money, PCI-E lanes and excessive heat

 

And that's fine.  I have the PCI-E lanes to spare, and the cooling hardware to handle a second GPU.

 

Quote

 

I'll simply buy a 2080 Ti and it will deliver what I want, 60 FPS with my 4k monitor. Then I'll be having fun indeed.

 

Back to our disagreement.  You're aiming at a lower resolution and/or frame rate than I am.  4K/140 with as many details cranked as can be (except AA and AO).  That's my goal.  With the Frostbite games: easy peazy with SLI.  With AnvilNext games: a bit harder but close; Siege is a beast when it comes to GPUs and 4K.

 

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, jasonvp said:

I think we're understanding more of our disagreement.  See, to me: 3440x1440 isn't enough.  We'll get back to that in a moment.

 

And that's fine.  I have the PCI-E lanes to spare, and the cooling hardware to handle a second GPU.

 

Back to our disagreement.  You're aiming at a lower resolution and/or frame rate than I am.  4K/140 with as many details cranked as can be (except AA and AO).  That's my goal.  With the Frostbite games: easy peazy with SLI.  With AnvilNext games: a bit harder but close; Siege is a beast when it comes to GPUs and 4K.

 


I'm running 4k as well.

I have those too.

Do you realize that the monitors you talk about do not do proper RGB at 144HZ 4k? You need to reduce them to 78HZ or so to get 4:4 sampling.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×