Jump to content

RTX 3090 SLI - We Tried so Hard to Love It

AlexTheGreatish
On 10/15/2020 at 1:10 PM, yaboistar said:

this comment was brought to you by the LTT folding team. if you, too, have two 3090's sitting around, consider donating them to science.

almost enough to get to top 10 too on ltt folding team

Link to comment
Share on other sites

Link to post
Share on other sites

Sooo.... Have they stopped linking the forum in the video description and posting video discussions here?

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/15/2020 at 10:26 AM, riba2233 said:

3:28

 

"I love Ryzen as much as anyone does at this point, but if you are running 2 RTX3090's, you are going to need the fastest possible gaming cpu, which for now anyway is 10900K".

 

Ok, maybe, but apparently you are fine with 1/4 of max bandwidth for each card?

 

BTW I died when I saw Alex laughing, so honest, his face got really red 😄

 

I don't know if Linus or @AlexTheGreatish realized but they aren't actually using a 10900K in the video. If you look closely at the motherboard in the video it's a Godlike Z390 not a Godlike Z490. The Z490 has a totally different layout with only 3 PCIE x16 slots instead of 4. I also believe that the newest mobo has 3 slot spacing because of this layout change so the NVLINK bridge wouldn't work and the graphics cards would be touching or almost touching. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Brett4721 said:

I don't know if Linus or @AlexTheGreatish realized but they aren't actually using a 10900K in the video. If you look closely at the motherboard in the video it's a Godlike Z390 not a Godlike Z490. The Z490 has a totally different layout with only 3 PCIE x16 slots instead of 4. I also believe that the newest mobo has 3 slot spacing because of this layout change so the NVLINK bridge wouldn't work and the graphics cards would be touching or almost touching. 

 

Nice catch!

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 10/21/2020 at 8:46 AM, Brett4721 said:

I don't know if Linus or @AlexTheGreatish realized but they aren't actually using a 10900K in the video. If you look closely at the motherboard in the video it's a Godlike Z390 not a Godlike Z490. The Z490 has a totally different layout with only 3 PCIE x16 slots instead of 4. I also believe that the newest mobo has 3 slot spacing because of this layout change so the NVLINK bridge wouldn't work and the graphics cards would be touching or almost touching. 

Yeah. I was wondering the same thing seeing as I was trying to find a 4-way Godlike for the 10700k or 10900k and ended up having to get the x570 Godlike instead. The only intel version I could find that worked was the Z390 but it's LGA 1151 (9900k, etc.).

 

Only current mobo I could find (non-server or threadripper) that runs 3090 SLI is the MSI Godlike X570. I picked one up and plan on picking up a 5950x for it on the 5th. I need two 3090 setups here anyway, so I'm hoping to play with SLI every so often on one of them when they're not both in use.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Man I hate when LTT does such things and half ass them....  Even if they dont do it because nvidia reps thought is was a good promotion etc (so no conspiracy theories for big bad tech companies shaping the opinions for their greed by bribe ) it doesnt change the fact that even unknowingly LTT helps the consumer hostile attitude of nvidia.

 

Why you think SLi is a thing of the past in the first place? yes it had its issues but in general it was a really nice way to get a higher tier performance by utilizing lower tier cards and save some money. And that is something Nvidia doesnt like us to do (save money for gaming) 

 

 

And most of the big influencers not only dont give a crap about criticizing nvidia about that but their pad its shoulders and lets not kid ourselves market is not shaped by the tech people its shaped by the big majority of "dontknowbetters" that look at videos and commercials and take it as gospel...

 

 

Instead of trying to shape their mind on what they are missing out by nowadays schemes (which will lead them to ask for example if X card can SLi and will get no as an asnwer from a retailer but if thousands of people do that and thousands of retailers have to disappoint them this could change things for example)  they just present it as useless... 

 

 

A)  The titles that are shown in the screen of linu's rig are not the only titles supporting SLi (out of the box)  not even close and there are even more titles that can run SLi with same configuration/patching!  but the average viewer will see littely a handful of titles(before watching this video for example I just finished a session of Resident evil 7 which has some top graphics and is a demanding game and which supports SLi/Crossfire out of the box)   and will think "wow that's lame" 

 

B) There must be an issue with RdR2 because I for example can play it with Crossfire and Wolfenstein YB has better performance in crossfire vs 1 single vega64 

 

C) They picked the worse case scenario of GPUs a single 3090 needs probably a beefy PSU to begin with and they were talking about the heat to show bad it is to SLi which is not a objective picture... 

 

 

SLi is dead but not because it was a bad thing for us gamers but because nvidia killed it to deny people access to "cheaper performance"  and force them to pony up more money to buy a higher tier card

 

One of the biggest advantages of DX12 was that it allows for multiple graphics cards to run a game (not even the same brand!! ) yet almost nobody implements those features ? why? Because they are partners with the GPU manufacturers and this wouldnt be good news for the ridiculous prices of the higher tier models which cost 1000+ $ a piece! 

 

I magine having 2 RX 580 and suddenly because of DX12 you could add a 1060 and 5500XT you found on ebay at a good price all those 4 cards (with the help of DX12) would allow you to play at atleast RTX 3070 FPS rates  and you wouldnt need to spend as much and make use of your existing older hardware .. bad news for nvidia which would also force it to lower the prices of current gen to make it more appealing because of that DX12 multiGPU features dont get implemented! 

On 10/15/2020 at 8:10 PM, yaboistar said:

this comment was brought to you by the LTT folding team. if you, too, have two 3090's sitting around, consider donating them to science.

The cards in the video consist of like 10% of the total stock of nvidia so I doubt that :P :P :P 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, papajo said:

A)  The titles that are shown in the screen of linu's rig are not the only titles supporting SLi (out of the box)  not even close and there are even more titles that can run SLi with same configuration/patching!  but the average viewer will see littely a handful of titles(before watching this video for example I just finished a session of Resident evil 7 which has some top graphics and is a demanding game and which supports SLi/Crossfire out of the box)   and will think "wow that's lame"

the games showen are the games that have Native Game Integration of sli as after January 1st 2021 nivideas drivers will not support sli profiles meaning it is entirly up to the develper to put in the work to biuld their own support for a fecher little people use. it also means that the games that support sli now not on that list will no longer

12 minutes ago, papajo said:

B) There must be an issue with RdR2 because I for example can play it with Crossfire and Wolfenstein YB has better performance in crossfire vs 1 single vega64

yes that is why sli is not so grate as some times it lessens proformance. also would you not be better off geting a better graphics card than buying two

12 minutes ago, papajo said:

C) They picked the worse case scenario of GPUs a single 3090 needs probably a beefy PSU to begin with and they were talking about the heat to show bad it is to SLi which is not a objective picture...

the 3090 is the only 3000 card to support sli so it is an obgetive look (this is a video on sli in the 3000 cards not an overview of sli in genrale)

Please excuse my spelling

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, DarthEoin said:

yes that is why sli is not so grate as some times it lessens proformance. also would you not be better off geting a better graphics card than buying two

32 minutes ago, papajo said:

Yea the point being that there is no cosmological wall that one hits when he tries to implement better support its just that in purpose they do not. 

 

9 minutes ago, DarthEoin said:

the games showen are the games that have Native Game Integration of sli as after January 1st 2021 nivideas drivers will not support sli profiles meaning it is entirly up to the develper to put in the work to biuld their own support for a fecher little people use. it also means that the games that support sli now not on that list will no longer

RE7, FARCRY ,CRYSIS3, FOR HONOR , COD IW, Fallout 4, Witcher 3 just a few games coming into my mind that support SLi out of the box and were not even mentioned.. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, DarthEoin said:

also would you not be better off geting a better graphics card than buying two

You would that's exactly why they dont support it! There is no technical obstacle that prohibits them to support it!! 

 

Imagine having one card already why not buy the same card (at a cheaper price on top of that since it will be a older model in teh future) and get some performance increase that way. 

 

And they scale good in most cases when I had RX 580s (spent 180$ before the bitcoin crisis) for example I get similar performance to a GTX 1080ti (1000$ back then) 

 

My two Vega 64 Nitro's (which I manually OCed) have about the performance of a RTX 3080 actually they are a little slower but only but a marginal point... and I spend like 500$ for both because I got them used.. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, papajo said:

My two Vega 64 Nitro's (which I manually OCed) have about the performance of a RTX 3080 actually they are a little slower but only but a marginal point... and I spend like 500$ for both because I got them used.. 

We learned from SLI and Crossfire that constant frame times and lower latency are more important than raw FPS. Even if you get the same performance on paper with two graphics cards, it is just a worse experience after all.

Not allowing the uninformed costumer to waste their money on an obsolete technology is a good thing.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HenrySalayne said:

We learned from SLI and Crossfire that constant frame times and lower latency are more important than raw FPS. Even if you get the same performance on paper with two graphics cards, it is just a worse experience after all.

Not allowing the uninformed costumer to waste their money on an obsolete technology is a good thing.

The multigpu rendering technology is not obsolete it is cutting edge (it is one of the newly baked features of DX 12 the latest DX ) 

 

As for the specific implementation of nvidia "SLi/SLI" (which again is not obsolete it is still present on their cutting edge workstation models -because there it needs to be present and profit margins and the way those cards get implemented doesnt suffer from the use of multiGPUs but rather is essential)

It had its issues but those issues are merely  a matter of optimization there was no global problem/intrinsic  to the use of dual or more GPUs (not all games have/had issues some of them have many of those can be fixed/mediated with tuning/configuration/patching and  others run just fine ) 

 

From the time I had my first multi GPU (GeForce 9800 GX2) till today the only real "issue" I was facing is that some of my games wouldnt play utilizing both GPUs that was the only real issue... 

 

 

It is dead now solely for profit margins many of the current 12389219038 different model tiers of nvidia wouldnt have meaningful reason to exist (or to exist at the current price tag) had the SLi technology support not been discuntinued 

 

 

EDIT: Last but not least the list of games that officially support sli and scale good (so the games that need patching, 3rd party software etc to make both GPUs work are excluded from that list- I mention that because if those titles were to be included the list would have been even bigger maybe double the size or close to that- ) 

is bigger than the list of games that support RTX https://www.build-gaming-computers.com/sli-supported-games.html

 

The difference here being that RTX keeps getting support and being enforced (as we recently saw with the "hardware unboxed" fiasco from nvidia ) is that RTX is a gimmick.. ehm "feature"  that allows nvidia to increase the pricetag of products that support RTX and to force people go on the nvidia wagon instead of amd by aggressive/hostile marketing making them think they need RTX.... 

 

On the other hand sli is a feature where consumers could save money while still increasing their in game performance which ends up with nvidia making less money from us.... so it needed to be discontinued! 

 

But I wouldn't expect anything better from greedy nvidia (Jensen's leather jackets are very expensive after all :P ) the tragedy for me is that YT channels like LTT are like "humm yea nvidia you are right to do that" and not only that but either in purpose (so with malicious indent) or because they are ignorant (pick and chose I dont have any interest to support any conspiracy theory but evidently if it is not a conspiracy of any sort/kind then they just dont know better ) are painting the worst possible picture for a very valuable consumer feature (mutligpu/sli/xfire) that leads to shape public opinion negatively towards it while quite the opposite should have happened consumers should be displeased with that decision because they lost value because of that while on the same time its absence allows for higher pricetags since the competition of a sli solution when planning to upgrade is absent. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, papajo said:

The multigpu rendering technology is not obsolete it is cutting edge (it is one of the newly baked features of DX 12 the latest DX ) 

SLI & Crossfire are obsolete.

8 minutes ago, papajo said:

It had it's issues but those issues are merely  a matter of optimization there was no global problem/intrinsic  to the use of dual or more GPUs

No, they are not. SLI and Crossfire add at least one frame of additional latency - intrinsic. Uneven frame times are also intrinsic. Pumping out more frames is no longer the metric of a "good" gaming system. Even if SLI or Crossfire would be optimized to the physical limit (2x faster), it still feels like playing with a single GPU and it still feels unresponsive and laggy, but the visuals are smoother (frame times might get worse -> microstuttering, or you have to buffer the frames even more -> latency). It basically works like frame interpolation, but you will need a second GPU (and all it's disadvantages) for it.

 

For certain workloads you could get a huge performance uplift with two or even more GPUs, but the current "alternate frame rendering" simply doesn't benefit gaming. And that's the reason SLI and Crossfire are obsolete.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HenrySalayne said:

SLI & Crossfire are obsolete.

sli is just discontinued from the gaming GPUs the technology is still supported and developed for the workstation/creator market (so any GPU but "GeForce/GTX/RTX" supports it) 

 

Crossfire on the other hand is not discontinued at all, the "brand name" only changed DX11 games still get crossfire support while DX12 games just rely on mgpu feature. 

 

Also read again my last post please because I have edited it 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, papajo said:

I dont have any interest to support any conspiracy theory but evidently if it is not a conspiracy of any sort/kind then they just dont know better ) are painting the worst possible picture for a very valuable consumer feature (mutligpu/sli/xfire) that leads to shape public opinion negatively towards it while quite the opposite should have happened consumers should be displeased with that decision because they lost value because of that while on the same time its absence allows for higher pricetags since the competition of a sli solution when planning to upgrade is absent. 

You are not supporting conspiracy theories, you are creating new ones.

 

To be honest, if one of my friends would recommend upgrading my PC with another identical GPU, I would either call him an idiot and/or revoke our friendship. Friends don't recommend friends Crossfire or SLI. "Upgrading" to a dual GPU setup is a waste of money.

 

 

3 hours ago, papajo said:

the tragedy for me is that YT channels like LTT are like "humm yea nvidia you are right to do that" and not only that but either in purpose (so with malicious indent) or because they are ignorant

If you don't like the removal of SLI on newer Nvidia cards, don't buy one. But it's certainly not a tragedy LTT points out what almost everyone already knew: SLI has been dead in the water for years, it introduces lag and stuttering, it's not widely supported, it is a waste of money and it's simply unnecessary nowadays. And this opinion is certainly not enforced by Nvidia's marketing team but years of experience with SLI.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/20/2020 at 5:24 PM, HenrySalayne said:

You are not supporting conspiracy theories, you are creating new ones.

How am I doing this? because I leave it as an option?  fine if its not a conspiracy as in they are benefiting directly or indirectly by doing this then the other option is that they just dont know and failed miserably doing what they did which has the exact same result. e.g as I just proved the list of supported games is like about 57 titles (again that's without the games one can manage to use two nvidia gpus by 3rd party software or other patches/hacks ) and they showed that there are like 6 or something... (I am not saying that they should test all the 57 titles ncesserily but they shouldnt just declare that these  6 icons on their screen is the entire supported list

 

So you either do such a thing in purpose or because you are a noob. Pick and choose I dont want to create a conspiracy I just tell it as it is. 

 

 

On 12/20/2020 at 5:24 PM, HenrySalayne said:

If you don't like the removal of SLI on newer Nvidia cards, don't buy one.

I don't but that's stupid to say. Sorry for that m8 no offence but it is... It like saying "if you hand pains a lot then cut your hand off" 

 

On 12/20/2020 at 5:24 PM, HenrySalayne said:

LTT points out what almost everyone already knew: SLI has been dead in the water for years

Which exactly is false as well and your only "grip" to that is this video.

 

People trying to hack SLi since Nvidia started to stop it (for their greety reasons) starting by the Pascal generation and namely the GTX 1060's not having sli leads which indicated that people where still interested in doing it (and of course for enabling SLi on non SLI mobos etc or with different GPUs etc)  

 

https://github.com/EmberVulpix/DifferentSLIAuto

 

https://www.overclockers.co.uk/forums/threads/sli-with-no-bridge-different-cards-non-sli-mobo.18809948/

 

https://www.techpowerup.com/forums/threads/hypersli-enabling-sli-on-non-sli-motherboards.153046/page-127

 

So people trying to hack it means that they know they want it but try to find a way to get it since they are "officially" denied by it! 

 

And here you can see the reason they simply outperfomed a GTX 1080 card which costed 3 times as much as a 1060.... so that's why the stopped it 

 

 

 

 

But of course GN despite having the results still "recomends" not to use SLi I wonder why people that try to have big channels and partnerships with nvidia say taht while people like me (or the guys in the following videos) who dont have any such interest find it a very cool feature and actually use SLi and Crossfire .... interesting. 

 

Here you can see with limited support (only from DX 12 not ingame optimization for it) two 1060s beat an RTX 2060 despite the absence of ingame optimization for dual graphics! 

 

 

 

Here you can see a guy rocking modern titles with GTX 660 Tis!! (yes 660 not 1060) Imagine if you had one 660 Ti all those years back (about 180$ new back then the prices -partially because of SLi as well- were GOOD you could actually get a mid tier card for that money now you need 300+$$$ for the same tier!  )and then you could buy one for like 20$ now and still be able to play some modern titles at solid good FPS that's why they dropped support on it! 

 

 

 

 

here you can check the scalling of higher tier cards like a GTX 1080 

 

Why to spent e.g 1000 euro back when the 2080 super launched (or even now for a 3080) if you had already a 1080 back then and can find an additional one for like 200-300$ 2nd hand and have scalling of 50% to 100%?? 

 

nvidia-geforce-gtx-1080-3840x2160-2-way-

 

But no lets drop support and leave only those 57 + titles because we lose money on that while consumers are able to get more bank for their buck and lets focus on RTX which is partially present in a handful of games because that way we can charge a premium and also undercut our competitors (AMD) GPU sales :P

 

yea for us consumers dropping support for SLi is the best thing that happened! good riddance! :P lets pay 1000 to 1500$ to be able to play cyberpunk at mediocre frame rates with RTX for this one game !! :P  but hey Minecraft with RTX on a 1000$ GPU plays flawlessly.. almost.. 

Link to comment
Share on other sites

Link to post
Share on other sites

@papajo

I already tried to explain to you, why SLI FPS are not the same as single GPU FPS. If you still think putting a second GPU into your system will magically turn an unplayable game into a joyful experience, you are wrong.

 

And to be fair - saying LTT are "benefiting directly or indirectly" or "are a noob" because they do not share your believes, disqualifies you and your opinion from any further discussion. You are just trying to plant the seed of doubt with a big pile of nothing to back it up. And this is by definition a conspiracy theory.

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/20/2020 at 9:46 AM, papajo said:

is still supported and developed for the workstation/creator market (so any GPU but "GeForce/GTX/RTX" supports it) 

What you're talking about has NOTHING to do with SLI. Software that supports multi GPU doesn't require any bridge/link between cards, solely the 2 cards connected through PCIe (or nvlink for faster communication).

 

The nvlink bridge adds some nice extra bandwidth for resource intensive tasks that need to keep moving data between the vrams a lot, but in most cases it's really unnecessary when you have less than 4 cards in most taks, since the data is usually self contained to each GPU and you just need to get the final results once. 

 

I left that last part in bold because it's the main difference between the requirement for SLI, where you need lots of synchronization between the cards every 16ms (or even less) to render stuff, that's just stupid and really hard to achieve, programming for something like that is a pain.

 

tl;dr: SLI is not used WHATSOEVER outside of gaming scenarios and has almost nothing to do with the tech used on workstation/creator workloads.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...
On 12/21/2020 at 10:21 AM, papajo said:

How am I doing this? because I leave it as an option?  fine if its not a conspiracy as in they are benefiting directly or indirectly by doing this then the other option is that they just dont know and failed miserably doing what they did which has the exact same result. e.g as I just proved the list of supported games is like about 57 titles (again that's without the games one can manage to use two nvidia gpus by 3rd party software or other patches/hacks ) and they showed that there are like 6 or something... (I am not saying that they should test all the 57 titles ncesserily but they shouldnt just declare that these  6 icons on their screen is the entire supported list

 

So you either do such a thing in purpose or because you are a noob. Pick and choose I dont want to create a conspiracy I just tell it as it is. 

 

 

I don't but that's stupid to say. Sorry for that m8 no offence but it is... It like saying "if you hand pains a lot then cut your hand off" 

 

Which exactly is false as well and your only "grip" to that is this video.

 

People trying to hack SLi since Nvidia started to stop it (for their greety reasons) starting by the Pascal generation and namely the GTX 1060's not having sli leads which indicated that people where still interested in doing it (and of course for enabling SLi on non SLI mobos etc or with different GPUs etc)  

 

https://github.com/EmberVulpix/DifferentSLIAuto

 

https://www.overclockers.co.uk/forums/threads/sli-with-no-bridge-different-cards-non-sli-mobo.18809948/

 

https://www.techpowerup.com/forums/threads/hypersli-enabling-sli-on-non-sli-motherboards.153046/page-127

 

So people trying to hack it means that they know they want it but try to find a way to get it since they are "officially" denied by it! 

 

And here you can see the reason they simply outperfomed a GTX 1080 card which costed 3 times as much as a 1060.... so that's why the stopped it 

 

 

 

 

But of course GN despite having the results still "recomends" not to use SLi I wonder why people that try to have big channels and partnerships with nvidia say taht while people like me (or the guys in the following videos) who dont have any such interest find it a very cool feature and actually use SLi and Crossfire .... interesting. 

 

Here you can see with limited support (only from DX 12 not ingame optimization for it) two 1060s beat an RTX 2060 despite the absence of ingame optimization for dual graphics! 

 

 

 

Here you can see a guy rocking modern titles with GTX 660 Tis!! (yes 660 not 1060) Imagine if you had one 660 Ti all those years back (about 180$ new back then the prices -partially because of SLi as well- were GOOD you could actually get a mid tier card for that money now you need 300+$$$ for the same tier!  )and then you could buy one for like 20$ now and still be able to play some modern titles at solid good FPS that's why they dropped support on it! 

 

 

 

 

here you can check the scalling of higher tier cards like a GTX 1080 

 

Why to spent e.g 1000 euro back when the 2080 super launched (or even now for a 3080) if you had already a 1080 back then and can find an additional one for like 200-300$ 2nd hand and have scalling of 50% to 100%?? 

 

But no lets drop support and leave only those 57 + titles because we lose money on that while consumers are able to get more bank for their buck and lets focus on RTX which is partially present in a handful of games because that way we can charge a premium and also undercut our competitors (AMD) GPU sales :P

 

yea for us consumers dropping support for SLi is the best thing that happened! good riddance! :P lets pay 1000 to 1500$ to be able to play cyberpunk at mediocre frame rates with RTX for this one game !! :P  but hey Minecraft with RTX on a 1000$ GPU plays flawlessly.. almost.. 

You're wrong though, the 3090 really only supports the game listed in this article NVIDIA SLI Support Transitioning to Native Game Integrations | NVIDIA (custhelp.com)

 

On top of that, since January 1 even Turing isn't getting new SLI profiles. 

So true it hurts

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/24/2020 at 2:41 AM, igormp said:

What you're talking about has NOTHING to do with SLI.

Potato potato... It's like saying AMD's SAM has nothing to do with Resizable address bar (of the PCIe spec) 

 

It is just a branding difference (along with driver manipulation to lock unlock the combined usage of the cards depending on how you flag it in the driver) 

 

If it allows faster rendering by using both GPUs via usage of a module to communicate better/faster between them then it is essentially SLi. 

 

e.g 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, SirFlamenco said:

You're wrong though, the 3090 really only supports the game listed in this article NVIDIA SLI Support Transitioning to Native Game Integrations | NVIDIA (custhelp.com)

 

On top of that, since January 1 even Turing isn't getting new SLI profiles. 

Your link doesnt contradict anything

 

a) I was talking about the current support. 

 

b) It mentions that they will maintain support for RTX 20 series that have SLI

 

c) They say that they will support SLI for the 3090 that will support it natively within the game (so all the games listed by the list I gave before still are included) 

 

d) Even if they said "No SLI for the 3090" wich is not something mentioned in your link but just for the argument's sake, then again nothing changes since I was referring about the support of current or past GPUs not about the future support of next/future gen GPUs which is exactly what grinds my gears btw (the intentional death on future products of a feature that was saving the consumer money and gave him/her additional options in terms of futureproofness/upgrade paths for the sole reason to make nvidia gain extra money by denying it) 

 

It is a smilar move compared to e.g the subway/train companies saying "hey we know that we have month/week long pass tickets that saved you money if you commute frequenlty but you know what we want more money so .. yea... we will stop it and you can only buy one way tickets for now on and on top of that each ticket will be valid for a particular destination no matter if the distance of two destinations is the more or less the  same we will charge it differently :P " (the latter is somewhat an analogy for nvidia having 100 different model for sales e.g 1060 3GB 1060 6GB 1660 1660 ti 1660 ultra and so on an so forth :P

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, papajo said:

Potato potato... It's like saying AMD's SAM has nothing to do with Resizable address bar (of the PCIe spec) 

 

It is just a branding difference (along with driver manipulation to lock unlock the combined usage of the cards depending on how you flag it in the driver) 

 

If it allows faster rendering by using both GPUs via usage of a module to communicate better/faster between them then it is essentially SLi. 

 

e.g 

 

 

You're comparing apples to oranges. While AMD's SAM is just a brand name for re-bar, SLI used to be an entire protocol that allowed point-to-point master-slave comms, along with the proprietary physical connector that connected those cards in order to increase throughput.

 

NVLink, on the other hand, is a proper network bus akin to PCIe, meaning that you can use it in a more generic way, and even have it built into the CPU for GPU-CPU comms (as seen in POWER9 CPUs). For games, currently the SLI protocol is simulated over NVLink, as in the whole master-slave thing runs as a software over the nvlink bus, while any other application just sends and coordinates data over nvlink as if it where in a regular PCIe bus, but way faster.

 

So I once again repeat what I said, what you're talking about has nothing to do with sli. When developing multi-GPU aware programs, you just have to split your data around how many devices you have and throw it at them, without needing to worry about synchronization or what card is on top. When developing an SLI application (as in games), you need to coordinate the pacing of your frames and how they're split between each GPU, and have a GPU set as a master that outputs the final frame, it's a whole mess to get things properly working.

Another clear difference of SLI and proper mGPU is that on SLI GPUs need to share their framebuffer, meaning that with 2x 4gb cards you only have 4gb of total available vram since those need to be in sync, while on any other mGPU application you can split your 20gb data over 5x 4gb GPUs and be done with it.

 

If you still insist on being ignorant and saying that 2 completely different technologies (both in physical implementation, protocol implementation and software development) are the same thing, then I guess there's nothing for me to add to this discussion.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

@papajo

Watch this:

And in case you're too lazy to watch it:
To have no stuttering and high FPS, you need to have both high FPS and low frametimes.

SLI can push out high FPS, but have slightly higher frametimes, leading to microstuttering.

On 1/13/2021 at 1:07 AM, papajo said:

the latter is somewhat an analogy for nvidia having 100 different model for sales e.g 1060 3GB 1060 6GB 1660 1660 ti 1660 ultra and so on an so forth 

The more options you have, the more sales you have as consumers like options.

On 12/21/2020 at 9:21 AM, papajo said:

Pick and choose I dont want to create a conspiracy I just tell it as it is.

Uh, no.

You're creating this weird interpretation that has a few crumbs of truth, but the rest is made up by you to tell the story how you want it to be told.

Nvidia is killing SLI in the consumer market because it's not worth it.

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/13/2021 at 2:07 AM, papajo said:

Your link doesnt contradict anything

 

a) I was talking about the current support. 

 

b) It mentions that they will maintain support for RTX 20 series that have SLI

 

c) They say that they will support SLI for the 3090 that will support it natively within the game (so all the games listed by the list I gave before still are included) 

 

d) Even if they said "No SLI for the 3090" wich is not something mentioned in your link but just for the argument's sake, then again nothing changes since I was referring about the support of current or past GPUs not about the future support of next/future gen GPUs which is exactly what grinds my gears btw (the intentional death on future products of a feature that was saving the consumer money and gave him/her additional options in terms of futureproofness/upgrade paths for the sole reason to make nvidia gain extra money by denying it) 

 

It is a smilar move compared to e.g the subway/train companies saying "hey we know that we have month/week long pass tickets that saved you money if you commute frequenlty but you know what we want more money so .. yea... we will stop it and you can only buy one way tickets for now on and on top of that each ticket will be valid for a particular destination no matter if the distance of two destinations is the more or less the  same we will charge it differently :P " (the latter is somewhat an analogy for nvidia having 100 different model for sales e.g 1060 3GB 1060 6GB 1660 1660 ti 1660 ultra and so on an so forth :P

You're very much wrong. The RTX 3090 is the only GPU from Ampere to support SLI. This isn't really SLI though, more like DX12/Vulkan MGPU over a bridge. Even then, there are only 13 games supported, with only 1 having been released in the last year. Turing has dropped actual SLI the first of january, which means no one will code for it anymore. SLI is dead.

So true it hurts

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, SirFlamenco said:

You're very much wrong. The RTX 3090 is the only GPU from Ampere to support SLI. This isn't really SLI though, more like DX12/Vulkan MGPU over a bridge. Even then, there are only 13 games supported, with only 1 having been released in the last year. Turing has dropped actual SLI the first of january, which means no one will code for it anymore. SLI is dead.

Which even if it holds true nothing of what I mentioned above is false. 

 

But I doubt that it holds true I am pretty sure that any of the games in the SLI list is going to work with two 3090s I just cant confirm that because I never had two 3090s there is absolutely no technical reason for them not to work unless just nvidia flags in the driver that "if the GPUs are tagged as 3090s then dont work" which wouldnt surprise me given nvidias profiteering strategies but also think is kinda unlikely because it would be a lame move even for nvidia standarts. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, papajo said:

Which even if it holds true nothing of what I mentioned above is false. 

 

But I doubt that it holds true I am pretty sure that any of the games in the SLI list is going to work with two 3090s I just cant confirm that because I never had two 3090s there is absolutely no technical reason for them not to work unless just nvidia flags in the driver that "if the GPUs are tagged as 3090s then dont work" which wouldnt surprise me given nvidias profiteering strategies but also think is kinda unlikely because it would be a lame move even for nvidia standarts. 

No, what you said is false. It simply doesn't support it, only MGPU over a bridge. SLI has stopped being developped since the first of january for Turing.

So true it hurts

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×