Jump to content

RTX 3090 SLI - We Tried so Hard to Love It

AlexTheGreatish
On 1/22/2021 at 10:22 PM, SirFlamenco said:

No, what you said is false. It simply doesn't support it, only MGPU over a bridge. SLI has stopped being developped since the first of january for Turing.

Quote a specific thing I said and is false. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/19/2020 at 9:17 AM, papajo said:

SLi is dead but not because it was a bad thing for us gamers but because nvidia killed it to deny people access to "cheaper performance"  and force them to pony up more money to buy a higher tier card

 

  

This was most evident with the RTX 2070 Super which permitted SLI.  Depending on pricing at the time, two of those RTX 2070 Supers could be had for less than one RTX 2080Ti and offered higher performance in games that'd scale well with SLI.  There needs to be about half a dozen asterisks around that statement.

 

Still, nVidia is cutting one of the easier means to gain raw performance without spending as much money.

 

I do think strategically it is a bad move to be dropping SLI support right now as we're on the edge of a new generation of graphical effects hitting games along side new display technologies hitting consumers (8K60, 4K144, HDR etc. etc.).  A GPU doesn't exist that'd be able to run a game like Cyber Punk 2077 at 8K60 with ray tracing and no DLSS.  Would it be possible with say four RTX 3090s?  Maybe and for those with deep pocket books, why not stop end users from trying?

 

On 12/19/2020 at 9:17 AM, papajo said:

One of the biggest advantages of DX12 was that it allows for multiple graphics cards to run a game (not even the same brand!! ) yet almost nobody implements those features ? why? Because they are partners with the GPU manufacturers and this wouldnt be good news for the ridiculous prices of the higher tier models which cost 1000+ $ a piece! 

The mGPU dream is something that has puzzled me about developers.  While the gaming market has shifted over the past few years from Intel to AMD, there are still plenty of Intel integrated GPUs that could be put to use for some mundane tasks alongside a real gaming GPU.  This would be the most popular mGPU setup and one that already has a fairly sizeable installed base.  As developers have not started to looks this for a fairly sizeable market already, there likely are some huge engineering hurdles to getting this to work reliably that provides a real benefit.  Solutions here sound easy.

 

A hypothetical example that has come to mind is asset decompression.  Moving that from the primary GPU to an Intel GPU should be straight forward but the nuances of system architecture start to impede raw gains.   The theory here is that the main GPU has some resources freed up by eliminating small part of is own workload and thus gains a small percentage boost overall proportional to the work freed up.  Now more data needs to be transferred to the main GPU as the the assets are now in their uncompressed format.  Granted 16x PCIe 3.0 bandwidth still has plenty of room for more traffic on current workloads but what now happens is that other data outside of these assets needs to wait around to get to the GPU in terms of timing.  This method adds a bit of latency which decreases performance.  For a CPU limited game, we are already at the point where performance would decrease as we have effectively increased the amount of idle time on the main GPU.  This also ignores that the integrated GPU and the CPU also share main memory bandwidth and various caches on-die.  This too can decrease performance of the other data being sent to the main GPU in this example.  So while asset decompression in this hypothetical example is faster, it decreased performance else where and just made the main GPU idle more.

 

SLI has its dedicated but for data sharing and CrossFire can have cards directly write to each other's memory without involving the CPU.  Granted these are vendor specific solutions but they worked well enough that developers were not burdened with dealing with some low level issues that they're otherwise have to.  The less developers have to do to adopt a technology, the more likely it is to be adopted.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×