Jump to content

Demystifying the max 2 gpu myth..

Hi ! :)

 

So as seeing as much ignorance on this forum is bleach worthy, I will demisitfy something for you today *Mindblown*. It's the 3 and 4 way sli pascal myth. Because YES, you can use that in game, Wonder how? Well, with that : 

It's basically tricking the gpu that you are in a benchmark (Catzilla I think in this video) through Nvidia Inspector. Really easy to do and take only one minute. Scaling is really great on games as shown there : 

And also : 

So that was all for me, hope you learned something new today ;) 

Edited by TheRandomness
Advertising is not allowed.

So true it hurts

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Sauron said:

I'd love to see these people's faces if nvidia suddenly blocks the exploit in a new driver.

Well it's not really an exploit and Nvidia isn't involved in that, so it should not happen. 

So true it hurts

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, rrubberr said:

Why, because they spent good money that would then be wasted?

Because NVOODIA WE TAKE AL UR DOLARS AND MUAHAHAHAHAHAHA

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Sauron said:

I'd love to see these people's faces if nvidia suddenly blocks the exploit in a new driver.

Why would they? If this somehow would become "the new thing" Nvideo would just sell more GPU's and still doesn't have to officially support 3 and 4 way SLI, win-win for them imo.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SirFlamenco said:

Well it's not really an exploit and Nvidia does nothing in that, so it should not happen. 

It happened all the time with "hybrid PhysX", cat-and-mouse game. Until the developer gave up, but of course, no one forces you to update the driver (especially if the update consists in breaking functionality). Same here, I would be backing up the current drivers just in case.

Link to comment
Share on other sites

Link to post
Share on other sites

Wish 1060s would support SLI....

Ion (Main Build)                                                                                        Overall Setup

i5 6500 3.2 GHz                                                                     -Blue snowball (White) thanks goodwill

MSI Mortar Arctic                                                                   -Logitech K120

Asus 1060 6GB Dual                                                             -Logitech Daedalus Prime G302

PNY CS1311 120 GB                                                            -Mousepad I made in 1st grade with my name on it                                                 

WD Caviar Blue 1 TB                                                              

Crucial Ballistix Sport LT White 16GB (8x2GB) 2400

NZXT S340 White

Corsair CXM 450W 

 

Lenovo H320 (Old Pre-built PC)                                      Possible upgrade for H320          

i5 650 3.2 GHz (heh)                                                                                    Xeon X3470

Motherboard unknown                                                       Same Motherboard

iGPU                                                                                   GT 1030 (MSI Low Profile Half Height)

Crucial 240GB SSD                                                           Crucial 240GB SSD

6GB DDR3 (4+2GB)                                                           8-10GB DDR3 (4+2+2GB/4+4+2GB)

Lenovo H320 case                                                             Lenovo H320 case

Unknown PSU (210W?)                                                     Same PSU (210W?)    

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, SirFlamenco said:

Well it's not really an exploit and Nvidia does nothing in that, so it should not happen. 

nVidia is infamous for this sort of thing.

18 minutes ago, Jonathan Lemmens said:

Why would they? If this somehow would become "the new thing" Nvideo would just sell more GPU's and still doesn't have to officially support 3 and 4 way SLI, win-win for them imo.

For the same reason they don't allow it officially in the first place.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Sauron said:

For the same reason they don't allow it officially in the first place.

Because it's dumb and causes more issues than it solves anyways?

 

Because maintaining support for a feature less than 1% of users actually use is a waste of resources?

 

Because the new high bandwidth setup is cheaper to maintain?

 

Am I missing one? I don't feel any of these would encourage them to deal with this. If you do this and stuff breaks they can just go "Well we told you not to do it and that it's not supported."

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Sauron said:

For the same reason they don't allow it officially in the first place.

They don't "don't allow it". They just don't want to support it, which is a major difference. 

 

It costs quite a lot to employ people to code drivers for this and validate it with their newer hardware etc. If they can cut this cost but still have people buying more cards for 3 and 4 way SLI, why wouldn't they?

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, SirFlamenco said:

Hi ! :)

 

So as seeing as much ignorance on this forum is bleach worthy, I will demisitfy something for you today *Mindblown*. It's the 3 and 4 way sli pascal myth. Because YES, you can use that in game, Wonder how? Well, with that : 

This was discussed after the Pascal launch i think. Anything more than two still works if you put a little effort in. 

 

Although trust some people to go "hur ba-dur novideo/ngreedia is bad"

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Jonathan Lemmens said:

They don't "don't allow it". They just don't want to support it, which is a major difference. 

 

It costs quite a lot to employ people to code drivers for this and validate it with their newer hardware etc. If they can cut this cost but still have people buying more cards for 3 and 4 way SLI, why wouldn't they?

It still puts a constraint on how much can you charge for higher end cards. Single card solutions have many advantages over miltiple-GPU solutions, but they don't have an infinite advantage - two lesser cards can be better than the highest end card if you charge enough for the latter. Completely removing SLI from the map, on the other hand, breaks all arbitrage possibilities, and allows for a much less linear (more "convex") pricing (meaning, charging arbitrarily higher prices for a given increase in performance).

I do not think this is very relevant for 3 and 4 GPU setups, though, but more about eventually getting rid of all SLI, or all except the highest en model (for extremely high resolutions and because there is no higher GPU to price anyway).

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Sniperfox47 said:

Because it's dumb and causes more issues than it solves anyways?

 

Because maintaining support for a feature less than 1% of users actually use is a waste of resources?

 

Because the new high bandwidth setup is cheaper to maintain?

nVidia never seemed to care. They didn't care when they stopped official support for more than 2-way sli, they didn't care when they artificially blocked physx cards on amd systems, they didn't care when they blocked pcie passthrough, they didn't care when they decided you couldn't run their cards on less than an 8x pcie slot regardless of generation (meaning that you need 3.0 8x despite it being twice as fast as 2.0 8x), they didn't care when they blocked sli across different cards with the same die... you get the picture. If this catches on you'd better believe they'll block it.

10 minutes ago, Jonathan Lemmens said:

It costs quite a lot to employ people to code drivers for this and validate it with their newer hardware etc. If they can cut this cost but still have people buying more cards for 3 and 4 way SLI, why wouldn't they?

See above, they don't care what it costs. nVidia has always behaved like this, I see no reason why this particular instance would be different. The best you can hope for is that they won't notice.

13 minutes ago, Jonathan Lemmens said:

They don't "don't allow it". They just don't want to support it, which is a major difference. 

It just means they don't want you do to it and you'll have no recourse if it suddenly stops working after a driver update.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Sauron said:

nVidia never seemed to care. They didn't care when they stopped official support for more than 2-way sli, they didn't care when they artificially blocked physx cards on amd systems, they didn't care when they blocked pcie passthrough, they didn't care when they decided you couldn't run their cards on less than an 8x pcie slot regardless of generation (meaning that you need 3.0 8x despite it being twice as fast as 2.0 8x), they didn't care when they blocked sli across different cards with the same die... you get the picture. If this catches on you'd better believe they'll block it.

 

See above, they don't care what it costs. nVidia has always behaved like this, I see no reason why this particular instance would be different. The best you can hope for is that they won't notice.

The red line through this story is Nvidia wanting to make more money, I do not see how actively blocking 3 and 4 way SLI will help them achieve that. If they would block it completely, it will just be a dick move with no financial gain on the other end.

Also AMD still supports up to 4 way crossfire even on Vega. So Nvidia will be hesitant to ban it completely.

 

8 minutes ago, Sauron said:

It just means they don't want you do to it

When and where did they say this? 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Jonathan Lemmens said:

The red line through this story is Nvidia wanting to make more money, I do not see how actively blocking 3 and 4 way SLI will help them achieve that. If they would block it completely, it will just be a dick move with no financial gain on the other end.

Also AMD still supports up to 4 way crossfire even on Vega. So Nvidia will be hesitant to ban it completely.

 

When and where did they say this? 

The fact they don't want you to do it is implicit in the lack of support... and as for wanting to make money, I don't see how they make more money from arbitrary pcie requirements, blocking vfio, etc either. They just look like dick moves to me. Of course the end goal is money, but what they think will make them more money doesn't always make sense... at least, not to you or me. They must be doing SOMETHING right given they lead the market by a large margin though.

 

AMD has been supporting all of these things for years, including lower bandwidth pcie, cross-gpu crossfire, 4 way crossfire, they even suggested they might be looking into 5 way crossifre at some point. And yet, amd is by far the underdog.

 

Either way, I don't claim to know the motives behind their behaviour - I just know that they have done things like this multiple times in the past, so I fully expect them to do it now. Maybe they will, maybe they won't, just don't be surprised if they do.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

If nvidia were REALLY through with SLi then they wouldn't put profiles in near every Game Ready Driver update. 

 

Changelogs are full of SLi fixes for goofy setups and games. 

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Sauron said:

I don't see how they make more money from arbitrary pcie requirements

I highly doubt it's arbitrary. It's likely due to electrical or signalling requirements between the cards. SLI still has a lot of legacy crud left over in it from Scan Line Interleave, and works substantially different from crossfire.

 

5 minutes ago, Sauron said:

blocking vfio

Because they implemented this driver check at a time when they were pushing more and more for their GRID GPUs for remote solutions. The vast majority of users who used their GPUs with VMs were people that they could push their GRID GPU solutions to for waaaaaaaaaaay more money.

 

37 minutes ago, Sauron said:

artificially blocked physx cards on amd systems

This should be obvious... It discourages users from buying AMD systems, thereby giving their competition less money and resources to innovate with, hobbling them, and allowing Nvidia to stay on top and ship more units >.>

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Sniperfox47 said:

I highly doubt it's arbitrary. It's likely due to electrical or signalling requirements between the cards. SLI still has a lot of legacy crud left over in it from Scan Line Interleave, and works substantially different from crossfire.

 

Because they implemented this driver check at a time when they were pushing more and more for their GRID GPUs for remote solutions. The vast majority of users who used their GPUs with VMs were people that they could push their GRID GPU solutions to for waaaaaaaaaaay more money.

 

This should be obvious... It discourages users from buying AMD systems, thereby giving their competition less money and resources to innovate with, hobbling them, and allowing Nvidia to stay on top and ship more units >.>

1) Since pcie 3.0 4x and pcie 2.0 8x are identical in bandwidth, if the latter is supported there is no good reason the first shouldn't be. And I'm not talking about sli, I'm talking about single cards not working on a 4x slot even though the bandwidth is sufficient.

 

2) One could easily counter that point by arguing that a] if those customers just go to AMD (as people looking into vfio probably would) they not only won't be able to sell them grid, they won't be able to sell them a card either and b] to use grid you need a monster internet connection that most people simply don't have, so they're shutting out a significant portion of potential customers.

 

3) As for discouraging AMD buyers, facts have shown physx doesn't pull in customers. People didn't consider it at the moment of purchase and if they eventually thought it would be nice to have they wouldn't go out and sidegrade to another high end card just to get it.

 

Of course you can find a potential benefit to all of these, but then you can do it with this too - if you can't add another card or two to your setup when they're cheap and used you're forced to upgrade sooner. I would consider it a bad idea, but as I said I think the same of all other similar situations.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Sauron said:

1) Since pcie 3.0 4x and pcie 2.0 8x are identical in bandwidth, if the latter is supported there is no good reason the first shouldn't be. And I'm not talking about sli, I'm talking about single cards not working on a 4x slot even though the bandwidth is sufficient.

Umm... I'm not sure where you're coming from there. Single cards run fine on 4x slots. Here's even a comparison with benchmarks of various games on a 1080 running at 4x, 8x, and 16x using PCIe 1.1, 2.0, and 3.0...:

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/

 

5 minutes ago, Sauron said:

2) One could easily counter that point by arguing that a] if those customers just go to AMD (as people looking into vfio probably would) they not only won't be able to sell them grid, they won't be able to sell them a card either and b] to use grid you need a monster internet connection that most people simply don't have, so they're shutting out a significant portion of potential customers.

Except there's not really any concern about remote workstation and cloud gaming companies going to AMD, or at least there weren't in 2014.

 

Average household consumers were not their market and aren't the kinds of users who typically use gaming VMs.

 

8 minutes ago, Sauron said:

3) As for discouraging AMD buyers, facts have shown physx doesn't pull in customers. People didn't consider it at the moment of purchase and if they eventually thought it would be nice to have they wouldn't go out and sidegrade to another high end card just to get it.

You may not be wrong, but that's also not how marketing works. Even if I'm not buying a product for a specific feature, it having that feature will guide my decision. Even if I would never actually even use that feature. Human psychology is pretty easy to take advantage of.

Link to comment
Share on other sites

Link to post
Share on other sites

I dug for like five minutes on the interwebs and came across this: https://www.pcper.com/news/Graphics-Cards/GeForce-GTX-1080-and-1070-3-Way-and-4-Way-SLI-will-not-be-enabled-games

 

The reason why NVIDIA doesn't really want to support more than 2-way SLI is:

Quote

As DX12 and Vulkan next-generation APIs become more prolific, the game developers will still have the ability to directly access more than two GeForce GTX 10-series GPUs, though I expect that be a very narrow window of games simply due to development costs and time.

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 2017-08-10 at 3:05 PM, DXMember said:

 

Yup, it shows how some people that are supposed to spread information are sometimes wrong. 

So true it hurts

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty sure no one thought it couldn't be done. Just isn't worth the time and effort. Seeing in how old the chip is already.

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SirFlamenco said:

Yup, it shows how some people that are supposed to spread information are sometimes wrong. 

but you still can't run more than two GPUs for 4K because of,

NvidiaRecBridge.png

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×