Jump to content

SLI & Crossfire in the Same PC at the Same Time??

Posted some of this this on the video, but for the sake of avoiding Youtube to give Linus my thoughts on the subject more directly, I'm posting here as well.

I don't really know what I expected out of this video, but I've said before that we're getting to the point where GPUs are obsolete. They're big, they're getting bigger it feels like, and yet everything else seems to want to get smaller. I've been thinking to myself what NVidia would do if APUs were much more powerful. It seems to be a step in the right direction when thinking about the future of GPUs, but I can't think of a single solution to that problem. The best I can do is predict the death of GPUs and APUs becoming widely adopted, which would kill off GeForce Experience benefits anyway.

With that being said, I will also add in that I think that it's way more beneficial to assume that AMD is actually taking the right route. If APUs get big, and they look like it right now, NVidia will be eating dirt. A lot of the features of GeForce Experience are already being shown elsewhere. It's only a matter of time before AMD is the way to go, and eventually we won't have anymore "X vs X" arguments. I've said it before and I'll say it again.. The latest generation of consoles came at a bad time. People knew that a lot of what the 360/PS3 had that was unique was very compatible with PC (namely the online multiplayer and the controllers but recording videos as well, for example).

PS4/XBO weren't unique or powerful enough to compete with PC, which had a good, what, 7-8 years to develop even further? They failed to really utilize the advancement in technologies across the board and really f---ed up on the future tech, which PCs are almost always prepared for. 4k/2160p isn't really out or even popularized yet but we can already use it. 1440p is practically skipped and 1080p is child's play for a consumer-grade pc, let alone higher end ones which seem to be getting cheaper/easier to buy by the second. Peripherals are getting better, cheaper and more available from different people yet Microsoft wants to market their XBO-only controller (that I really wish was usable on PC, very good controller) to console users for upwards of $60-70 if I remember correctly. Sony is a bit different in that aspect but I think their console is featureless and far less prepared for the market when it was launched. It has only games and I've always thought quality over quantity seeing as 90% of the general population will not purchase tens or hundreds of games throughout their console's lifespan, unless you include trade-ins.

So, that leaves us with PCs for the future, unless consoles somehow get really, really good and the next generation are being released in 2016. IF that ends up happening, I expect a pointless price hike that is undesirable to most people while the PC probably would be looking fit for a big jump in sales (yet again). The steam machine/box/gabecube is a weird thing, but mostly because it's the console pricepoint-type of prebuilt thing that's aimed directly at gamers. It's really good because it'll remove people from the Mac/HP/Dell/etc. prebuilt market that only wanted to play some games (and really can't now) on the PC. It'll help out with awareness of computer games as well, so I don't hate it or think it's a stupid idea when almost anyone can build their own computer for cheaper usually.

Bringing it back to AMD versus NVidia to finish the post up, I stated in my 2nd paragraph that AMD is literally going to shut down NVidia in graphics unless NVidia changes. I'm talking like, super long-term, 20+ years from now type of stuff. People will still buy dedicated cards and build larger machines for a long while, mostly because of the amount of time it'll take for things to transition to Gigabyte Brix-sized computers (way of the future, the way of the future..). Small isn't exactly better, but if we can lower the thermal output on devices that are as strong as current-day computers at a significantly smaller size, there's absolutely no reason we won't see something the size of the average modem being the "gaming" or general entertainment standard. Like I said, we're getting most/all of the features in GeForce Experience for free in the future, at least at some point. Tack on all the new technologies we've seen in the last 6-8 months or so along with upcoming ones, and I see GPUs dying off so quickly. We need something smaller, more portable and modular, and an 11.5"x5.5"x3"* PCI-E graphics card is definitely not gonna work.

*Not sure if accurate; first time I've not known the imperial measurement to something rather than the metric as an american...

Link to comment
Share on other sites

Link to post
Share on other sites

buy an r9 290x for mantle support then then then
add a physx nvidia card maybe just a 460 or even just a 450 would be more enough

but i'm not sure about getting g-snyc i think that mantle is more important than g-sync imo

Link to comment
Share on other sites

Link to post
Share on other sites

OR CAN YOU???

 

#slickdishes

 

 

dat expression...

 

:lol: :lol: :lol:

 

 

back to the topic

 

dear linus

 

care to give us the full system part list?

 

and we need a step by step guide as to how to get the system and driver working

 

reminds me of the MSI Big Bang Fuzion P55 which support 2 types of GPU

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think that AMD is in the right here as they are more willing to "Open Source" there software to developers. They don't plan on making Mantel proprietary drivers and allow Nvidia to use it, but we all know Nvidia will refuse to. Now on the other side G-sync by Nvidia is going to be exclusive with there GPU's and monitors about which ones will be available for gamer's to use. Now I want you to think about this, I have a Korean 1440p monitor, I can adjust the refresh rate of this monitor with no special hardware, works on AMD or Nvidia, it is all software based. What is keeping this from being implemented where the monitor is refreshing when the GPU pushes out the image? This is why I believe AMD's Free-Sync may be real and can easily be implemented through software and no extra hardware needed on monitors, not just mobile devices. Remember hardware is only as strong as its software now days. What do you think? defend your statement. 

Link to comment
Share on other sites

Link to post
Share on other sites

Hahaha oh Linus, you're like the only YouTuber with enough courage to pull this off.

i7 3770k @ 4.3GHz, Asus P8Z77-V LK, 16GB G.Skill 2133MHz, Gigabyte GTX 680 SOC @ 1306MHz Core, 7141MHz Memory, Cooler Master Storm Stryker, 256GB Sandisk SSD, 3TB Seagate HDD, Antec High Current Gamer 750W, Corsair H55 Liquid w/ Dual SP120s in Push/Pull

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm thinking it like this Nvidia can use Mantle on their cards,AMD said that anyone can use it freely but can the AMD have (AMD-G-sync) i think not without releasing new monitors,Nvidia G-sync monitors are around the corner,BUT Mantle is free,Yea but in video you are speaking for 780-Ti SLI and 290x Xfire if you can spend so much for gpus i believe you can spend enough money to buy g-sync monitors. So bottom line is if you are going to upgrade only GPU then go AMD reason Mantle is free and amd-cards are little cheaper than nvidia's going for a big change monitors and stuff put some extra $$$ and go nvidia reason G-sync and if mantle is a huge deal maybe nvidia is going to support it later.

Link to comment
Share on other sites

Link to post
Share on other sites

Posted some of this this on the video, but for the sake of avoiding Youtube to give Linus my thoughts on the subject more directly, I'm posting here as well.

I don't really know what I expected out of this video, but I've said before that we're getting to the point where GPUs are obsolete. They're big, they're getting bigger it feels like, and yet everything else seems to want to get smaller. I've been thinking to myself what NVidia would do if APUs were much more powerful. It seems to be a step in the right direction when thinking about the future of GPUs, but I can't think of a single solution to that problem. The best I can do is predict the death of GPUs and APUs becoming widely adopted, which would kill off GeForce Experience benefits anyway.

With that being said, I will also add in that I think that it's way more beneficial to assume that AMD is actually taking the right route. If APUs get big, and they look like it right now, NVidia will be eating dirt. A lot of the features of GeForce Experience are already being shown elsewhere. It's only a matter of time before AMD is the way to go, and eventually we won't have anymore "X vs X" arguments. I've said it before and I'll say it again.. The latest generation of consoles came at a bad time. People knew that a lot of what the 360/PS3 had that was unique was very compatible with PC (namely the online multiplayer and the controllers but recording videos as well, for example).

PS4/XBO weren't unique or powerful enough to compete with PC, which had a good, what, 7-8 years to develop even further? They failed to really utilize the advancement in technologies across the board and really f---ed up on the future tech, which PCs are almost always prepared for. 4k/2160p isn't really out or even popularized yet but we can already use it. 1440p is practically skipped and 1080p is child's play for a consumer-grade pc, let alone higher end ones which seem to be getting cheaper/easier to buy by the second. Peripherals are getting better, cheaper and more available from different people yet Microsoft wants to market their XBO-only controller (that I really wish was usable on PC, very good controller) to console users for upwards of $60-70 if I remember correctly. Sony is a bit different in that aspect but I think their console is featureless and far less prepared for the market when it was launched. It has only games and I've always thought quality over quantity seeing as 90% of the general population will not purchase tens or hundreds of games throughout their console's lifespan, unless you include trade-ins.

So, that leaves us with PCs for the future, unless consoles somehow get really, really good and the next generation are being released in 2016. IF that ends up happening, I expect a pointless price hike that is undesirable to most people while the PC probably would be looking fit for a big jump in sales (yet again). The steam machine/box/gabecube is a weird thing, but mostly because it's the console pricepoint-type of prebuilt thing that's aimed directly at gamers. It's really good because it'll remove people from the Mac/HP/Dell/etc. prebuilt market that only wanted to play some games (and really can't now) on the PC. It'll help out with awareness of computer games as well, so I don't hate it or think it's a stupid idea when almost anyone can build their own computer for cheaper usually.

Bringing it back to AMD versus NVidia to finish the post up, I stated in my 2nd paragraph that AMD is literally going to shut down NVidia in graphics unless NVidia changes. I'm talking like, super long-term, 20+ years from now type of stuff. People will still buy dedicated cards and build larger machines for a long while, mostly because of the amount of time it'll take for things to transition to Gigabyte Brix-sized computers (way of the future, the way of the future..). Small isn't exactly better, but if we can lower the thermal output on devices that are as strong as current-day computers at a significantly smaller size, there's absolutely no reason we won't see something the size of the average modem being the "gaming" or general entertainment standard. Like I said, we're getting most/all of the features in GeForce Experience for free in the future, at least at some point. Tack on all the new technologies we've seen in the last 6-8 months or so along with upcoming ones, and I see GPUs dying off so quickly. We need something smaller, more portable and modular, and an 11.5"x5.5"x3"* PCI-E graphics card is definitely not gonna work.

*Not sure if accurate; first time I've not known the imperial measurement to something rather than the metric as an american...

 

Thanks for taking the time :)

 

I had already seen some of it on YouTube, but the forum is always better for discussion ;)

Link to comment
Share on other sites

Link to post
Share on other sites

Posted some of this this on the video, but for the sake of avoiding Youtube to give Linus my thoughts on the subject more directly, I'm posting here as well.

I don't really know what I expected out of this video, but I've said before that we're getting to the point where GPUs are obsolete. They're big, they're getting bigger it feels like, and yet everything else seems to want to get smaller. I've been thinking to myself what NVidia would do if APUs were much more powerful. It seems to be a step in the right direction when thinking about the future of GPUs, but I can't think of a single solution to that problem. The best I can do is predict the death of GPUs and APUs becoming widely adopted, which would kill off GeForce Experience benefits anyway.

With that being said, I will also add in that I think that it's way more beneficial to assume that AMD is actually taking the right route. If APUs get big, and they look like it right now, NVidia will be eating dirt. A lot of the features of GeForce Experience are already being shown elsewhere. It's only a matter of time before AMD is the way to go, and eventually we won't have anymore "X vs X" arguments. I've said it before and I'll say it again.. The latest generation of consoles came at a bad time. People knew that a lot of what the 360/PS3 had that was unique was very compatible with PC (namely the online multiplayer and the controllers but recording videos as well, for example).

PS4/XBO weren't unique or powerful enough to compete with PC, which had a good, what, 7-8 years to develop even further? They failed to really utilize the advancement in technologies across the board and really f---ed up on the future tech, which PCs are almost always prepared for. 4k/2160p isn't really out or even popularized yet but we can already use it. 1440p is practically skipped and 1080p is child's play for a consumer-grade pc, let alone higher end ones which seem to be getting cheaper/easier to buy by the second. Peripherals are getting better, cheaper and more available from different people yet Microsoft wants to market their XBO-only controller (that I really wish was usable on PC, very good controller) to console users for upwards of $60-70 if I remember correctly. Sony is a bit different in that aspect but I think their console is featureless and far less prepared for the market when it was launched. It has only games and I've always thought quality over quantity seeing as 90% of the general population will not purchase tens or hundreds of games throughout their console's lifespan, unless you include trade-ins.

So, that leaves us with PCs for the future, unless consoles somehow get really, really good and the next generation are being released in 2016. IF that ends up happening, I expect a pointless price hike that is undesirable to most people while the PC probably would be looking fit for a big jump in sales (yet again). The steam machine/box/gabecube is a weird thing, but mostly because it's the console pricepoint-type of prebuilt thing that's aimed directly at gamers. It's really good because it'll remove people from the Mac/HP/Dell/etc. prebuilt market that only wanted to play some games (and really can't now) on the PC. It'll help out with awareness of computer games as well, so I don't hate it or think it's a stupid idea when almost anyone can build their own computer for cheaper usually.

Bringing it back to AMD versus NVidia to finish the post up, I stated in my 2nd paragraph that AMD is literally going to shut down NVidia in graphics unless NVidia changes. I'm talking like, super long-term, 20+ years from now type of stuff. People will still buy dedicated cards and build larger machines for a long while, mostly because of the amount of time it'll take for things to transition to Gigabyte Brix-sized computers (way of the future, the way of the future..). Small isn't exactly better, but if we can lower the thermal output on devices that are as strong as current-day computers at a significantly smaller size, there's absolutely no reason we won't see something the size of the average modem being the "gaming" or general entertainment standard. Like I said, we're getting most/all of the features in GeForce Experience for free in the future, at least at some point. Tack on all the new technologies we've seen in the last 6-8 months or so along with upcoming ones, and I see GPUs dying off so quickly. We need something smaller, more portable and modular, and an 11.5"x5.5"x3"* PCI-E graphics card is definitely not gonna work.

*Not sure if accurate; first time I've not known the imperial measurement to something rather than the metric as an american...

I'd like to argue that Nvidia is not behind in there R&D for their "APU" implementation. Look at their ARM processor technology known to consumers as Tegra. ARM is undoubtedly the future in high performance scalable computing. Nvidia may just be marketing this incorrectly or they already have a enterprise grade idea for the use of their ARM processor chip. They are doing some very impressive things with GRID and there VDI implementations with VMware. Nvidia is more than ready to continue to throw punches at AMD.  

Link to comment
Share on other sites

Link to post
Share on other sites

Should be open sourced Technology sharing from each manufacturer, with its many vendors still, it would still be all about the graphics cards to some, looks and aesthetics, but with all technologies available...

 

Saome people I think these days don't even care about the tech, and are "just used to the control panel" or "I know my way around Catalyst/GF-Exp" knowing one moreso than the other can influence decisions.

If both dominant players with their control panels, enabled or disabled at user will, all shared technologies, then people would still be buying the cards of their favorite manufacturer still.

 

Like me, I'm red team at home, would love the Nvidia tech inside too, just like Nvidia boys would love if Mantle or some other AMD tech could be enabled and working from their control panel, it'd be pretty sweet to mix n match to your own specific liking.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Should be open sourced Technology sharing from each manufacturer, with its many vendors still, it would still be all about the graphics cards to some, looks and aesthetics, but with all technologies available...

 

Saome people I think these days don't even care about the tech, and are "just used to the control panel" or "I know my way around Catalyst/GF-Exp" knowing one moreso than the other can influence decisions.

If both dominant players with their control panels, enabled or disabled at user will, all shared technologies, then people would still be buying the cards of their favorite manufacturer still.

 

Like me, I'm red team at home, would love the Nvidia tech inside too, just like Nvidia boys would love if Mantle or some other AMD tech could be enabled and working from their control panel, it'd be pretty sweet to mix n match to your own specific liking.

I completely agree with your statement in the way that the first vendor, AMD or Nvida, to release there drivers and/or software available for open source will be the one that "wins" If I could get both technologies on the same card...why would I not get that card? To be honest Microsoft should maybe be working with AMD;s mantel to improve their performance and get rid of these bottlenecks that exist. 

Link to comment
Share on other sites

Link to post
Share on other sites

It's (kind of) possible to get PhysX running while rendering the game on an AMD card. been tested to work with various games that have PhysX (not a big list of games). Mafia II, batman 1 and 2 (haven't tried 3rd), borderlands II, and Alice: Madness Returns. the process is hacked with custom .dll files because Nvidia set code in their drivers to FORCE DISABLE PhysX in the presence of AMD cards for no good reason.
After thinking about it, there's not much that would make me doubt running an SLI/Crossfire rig would work, since I have an AMD R9 290, AMD 7950, and Nvidia 620 in my computer, but experience shows that the most common issue is vendors not liking the presence of others. they're party poopers.
I can imagine all technologies from each vendor working, except G-sync, mantle, or adaptive V-sync.
Maybe with a standard like Mantle being Open-source, we might get somewhere with AMD and Nvidia playing nice for ONCE and working together to make the game better, not their pockets.

 

Dat Power consumption though....Dayum...

 

Were you intending to create an oven linus 0_0

 

People are missing a detail of this setup. It's not like the game is rendering with all 4 GPU's at 100%. for every 2 GPU's rendering, the other 2 are idle. Idle power draw for each card is less than 10W. That's a big flaw with making a "have all the proprietary technologies!" computer. You have to choose what vendor you want to use on a game-by-game basis.

 

 

So wait, you made a Parallel Gaming+Mining rig?

In this situation the heat output would be high enough to throttle the Nvidia cards at least a few percent. Water cooling would be suggested for all GPU's. Sidenote: when mining on non-primary cards, I get some form of fps and mouse stutter, even though the card mining is not rendering. Haven't tracked down the source yet. might be PCI-e saturation. I experience a similar effect while recording screen content.

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder what would happen if either amd or nividia went completely proprietary. Would they end up doing a sony and start failing.

Link to comment
Share on other sites

Link to post
Share on other sites

not that this kind of setup was not done before

 

the drivers are one of the limiting factors for this setup, even if you get it to work finally you have wasted quite a huge effort

 

nVidia and AMD drivers working together might cause system stability problems or refuse to work

 

ask question on how to run nVidia card with AMD card in either AMD or nVidia forums and you get bashed by AMD/nVidia fans

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

i think that amd and nvidia should form a driver which will somehow allow phyz x, mantle, true audio ,etc to work together......i know they will not but still...why cant we dream about it....it will also stop 50% Fights...!!! :P  :P  ;)

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know if Linus saw this but there is now an article on Pcper.com about this video.

http://www.pcper.com/news/General-Tech/Linus-Brings-SLI-and-Crossfire-Together

check it out

@LinusTech

Motherboard - Gigabyte P67A-UD5 Processor - Intel Core i7-2600K RAM - G.Skill Ripjaws @1600 8GB Graphics Cards  - MSI and EVGA GeForce GTX 580 SLI PSU - Cooler Master Silent Pro 1,000w SSD - OCZ Vertex 3 120GB x2 HDD - WD Caviar Black 1TB Case - Corsair Obsidian 600D Audio - Asus Xonar DG


   Hail Sithis!

Link to comment
Share on other sites

Link to post
Share on other sites

5000 dollars heater for your home, with addition to some gaming...

Link to comment
Share on other sites

Link to post
Share on other sites

it been mentioned before

 

you can only use AMD cards at one point, while nVidia cards are idling and vice versa

 

not all 4 cards are running at the same time

 

power draw should be close to 2 cards when full load which will be 700-800W

 

AMD cards with aftermarket coolers won't be as hot as the reference version

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm a bit of a pc noob. Could i have a like a GTX 770 and a R9 280X in the same system?

Link to comment
Share on other sites

Link to post
Share on other sites

I'm a bit of a pc noob. Could i have a like a GTX 770 and a R9 280X in the same system?

Yeah, just make sure your PSU and MOBO will support it.

Motherboard - Gigabyte P67A-UD5 Processor - Intel Core i7-2600K RAM - G.Skill Ripjaws @1600 8GB Graphics Cards  - MSI and EVGA GeForce GTX 580 SLI PSU - Cooler Master Silent Pro 1,000w SSD - OCZ Vertex 3 120GB x2 HDD - WD Caviar Black 1TB Case - Corsair Obsidian 600D Audio - Asus Xonar DG


   Hail Sithis!

Link to comment
Share on other sites

Link to post
Share on other sites

yes it can be done hardware wise

 

motherboard with 2 x PCi-E 16x slots

 

PSU with at least 750W

 

the issue is more of the order of the installation of the cards followed by the order of installing the drivers

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

so i can run AMD and Nvidia gpus together, but can i do it with no SLI??

Link to comment
Share on other sites

Link to post
Share on other sites

Yes as long as the mobo supports at least 2 GPU

 

look for board which mention SLi capable or X-Fire capable

 

and use a PSU of at least 750W and above

 

 

now for the order which the GPU to install

 

some say you need to install nVidia card first install the nVidia driver

 

followed by AMD card in the second PCI-E x16 slot and then AMD drivers

 

correct me if I am wrong on this.

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×