Jump to content

SLI & Crossfire in the Same PC at the Same Time??

To answer your question about what the should happen....

 

what should happen is what's happening. Console exclusives is locking down content so someone buys your product. It ads no value to the consumer. Just nerfs the competitor.

 

G-Sync and Mantle are GPU companies innovating and bringing out new features. In order for them to meet your expectation, the would have to look at new features, and say "well we can't do that because the other guy doesn't, and it would confuse the consumer". Then these new features would just not exist.

Link to comment
Share on other sites

Link to post
Share on other sites

The Hardware people can give us all that they can to give us better products.... but in the end the only way to get the best game experience is that the game production company uses the latest technology given to them by AMD and nVidia, and program their product to work at the best level within the given technology..... how hard would it be for the game production people to program their games for nvidia and amd on one DVD

Link to comment
Share on other sites

Link to post
Share on other sites

I never really got much out of phys-x or CUDA. Sure phsy-x looks nice, but it's never really been used for anything other than smaller visual flare stuff. For me, the choice was a 770 or an R9 290. I went big red for better general performance, OpenCL performance (which I think will replace CUDA in the not too distant future in most applications). TrueAudio, mantle and HSA could be cool as well, but we'll have to wait and see on that stuff.

 

That being said, I do think it's crazy the amount of hardware specific crap that is out there. Especially when nvidia does stuff like disable the ability to use a second nvidia gpu as a phys-x processor. There's no real reason for that other than politics. You are still buying their hardware, so what should they care.

Desktop: AMD Threadripper 1950X @ 4.1Ghz Enermax 360L  Gigabyte Aorus Extreme   Zotac 1080Ti AMP Extreme  BeQuiet! Dark Base Pro 900  EVGA SuperNova 1000w G2  LG 34GK950f & ASUS PA248Q Klipsch Reference/Audeze Mobius

 

Synology Wireless AC-2600

 

 

Laptop: Alienware 17R5   Intel i7 8750H  Nvidia GTX1080   3840x2160 4k AdobeRGB IGZO Display   32GB DDR4 2133   256GB+1TB NVMe SSD    1TB Seagate SSHD   Killer 1550 Dual-Band Wireless AC

 

Link to comment
Share on other sites

Link to post
Share on other sites

intel and amd and nvidia + (some motherboard vendor)+ kingston, should team up, sign a contract for 1 project and they all make 1 inclusive hell of a computer, yes, a computer; not swapable, not upgradable, but just one thing that amd + intel make the permenetly stuck cpu, amd + nvidia make a new die of gpu that is in the motherboard, the motherboard vendor makes the motherboard + intel chipset;

and.... you get the idea, and just that one piece of hardware, with all thunder bolt connections; makes the 1 board with all the features you want, and then it's free for people to develop themselves

Link to comment
Share on other sites

Link to post
Share on other sites

intel and amd and nvidia + (some motherboard vendor)+ kingston, should team up, sign a contract for 1 project and they all make 1 inclusive hell of a computer, yes, a computer; not swapable, not upgradable, but just one thing that amd + intel make the permenetly stuck cpu, amd + nvidia make a new die of gpu that is in the motherboard, the motherboard vendor makes the motherboard + intel chipset;

and.... you get the idea, and just that one piece of hardware, with all thunder bolt connections; makes the 1 board with all the features you want, and then it's free for people to develop themselves

 

that's sounding like a console idea....

Link to comment
Share on other sites

Link to post
Share on other sites

Siiiiiick! Wish I had that kind of money to throw around..

CPU: i5 3570k                                 PSU: CX650 Corsair                 SSD: 128GB Samsung 840

RAM: 16GB Corsair Vengeance     Case: Zalman Z11                    HDD: 500Gb Toshiba & 1TB Western Digital

MOBO: ASrock Extreme 4 Z77        GPU: GTX 770

Link to comment
Share on other sites

Link to post
Share on other sites

The technologies Nvidia and AMD are making are highly appreciated, however I feel like they should be going with more open implementations.

 

Nvidia had to have known Vblank (FreeSync) was potentially coming to the desktop implementation of DP in the next standard, if it didn't make it into the spec I'd feel better about them going and doing their own thing. However it will likely be there as AMD will probably push hard for it. Where I give Nvidia credit is that once you have their card and a GSync monitor it works with any game, as compared with AMD's technologies that must be implemented by the developer (but this is true of Vblank/FreeSync as well).

 

Mantle finally fixes game software to better meet up with today's hardware limitations. We can't make CPUs with faster and faster IPC, so we are going multicore. Games use two threads at best in most cases and the ability to use 8 and potentially more is fantastic. Also Crossfire/SLI have been around for about 10 years, we have needed a better solution for frame pacing (which is being addressed), fps scaling, and compatibility. What I wish AMD had done was work with OpenGL to better multithread and be able to split workloads up across GPUs the way Mantle does, rather than make their own API that will only work on a few cards.

 

TrueAudio, again is a fantastic technology for a long standing problem. We having seen anything happen with digital audio in many, many years and we certainly have the technology to make a more immersive experience. Although again, an open standard would be much appreciated. Even of their new series of cards not all the AMD cards have the chip needed, and only their cards and APUs will have it. Adoption will be scarce.

 

Put simply, closed ecosystems for software and certain technologies hold back adoption from developers which means there is limited content that even supports this awesome thing you paid for. In the case of GSync this means few choices of panels (likely to be all TN/120+hz/1080p for at least a year). Also there are some games (Star Citizen) that you can't possibly get the full experience from with a single configuration (can't run AMD/Nvidia in same game at same time).

 

The question you might ask after all this is "well how do they make money on their investment?". AMD can't keep cards in stock and their prices are through the roof because of their support for OpenCL. They also made TressFX which can run on Intel, Nvidia, and AMD graphics chips. It happens to run better on AMD of course because they made it. If Nvidia had made PhysX a library using OpenCL I'm sure their cards would do better than AMD cards at that (or would have for the first few generations). Open standards can still be profitable. For another example look at Linux, it's in most any integrated computer system and it's the back end to Android and basically all enterprise servers.

CPU: 5820k 4.5Ghz 1.28v, RAM: 16GB Crucial 2400mhz, Motherboard: Evga X99 Micro, Graphics Card: GTX 780, Water Cooling: EK Acetal CPU/GPU blocks,


240mm Magicool slim rad, 280mm Alphacool rad, D5 Vario pump, 1/4 ID 3/4 OD tubing, Noctua Redux 140/120mm fans. PSU: Evga 750w G2 SSD: Samsung 850 Pro 256GB & Seagate SSHD 2TB Audio: Sennheiser HD558s, JBL! speakers, Fiio E10k DAC/Amp Monitor: Xstar DP2710LED @ 96hz (Korean Monitor) Case: Fractal Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

Woah. I never even thought of this this is awesome!

"If you do not take your failures seriously you will continue to fail"

Link to comment
Share on other sites

Link to post
Share on other sites

I thought AMD had a solution for GSync that was based on open source? And there was cards that could already support once it's officially released?

Link to comment
Share on other sites

Link to post
Share on other sites

I thought AMD had a solution for GSync that was based on open source? And there was cards that could already support once it's officially released?

It's the VBlank technology that will likely be in DisplayPort I think 1.4? Whichever is next.

All that it requires is a driver update to tell the monitor to refresh alongside sending a frame to refresh with. Much like GSync this should work with all games.

CPU: 5820k 4.5Ghz 1.28v, RAM: 16GB Crucial 2400mhz, Motherboard: Evga X99 Micro, Graphics Card: GTX 780, Water Cooling: EK Acetal CPU/GPU blocks,


240mm Magicool slim rad, 280mm Alphacool rad, D5 Vario pump, 1/4 ID 3/4 OD tubing, Noctua Redux 140/120mm fans. PSU: Evga 750w G2 SSD: Samsung 850 Pro 256GB & Seagate SSHD 2TB Audio: Sennheiser HD558s, JBL! speakers, Fiio E10k DAC/Amp Monitor: Xstar DP2710LED @ 96hz (Korean Monitor) Case: Fractal Node 804

Link to comment
Share on other sites

Link to post
Share on other sites

Amd can mine while nvidias render!

Desk: monitors 3x Asus VE248h(eyefinity), Keyboard Cm Strom Trigger(mx red), Mouse Corsair m65, Headset Audio Technica ATH-M50

Black Friday 2013 Build: i7 4770k, Gigabyte Z87X UD5H, 16gb Corsair, Msi R9 290, Corsair Axi 760, Corsair 750D, 2x intel 530 240gb ssd, 2x Seagate 400gb

Older Machine amd x640, msi 760g mobo, 8gb gskillz, Sapphire 6870, Corsair hx650, Cooler master haf 922, ocz agility 3 120gb ssd || HTPC: i7 3770k, shuttle xpc z77, 16gb gskillz, Asus GTX 650 ti, intel 120gb msata ssd

Link to comment
Share on other sites

Link to post
Share on other sites

If PhysX was worth using, I'd totally buy a GTX 650 for an ultra high-end AMD system. Unfortunately, it isn't. Same for 3D and triple monitor (imo). Steam's game streaming is a much more versatile (and open!) system than GameStream for what I want.

For me, that leaves Mantle, G-Sync, and the normal stuff like drivers. My last AMD card was an HD3450, so I don't know the quirks about CCC today. From what I've heard, you need Raptr for driver updates, and Raptr is clunky crap. Geforce Experience makes it easy to disable the crap and just get driver notifications.

Mantle is still a thing, as is G-Sync. I have no intention to buy a Displayport-only monitor that only runs on select GPUs with the correct drivers. I have enough issues getting my Intel RAID to not cause Linux to crash whenever I use it. I will not be buying G-Sync monitors (or any derivatives) unless it works as reliably as a normal monitor.

Mantle might be common, but a GTX 780 or R9 290 is totally within my means, and I have a 1080p monitor. For now, I can just throw hardware at the problem and use a framerate limiter to get low tearing sans lag.

Link to comment
Share on other sites

Link to post
Share on other sites

Hey Linus great video and like I said in Youtube comment what I normally shop for is hardware that is in my price range and since nvidia came out with the 700 Series they have been too expensive.  So after careful consideration I will shop for AMD/ATI hardware for my gaming experience because of their affordable price ranges I can do. 

Link to comment
Share on other sites

Link to post
Share on other sites

ok rant mode on

 

WTF was the point in this build??????????

 

I know you're busy - anyone running a media-based company in the tech industry is. I know it's not all fun and games, no job is.

 

But you decided that this build was worth the time over other things?

 

You had a 2 second slide of what benchmarks you received, which were pretty much irrelevant and useless. No sensible consumer, nor 99.9999% of your viewers, would care to consider an SLi + Crossfire build. Sure you shared your experiences on how it worked, but again - nobody really cares, because hardly anyone would. Those that would would be doing it regardless of what anybody told them, so they wouldn't need a video to show them how/why.

 

ok so maybe you wanted to make an argument about proprietary tech...

 

in what way did your actual build help strengthen your argument? all you did was just run some gameplay footage from the computer running said game, over a voiceover - that served NO purpose other than filler content

 

all I see is a bunch of wasted time in a pointless build to segway into an argument/debate point. you could have easily started this debate without wasting your company's time. nothing annoys me more than seeing someone always say "I'm so busy" and seeing wasted time in what you've been doing with it

 

end rant mode

 

**all intended as constructive criticism

Link to comment
Share on other sites

Link to post
Share on other sites

ok rant mode on

 

WTF was the point in this build??????????

 

I know you're busy - anyone running a media-based company in the tech industry is. I know it's not all fun and games, no job is.

 

But you decided that this build was worth the time over other things?

 

You had a 2 second slide of what benchmarks you received, which were pretty much irrelevant and useless. No sensible consumer, nor 99.9999% of your viewers, would care to consider an SLi + Crossfire build. Sure you shared your experiences on how it worked, but again - nobody really cares, because hardly anyone would. Those that would would be doing it regardless of what anybody told them, so they wouldn't need a video to show them how/why.

 

ok so maybe you wanted to make an argument about proprietary tech...

 

in what way did your actual build help strengthen your argument? all you did was just run some gameplay footage from the computer running said game, over a voiceover - that served NO purpose other than filler content

 

all I see is a bunch of wasted time in a pointless build to segway into an argument/debate point. you could have easily started this debate without wasting your company's time. nothing annoys me more than seeing someone always say "I'm so busy" and seeing wasted time in what you've been doing with it

 

end rant mode

 

**all intended as constructive criticism

 

Your constructive criticism is duly noted, but I have some counter arguments.

 

If I created a "here's why proprietary tech is bad" lecture series, it might be factual and thorough, but realistically very few would watch it. 

 

If instead I put on a show and create a spectacle AROUND the message, 75,000 people watch it in a span of 16 hours. It's called a "hook"

 

This build was excellent use of my time if the objective is to create a show that people want to watch, while delivering messages that I feel are important, which is my job.

Link to comment
Share on other sites

Link to post
Share on other sites

Awesome video, not really sure why you're advertising a book with a guy getting a BJ on the cover..?

"Her tsundere ratio is 8:2. So don't think you could see her dere side so easily."


Planing to make you debut here on the forums? Read Me First!


unofficial LTT Anime Club Heaven Society

Link to comment
Share on other sites

Link to post
Share on other sites

Posted some of this this on the video, but for the sake of avoiding Youtube to give Linus my thoughts on the subject more directly, I'm posting here as well.

I don't really know what I expected out of this video, but I've said before that we're getting to the point where GPUs are obsolete. They're big, they're getting bigger it feels like, and yet everything else seems to want to get smaller. I've been thinking to myself what NVidia would do if APUs were much more powerful. It seems to be a step in the right direction when thinking about the future of GPUs, but I can't think of a single solution to that problem. The best I can do is predict the death of GPUs and APUs becoming widely adopted, which would kill off GeForce Experience benefits anyway.

With that being said, I will also add in that I think that it's way more beneficial to assume that AMD is actually taking the right route. If APUs get big, and they look like it right now, NVidia will be eating dirt. A lot of the features of GeForce Experience are already being shown elsewhere. It's only a matter of time before AMD is the way to go, and eventually we won't have anymore "X vs X" arguments. I've said it before and I'll say it again.. The latest generation of consoles came at a bad time. People knew that a lot of what the 360/PS3 had that was unique was very compatible with PC (namely the online multiplayer and the controllers but recording videos as well, for example).

Bringing it back to AMD versus NVidia to finish the post up, I stated in my 2nd paragraph that AMD is literally going to shut down NVidia in graphics unless NVidia changes. I'm talking like, super long-term, 20+ years from now type of stuff. People will still buy dedicated cards and build larger machines for a long while, mostly because of the amount of time it'll take for things to transition to Gigabyte Brix-sized computers (way of the future, the way of the future..). Small isn't exactly better, but if we can lower the thermal output on devices that are as strong as current-day computers at a significantly smaller size, there's absolutely no reason we won't see something the size of the average modem being the "gaming" or general entertainment standard. Like I said, we're getting most/all of the features in GeForce Experience for free in the future, at least at some point. Tack on all the new technologies we've seen in the last 6-8 months or so along with upcoming ones, and I see GPUs dying off so quickly. We need something smaller, more portable and modular, and an 11.5"x5.5"x3"* PCI-E graphics card is definitely not gonna work.

*Not sure if accurate; first time I've not known the imperial measurement to something rather than the metric as an american...

 

Edited out much of the consoles stuff because I won't be responding to that. 

 

I think you bring up a good point though, except in saying that everything is getting smaller. Yes, technology is allowing us to build smaller computers. However, I don't think that will spell the death of the GPU. The APU is probably beneficial somewhere like a laptop in that it might be possible to make a laptop in an ultrabook-like form factor capable of gaming, but for a desktop form factor, I don't think so. 

 

First, let's think about who even needs a GPU, the truth is, not a lot of people. Gamers, people who use CUDA and OpenCL, they can be used for other certain niche applications as well such as floating point and integer operations, but what about MOST people who are buying a computer. Most people who buy a PC don't need a GPU at all. They use their computer for work, internet, documents, spreadsheets, movies, music...etc. something that current generation Intel HD Graphics are more than capable of. Thus, the way I see it is that GPUs are a niche product which are there to only service a certain segment of the market. 

 

Let's look at it this way. At the moment AMD's APUs don't make much sense. Firstly, their CPU performance is woeful. The best Kaveri APU is almost the same price as an FX 8350 and a 4670K from Intel, yet offers nowhere near the performance that those other two chips offer from a CPU point of view. From a GPU point of view, the APU doesn't make any sense compared to a cheap CPU, e.g. Celeron, combined with a 7790 or 7850, which though a little more expensive, will offer so much more FPS on games. So let's think about who actually benefits from an APU. 

 

1) Gamers on a budget - no - it's better to go with cheap CPU + 7790 or 7850 (i.e. budget GPU + bargain CPU)

2) Everyday users (e.g. office users, grandparents...etc.) - no - because Intel's integrated graphics are more than sufficient for that and Intel has far superior CPU performance. 

 

That is the essential problem with APUs. They simply do not provide a worthy alternative to those two options. We've been talking about the "death" of the GPU ever since integrated graphics made their way onto the Southbridge over 10 years ago, yet it hasn't happened yet. 

 

I think AMD is taking the right route too, if nothing else then to make Intel brush up on their integrated graphics. But Intel's integrated graphics are not what the "GMA graphics" used to be, they are fully capable of a good computing experience provided you don't want to play games. And let's be real here, most people who buy a computer don't game and have no interest in it. 

 

In response to needing something smaller, I disagree too. Computers have been the same size for many, many years. I'm not sure if you ever owned a computer in the 386 or 486 era, they were the same size as most mid-towers are now. I think there is a reason for that. It is because it's an ideal size. Let's not think of computer enthusiasts. Think about an average person who wants to buy a computer to use at home to do things such as browse the internet, type out word documents, play the odd game or two, do their spreadsheets, maybe do some programming, just general usage. I would probably recommend an ATX mid-tower or mATX mini-tower, not mITX, not full-tower. 

 

1) Going mITX sacrifices too much - e.g. expansion slots, what if they need a TV tuner card, or a WiFi card

2) mITX is more expensive - it's true, making things smaller makes them more expensive

3) Difficult to upgrade - think about how difficult it is to install a hard drive in some mITX cases, now think about how easy it is in an average mid-tower

4) mITX is still a boutique standard at the moment, it is there for people who want a small system for whatever reason, it doesn't make much sense for a general consumer yet. 

5) I wouldn't recommend full-tower as the extra space adds nothing and it is just far too big, also more expensive.

 

That's my reasoning anyway, that the ideal size for computers is not mITX or a small form factor and that the GPU will not die. 

My Personal Rig - AMD 3970X | ASUS sTRX4-Pro | RTX 2080 Super | 64GB Corsair Vengeance Pro RGB DDR4 | CoolerMaster H500P Mesh

My Wife's Rig - AMD 3900X | MSI B450I Gaming | 5500 XT 4GB | 32GB Corsair Vengeance LPX DDR4-3200 | Silverstone SG13 White

Link to comment
Share on other sites

Link to post
Share on other sites

Your constructive criticism is duly noted, but I have some counter arguments.

 

If I created a "here's why proprietary tech is bad" lecture series, it might be factual and thorough, but realistically very few would watch it. 

 

If instead I put on a show and create a spectacle AROUND the message, 75,000 people watch it in a span of 16 hours. It's called a "hook"

 

This build was excellent use of my time if the objective is to create a show that people want to watch, while delivering messages that I feel are important, which is my job.

 

few people would watch factual content? like techquickie? or the wan show? people like hearing what you have to say regardless of what it is. the show people want to watch is whatever you put up, that's why they're subscribed

 

considering you had a system right infront of you to do things like - back to back comparisions of X game running with Physx On and Off, Mantlevs DirectX, TressFX on and Off, the very things you opened the video with

 

it would have been

- Informative (especially to NEW, non-subscribed viewers - someone new to tech wants to know the differences, now people would share this video and get you more views, and potentially more subs/forum discussion)

- Insightful comparison of said technologies

- spurred discussion (I just saw ___ vs ___ tech, I'm posting what I felt was better on this forum link)

- still entertaining

- strengthened your argument/point, which would spur more comments and discussion

 

instead, the build was just there to look at....cool?

 

 

This build was excellent use of my time if the objective is to create a show that people want to watch

 

- Which people? Because, new viewers won't find this video of any importance to them and they'll just move on. Again, if it's your subscribers, they'll watch anything you put up - that's why they're subscribers

 

 

while delivering messages that I feel are important, which is my job.

 

- And it was merely a message. Not a discussion/debate stirring video - most of the people arguing about it are the same people that have been arguing about it regardless

 

my point being - potential was wasted in this build (or the content of the video around said build - same point either way)

Link to comment
Share on other sites

Link to post
Share on other sites

I want NV to pick up Mantle and Trueaudio (if that is possible hardware wise). I want NV to open up Physx and G sync to AMD and AMD  adopts Physx and Gsync. However, Sometime I do fear that if amd just keeps opening up technologies to NV and NV doesn't open up their tech to AMD. People will be like "Why would I buy a amd card when I can buy a NV card and have physx, mantle, Gsync and trueaudio." or vise versa. Is that something others worry about too.

CPU amd phenom ii x4 965 @ 3.4Ghz | Motherboard msi 970a-g46 | RAM 2x 4GB Team Elite | GPU XFX Radeon HD 7870 DD | Case NZXT Gamma Classic | HDD 750 GB Hitachi | PSU ocz modxstream pro 600w

Link to comment
Share on other sites

Link to post
Share on other sites

LOL If AMD and Nvida see this, the battle of the GPU titans

$$金Trill金$$

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it idles at 110W... Not half bad considering...

What's it like under load? 

Where do bad folks go when they die?
They don't go to heaven where the angels fly
They go to the lake of fire and fry, Won't see them again 'till the fourth of July

Link to comment
Share on other sites

Link to post
Share on other sites

Your constructive criticism is duly noted, but I have some counter arguments.

 

If I created a "here's why proprietary tech is bad" lecture series, it might be factual and thorough, but realistically very few would watch it. 

 

If instead I put on a show and create a spectacle AROUND the message, 75,000 people watch it in a span of 16 hours. It's called a "hook"

 

This build was excellent use of my time if the objective is to create a show that people want to watch, while delivering messages that I feel are important, which is my job.

Mantle's future isn't proprietary nor is dynamic refresh rate technologies like FreeSync.

The argument is momentary and rather lop-sided, since only one of two companies maintains a closed ecosystem, the Nvidia driver conflict is in and of itself an example of how Nvidia continues to try and lock its platform.

To be entirely honest with you, by the time Star Citizen actually launches, Mantle will be open to everybody & FreeSync monitors will be widely available.

The lower CPU overhead enabled by Mantle also means that CPU PhysX will become more viable even though it already is pretty much viable right now.

There is also TressFX which I expect AMD will keep pushing.

Link to comment
Share on other sites

Link to post
Share on other sites

It's a fascinating situation.

Nvidia would probably support Mantle, because it would pretty much take away AMD's primary advantage, adding it to their own proprietary crap (notably, G-Sync).

Intel would benefit as well, since they are so invested in APUs currently, and it would allow their APUs to be actually usable for gaming.

 

The only one that has to lose from opening Mantle is AMD currently, which is why I really wonder if they're gonna follow on the promise.

Ipsa scientia potestas est

Link to comment
Share on other sites

Link to post
Share on other sites

So wait, you made a Parallel Gaming+Mining rig?

 

Wouldn't work, all 4 cards would throttle, AMD because Mining heats them up, Nvidia because gaming would heat them up.

Ipsa scientia potestas est

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×