Jump to content

SLI & Crossfire in the Same PC at the Same Time??

I wonder how the results would be if we put a Nvidia Pro card with an AMD Pro card ~ possibly all the CUDA cores and OpenCL performance we will ever need for quite a few years! xD

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder how the results would be if we put a Nvidia Pro card with an AMD Pro card ~ possibly all the CUDA cores and OpenCL performance we will ever need for quite a few years! xD

you mean the professional cards like Quadro and FirePro?

 

but those card cost as much as the test rig

Budget? Uses? Currency? Location? Operating System? Peripherals? Monitor? Use PCPartPicker wherever possible. 

Quote whom you're replying to, and set option to follow your topics. Or Else we can't see your reply.

 

Link to comment
Share on other sites

Link to post
Share on other sites

That's it, I am getting a PS4................. just kidding  :D ., however this is really frustrating.

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is a great topic that speaks very much to the dilema I think is causing a lot of "internet fights" all over forums everywhere. From a customer perspective we don't want to be missing out on key features when we buy a game just because they partnered with a different GPU vendor. We don't want performance to suffer  (like it is with BL2 or BF4 being not as smooth) because we don't have the magic card/api/frame pacing etc. I absolutely agree with the sentiment, I don't want to have to choose between gsync/streaming/physX and mantle/truaudio at all, I want to choose based on other criteria and have games run on all platforms equally as well within reason. But we do all realise that is impossible, the use of certain graphical features (tessellation, particular FX) will perform better on one card than another and which wins out often depends on the differences. 

 

Most innovation comes from companies doing something on their own. I suspect Nvidia spoke to other companies about replacing vsync and got a cold response initially. The cost of changing the interface was incredibly high and the benefits were unknown and likely underplayed. So they did their own thing, designed their own technology and brought it to market. Then everyone got to see how great it was and then AMD is interested. Many monitor companies still aren't onboard with gsync yet they have had plenty of time to contact Nvidia and start to look at how they can implement it. Nvidia is being silly though by requiring it to be their module and that they must do the work to put it into a monitor, that as a solution is never going to scale to the industry. AMD on the other hand wants to do it via the displayport standard. This is both great and terrible, Nvidia's solution seems to be more complicated than just dynamic refresh rate as used in eDP and presumably they need the polling and other features, the ultra low latency mode for example will be great in competition games. The standards based approach puts it back with the monitor companies to support, and they will continue to look at it as niche until gsync sells enough to prove the market, then they will work with a standard and implement it.

 

This is essence of the problem. In order to get a new standard like gsync/mantle requires enormous cost that the industry as a whole is not willing to incur until they see demand. Standards form on the basis of a few competing solutions and having one win out or the best of both competing solutions being combined. In the interim we are always left with these competitive solutions doing different things. A great example is AMD eyefinity, Nvidia took a while but they came up with their own surround implementation and they are equivalent now. But in the end really useful technology gets adopted by both companies. I think the challenge this round is that both vendors have innovated technologies we all want, its not just one of them making good progress and clearly better progress than the other. We should all be frustrated with the choice but these companies have to allowed to innovate otherwise you get the same thing as the consoles, no innovation just more of the same and a little faster but mostly designed to be cheap. I personally don't want to have to choose, but I am being forced to, and its really hard to choose considering the trade offs in performance/noise/frame pacing etc that also define the differences in these cards. I get the feeling its more divergent now than I can ever remember with GPUs, which is both good and bad.

Link to comment
Share on other sites

Link to post
Share on other sites

It would be awesome if you could do this, then have mantle and g-sync running at the same time!

Cooler Master 690 II Advanced - I5 2500K @ 4.5Ghz - Geil Enhance Plus 1750Mhz 8Gb - ASUS ENGTX 570 DCUII - MSI Z68aGD65 - Scythe Mugen 2 Rev. B - Samsung 830 128gb and Crucial m4 128gb - much other stuff not worth mentioning

Link to comment
Share on other sites

Link to post
Share on other sites

Fall for me... Best... Novel .... Ever!!!!!

"Wibbly Wobbly Timey Wimey".... Yeah

-The Doctor

Link to comment
Share on other sites

Link to post
Share on other sites

Back to the real discussion, I hope nvidia uses mantle one AND opens it up for everyone.

"Wibbly Wobbly Timey Wimey".... Yeah

-The Doctor

Link to comment
Share on other sites

Link to post
Share on other sites

I think that Nvidia and AMD need to put this nuclear arms race silliness asside and come together to create a standardised solution to the DirectX problem.  If Nvidia were to ever think that the correct option would be to create a counter to mantle then rest assured the world as we know it will  end.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD an Nvidia should fusion and only build one lineup of GPUs and make the consumer pay for unlocking features lik e G-Sync Mantle and so on. This would be the best solution imo, but also probably the most difficult to do. :D

who cares...

Link to comment
Share on other sites

Link to post
Share on other sites

My question is CAN you be running a openCL render on the amd GPUS in the background while the monitor is hooked up to the Nvidia cards and playing a game.

Or rather why not just have two monitors one for each pair of cards.

TekSyndicate Forum Moderator: https://forum.teksyndicate.com/users/njm1112

5930K@4.3Ghz | 16GB 4x4 2133Mhz | r9-290 | 6TB Raid5 | 250GB 850Evo | 8.1pro | RM750

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously though, Nvidia and AMD need to suck it and make a better experience for the gamer and not the select people who decide to go Green or Red team. I think Nvidia should take the mantle idea if and when AMD allows, and use it to its full extent for the gamers, and Nvidia should also open up G Sync to AMD platforms. Not fully open but allow people that have AMD to use there tech, then again AMD would have to get on board with this one as well. All im trying to say is, these companies are hurting us more then helping. Like Linus said in this video people who don't know are buying these cards hoping to get all the feature but some like Phys X aren't even supported wide spread ie every game and only through Nvidia. 

 

Message to Nvidia and AMD. If you want the gamers to be happy then give us a reason to be, make it so that everyone can get the same exciting game experience and all its capable features regardless of the card they choose. Make it easier for those who want to have that console experience ie plug and play no problems have that ability by joining sides for once ( i'm not asking for a merger or anything close to that)and allowing those on each team to experience an amazing title with all it has to offer set through the eyes of the game creator. 

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder what the temps are, especially on that second slot 290x.

 

For the consumer, we would of course like to see all technologies in one card. But then there would be nearly no competition besides raw performance.

For those who'd like nVidia to buy AMD's graphics department or a third party to buy up both, remember that a monopoly is the worst outcome for consumers and the market.

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

Wait. Can I put a AMD card as a primary and hook monitor up to it. Then have an NVIDIA card not hooked up to anything running in background as dedicated PhysX card? So I can have a main AMD card with PhysX? Or am I talking bullshit.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait. Can I put a AMD card as a primary and hook monitor up to it. Then have an NVIDIA card not hooked up to anything running in background as dedicated PhysX card? So I can have a main AMD card with PhysX? Or am I talking bullshit.

yep just search Hybrid PhysX or go back a couple pages and watch the video i linked

Link to comment
Share on other sites

Link to post
Share on other sites

Seriously though, Nvidia and AMD need to suck it and make a better experience for the gamer and not the select people who decide to go Green or Red team. I think Nvidia should take the mantle idea if and when AMD allows, and use it to its full extent for the gamers, and Nvidia should also open up G Sync to AMD platforms. Not fully open but allow people that have AMD to use there tech, then again AMD would have to get on board with this one as well. All im trying to say is, these companies are hurting us more then helping. Like Linus said in this video people who don't know are buying these cards hoping to get all the feature but some like Phys X aren't even supported wide spread ie every game and only through Nvidia. 

 

Message to Nvidia and AMD. If you want the gamers to be happy then give us a reason to be, make it so that everyone can get the same exciting game experience and all its capable features regardless of the card they choose. Make it easier for those who want to have that console experience ie plug and play no problems have that ability by joining sides for once ( i'm not asking for a merger or anything close to that)and allowing those on each team to experience an amazing title with all it has to offer set through the eyes of the game creator. 

GG man that text coloring, but I can sympathize with the feeling.

 

Personally, as long as AMD can support themselves while making a transition point like developing a low level API then I'm all for it. This just concerns me that in order to develop this new API that something else had to give or that some segment of their product lines is going to suffer because of it.

 

I mean honestly, only Intel really has the money to just throw R&D at an issue they want fixed without very many concerns over the allocation of limited assets. But when taking the market as a whole into consideration, discrete GPU sales must make up just a tiny fraction of their overall revenue so I can't fault them for not pushing the limits there. So I guess what I'm trying to say is that you have to consider AMD as a competitor to Intel first and Nvidia second when you look at where there market priorities lay.

 

So what this could be is AMD fleshing out an idea through their discrete GPU market, to later on bring to their integrated chips at a later point for business purposes. I realize that mantel affects their APU's as it currently stands but how many businesses are going to accept the software in its current state? Consider the possibilities for real world applications such as highly detailed billboards running on simple chips and mantle makes the performance difference where it sways the choice of a conglomerate to AMD over Intel.

 

In the short term this, I feel, will hurt the hardware development for AMD's discrete GPU division simply based on their not unlimited resources to throw at the issue. But in the long term, as was mentioned previously in this thread, if AMD can flesh out this API enough to make it deploy-able en mass to their corporate customers we may see a paradigm shift within the computing space.

Link to comment
Share on other sites

Link to post
Share on other sites

@LinusTech , what was the purpose of this video? to send a message to NVIDIA AND AMD? or to say u can enjoy both, if you spend fuck ton of money.

Link to comment
Share on other sites

Link to post
Share on other sites

I think AMD and Nvidia should license the tech to each other cheaply or call it even and let each other use mantle, g-sync, physX. If they want some advantage maybe that tech is locked in the game for a month for the users of the other team?

Link to comment
Share on other sites

Link to post
Share on other sites

On one of the views of the cards, my friend thought it was an engine...

Main rig on profile

VAULT - File Server

Spoiler

Intel Core i5 11400 w/ Shadow Rock LP, 2x16GB SP GAMING 3200MHz CL16, ASUS PRIME Z590-A, 2x LSI 9211-8i, Fractal Define 7, 256GB Team MP33, 3x 6TB WD Red Pro (general storage), 3x 1TB Seagate Barracuda (dumping ground), 3x 8TB WD White-Label (Plex) (all 3 arrays in their respective Windows Parity storage spaces), Corsair RM750x, Windows 11 Education

Sleeper HP Pavilion A6137C

Spoiler

Intel Core i7 6700K @ 4.4GHz, 4x8GB G.SKILL Ares 1800MHz CL10, ASUS Z170M-E D3, 128GB Team MP33, 1TB Seagate Barracuda, 320GB Samsung Spinpoint (for video capture), MSI GTX 970 100ME, EVGA 650G1, Windows 10 Pro

Mac Mini (Late 2020)

Spoiler

Apple M1, 8GB RAM, 256GB, macOS Sonoma

Consoles: Softmodded 1.4 Xbox w/ 500GB HDD, Xbox 360 Elite 120GB Falcon, XB1X w/2TB MX500, Xbox Series X, PS1 1001, PS2 Slim 70000 w/ FreeMcBoot, PS4 Pro 7015B 1TB (retired), PS5 Digital, Nintendo Switch OLED, Nintendo Wii RVL-001 (black)

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry, which motherboard is he using?

CPU: Intel Core i7-4790k | CPU Cooler: Noctua NH-D15 | Motherboard: ASUS Sabertooth Z97 MARK 1 | Memory: Kingston HyperX FURY 16GB 1866MHz | GPU: Gigabyte GeForce GTX 770 4GB Windforce


Storage: Samsung 840 EVO | PSU: CM Silent Pro 720W | Case: Phanteks Enthoo Luxe | Headset: Corsair Vengeance 2100 | Keyboard: Logitech G710+ | Mouse: Razer DeathAdder Chroma


"You see, one can only be angry with those he respects." - R. Nixon

Link to comment
Share on other sites

Link to post
Share on other sites

The best part of this video was the endorsement: Leave it to Linus to find a porn book to plug your service, nicely done sir! :D

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

I'd like to argue that Nvidia is not behind in there R&D for their "APU" implementation. Look at their ARM processor technology known to consumers as Tegra. ARM is undoubtedly the future in high performance scalable computing. Nvidia may just be marketing this incorrectly or they already have a enterprise grade idea for the use of their ARM processor chip. They are doing some very impressive things with GRID and there VDI implementations with VMware. Nvidia is more than ready to continue to throw punches at AMD.

I hope so, because competition creates innovation.

I'm not all that informed with ARM processors, Tegra K1s look neat at a glance though. I still think that AMD has the upper hand when it comes to compact pc hardware right now, just because of Intel not putting forward what I consider adequate technology to advance in the market well. Intel isn't as strong in the graphics processor market, but I think they're a far bigger competitor for strictly desktops longterm than NVidia will be. I almost want to say that AMD isn't going to go too far in the mobile market because of NVidia, as well, but I find mobile devices to be very wishy-washy right now. Their future is kind of unstable to me because I really don't know what to expect besides more powerful phones and tablets than we have now. Google Glass-type technology (such as the pebble watch or w/e)is about as crazy as it gets for me when it comes to what we might see replacing phones and the likes. Walking peripherals, if you will.

Not sure if I said this in my last post but I was implying that AMD may be closer to some form of all-in-one desktop thing (or support for one) than anyone else. That is why I'm optimistic about APUs at all. I am tired of ATX motherboards and long, large graphics cards. I want to see mini-ITX or smaller become the norm, with different ways to implement RAM into a system and no bulk on the board itself.

External stuff like Thunderbolt technology is a step in the right direction, I think. It can take just a little bit more away from motherboards while still giving large benefits. I think the biggest limitation is storage space size right now, and how expensive it is. Both in actual storage and in physical size of the HDD/SSDs you need for storing multiple TBs of data.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Their future is kind of unstable to me because I really don't know what to expect besides more powerful phones and tablets than we have now

 

I'm sorry but is anything but. I can see how you (and others of course) can arrive at that conclusion because right now all that hardware is barely taken advantage of save very few games but we're moving so fast into mobiles than it will eventually lead to games being released for "Steam" "Xbox Live" and "PS+" as the platforms but the hardware would be phones, smart tvs, tablets, etc.

You might think that I'm looking too far away into the future but at the rate we're improving I don't think it's that far off, maybe a decade or so.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

...wait Linus, what about freesync? I was really debating between a GTX 770 and an R9 280x, because the 770 has gsync while the 280x has mantle (I wanna play BF4). But when I heard about freesync, I was like "NVidia just lost this". To many people, isn't that the "ultimate compromise"? Nice video tho...  

Link to comment
Share on other sites

Link to post
Share on other sites

I hope so, because competition creates innovation.

I'm not all that informed with ARM processors, Tegra K1s look neat at a glance though. I still think that AMD has the upper hand when it comes to compact pc hardware right now, just because of Intel not putting forward what I consider adequate technology to advance in the market well. Intel isn't as strong in the graphics processor market, but I think they're a far bigger competitor for strictly desktops longterm than NVidia will be. I almost want to say that AMD isn't going to go too far in the mobile market because of NVidia, as well, but I find mobile devices to be very wishy-washy right now. Their future is kind of unstable to me because I really don't know what to expect besides more powerful phones and tablets than we have now. Google Glass-type technology (such as the pebble watch or w/e)is about as crazy as it gets for me when it comes to what we might see replacing phones and the likes. Walking peripherals, if you will.

Not sure if I said this in my last post but I was implying that AMD may be closer to some form of all-in-one desktop thing (or support for one) than anyone else. That is why I'm optimistic about APUs at all. I am tired of ATX motherboards and long, large graphics cards. I want to see mini-ITX or smaller become the norm, with different ways to implement RAM into a system and no bulk on the board itself.

External stuff like Thunderbolt technology is a step in the right direction, I think. It can take just a little bit more away from motherboards while still giving large benefits. I think the biggest limitation is storage space size right now, and how expensive it is. Both in actual storage and in physical size of the HDD/SSDs you need for storing multiple TBs of data.

 

I agree completely with your statement regarding competition creating innovation, all this talk about what if nVidia opened up their doors or AMD and nVidia becoming a some big conglomeration is wishful thinking.

 

Take for example titan, the topic has been beaten to death whether it was suppose to be the gtx 780 or the titan release was planned all along. I believe the general consensus that A. it would have been the gtx 780, and B. the only reason it isn't is because AMD dropped the ball.

It took AMD almost 10 months later to play catch up. My point here is that if nVidia had competition at the time, we wouldn't have had a half-ass launch of the GK110 that raped and pillaged the consumer, and we might have had maxwell a bit sooner. That's what happens when there is no competition.

 

Beyond this, the real big issue is these technologies, yes they're great and yes the consumer wins in the end. However, they're trying to use these technologies to differentiate themselves instead of producing a better product. I.e. tressFX, Physx, Mantle, G-Sync. All of these technologies are great in their own regard, but these technologies could theoretically be done by either vendor. In the end it's them trying to create a marketing difference that's tangible to the layman, i.e. instead of looking at numbers and benchmarks, the person shopping looks and sees for example TressFX or PhysX and says wow the hair looks more real with this card or the physics using this card becomes super realistic.

 

In the end when we shop for a card what does it boil down too? It's about the performance, if one card strictly performs better at that price range, you're going to get it. You're not going to be worrying about pretty hair and realistic particles if you're getting 30% less frames.

 

In regards to APU's and such being the future of computing, I think were forgetting about things like GRID and shield, I believe that will be the future of high powered computing. One computer to rule them all so to speak. I think nVidia releasing shield is an attempt to get us to warm up to the idea of this kind of cloud computing. If it becomes a seamless enough experience you can have amazing performance on a pretty and slim laptop. While the real computing is done off-site whether in your home in a personal grid or thousands of KM away.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×