Jump to content

DirectX12 AMD and Nvidia Multi-GPU Configurations Confirmed

CM2546

@WhiteSkyMage I think this (although a little sketchy) makes that seem a lot more possible as sfr in Mantle has vram stacking.

Even if Nvidia Locks this down, as long as we are able to use our cpu integrated graphics and older/less powerful/more powerful nvidia gpus, i'll be fine with it.

They could already do that without DX12. AMD does it as far as I know. They don't need a new graphics API to do what you're describing I think.

Link to comment
Share on other sites

Link to post
Share on other sites

They could already do that without DX12. AMD does it as far as I know. They don't need a new graphics API to do what you're describing I think.

On DX11? How?

@WhiteSkyMage Even if Nvidia Locks this down, as long as we are able to use our cpu integrated graphics and older/less powerful/more powerful nvidia gpus, i'll be fine with it.

Lock what down? SLI with different cards? I dont think they would lock down the Vram stacking, that would piss lots of people.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

First there was JBOD.  Now there will be JBOVC.

Link to comment
Share on other sites

Link to post
Share on other sites

Brilliant, now to hope like hell that devs see the light and stop this 'optimized for 1 brand' shit.

So we'll see games optimised for cross-sli

Link to comment
Share on other sites

Link to post
Share on other sites

So does this mean there will be a purpose for 3-4 way sli/crossfire in dx12 titles now. Scaling after 2 is just awful and a waste of money

Link to comment
Share on other sites

Link to post
Share on other sites

so this is axtually awesome! But no doubt it will take NV like 2 seconds to block it giving some really crapyvague reason i expect as "it will ensure a better experiance blah blah blah"

Hi!  Please quote me in replies otherwise I may not see your response!

Link to comment
Share on other sites

Link to post
Share on other sites

Unless Nvidia or Intel says so it will not happen,probably AMD would agree but i doubt them too.

So stop dreaming its not gonna happen,and if it is possible they will probably not optimize it well enough between different vendors,what im crazily interested about that no one seems to see is multi gpu from the same vendor i would kill to be able to run my GTX 670 with my old gtx 460,or even a new maxwell midrange card,that would be fucking insane and throw pc gaming into a new era,almost everyone will have multigpu setups probably.

Link to comment
Share on other sites

Link to post
Share on other sites

As long as the rumor that DX12 will treat SLI/Crossfire as one unifed graphics unit (4gb+4gb=8gb) I'm happy.

 

Would be nice to have the additional onboard memory for future dual GTX 980 4K gaming

CPU i5-4690K(OC to 4.4Ghz) CPU Cooler NZXT Kraken x41 Motherboard MSI Z97 Gaming 5 Memory G.Skillz Ripjaws X 16gb 2133 Video Card MSI GTX 1080 Gaming X           Case NZXT H440 Power Supply XFX XTR 750W Modular Storage Samsung 840 EVO 250gb/Seagate Barracuda 2TB Monitor Acer XB270HU G-Sync http://pcpartpicker.com/b/3CkTwP

Link to comment
Share on other sites

Link to post
Share on other sites

Never going to happen even if it becomes possible. Nvidia and their drivers will stop it. They already proved they can and will do that with amd gpu rendering the game/nvidia doing physx calculation setups.

Ryzen 3700x -Evga RTX 2080 Super- Msi x570 Gaming Edge - G.Skill Ripjaws 3600Mhz RAM - EVGA SuperNova G3 750W -500gb 970 Evo - 250Gb Samsung 850 Evo - 250Gb Samsung 840 Evo  - 4Tb WD Blue- NZXT h500 - ROG Swift PG348Q

Link to comment
Share on other sites

Link to post
Share on other sites

22:22:24, almost a perfect time :D

This sounds like a wet dream.

I believe it when I see it and when I see it I will just add another more powerful GPU to my rig. Damn all that performance for just little more money. PLEASE WORK PROPERLY AND BE TRUE.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

Never going to happen even if it becomes possible. Nvidia and their drivers will stop it. They already proved they can and will do that with amd gpu rendering the game/nvidia doing physx calculation setups.

Stupid drivers :(

I wonder how much it would hit their revenues though.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

Very interesting! Though I guarentee Nvidia would never let this happen.

Nvidia may not like it, but they would be disabling a part of DirectX or avoiding the feature all together through drivers, and I don't think Microsoft would take too kindly to that. 

Edit: since DirectX12 is supposed to be a lower overhead API

Ryze of the Phoenix: 
CPU:      AMD Ryzen 5 3600 @ 4.15GHz
Ram:      64GB Corsair Vengeance LPX DDR4 @ 3200Mhz (Samsung B-Die & Nanya Technology)
GPU:      MSI RTX 3060 12GB Aero ITX
Storage: Crucial P3 1TB NVMe Gen 4 SSD, 1TB Crucial MX500, Spinning Rust (7TB Internal, 16TB External - All in-use),
PSU:      Cooler Master MWE Gold 750w V2 PSU (Thanks LTT PSU Tier List)
Cooler:   BeQuite! Prue Rock 2 Black Edition
Case:     ThermalTake Versa J22 TG

Passmark 10 Score: 6096.4         CPU-z Score: 4189 MT         Unigine Valley (DX11 @1080p Ultra): 5145         CryEngine Neon Noir (1080p Ultra): 9579

Audio Setup:                  Scarlett 2i2, AudioTechnica AT2020 XLR, Mackie CR3 Monitors, Sennheiser HD559 headphones, HyperX Cloud II Headset, KZ ES4 IEM (Cyan)

Laptop:                            MacBook Pro 2017 (Intel i5 7360U, 8GB DDR3, 128GB SSD, 2x Thunderbolt 3 Ports - No Touch Bar) Catalina & Boot Camp Win10 Pro

Primary Phone:               Xiaomi Mi 11T Pro 5G 256GB (Snapdragon 888)

Link to comment
Share on other sites

Link to post
Share on other sites

I believe this was said last time this was in the news, but I'll say it again:

It would be highly unlikely for Nvidia or AMD to actually allow the user to use their competitor's GPUs alongside their own. This "feature" will be disabled in drivers.

 

Kinda like how you can't SLI a 680 and a 770, even though they're the same damn card.

i7 not perfectly stable at 4.4.. #firstworldproblems

Link to comment
Share on other sites

Link to post
Share on other sites

Hmm, this I gotta see in demo,

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Corsair K63 Cherry MX red | Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Fuck yeah, I can run my 970 with my good ole HD 7770 or with my brothers 650ti... Not my 480 since that's dead.. I want my $40 back. :(

 

Spoiler

Senor Shiny: Main- CPU Intel i7 6700k 4.7GHz @1.42v | RAM G.Skill TridentZ CL16 3200 | GPU Asus Strix GTX 1070 (2100/2152) | Motherboard ASRock Z170 OC Formula | HDD Seagate 1TB x2 | SSD 850 EVO 120GB | CASE NZXT S340 (Black) | PSU Supernova G2 750W  | Cooling NZXT Kraken X62 w/Vardars
Secondary (Plex): CPU Intel Xeon E3-1230 v3 @1.099v | RAM Samsun Wonder 16GB CL9 1600 (sadly no oc) | GPU Asus GTX 680 4GB DCII | Motherboard ASRock H97M-Pro4 | HDDs Seagate 1TB, WD Blue 1TB, WD Blue 3TB | Case Corsair Air 240 (Black) | PSU EVGA 600B | Cooling GeminII S524

Spoiler

(Deceased) DangerousNotDell- CPU AMD AMD FX 8120 @4.8GHz 1.42v | GPU Asus GTX 680 4GB DCII | RAM Samsung Wonder 8GB (CL9 2133MHz 1.6v) | Motherboard Asus Crosshair V Formula-Z | Cooling EVO 212 | Case Rosewill Redbone | PSU EVGA 600B | HDD Seagate 1TB

DangerousNotDell New Parts For Main Rig Build Log, Señor Shiny  I am a beautiful person. The comments for your help. I have to be a good book. I have to be a good book. I have to be a good book.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I believe this was said last time this was in the news, but I'll say it again:

It would be highly unlikely for Nvidia or AMD to actually allow the user to use their competitor's GPUs alongside their own. This "feature" will be disabled in drivers.

Kinda like how you can't SLI a 680 and a 770, even though they're the same damn card.

Having cross multi gpu between both competitors disabled is still Ok, but what i am concerned about is the memory stack when SLI or crossfire. Now, it is maybe now possible with AMD's Mantle, but what will be Nvidia's answer to that memory stacking? Will they allow it...? That is th thing everyone want so we can go 4K. But then what will happen on the market if everyone suddenly have 8+ GB Vram using dual gpu config with that much compute power...? You see, that's just going to mean performance leap...and with those 20% of performance boost which Microsoft is promising... Omg...

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Having cross multi gpu between both competitors disabled is still Ok, but what i am concerned about is the memory stack when SLI or crossfire. Now, it is maybe now possible with AMD's Mantle, but what will be Nvidia's answer to that memory stacking? Will they allow it...? That is th thing everyone want so we can go 4K. But then what will happen on the market if everyone suddenly have 8+ GB Vram using dual gpu config with that much compute power...? You see, that's just going to mean performance leap...and with those 20% of performance boost which Microsoft is promising... Omg...

 

Lemme guess, they're going to do whatever gives their cards the best performance that they can achieve within their target pricepoint.. 

If that means that they stack VRAM wherever they can, they most definitely will allow it in order to stay competitive.

i7 not perfectly stable at 4.4.. #firstworldproblems

Link to comment
Share on other sites

Link to post
Share on other sites

What about the Intel HD graphics and iris?with an AMD or nvidia or even an slifire HD graphics iris to top it all

Link to comment
Share on other sites

Link to post
Share on other sites

What about the Intel HD graphics and iris?with an AMD or nvidia or even an slifire HD graphics iris to top it all

TBH, with Intel graphics you need to make your own drivers first before even hoping it will run any form of DirectX properly. And with the way Intel has their iGPU set up I highly doubt that you could use it in conjunction with a graphics card.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

TBH, with Intel graphics you need to make your own drivers first before even hoping it will run any form of DirectX properly. And with the way Intel has their iGPU set up I highly doubt that you could use it in conjunction with a graphics card.

 

how can you create your own hd graphics driver? i've been gaming on  my laptop that has the intel HD graphics 3000 for  over 3  years now and i always find a  way aroun dto play demanding games and it gives me generally 20fps.. 

 

using the intel igpu cant hurt performance .. could it? 

Link to comment
Share on other sites

Link to post
Share on other sites

This is great. I recently purchased a GTX 980 to replace my 670, I won't sell it now, will just add it back into the system :)

Link to comment
Share on other sites

Link to post
Share on other sites

if this is true, then theoretically you could have an unlimited number of GPU's as long as you have a way to connect it

2017 Macbook Pro 15 inch

Link to comment
Share on other sites

Link to post
Share on other sites

if this is true, then theoretically you could have an unlimited number of GPU's as long as you have a way to connect it

Yes, because along with Split Frame Rendering comes (hopefully) perfect scaling. You could use a mining rig that has 12 290xs to play a game... 

 

 

I believe this was said last time this was in the news, but I'll say it again:

It would be highly unlikely for Nvidia or AMD to actually allow the user to use their competitor's GPUs alongside their own. This "feature" will be disabled in drivers.

 

Kinda like how you can't SLI a 680 and a 770, even though they're the same damn card.

 

 

I don't really care if I can't throw an AMD card in my system, what I do care about is what you're talking about-using 2 different power/generation gpus together.

Link to comment
Share on other sites

Link to post
Share on other sites

this is like the only piece of good news for GTX 970 SLI owners like me

hope that one day, we could perhaps run 4K

Link to comment
Share on other sites

Link to post
Share on other sites

I will be happy if this improves SLI, anything beyond that is simply a bonus.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×