Jump to content

Multi-GPU Resource Pooling - DX12 Demo

MEC-777

SLI_AMD.jpg

 

PC gaming is about to see a rather drastic change and improvement in the way multi-GPUs are implemented. No longer will you be forced to pair dGPUs from the same architecture/family or even brand. The ability to combine any two or more GPUs and have their resources (GPU cores, Vram etc.) stacked and pooled is a thing and will be possible with DX12. Whether you combine Intel or AMD iGPUs with any dGPU or Nvidia and AMD dGPUs, it won't matter. Their resources can be shared for additional gaming performance. No longer will the iGPU in your system go to waste. This is awesome news for all of us. I know this has been known as a possibility with DX12 for some time, but this is the first time I've seen a demo of it and a game dev talking about how it can be implemented. 

 

The demo in the video is using one of AMDs latest APUs paired with an R9 290X - an unlikely and formerly thought of "unbalanced" combination. But now with DX12, we not only have reduced CPU overhead - allowing this CPU/dGPU combo to actually perform optimally, but we also have the pairing of the APUs iGPU added to the total GPU capabilities of this particular system for even more performance. As mentioned, this is not exclusive to AMD hardware, they just happened to be running it on an AMD system. With the latest APUs having such strong iGPU performance, it was a good platform to demonstrate the effectiveness this can have.

 

Currently, the way multi-GPUs are combined is that each GPU (in a dual-card config) takes turn rendering each frame. Some of the disadvantages with this is that both GPUs have to be nearly identical, run at the same clock speeds and both have to hold the same data in the Vram of each card resulting in redundancies and limitations on which cards can be used/combined. The new approach made possible with DX12 is that now the devs can have control over which aspects of the scene are being rendered by which GPU. So you can have the high-end dGPU still doing most of the heavy-lifting, rendering the majority of the scene, while the iGPU takes on the remaining tasks/aspects of rendering the scene. This means the dGPU doesn't have to render the entire scene by itself, only part of it, which as you can guess, will increase frame rates. 

 

As stressed in the video, it will be up to the devs of each game to optimize and sort out which aspects/tasks can be offloaded to an iGPU and how the work will be divided. One key thing to note is that this will allow for much better crossfire/SLI or hybrid dGPU combos to be possible as the GPUs will not be taking turns rendering each frame, but each frame will be rendered in concert with each GPU taking on specific elements of the scene. This will also allow for Vram stacking, so two cards with 4GB of Vram each will amount to a total of 8GB Vram available. One card can handle all textures, while the other handles lighting etc. The potential is there, but again, it will be strongly dependant on the devs to implement it.

 

Here's the demo video:

 

https://www.youtube.com/watch?v=9cvmDjVYSNk

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

What about 'crossfiring' Intel onboard with a dedicated GPU?

Spoiler

Prometheus (Main Rig)

CPU-Z Verification

Laptop: 

Spoiler

Intel Core i3-5005U, 8GB RAM, Crucial MX 100 128GB, Touch-Screen, Intel 7260 WiFi/Bluetooth card.

 Phone:

 Game Consoles:

Spoiler

Softmodded Fat PS2 w/ 80GB HDD, and a Dreamcast.

 

If you want my attention quote my post, or tag me. If you don't use PCPartPicker I will ignore your build.

Link to comment
Share on other sites

Link to post
Share on other sites

What about 'crossfiring' Intel onboard with a dedicated GPU?

 

In the video they say it should be possible with any iGPU (integrated/onboard GPU) and dGPU (discreet GPU) combination. It's not brand-specific and the only limitations with be how powerful each GPU is. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

AHhhhhhhhh I wish they'd release it for hardcore people who want to test the APIs! They could have a disclaimer saying this could brick your card and everything, it'd be fine! Crowdsource your testing, companies!

Link to comment
Share on other sites

Link to post
Share on other sites

We can finally mix and match different GPU's??? :o

 

I really hope devs take advantage of this.  This is f'n awesome!

Current Rig: i5 3570K @ 4.2 GHz, Corsair H100i, Asus Maximus V Gene, Asus GTX 970 Turbo, Patriot Viper Black Mamba 8 GB DDR-3 1600 (2x4GB), Sandisk 120 GB SSD + Seagate 500 GB HDD + WD Blue 1 TB HDD, XFX 750W Black Edition, Corsair 350D - https://ca.pcpartpicker.com/b/cbxrxr

 

Link to comment
Share on other sites

Link to post
Share on other sites

In the video they say it should be possible with any iGPU (integrated/onboard GPU) and dGPU (discreet GPU) combination. It's not brand-specific and the only limitations with be how powerful each GPU is.

And if the developer can be bothered developing with this in mind. The potential for extremely bad implementations of this is high :/

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if this will improve dual card frame latency, because as it is now, at 50fps with 2 cards, you're actually getting 25fps equivalent input latency, if not more.

System specs
  • Graphics card: Asus GTX 980 Ti (Temp target: 60c, fan speed: slow as hell)
  • CPU: Intel 6700k @ 4.2Ghz
  • CPU Heatsink: ThermalRight Silver Arrow Extreme
  • Motherboard: Asus Maximus Viii Gene
  • Ram: 8GB of DDR4 @ 3000Mhz
  • Headphone source: O2 + Odac 
  • Mic input: Creative X-Fi Titanium HD
  • Case: Fractal Design Arc midi R2
  • Boot Drive: Samsung 840 Pro 128GB 
  • Storage: Seagate SSHD 2TB
  • PSU: Be quiet! Dark Power Pro 550w

Peripherals

  • Monitor: Asus ROG Swift PG278Q
  • Mouse: Razer DeathAdder Chroma (16.5 inch/360)
  • Mouse surface: Mionix Sargas 900
  • Tablet: Wacom Intuos Pen
  • Keyboard: Filco Majestouch Ninja, MX Brown, Ten Keyless 
  • Headphones: AKG K7xx
  • IEMs: BrainWavs S1
Link to comment
Share on other sites

Link to post
Share on other sites

Ive been dreaming for something like this ever since i learner what an SLi/xfire is, and im still skeptical about it, companies are just retarded man,instead of advancing and creating insane new experiences using hardware that we already have in new ways they block it, like nvidia physx,just imagine every single game that has gpu physx and shared pool memory and offloads other stuff to iGpu or a 2nd gpu, i have a gtx 460 laying around and an hd 4600 igpu now i can only imagine the 3 combined.

 

Anyway i think the iGPU co-processing is still nonesense,i still see iGPUs useless and toxic,they need to go away from dedicated desktop pc's.

Think about it for a second theres no way an iGPU that uses slow DDR3 -4 RAM can keep up with 7GHZ gddr5 or HBM,its not even close in bus width to a gpu, and HBM with 1024 bit bus its going to get even more crazy.

Just let the iGPU's go we dont need em,all we need is to be able to use cross-vendor and cross-gen gpu's.

Just the tought of buying a future hbm amd card and offloading some stuff to my old gtx 670 makes me "wet" ,not having to throw away a good card.

Link to comment
Share on other sites

Link to post
Share on other sites

Fuck hype over SWBF, BAK or any other game.

This is real hype, can't wait to see games scale across multiple CPU cores and multiple GPUs no matter what brand!!!!!

Its only been 8 years since the first X86 based dual core CPU launched.

HYPE!!!!!

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

so going to have a 770, 640, and 4600 powering through games.

 

  1. GLaDOS: i5 6600 EVGA GTX 1070 FE EVGA Z170 Stinger Cooler Master GeminS524 V2 With LTT Noctua NFF12 Corsair Vengeance LPX 2x8 GB 3200 MHz Corsair SF450 850 EVO 500 Gb CableMod Widebeam White LED 60cm 2x Asus VN248H-P, Dell 12" G502 Proteus Core Logitech G610 Orion Cherry Brown Logitech Z506 Sennheiser HD 518 MSX
  2. Lenovo Z40 i5-4200U GT 820M 6 GB RAM 840 EVO 120 GB
  3. Moto X4 G.Skill 32 GB Micro SD Spigen Case Project Fi

 

Link to comment
Share on other sites

Link to post
Share on other sites

Inb4 PSU makers (and MBO to some degree) earning a lot of money.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Great stuff. Something we can all cheer about regardless of fanboy allegiances.

 

These oxide guys always seem to be on the cutting edge of graphics APIs. First with mantle, then with DX12, then with Vulkan.

I hope their game will be as good as their tech LOL.

Link to comment
Share on other sites

Link to post
Share on other sites

This makes sense on why Nvidia are so hyping NVLink

 

They need to make a version of SLI so much better than DX12 shared resources in order to encourage people to just buy Nvidia cards.

CPU: AMD Ryzen 7 3700X - CPU Cooler: Deepcool Castle 240EX - Motherboard: MSI B450 GAMING PRO CARBON AC

RAM: 2 x 8GB Corsair Vengeance Pro RBG 3200MHz - GPU: MSI RTX 3080 GAMING X TRIO

 

Link to comment
Share on other sites

Link to post
Share on other sites

They need to make a version of SLI so much better than DX12 shared resources in order to encourage people to just buy Nvidia cards.

What oxide games showed will work on nvidia too. They just demoed it on AMD because that's what they have got working at the moment. But it's not a vendor exclusive thing...

It's not really like SLI or cross-fire, it's about have a discrete GPU share the load with your onboard GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

What oxide games showed will work on nvidia too. They just demoed it on AMD because that's what they have got working at the moment. But it's not a vendor exclusive thing...

It's not really like SLI or cross-fire, it's about have a discrete GPU share the load with your onboard GPU.

 

The point I'm making is that with this, it makes SLI effectively redundant.

 

NVLink is supposed to be SLI on steroids and I'm guessing for Nvidia hardware better than this implementation. If Nvidia want to push you buying two new matched cards for SLI they need to offer tangible benefits to this DX12 implementation. NVLink is their plan.

CPU: AMD Ryzen 7 3700X - CPU Cooler: Deepcool Castle 240EX - Motherboard: MSI B450 GAMING PRO CARBON AC

RAM: 2 x 8GB Corsair Vengeance Pro RBG 3200MHz - GPU: MSI RTX 3080 GAMING X TRIO

 

Link to comment
Share on other sites

Link to post
Share on other sites

The point I'm making is that with this, it makes SLI effectively redundant.

 

NVLink is supposed to be SLI on steroids and I'm guessing for Nvidia hardware better than this implementation. If Nvidia want to push you buying two new matched cards for SLI they need to offer tangible benefits to this DX12 implementation. NVLink is their plan.

 

I don't see NVlink coming to consumer boards. The NVlink interconnect is made to replace PCIe but intel probably won't make it work with their chipset

Link to comment
Share on other sites

Link to post
Share on other sites

This makes sense on why Nvidia are so hyping NVLink

 

They need to make a version of SLI so much better than DX12 shared resources in order to encourage people to just buy Nvidia cards.

 

NVLink is more geared towards HPC, not consumer use. 

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if this will improve dual card frame latency, because as it is now, at 50fps with 2 cards, you're actually getting 25fps equivalent input latency, if not more.

 

Not really. Both cards are working in unison at 50fps. So your input latency is still at 50fps. 

 

With this new implementation, your input latency would improve by however much your frame rate improves by.

 

 

Anyway i think the iGPU co-processing is still nonesense,i still see iGPUs useless and toxic,they need to go away from dedicated desktop pc's.

Think about it for a second theres no way an iGPU that uses slow DDR3 -4 RAM can keep up with 7GHZ gddr5 or HBM,its not even close in bus width to a gpu, and HBM with 1024 bit bus its going to get even more crazy.

Just let the iGPU's go we dont need em,all we need is to be able to use cross-vendor and cross-gen gpu's.

Just the tought of buying a future hbm amd card and offloading some stuff to my old gtx 670 makes me "wet" ,not having to throw away a good card.

 

iGPUs will never go away because the majority of users don't need or use discreet graphics. 

 

This new implementation of combining multiple GPUs does not work the way you're describing. The Vram on the dGU and the system ram being used by the iGPU does not have to be the same speed/type etc. How it will works is what ever data is necessary for whatever task the iGPU is assigned, that data is loaded to the system ram only. This cuts down on redundant data copying/transfers unlike current multi GPU methods. The issues you described would not be a problem. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Every time I see this subject and then the "depends only on the developers" I get frustrated...

 

I'm serious(this time), last years we saw dozens examples about not so committed developers and do you think that they will support this? I don't think so...

 

And AMD and Nvidia supporting developers to do these multi gpus combos? Wouldn't believe... Life can't be this good, the world is not fair...

Link to comment
Share on other sites

Link to post
Share on other sites

"up to developers" means "up to publishers"

*sigh*

"When you're in high school you should be doing things, about which you could never tell your parents!"

Link to comment
Share on other sites

Link to post
Share on other sites

One major question I have about this is how will the drivers work and interact? AMD GPUs with Nvidia GPUs will probably be the most problematic combination and may not come to fruition. Would require both parties to work together and I can't see them collaborating on something like that. AMD would be willing, but Nvidia, I doubt it. 

 

At least we know it does work for sure combining an APU with a dGPU. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

This makes sense on why Nvidia are so hyping NVLink

 

They need to make a version of SLI so much better than DX12 shared resources in order to encourage people to just buy Nvidia cards.

That's not the reason. NVLink is for compute tasks where bandwidth is a problem. IBM has had NVLink 1.0 implemented in their Power 8 series motherboards for a long time already.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×