Jump to content

DirectX12 Will Allow Multi-GPU Between GeForce And Radeon

Ragin Asian

 

One of the big things that we will be seeing is DirectX 12's Explicit Asynchronous Multi-GPU capabilities. What this means is that the API combines all the different graphics resources in a system and puts them all into one "bucket." It is then left to the game developer to divide the workload up however they see fit, letting different hardware take care of different tasks.

Part of this new feature set that aids multi-GPU configurations is that the frame buffers (GPU memory) won't necessarily need to be mirrored anymore. In older APIs, in order to benefit from multiple GPUs, you'd have the two work together, each one rendering an alternate frame (AFR). This required both to have all of the texture and geometry data in their frame buffers, meaning that despite having two cards with 4 GB of memory, you'd still only have a 4 GB frame buffer.

DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed.

 

We were also told that DirectX 12 will support all of this across multiple GPU architectures, simultaneously. What this means is that Nvidia GeForce GPUs will be able to work in tandem with AMD Radeon GPUs to render the same game – the same frame, even.

This is especially interesting as it allows you to leverage the technology benefits of both of these hardware platforms if you wish to do so. If you like Nvidia's GeForce Experience software and 3D Vision, but you want to use AMD's TrueAudio and FreeSync, chances are you'll be able to do that when DirectX 12 comes around. What will likely happen is that one card will operate as the master card, while the other will be used for additional power.

What we're seeing here is that DirectX 12 is capable of aggregating graphics resources, be that compute or memory, in the most efficient way possible. Don't forget, however, that this isn't only beneficial for systems with multiple discrete desktop GPUs. Laptops with dual-graphics solutions, or systems running an APU and a GPU will be able to benefit too. DirectX 12's aggregation will allow GPUs to work together that today would be completely mismatched, possibly making technologies like SLI and CrossFire obsolete in the future.

There is a catch, however. Lots of the optimization work for the spreading of workloads is left to the developers – the game studios. The same went for older APIs, though, and DirectX 12 is intended to be much friendlier. For advanced uses it may be a bit tricky, but according to the source, implementing the SFR should be a relatively simple and painless process for most developers.

 

Take this with a grain of salt since Tom's Hardware doesn't name a source.

 

While this would be cool I can't see Nvidia allowing this. If they don't though we may finally be able access all vendor exclusive technologies at the same time

 

Source:http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

STOP using prime95 with Haswell CPUs!

Main Rig:

CPU:i5 4690k @ 4.6GHz, Motherboard:ASUS Z97-A, GPU: EVGA GTX 970 FTW, PSU: EVGA Supernova 1000 G1, CPU Cooler: Cooler Master Nepton 240M w/ Noctua NF F12s, Case: NZXT S340, Memory: 16GB (4X4GB) G Skill Ripjaws X 2133MHz, Storage: Samsung 840 Evo 256GB boot drive & Seagate Barracuda 7200 rpm 2TB

Secondary/Folding Rig:

CPU:FX 8320 @ 4.6GHz @ 1.344 v., Motherboard: ASUS M5A99FX, GPU: MSI R9 290 Gaming, PSU: Corsair HX 750, CPU Cooler: be quiet! Dark Rock Pro 3, Case: NZXT Source 530, Memory: 8GB (2X4GB) G Skill Ares 1600MHz, Storage: Samsung 840 Evo 128GB
Link to comment
Share on other sites

Link to post
Share on other sites

Who will actually do this? And who decides if your PC is G-Sync or Freesync ready?

 

Nontheless a nice option to upgrade later to any GPU you want in SLIFire. ;)

who cares...

Link to comment
Share on other sites

Link to post
Share on other sites

Then again Nvidia I think made it so that you cant even have an Nvidia GPU and an AMD GPU in the system at all.

So it broke the Nvidia for PhysX for AMD users.

Stock coolers - The sound of bare minimum

Link to comment
Share on other sites

Link to post
Share on other sites

nVidia won't allow this and it seems like ram - it'd be best to use two the the same type but if it works it works.

My arsenal: i7-9700k Gaming Rig, an iPhone, and Stupidity.

Link to comment
Share on other sites

Link to post
Share on other sites

nvm..

Core i7 4820K  |  NH-D14 | Rampage IV Extreme | Asus R9 280X DC2T | 8GB G.Skill TridentX | 120GB Samsung 840 | NZXT H440  |  Be quiet! Dark Power Pro 10 650W

Link to comment
Share on other sites

Link to post
Share on other sites

Highly doubt it, if possible it wouldn't make much sense tbh ^^

CPU: Xeon 1230v3 - GPU: GTX 770  - SSD: 120GB 840 Evo - HDD: WD Blue 1TB - RAM: Ballistix 8GB - Case: CM N400 - PSU: CX 600M - Cooling: Cooler Master 212 Evo

Update Plans: Mini ITX this bitch

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

holy shit I can see the issues coming up already.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why would AMD allow this? I love this needless segregation.

AMD seems to be more open to new things.

My arsenal: i7-9700k Gaming Rig, an iPhone, and Stupidity.

Link to comment
Share on other sites

Link to post
Share on other sites

I remember similar stuff was discussed on WAN show a while ago... about directx 12 treating multiple gpus as a single entity...

Link to comment
Share on other sites

Link to post
Share on other sites

inb4 driver block by both

this is one of the greatest thing that has happened to me recently, and it happened on this forum, those involved have my eternal gratitude http://linustechtips.com/main/topic/198850-update-alex-got-his-moto-g2-lets-get-a-moto-g-for-alexgoeshigh-unofficial/ :')

i use to have the second best link in the world here, but it died ;_; its a 404 now but it will always be here

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why would AMD allow this? I love this needless segregation.

Amd is far more likely to let you use a geforce card for physx then nvidia. Just one example.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't see this actually happening. Even if DX12 allows it; I would imagine that both AMD and Nvidia would do something to stop it from working. On top of that I don't know if I see a benefit unless you're keeping your old cards and doing SLI with them. I just see too many problems to get excited.

Link to comment
Share on other sites

Link to post
Share on other sites

DirectX 12 will remove the 4 + 4 = 4 idea and will work with a new frame rendering method called SFR, which stands for Split Frame Rendering. Developers will be able to manually, or automatically, divide the texture and geometry data between the GPUs, and all of the GPUs can then work together to work on each frame. Each GPU will then work on a specific portion of the screen, with the number of portions being equivalent to the number of GPUs installed.

 

Sorry, how does that not mean that all the textures are still not being loaded into both/all cards?  Or are they saying that 1 card does textures and the other does everything else?  Still looks like 4 + 4 = 4, or 4 + 0 = 4.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD seems to be more open to new things.

Amd is far more likely to let you use a geforce card for physx then nvidia. Just one example.

 

And look where that got them. It would just be taking on another stupid idea to gain some karma points but fail miserably.

But seriously, even AMD wouldn't be this stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

And look where that got them. It would just be taking on another stupid idea to gain some karma points but fail miserably.

But seriously, even AMD wouldn't be this stupid.

That doesn't make sense.

Link to comment
Share on other sites

Link to post
Share on other sites

 

And look where that got them. It would just be taking on another stupid idea to gain some karma points but fail miserably.

But seriously, even AMD wouldn't be this stupid.

Hence the term 'more likely'. Another example is what is stopping Nvidia from using FreeSync and letting GSync allow AMD cards? I still can't imagine why Nvidia locks their own GSync when they could make more money from the mondule if it also allowed AMD cards.

 

Don't think there would be a big benefit in combining both cards though anyway. Each game either has a Nvidia feature or an AMD one, but not both.

Link to comment
Share on other sites

Link to post
Share on other sites

ima try this now , since the windows 10 tech preview had dx12 in it

 

 

wait theres nothing that used dx12 out rn damnit  

Please quote me or tag me if your trying to talk to me , I might see it through all my other notifications ^_^

Spoiler
Spoiler
the current list of dead cards is as follows 2 evga gtx 980ti acx 2.0 , 1 evga gtx 980 acx 2.0 1600mhz core 2100mhz ram golden chip card ... failed hardcore , 1 290x that caught fire , 1 hd 7950 .

may you all rest in peaces in the giant pc in the sky

Link to comment
Share on other sites

Link to post
Share on other sites

SLiFire!

If only...

Omg yes, I'd like summadat SLIfiya!

- Fresher than a fruit salad.

Link to comment
Share on other sites

Link to post
Share on other sites

And look where that got them. It would just be taking on another stupid idea to gain some karma points but fail miserably.

But seriously, even AMD wouldn't be this stupid.

Wow you seem to have the ability to turn a valid argument and a positive remark into ridicule towards AMD.

 

Nvidia usually is the neighbour who puts up a big fence, crossing half of the other neighbour's yard. Through your weird fanbot language you are able to not only make Nvidia neutral in this matter, but shame AMD for being more open-oriented.

 

Yes, I meant to say fanbot

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting. I'm not sure how much actual use it will have but it's interesting nonetheless. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×