Jump to content

AMD moving away from multi-GPU too?

source: http://www.gamersnexus.net/news-pc/3005-amd-moving-away-from-crossfire-with-rx-vega

 

Wf0kK10.png

 

Quote

With the Polaris launch, great emphasis was placed on dual RX 480 cards evenly embattling GTX 1080 hardware – something we later found to be of mixed virtue. This time, it seems, none of the CrossFire claims were made; in fact, "CrossFire" wasn’t once mentioned during any of the day-long media briefing. It wasn’t until media round-table sessions later in the day that the topic of CrossFire came up.

 

RX Vega 64 and RX Vega 56 will support CrossFire, technically speaking, but AMD noted that the industry is largely moving away from multi-GPU configurations. We agree with that sentiment, and have for a while, though that’s been the case since before Polaris and its heavy CF marketing language. Regardless, AMD has minimized its marketing focus on multi-GPU for RX Vega and, although the cards can technically function in multi-card arrays, AMD noted that the value is rough when considering limited developer support.

 

This aligns with nVidia’s decision to begin slowly winding-down SLI support during the Pascal 10-series launch event, where discussion of keyed 3-way SLI would be required (something later changed, though there’s no official support of >2-way SLI in games on 10-series cards).

 

this is not entirely unexpected

game developers and game engine developers have shown little to no interest in supporting multi-GPU configurations, so this shouldn't be of any surprise

the baffling thing is that DX12 MS went out of their way to support and implement mixed GPU setups in the API, although no one actually has a game engine that supports it

my guess is that few generations from now, SLI and CFX will be removed altogether 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mr.Meerkat said:

Hmmmm...'glued' together GPU dies anyone? :D 

Everything they've said about Navi involves "scalability", so I wouldn't be surprised at all to if that translates to multiple small/cheap vega-based gpus connected via Infinity Fabric.

SFF-ish:  Ryzen 5 1600X, Asrock AB350M Pro4, 16GB Corsair LPX 3200, Sapphire R9 Fury Nitro -75mV, 512gb Plextor Nvme m.2, 512gb Sandisk SATA m.2, Cryorig H7, stuffed into an Inwin 301 with rgb front panel mod.  LG27UD58.

 

Aging Workhorse:  Phenom II X6 1090T Black (4GHz #Yolo), 16GB Corsair XMS 1333, RX 470 Red Devil 4gb (Sold for $330 to Cryptominers), HD6850 1gb, Hilariously overkill Asus Crosshair V, 240gb Sandisk SSD Plus, 4TB's worth of mechanical drives, and a bunch of water/glycol.  Coming soon:  Bykski CPU block, whatever cheap Polaris 10 GPU I can get once miners start unloading them.

 

MintyFreshMedia:  Thinkserver TS130 with i3-3220, 4gb ecc ram, 120GB Toshiba/OCZ SSD booting Linux Mint XFCE, 2TB Hitachi Ultrastar.  In Progress:  3D printed drive mounts, 4 2TB ultrastars in RAID 5.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Phate.exe said:

Everything they've said about Navi involves "scalability", so I wouldn't be surprised at all to if that translates to multiple small/cheap vega-based gpus connected via Infinity Fabric.

Well I mean they've found out how well IF works for Ryzen so its deffo a large possibility that their GPUs would get a taste of IF (or similar) as well :P 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

No wonder. Hardly anyone in the total gaming spectre uses dual GPU's. Say you make a new Dishonored game for the 2 consoles and PC. Say PC is 1/3rd or even half the total player base. Out of those, how many actually have dual GPU setups? 1-3%? Your graphics engine has to be programmed from the ground up to utilize multi GPU's properly. It's just a huge cost and added complexity for hardly any benefit. The time and money spent would yield much better results used elsewhere.

 

The rumours are still going that NAVI will be multi died on an interposer, like a mix between Vega and Threadripper. In that case, we might get insanely large GPU's at very low cost. That would be very exciting. Personally, I don't see the point in multi GPU systems anymore. As a result, I think the vast majority of gamers could suffice just fine with an ITX build. How many actually uses more than 1 PCIe slot, 2 ram slots and 6 SATA drives?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

While I understand limited developer support with multi-gpu setups, I have a hard time believing multi-GPU will die off completely. Supercomputers and video walls both rely on multi-gpu setups. I suspect there are other professional uses as well. 

 

What I imagine AMD is doing is backing off CF marketing but not the feature itself.

Yeah, CF and SLI are probably going to be removed. Maybe there will be a resurgence of dual gpus on a single card?

Edited by TBA
@Drak3 pointed out that CF and SLI aren't the setup I thought they were. Oops.

Please quote or mention me if you would like a timely reply. Thank you!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, TBA said:

Supercomputers and video walls both rely on multi-gpu setups. I suspect there are other professional uses as well. 

Ah, but the thing being, they don't rely on SLI or Crossfire. Both are gaming specific multiple GPU setups. Both scale poorly beyond 2 cards, and likely, both will only support 2 cards moving forward, and mostly for Nvidia Surround and AMD Eyefinity setups.

 

Whereas most professional uses don't need the GPUs in a master/slave config. And both Xfire and SLI disable the second card's outputs (making them useless for video walls).

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Mr.Meerkat said:

Hmmmm...'glued' together GPU dies anyone? :D 

 Yep, and possibly if they figure out how, an InfiniteStitch replacement for CrossFire. Though it's questionable whether they could implement Infinity Fabric over those differences.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, ravenshrike said:

 Yep, and possibly if they figure out how, an InfiniteStitch replacement for CrossFire. Though it's questionable whether they could implement Infinity Fabric over those differences.

It's actually a second, modified PCIe x16 connector. The bridge has a 6 pin, and powers a chipset on each card that enables really efficient communications over a PCIe 5.0 x16 connection dedicated between the cards.

 

Kappa.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Phate.exe said:

Everything they've said about Navi involves "scalability", so I wouldn't be surprised at all to if that translates to multiple small/cheap vega-based gpus connected via Infinity Fabric.

Raja has been cagey, but it's clearly exactly where AMD is going. It also makes the most sense from a Cost/Design standpoint. 1, 2 or 4 of their Navi cores on the interposer and what do you get? Well, whatever you want to pay for.

 

This also goes with the Petaflop render/server/VM device they were showing off at the end of the SIGGRAPH event. If you can scale smaller dies to whatever product you need, you open up a world of Product Designs without having to remake the GPU itself. Nvidia is running, what, 4 separate iterations of the Pascal die? That's a lot of design time, which costs a lot of money. 

 

(There's also the issue that uArchs + Process have efficiency curves and GPUs pretty much always run outside of them on the top SKUs. ~98% performance efficiency in a x2 combination for performance could easily be 25% or more power efficiency.)

Link to comment
Share on other sites

Link to post
Share on other sites

Are they just moving away from Crossfire itself and letting DX12 multi GPU take over instead, if devs want to use it of course. Not much point maintaing Crossfire driver support and the issues that brings with it if it can be done natively in API.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Taf the Ghost said:

Raja has been cagey, but it's clearly exactly where AMD is going. It also makes the most sense from a Cost/Design standpoint. 1, 2 or 4 of their Navi cores on the interposer and what do you get? Well, whatever you want to pay for.

So like if Raven Ridge utilizes an Infinity Fabric link, just drop a crapton of those iGPU's down on an interposer until you have the performance level you want.

SFF-ish:  Ryzen 5 1600X, Asrock AB350M Pro4, 16GB Corsair LPX 3200, Sapphire R9 Fury Nitro -75mV, 512gb Plextor Nvme m.2, 512gb Sandisk SATA m.2, Cryorig H7, stuffed into an Inwin 301 with rgb front panel mod.  LG27UD58.

 

Aging Workhorse:  Phenom II X6 1090T Black (4GHz #Yolo), 16GB Corsair XMS 1333, RX 470 Red Devil 4gb (Sold for $330 to Cryptominers), HD6850 1gb, Hilariously overkill Asus Crosshair V, 240gb Sandisk SSD Plus, 4TB's worth of mechanical drives, and a bunch of water/glycol.  Coming soon:  Bykski CPU block, whatever cheap Polaris 10 GPU I can get once miners start unloading them.

 

MintyFreshMedia:  Thinkserver TS130 with i3-3220, 4gb ecc ram, 120GB Toshiba/OCZ SSD booting Linux Mint XFCE, 2TB Hitachi Ultrastar.  In Progress:  3D printed drive mounts, 4 2TB ultrastars in RAID 5.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Drak3 said:

Ah, but the thing being, they don't rely on SLI or Crossfire. Both are gaming specific multiple GPU setups. Both scale poorly beyond 2 cards, and likely, both will only support 2 cards moving forward, and mostly for Nvidia Surround and AMD Eyefinity setups.

 

Whereas most professional uses don't need the GPUs in a master/slave config. And both Xfire and SLI disable the second card's outputs (making them useless for video walls).

Agreed. Blender Cycles scales extremely well across multiple GPUs but the two essentially share an identical workload of path tracing. A game engine is an entirely different beast, requiring exacting synchronization between the GPUs and the CPU (I don't know the details myself, though logic leads me to this).

 

If games were to heavily rely upon ray trace means for shading, it would be feasible that a second GPU can copy data from the first (geometry, textures, shaders, etc), decide how to divide the scene, then use it's shader cores alongside the other (perhaps even perform on it's own frame, later to be composited together)

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, leadeater said:

Are they just moving away from Crossfire itself and letting DX12 multi GPU take over instead, if devs want to use it of course. Not much point maintaing Crossfire driver support and the issues that brings with it if it can be done natively in API.

DX m-gpu setups work in two modes, linked and unlinked - and both still require GPU manufacturers to properly support it

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, zMeul said:

DX m-gpu setups work in two modes, linked and unlinked - and both still require GPU manufacturers to properly support it

Not really, as long as the GPU supports DX12 that's it and the rest is over to the application. To be clear I don't expect devs to take this up or anything and it'll be the same story or worse as SLI/Crossfire since it's 100% on them now and not part GPU driver and part them.

 

Quote

This blog post is about explicit multi-GPU programming that became possible with the introduction of the DirectX 12 API. In previous versions of DirectX, the driver had to manage multiple SLI GPUs. Now, DirectX 12 gives that control to the application.

https://developer.nvidia.com/explicit-multi-gpu-programming-directx-12

 

I know there are different modes in DX12 but the only one worth sorry about and likely to get used is explicit unlinked as that is the most abstracted form the GPU and allows Nvidia and AMD GPUs to be in the same system rendering the same game.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

I know there are different modes in DX12 but the only one worth sorry about and likely to get used is explicit unlinked as that is the most abstracted form the GPU and allows Nvidia and AMD GPUs to be in the same system rendering the same game.

no one has actually done it on a mass scale

except for Ashes who's a showcase benchmark anyways

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Jito463 said:

Frankly, I've never been fond of multi-GPU setups.  For as long as they've been a thing, all I've read about is the issues people have with getting it working.  I'd much rather have a single card that does what I need.

But nothing looks as good as multi gpu setups ;)

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

source: http://www.gamersnexus.net/news-pc/3005-amd-moving-away-from-crossfire-with-rx-vega

 

Wf0kK10.png

 

 

this is not entirely unexpected

game developers and game engine developers have shown little to no interest in supporting multi-GPU configurations, so this shouldn't be of any surprise

the baffling thing is that DX12 MS went out of their way to support and implement mixed GPU setups in the API, although no one actually has a game engine that supports it

my guess is that few generations from now, SLI and CFX will be removed altogether 

So Raja said in early 2015 that Multi-GPU was going to see a golden age 3-5 years from "now" (then). That is 1-2 years from now (present). It takes 3-4 years to make a new game engine from scratch, and usually another year or two ontop of that to clean out all the bugs and optimize it... The point GN makes about "no engine supporting Multi Adapter" is true, however there hasnt been enough time to allow such a feature to be added to a pureblood DX12 engine, because no such thing exists.

 

All existing game engines are either "fixed" DX11 or MANTLE based engines, the closest we get to a DX12 engine is the one used in Ashes of the Singularity, it is DX12, but the core is based on MANTLE. A AMD API..... and, it supports multi-vendor adapter. And it works, funny that.

 

But one engine isnt a proof of adoption, or function. While Ashes is a slow moving RTS, a game like Battlefield 1 or CoD with their fast paced action may not even work with this type of multi adapter due to their sensitivity to latency. Yet, we havent seen properly built DX12 engines out there either, most engines just has it bolted on, not built upon.

 

Sooo, when LTTs resident specialist in spreading vitriol and fake news predicts its dieing, well, i dont have a ship large enough to carry all the salt i need for this thread.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, zMeul said:

 

game developers and game engine developers have shown little to no interest in supporting multi-GPU configurations, so this shouldn't be of any surprise

the baffling thing is that DX12 MS went out of their way to support and implement mixed GPU setups in the API, although no one actually has a game engine that supports it

my guess is that few generations from now, SLI and CFX will be removed altogether 

 

That's not true at all.  Most AAA games have SLI support.  Recently games off the top of my head: Dawn of War 3, Ghost Recon Wildlands, Prey, Dishonored 2, Deus Ex Mankind Divided, Titanfall 2.


Even DX12 games like Deus Ex Mankind Divided, Hitman and Rise of the Tomb Raider have received multiGPU patches for the DX12 renderer.  Dunno about Crossfire support these days though.

 

Lots of games like Resident Evil 7, Tekken 7 and Playerunknown Battlegrounds have ways to force SLI too.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, zMeul said:

no one has actually done it on a mass scale

except for Ashes who's a showcase benchmark anyways

Because Ashes is the only engine ABLE TO HANDLE IT. No other engine is built on a foundation that allows it. Ubisoft, Frostbite, Unreal, Unity, Squeenix -> they all use modified DX11 engines. Not native DX12 or Vulkan

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, zMeul said:

no one has actually done it on a mass scale

except for Ashes who's a showcase benchmark anyways

Few games support it, Civ 6 does but I agree it's likely to be more rare than SLI/Crossfire. In total there is whopping 8 in total so far.

http://www.tomshardware.com/answers/id-3142821/dx12-vulkan-games-supporting-multi-gpu.html

 

If game engines start supporting it natively in their tool set and does the work for the devs then it'll be far more common, that won't be a thing from the start.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Phate.exe said:

So like if Raven Ridge utilizes an Infinity Fabric link, just drop a crapton of those iGPU's down on an interposer until you have the performance level you want.

GCN is the best iGPU tech in existence. The problem is they need to compete with Nvidia's dGPU tech.

 

Once you break down the market into the design priorities:

 

Intel: 2w to 10w and terrible past that.

 

AMD: 5w to 100w, uses a lot more Power after that.

 

Nvidia: 300W and they're good at cutting down the Compute Cards to consumer level.

 

In a weird way, Nvidia has actually taken the biggest gamble in the GPU space over the last decade. They shifted their design into business applications that pretty much didn't exist when they moved towards CUDA rendering.

Link to comment
Share on other sites

Link to post
Share on other sites

That would be kinda shame, it's great scaling with two GPUs so I see at least that staying. I know it's smaller market for multi GPU and devs may not proritize it as such. Though it's not too much extra work specially talking like AAA games. 

 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×