Jump to content

AM5 motherboard with support for 3 GPUs

Hi, I am looking into building a new PC for gaming virtual machines and/or computation with 3 GPUs (no need for SLI or CF 馃).

CPU will be some 79xx/X variant, depending on the exact needs. I need the integrated GPU for underlying Windows OS (yes, those will be Windows VMs through Hyper-V), that's why AM5 and Intel with their E and P cores aren't a good choice for VMs.

Looking at the different AM5 motherboards, I was not able to find one that would fit three 2 slot (minimum) GPUs without them having cooling choked or having to be water cooled (I might need couple of such computers and water cooling is not an option).

At first I thought the聽MSI PRO B650-P would work, but sadly 2 of the 4 PCIe x16 slots are actually only x1 (I know I don't need entire x16, probably even x8 would be more than enough)

I am expecting to use a specific case, to utilize the PCIe slot that's at the bottom of the board, the issue is usually the spacing between port 2 and 3.

Is there a board that I am overlooking, that would be able to fit 3x air cooled RTX 3080 or similar GPUs (a 4080 or 4090 would be nice, but I think those are just too big)? Or am I forced to go server stuff, like some older Xeons (the prices are quite high, so I would like to avoid it)?

Thanks.

Link to comment
Share on other sites

Link to post
Share on other sites

If you want AM5 it will probably have to be an X670E board.聽 I believe some have a 3rd 4.0x8 slot.聽 You can just run all three cards at x8.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

If you can run them with only PCIe 4.0 x4, and get a taller case so you can fit one GPU in the very bottom slot, the ASRock B650 LiveMixer could work: https://www.asrock.com/mb/AMD/B650 LiveMixer/Specification.us.asp#Specification. It has 3 4.0 x16 hardware slots, top is x16 from CPU, 2nd is x4 from CPU, 3rd is x4 from chipset.聽

Intel HEDT and Server platform enthusiasts:聽Intel HEDT Xeon/i7 Megathread

Main PC

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset聽

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads聽

Motherboard: EVGA X299 Dark聽

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block

Storage:聽 1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo聽

Optical Drives: LG WH14NS40聽

PSU: EVGA 1600W T2聽

Case & Fans: Corsair 750D Airflow - 3x聽Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM

OS: Windows 11

Display:聽LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

Yes, I looked at multiple of them and I honestly do not understand who designed them, because most of them have a huge gap for m.2 drives between slot 1 and 2 and then immediately slot 3, so unless you have a single slot GPUs, you can't fit more than 2...

Currently I am thinking about聽ASRock X670E PG LIGHTNING or ROG STRIX X670E-E GAMING. Both seem to have only 2 slots of space between slot 2 and 3, so finding a specific GPU that fits in and doesn't overheat would probably be needed.

5 minutes ago, Zando_ said:

ASRock B650 LiveMixer could work

I did look at that one, but it's sadly not available in central EU where I live (I guess I should've mentioned it, sorry).

Also I forgot to mention, a built in 2.5Gbit LAN would be preferred.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MiChAeLoKGB said:

Yes, I looked at multiple of them and I honestly do not understand who designed them, because most of them have a huge gap for m.2 drives between slot 1 and 2 and then immediately slot 3, so unless you have a single slot GPUs, you can't fit more than 2...

People who look at the current market. Mainstream users usually only have 1 GPU, GPUs are often stupidly thick now, so any PCIe slot too close to the top (where the GPU always goes) would just be covered. So they shove them down the bottom to make sure you can at least add in a couple single slot PCIe cards, and put M.2s in the empty space to avoid wasting it.

2 minutes ago, MiChAeLoKGB said:

I did look at that one, but it's sadly not available in central EU where I live (I guess I should've mentioned it, sorry).

Darnit.聽

2 minutes ago, MiChAeLoKGB said:

Both seem to have only 2 slots of space between slot 2 and 3, so finding a specific GPU that fits in and doesn't overheat would probably be needed.

... if you're willing to go 30 series, I think ASUS made some Turbo (blower) models of those. I know they made 2080 Tis for sure. They do get hot, but blowers deal better with being sandwiched. Although an axial can do okay with enough air jammed into it by intake fans, I ran my 780 Classifieds (2-slot card ~250-320W depending on whether OCed or not) back to back and the top one got toasty but didn't throttle. Otherwise you bite the bullet and slap the cards on waterblocks, but that adds ~$100 or more per card for blocks + the cost of the loop itself. If you go for old Xeons like you mentioned in the OP, you'll likely run into the same issue there, as 2-slot cards were the standard and most PCIe slot setups account for that. My X58/79/99/299 boards that support 3 or 3 GPUs all have them set up with 2-slot spacing for the main GPU slots.聽

Intel HEDT and Server platform enthusiasts:聽Intel HEDT Xeon/i7 Megathread

Main PC

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset聽

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads聽

Motherboard: EVGA X299 Dark聽

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block

Storage:聽 1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo聽

Optical Drives: LG WH14NS40聽

PSU: EVGA 1600W T2聽

Case & Fans: Corsair 750D Airflow - 3x聽Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM

OS: Windows 11

Display:聽LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Zando_ said:

People who look at the current market. Mainstream users usually only have 1 GPU, GPUs are often stupidly thick now, so any PCIe slot too close to the top (where the GPU always goes) would just be covered. So they shove them down the bottom to make sure you can at least add in a couple single slot PCIe cards, and put M.2s in the empty space to avoid wasting it.

True, I guess this use is quite niche and the 4090 is quite thicccc.

4 minutes ago, Zando_ said:

... if you're willing to go 30 series, I think ASUS made some Turbo (blower) models of those. I know they made 2080 Tis for sure. They do get hot, but blowers deal better with being sandwiched. Although an axial can do okay with enough air jammed into it by intake fans, I ran my 780 Classifieds (2-slot card ~250-320W depending on whether OCed or not) back to back and the top one got toasty but didn't throttle. Otherwise you bite the bullet and slap the cards on waterblocks, but that adds ~$100 or more per card for blocks + the cost of the loop itself. If you go for old Xeons like you mentioned in the OP, you'll likely run into the same issue there, as 2-slot cards were the standard and most PCIe slot setups account for that. My X58/79/99/299 boards that support 3 or 3 GPUs all have them set up with 2-slot spacing for the main GPU slots.聽

I think the cards will be in a range of 3060 to 3080 (including), since the 4xxx cards are just too hot. But the plan is to have a larger case, with a lot of intake fans on the front and bottom with exhaust at the top and the back, aiming for positive pressure.

Well crap, just checked and聽ASRock X670E PG LIGHTNING has it's last PCIe x16 slot only at x1 mode 馃槙聽The聽ProArt X670E-CREATOR has 3rd slot only at x2 mode, so only option I can see (from AM5 boards) is the聽ROG STRIX X670E-E GAMING, which has 2x PCIe 5.0 x16 slots (with support for x16, x8 and x4) and 1x PCIe 4.0 x16 slot with support for x4 mode.

Only slight issue is, that the darn board costs as much as the CPU I would put in it 馃槺

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MiChAeLoKGB said:

I think the cards will be in a range of 3060 to 3080 (including), since the 4xxx cards are just too hot. But the plan is to have a larger case, with a lot of intake fans on the front and bottom with exhaust at the top and the back, aiming for positive pressure.

Define "too hot". In a lot of tasks the 4090 is ~2x as fast as a 3090 at slightly more power draw. If you're going to be running multiple lower cards you may end up with higher heat output than just running a single, stupidly fast card. If you're planning to pass a card/cards through to VMs you also need to confirm the motherboard has solid IOMMU groupings that will let you pass through just the device you want to.聽

Reading back through the OP more carefully... why would you need a few of these machines? And why is water cooling not an option, I know it's expensive but if running a stack of GPUs is an option then a simple loop likely is as well. If you can explain more behind what you're trying to do and why, people can probably provide better solutions, or additional perspective you won't have considered otherwise.聽

Intel HEDT and Server platform enthusiasts:聽Intel HEDT Xeon/i7 Megathread

Main PC

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset聽

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads聽

Motherboard: EVGA X299 Dark聽

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block

Storage:聽 1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo聽

Optical Drives: LG WH14NS40聽

PSU: EVGA 1600W T2聽

Case & Fans: Corsair 750D Airflow - 3x聽Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM

OS: Windows 11

Display:聽LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Caroline said:

I've seen ugly boards before but man this one is really trying.

Yep it's horrible, someone threw condiments at a wall and decided to make a motherboard out of it. At least good ole Fractal still makes proper cases with solid side panels, so you don't have to look at it once it's done :old-laugh:.

Intel HEDT and Server platform enthusiasts:聽Intel HEDT Xeon/i7 Megathread

Main PC

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset聽

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads聽

Motherboard: EVGA X299 Dark聽

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block

Storage:聽 1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo聽

Optical Drives: LG WH14NS40聽

PSU: EVGA 1600W T2聽

Case & Fans: Corsair 750D Airflow - 3x聽Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM

OS: Windows 11

Display:聽LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Zando_ said:

Define "too hot".

I meant too hot to have three of them in a single case potentially all going full bore at once.

20 minutes ago, Zando_ said:

Reading back through the OP more carefully... why would you need a few of these machines? And why is water cooling not an option, I know it's expensive but if running a stack of GPUs is an option then a simple loop likely is as well. If you can explain more behind what you're trying to do and why, people can probably provide better solutions, or additional perspective you won't have considered otherwise.聽

They are basically for gaming Windows VM's (or a computation box if needed, then no VM's of course) and they won't all be at one place, and might not be checked constantly for leaks etc. So water cooling would not only add cost, but also liability if a water leaks somewhere.

20 minutes ago, Caroline said:

VMs in Windows? you sure like to suffer huh? I tried, and it's terrible, consider Linux for the host OS.

The spacing of slots in end-user motherboards is usually done on purpose that way so you CAN'T fit more than 2 cards at the same time. I ran into this issue with AM4, considered AM5 too but there was another problem, PCIe lanes, in most CPUs there's not enough of them for more than 2 cards, and the motherboards will also cap the slots at X8 or even X4 depending on how the slots are used and if there's an m.2 SSD present.

The chipset adds lanes but they're usually for X1/X4 slots and other "minor" components.

I'm using 4 cards right now but they're all 2-slots wide. And I had to use a small riser ribbon for the sound card because there was no other way. This is a workstation mobo though, comes with 6 slots, it's made for portless CUs or single-slot wide cards, but well, it works with normal graphics cards too. M12SWA-TF

There are sadly some things I can't change and Windows as a host OS is one of them.

Looking at the boards and requirements of the card like RTX 3080, I don't think even the x4 mode will be enough for it, so that might not even work and I am not sure if I want to go the route of Thread ripper, EPYC or Xeon 馃 2 cards it IMO not that much, if you want to run lets say 6 VMs, now you need 3 machines instead of just 2 馃槙

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, MiChAeLoKGB said:

They are basically for gaming Windows VM's (or a computation box if needed, then no VM's of course) and they won't all be at one place, and might not be checked constantly for leaks etc. So water cooling would not only add cost, but also liability if a water leaks somewhere.

That's... a really weird use case. Building a game VM machine has different requirements from a compute box, typically you'd make one or the other.聽

1 hour ago, MiChAeLoKGB said:

if you want to run lets say 6 VMs, now you need 3 machines instead of just 2 馃槙

If you need a GPU per VM then you need 2-3 machines anyways. To fit 6 GPUs in any system they need to be single slot, which means either cards so slow they're basically display outs/light accelerators, or putting everything on water.聽

Intel HEDT and Server platform enthusiasts:聽Intel HEDT Xeon/i7 Megathread

Main PC

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset聽

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads聽

Motherboard: EVGA X299 Dark聽

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block

Storage:聽 1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo聽

Optical Drives: LG WH14NS40聽

PSU: EVGA 1600W T2聽

Case & Fans: Corsair 750D Airflow - 3x聽Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM

OS: Windows 11

Display:聽LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Zando_ said:

That's... a really weird use case. Building a game VM machine has different requirements from a compute box, typically you'd make one or the other

Yeah, the main usage would be VMs, but they could be used for GPU compute if "needed" (free HW not used by any VMs).

12 minutes ago, Zando_ said:

If you need a GPU per VM then you need 2-3 machines anyways. To fit 6 GPUs in any system they need to be single slot, which means either cards so slow they're basically display outs/light accelerators, or putting everything on water.聽

If you are putting them to a server rack, you could have 5 or 6 GPUs no problem, but I need this to be a normal case you could have somewhere at home.

So basically its, 2 GPUs max or go with server HW 馃槙

Link to comment
Share on other sites

Link to post
Share on other sites

  • 7 months later...
On 1/24/2023 at 11:17 PM, Zando_ said:

That's... a really weird use case. Building a game VM machine has different requirements from a compute box, typically you'd make one or the other.聽

If you need a GPU per VM then you need 2-3 machines anyways. To fit 6 GPUs in any system they need to be single slot, which means either cards so slow they're basically display outs/light accelerators, or putting everything on water.聽

I think the rtx a5000 can be used as a virtualised gpu, I'm looking at a couple of these plus some a4000 for render nodes, and whilst I was doing some deeper digging on the a5000 I learned it has some other abilities maybe two of these dual slot cards would be more than enough (if the virtualisation allows their resources to be split) a lot more vram than 3090s and comparable to 40 series cards in terms of power, just more tensor cores and a lot less power drain.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now