Jump to content

2080 Ti X 4 Redshift rendering rig - Air-cooled case suggestions?

Hello everyone,

Long time listener, first time caller. This post is very specific to self-employed VFX workstation users, machine learning developers, and air-cooled multi-GPU madlads.

 

The State of Play

I come from the film, TV, and agency freelance industry. I am a standalone 3D artist and video professional. It's been a pretty slim year, but streaming TV work has kept me going, and I'm looking to upgrade in the next 8 months or so to stay competitive with the agencies that share medium-sized CPU/GPU render-farms and spendy licenses for farm managers like Deadline. I've been saving up my pennies for this upgrade for almost five years!

I need a case that fits a lot of hot components with room to breathe! I am upgrading from a 10-core Intel 6950X to a CPU like the AMD 64-core Threadripper 3990X, if/when they lift the 256GB RAM limit at the next revision!

 

I shoot and edit 10-bit video, composite 3D shots in Nuke, animate in Adobe After Effects, cut in Premiere Pro, grade in DaVinci Resolve, make models for augmented reality, and create VR experiences in Unreal Engine, but l pay most of my bills by modelling, animating, and rendering in Autodesk Maya. Right now I'm managing to make this profitable on a single machine by using Redshift. I use a 2016-vintage Phanteks Enthoo Luxe, fitted with an Asus X99-E WS/USB3.1 motherboard. I have an NVIDIA 1080 Ti, and a pair of NVIDIA 2080 Ti cards connected via an NVLink bridge.

 

More PIC-E lanes are more-better, but 8X vs 16X slot speed isn't currently an issue, as rendering single frames is so slow that this should never be a problem. My three cards are in slots 1, 4, and 7. The 1080 Ti in slot 7 of my motherboard can't actually fully seat into the last PCI-E slot because the motherboard audio headers and others get in the way! The 1080 Ti lives in this PCI-E riser extension on a little 3D printed standoff that rests on the PSU at the bottom of the case. If I pack the GPUs next to each other in slots 1, 3, and 6, they overheat badly and the system often crashes. A gap of 1 PCI-E slot between the open-cooler cards drops their temps from 90°C to 70°C (140.0 °F - 194.0 °F).

 

Hopeful Requirements

• A PC case that fits an SSI-CEB (phatboy ATX) motherboard

• An X399 motherboard

Noctua NH-U14S TR4 SP3 or another extreme-high-end air cooler

• 4 X NVIDIA 2080 Ti GPUs or 2X 2080 Ti graphics cards and 2X future 3080 Ti cards

• Air-cooling is preferable to water-cooling for the sake of longevity and service-life. An AiO CPU cooler is absolutely fine, as long as it's not made by Enermax!

 

Possible options

• Nanoxia 4 x GPU case

Nanoxia Hydra II Rev. B 8 GPU case

 

PCI-E slot riser cables really don't bother me, as long as they're properly shielded and don't get in the way of anything crucial. The GPUs are pure computation units, and 8X speed doesn't seem to upset my other video applications. In tandem with this point, I feel that a cryptocoin mining board with non-standard connections wouldn't be suitable. Naked PCI-E connections would be better.

 

The Big Ask here is for advice on building a workstation that can act as a GPU rendering server, and doesn't have to rely on 11,000 RPM bloweymatron noisemakers to feed the 4 GPUs with cool air!

 

Do I need to just suck it up and build a case out of aluminium rail and 3D printed parts to house 4 air-cooled GPUs with a bit of breathing space?!

 

If I'm being silly and asking for the moon on a stick, please tell me, as I understand this is pretty niche!

Link to comment
Share on other sites

Link to post
Share on other sites

>What 1080ti/2080ti's? Generally blower style cards do better in many-GPU setups.

 

Other than that look for older-style cases with side panel fans, or just custom watercool the whole mess.

 

Also, why X399? The 2990wx only has 32 cores and latency issues. I also don't know which X399 boards have the space for quad GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Grabhanem said:

>What 1080ti/2080ti's? Generally blower style cards do better in many-GPU setups.

 

Other than that look for older-style cases with side panel fans, or just custom watercool the whole mess.

 

Also, why X399? The 2990wx only has 32 cores and latency issues. I also don't know which X399 boards have the space for quad GPU.

- 1 x 1080 Ti Founders Edition blower-style card

- 2 x 2080 Ti Founders Edition open-cooler cards

These were all bought from the NVIDIA store directly for reasons relating to warranty and GPU binning. 

I'm right there with you. A workstation multi-core workstation CPU with PCI-E lanes for days, and the motherboards all have 4 slots(!)

 

Quote

Also, why X399? The 2990wx only has 32 cores and latency issues. I also don't know which X399 boards have the space for quad GPU.

Whoops! I messed up, sorry! I originally posted about the 32 core 2990WX and I meant the 64 core 3990X! (I'll correct the post, and thank you for spotting that!)

As for latency, I am using software that has a viewport refresh of about 60FPS or above, which is acceptable for interactive modelling tasks. Am I making a mistake somewhere?

 

I specify the 64 core, as I normally render at 3840x2160, but occasionally I have to make standies or banners at print resolution, which can be 20,000 or 30,000 pixels wide. I very quickly get an out-of-memory error on the GPUs, so I remake the shaders in the scene for a CPU renderer and render it overnight with the Arnold CPU renderer. No memory issues, just lots of time!

Link to comment
Share on other sites

Link to post
Share on other sites

It seems like AMD won't release their Threadripper Pro lineup to consumers, it'll be OEM only. The only place where you can get it atm is by buying a Thinkstation, with a rather small space for your demands.  Probably there will be other OEMs selling such products, but you're out of luck if you want a custom motherboard in a custom case.

 

The fact that you don't have blower-like cards makes it really hard to stack many of those, and the only products nvidia seems to be releasing in this format are their Quadro products, whose consumers probably will use many of, unlike their GeForce lineup.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, MaxyW said:

Hopeful Requirements

• A PC case that fits an SSI-CEB (phatboy ATX) motherboard

• An X399 motherboard

Noctua NH-U14S TR4 SP3 or another extreme-high-end air cooler

• 4 X NVIDIA 2080 Ti GPUs or 2X 2080 Ti graphics cards and 2X future 3080 Ti cards

• Air-cooling is preferable to water-cooling for the sake of longevity and service-life. An AiO CPU cooler is absolutely fine, as long as it's not made by Enermax!

Sounds like you might be interested in the newly release Enthoo Pro II

http://www.phanteks.com/Enthoo-Pro2-TemperedGlass.html

Spoiler

PH-ES620PTG.jpg

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, MaxyW said:

- 1 x 1080 Ti Founders Edition blower-style card

- 2 x 2080 Ti Founders Edition open-cooler cards

These were all bought from the NVIDIA store directly for reasons relating to warranty and GPU binning. 

I'm right there with you. A workstation multi-core workstation CPU with PCI-E lanes for days, and the motherboards all have 4 slots(!)

 

Whoops! I messed up, sorry! I originally posted about the 32 core 2990WX and I meant the 64 core 3990X! (I'll correct the post, and thank you for spotting that!)

As for latency, I am using software that has a viewport refresh of about 60FPS or above, which is acceptable for interactive modelling tasks. Am I making a mistake somewhere?

 

I specify the 64 core, as I normally render at 3840x2160, but occasionally I have to make standies or banners at print resolution, which can be 20,000 or 30,000 pixels wide. I very quickly get an out-of-memory error on the GPUs, so I remake the shaders in the scene for a CPU renderer and render it overnight with the Arnold CPU renderer. No memory issues, just lots of time!

Stacking open air cards doesn't generally go well for already-discussed reasons-- you could try it but it would likely have awful thermals.

 

If you're running the 64 core, you'll need a TRX40 board-- the pinouts of 1/2nd gen TR are different from 3rd gen.

 

For boards the TRX40 Designare and Creator are the best for multiGPU I think-- the Aorus Xtreme might support it as well, but it's more expensive for very little gain. The Designare also takes up 9 slots whereas the Creator "only" uses 8. (of course this can be worked around with vertical mounts)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×