We just finished & photographed the build of our 7 GPU Hybrid Supercomputer Render Rig with 7X RTX 2080s from Evga. Designed for Octane Render, it's powered by a Xeon W chip on a Gigabyte server board; 128GB of ECC RAM and 2X 1600W PSUs.
If it's not a world first of its type, it's at least quite rare -- as getting all 7 gpus functional at maximum theoretical [16x8x16x8x16x8x16x8] PCIE bandwidth is a huge, annoying challenge. It involves tweaking some in-depth settings related to PCIE compatibility.
Usually we build high end desktops, but we couldn't resist trying this out. It's going to save the final user A LOT of money on AWS render costs.
Why liquid cool? It's going to be run in an office so being extra quiet is helpful.
Why 7 GPUs? If you put 4 gpus in two rigs, you pay twice for the motherboard, case, RAM, AIO, and drive... so this saves a few thousand bucks.
It easily outperforms an amazon cloud server that costs MANY times more than it does.
Basic SPECS & PARTS:
7X EVGA RTX 2080 hybrids, for a total of around 24,000 CUDA cores.
XEON W 2125+ (any W series will work on this board)
Gigabyte MW51-HP0 dual gigabit [not 10gig] networking, 7x16 pcie slots!
128GB Samsung ECC server RAM
Corsair H100 AIO cooler (large air coolers would not fit under the thick risers!)
7X Thermaltake risers
2X EVGA 1600W PSUs
Custom aluminum frame (adapted from mining & modified to support the huge heavy radiators)
3D printed PSU adapter bracket and PSU adapter to join the two PSUs. Master PSU activates the slave when turned on.
Posted this to Reddit last week and got lots of positive response; a few folks suggested we post here as well! So here we are.
Questions/comments let me know...
Thanks for looking,