Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Single slot water block 3090s?

Hey guys! I'm building the ultimate rendering / 3D graphics workstation.

 

I plan on using this motherboard: https://www.asus.com/Motherboards-Components/Motherboards/Workstation/Pro-WS-WRX80E-SAGE-SE-WIFI/ $1,000

With a Threadripper Pro 3975WX

 

This motherboard and CPU combo is the only one I have found that supports 6 PCIE 4.0 16x GPUs + one at 8x.

 

I plan on using risers with a custom case, however I'm curious if anyone could find a solution that would allow me to have the 3090s without risers, of course water cooled. The FE I don't believe could work because of it's side power connector, so any advice/ideas you have would be awesome!

 

FWI, I'm doing with 3090s due to their 24gb of vram because the 12 in the 3080Ti isn't enough for my workload.

 

 

Thanks!

Computers r fun

Link to post
Share on other sites

im not going to be able to help you, but may i ask what kind of rendering are you doing? how many 3090s are you using?

👀👀👀👀 

Link to post
Share on other sites
2 minutes ago, adarw said:

im not going to be able to help you, but may i ask what kind of rendering are you doing? how many 3090s are you using?

I'd like to use as many as physically possible. My company has had huge success recently in the NFT space and I want to invest in upgrades. My current setup is a 5800X with 2 FE 3090s. 

 

Rendering consists of Maya/Blender primarily, sometimes involves other softwares - really depends on the project. However, in all the software's I use, each GPU is used 100%. So 7 cards are 7 times faster than 1.

Computers r fun

Link to post
Share on other sites

if you are doing heavy memory tasks the issue with the 3090's is you might need active cooling for the back of the card.

EKWB make an active backplate and you could get a joiner block that might sandwidth both into two slots.

EKWB should be able to help find a compatiable option for you.

 

https://www.ekwb.com/shop/ek-quantum-vector-re-rtx-3080-3090-active-backplate-d-rgb-plexi

You would need a dual card FC Terminal connector to bolt across the two and you can do in and out on each sides.

 

Example in the image you can get lots of custom joiner card to join many cards together...

43e1fe3299509a58e70b5b15e2d98252_XL.jpg

CPU | Intel i7-8086K Overclocked 5.4Ghz | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM | 4x8GB Corsair Vengeance 3200MHz |MOTHERBOARD | Asus ROG Maximus X Formula | STORAGE | 2x Samsung Evo 970 256GB NVME  | COOLING | Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo | OS | Windows 10

Link to post
Share on other sites

Power requirements for 7 3090s would be crazy, does it need to be a single machine?

Couldn't you just replicate your current 5800X+2*3090 multiple times and distribute the work?

Link to post
Share on other sites

This would be one of the cases I'd suggest to just build a couple pc's.

 

Why?

 

The 3090 NEEDS ACTIVE COOLING ON THE BACK. It will literally DIE if it's not cooled properly so stacking them against each other is a BIG NO.

Since you are going gpu render cpu just does not matter.

For rendering purposes a 4.0 x16 link is like not at all needed.

Power requirements will be insane this would be a triple psu setup. No case fits that.

It will just be cheaper.

 

You can get multiple triple 3090 + 5600g systems for cheaper and they'll also be functional and not a mess of a system.

 

Let the stupid systems linus did be a lesson on why you really can have too much of a nice thing and it's often better to just simply split up the task.

Link to post
Share on other sites
7 hours ago, Zagna said:

Power requirements for 7 3090s would be crazy, does it need to be a single machine?

Couldn't you just replicate your current 5800X+2*3090 multiple times and distribute the work?

 

6 hours ago, jaslion said:

This would be one of the cases I'd suggest to just build a couple pc's.

 

Why?

 

The 3090 NEEDS ACTIVE COOLING ON THE BACK. It will literally DIE if it's not cooled properly so stacking them against each other is a BIG NO.

Since you are going gpu render cpu just does not matter.

For rendering purposes a 4.0 x16 link is like not at all needed.

Power requirements will be insane this would be a triple psu setup. No case fits that.

It will just be cheaper.

 

You can get multiple triple 3090 + 5600g systems for cheaper and they'll also be functional and not a mess of a system.

 

Let the stupid systems linus did be a lesson on why you really can have too much of a nice thing and it's often better to just simply split up the task.

Can't do multiple systems. The goal is to have a single system for 3D graphics development purposes. This machine is used by 3D artists to create artwork before it gets sent off to the render farm for the long renders.

 

For power I already have a couple 500 watt 12 volt only server PSUs, like the kind of thing used for bitcion minning. 

 

As for the high bandwidth requirements, this matters because this will be an artist computer where they will be previewing changes to lighting, materials, etc and we just want to get the absolute fastest possible viewport performance. I'd be okay with all cards running at 8x, but the next best option from the one described above gives significantly less than that.

 

I have already verified that the viewport performance does scale linearly with multiple GPUs. 

 

 

Sounds like water cooling these things isn't a viable option. My original plan was to build a custom case that allows all of these cards to use their original coolers and be run with PCIe extenders, placed in a way where they all get cool fresh air.

 

Also, the CPU usage will be crazy because of what it takes to communicate between the 3090s.

Computers r fun

Link to post
Share on other sites
12 hours ago, Maticks said:

if you are doing heavy memory tasks the issue with the 3090's is you might need active cooling for the back of the card.

EKWB make an active backplate and you could get a joiner block that might sandwidth both into two slots.

EKWB should be able to help find a compatiable option for you.

 

https://www.ekwb.com/shop/ek-quantum-vector-re-rtx-3080-3090-active-backplate-d-rgb-plexi

You would need a dual card FC Terminal connector to bolt across the two and you can do in and out on each sides.

 

Example in the image you can get lots of custom joiner card to join many cards together...

~SNIP~

I think the issue here with backplates is the GPUs won't fit in a single slot.

 

I think water cooling isn't the way to go.

Computers r fun

Link to post
Share on other sites
3 minutes ago, TheNuzziNuzz said:

 

Can't do multiple systems. The goal is to have a single system for 3D graphics development purposes. This machine is used by 3D artists to create artwork before it gets sent off to the render farm for the long renders.

 

For power I already have a couple 500 watt 12 volt only server PSUs, like the kind of thing used for bitcion minning. 

 

As for the high bandwidth requirements, this matters because this will be an artist computer where they will be previewing changes to lighting, materials, etc and we just want to get the absolute fastest possible viewport performance. I'd be okay with all cards running at 8x, but the next best option from the one described above gives significantly less than that.

 

I have already verified that the viewport performance does scale linearly with multiple GPUs. 

 

 

Sounds like water cooling these things isn't a viable option. My original plan was to build a custom case that allows all of these cards to use their original coolers and be run with PCIe extenders, placed in a way where they all get cool fresh air.

 

Also, the CPU usage will be crazy because of what it takes to communicate between the 3090s.

Almost all 3d viewport software shows none to margin of error difference between a 8x link or 16x link on 3090's.

 

If that is all for the psu you'll need like at MINIMUM 8 of those psu's as technically a 3090 is NOT satisfied on a 500w one. You REALLY should reavaluate the component choises here as 500w psu's are not going to happen here

 

Bandwith really just doesn't matter the gpu takes so long to calculate a frame it just does not need that bandwith. Same for viewport performance bandwith has never really been an issue.

 

viewport performance does scale BUT it is limited and at a certain point more gpu's cause a performance loss as the cpu needs to split too many instructions between the cards. I've seen this happen personally on quad gpu rigs. It all depends on the software.

 

You also mentioned artists. So multiple people at a time will be using this computer?

 

As for case forget about that. Get some mining rig frame for this thing.

 

As for cpu usage. A couple cores will be used not many. All the cpu needs to do is send minor data. Just isn't needed.

Link to post
Share on other sites
1 hour ago, TheNuzziNuzz said:

Can't do multiple systems. The goal is to have a single system for 3D graphics development purposes. This machine is used by 3D artists to create artwork before it gets sent off to the render farm for the long renders.

Is Quadro not a viable alternative?

It's entirely possible that I misinterpreted/misread your topic and/or question. This happens more often than I care to admit. Apologies in advance.

 

珠江 (Pearl River): CPU: AMD Ryzen 7 3700X; Motherboard: ASUS TUF Gaming X570-Plus (WIFI); RAM: G.Skill TridentZ RGB 32GB (2x16GB) DDR4 @3200MHz CL16; Cooling Solution: NZXT Kraken Z53 240mm AIO, w/ 2x Lian Li ST120 RGB Fans; GPU: Nvidia GeForce RTX 3070 Founders Edition; StorageSamsung 970 EVO, 1TB; PSU: Corsair RM850x; Case: Lian Li Lancool II Mesh RGB, Black; Display(s): Primary: ASUS ROG Swift PG279QM (1440p 27" 240 Hz); Secondary: Acer Predator XB1 XB241H bmipr (1080p 24" 144 Hz, 165 Hz OC); Case Fans: 1x Lian Li ST120 RGB Fan; Capture Card: Elgato HD60 Pro

 

翻生 (Resurrection): CPU: 2x Intel Xeon E5-2620 v2 (planning to upgrade to 2x Intel Xeon E5-2680 v2); Motherboard: ASUS Z9PR-D12; RAM: Crucial 28GB (7x4GB) DDR3 ECC RAM; Cooling Solution: 2x Cooler Master Hyper 212 EVO; GPU: EVGA RTX 3060 XC Gaming; StorageCrucial MX500, 500GB; PSU: Super Flower Leadex III 750W; Case: Phanteks Enthoo Pro; Expansion Card: TP-Link Archer T4E AC1200 PCIe Wi-Fi Adapter Display(s): Dell P2214HB (1080p 22" 60 Hz)

Link to post
Share on other sites
4 hours ago, jaslion said:

Almost all 3d viewport software shows none to margin of error difference between a 8x link or 16x link on 3090's.

Yes 100% 8x 4.0 is plenty, however the alternative is 4x PCIe 3.0 on motherboards that support 7 GPUs other than the one I have linked, which would be an unnecessary bottleneck when I can buy the board linked above

Computers r fun

Link to post
Share on other sites
3 hours ago, CT854 said:

Is Quadro not a viable alternative?

The price difference is just so absurdly significant. 3090s perform the same, and this workload will never make use of the double precious calculations, nor the extra vram. 24 gb is plenty.

Computers r fun

Link to post
Share on other sites
4 hours ago, jaslion said:

viewport performance does scale BUT it is limited and at a certain point more gpu's cause a performance loss as the cpu needs to split too many instructions between the cards. I've seen this happen personally on quad gpu rigs. It all depends on the software..

I've already run tests, and with our workflow I know for a fact that 4 GPUs will scale 100%. Even if it's diminishing, it's worth it for the final render performance.

 

This is also why I want a beefy CPU, which any TR Pro is.

Computers r fun

Link to post
Share on other sites

a server based GPU card might be a way to go and throw it in a server case, you can get your power requirements easy met there and fit enough cards on one board.

given you are looking at 3090 prices already those server cards are around that same pricepoint.

CPU | Intel i7-8086K Overclocked 5.4Ghz | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM | 4x8GB Corsair Vengeance 3200MHz |MOTHERBOARD | Asus ROG Maximus X Formula | STORAGE | 2x Samsung Evo 970 256GB NVME  | COOLING | Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo | OS | Windows 10

Link to post
Share on other sites
3 hours ago, Maticks said:

a server based GPU card might be a way to go and throw it in a server case, you can get your power requirements easy met there and fit enough cards on one board.

given you are looking at 3090 prices already those server cards are around that same pricepoint.

Server cards?

Computers r fun

Link to post
Share on other sites
16 minutes ago, TheNuzziNuzz said:

Server cards?

NVIDIA RTX A5000

NVIDIA RTX A6000

CPU | Intel i7-8086K Overclocked 5.4Ghz | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM | 4x8GB Corsair Vengeance 3200MHz |MOTHERBOARD | Asus ROG Maximus X Formula | STORAGE | 2x Samsung Evo 970 256GB NVME  | COOLING | Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo | OS | Windows 10

Link to post
Share on other sites
3 hours ago, Maticks said:

NVIDIA RTX A5000

NVIDIA RTX A6000

They are not cost lucrative. 

 

7 3090s: $10,500

7 A6000s: $42,000

 

Literally the value of new car I'd be paying for the Quadro cards to gain absolutely no benefit other than a duel slot blower design cooler. I could design a custom case, attempt single slot water cooling, use risers, buy 4 industrial refrigerators, and i'd still have money leftover.

Computers r fun

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×