Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
twixx

CUDA Rendering Workstation - Some help needed

Recommended Posts

Posted · Original PosterOP

Hi Everyone!

 

So I do 3D work, recently the business got going, so I need to upgrade from my personal rig, because the output is not fast enough.

 

I do CUDA rendering.

 

Current build is:

 

  • Motherboard: Asrock B85M Pro3 - a budget 1150 board
  • CPU: Core i5-4460 Haswell Refresh running at 3.2 GHz, turbo to 3.4 GHz - nothing fancy here
  • GPU: Gigabyte G1 Gaming GeForce GTX 1080
  • RAM: 4x4 GB of DDR3 HyperX Fury 1600MHz
  • PSU: Corsair CX 600 BUILDER Series
  • Case: NZXT H440 - a fairly large case, however it is low noise, not airflow optimized with 3x120mm front fans and 1x140mm exhaust at the back.
  • CPU cooler: be quiet! Shadow Rock Slim

Everything runs on air cooling.

 

Plans and goals for workstation build:

 

So I have 3-4 GPUs planned, starting with my already existing 1080, and buying a 1080 TI as soon as it's available. Later upgrade everything to 1080 TI.

With that said, I need a processor capable of handling 3-4 (more likely 4) GPUs, as far as I know I have 2 options, to go with Broadwell-E 6850K and up,

or get a Xeon, although I'm not familiar with the Xeon family at all. I need some advice about this.

CPU heavy work is some video editing, and encoding, but video work is rare, so it's not a first priority. Also, the occasional gaming at 4K, but this isn't so important either.

If it can be avoided, I wouldn't dish out a shit ton of money for a 6900K. I waited for Ryzen patiently, but they don't have enough PCI-E lanes for the GPUs. Sad.

 

The other thing is cooling. I am really unsure what to do there. I probably need a new case which is airflow optimised.

I plan to overclock the GPUs, not by much, but even 5% performance per GPU is a huge bonus when you have 3-4.

What I managed to gather from various soruces, is that putting 3-4 GPUs together with the regular cooling fans of the manufacturers are bad

because the cards blow hot air on each other, so a blower style cooler works better here.

Other option is to buy whichever card is cheap but performs good, and build a custom loop, which I have never did before, so I am a bit scared.

Not to mention it would increase the costs. I think the most flexible option would be to buy cheaper blower style cards, which later can be fitted with waterblocks?

 

Miscellenious info:

Also, some general tips on other components would be welcome, and about what can be recycled from my current build. PSU capacity tips are welcome as well, I guess I need 1200W+

PC would often run 10-16 hours a day.

 

Link to post
Share on other sites

What software do you use for 3D work, usually looking at the software and the recommended specs helps in defining what components will be useful for a good build.  And seriously, do you actually need 1080/Ti's in a quad SLI setup?


That is not dead which can eternal lie.  And with strange aeons even death may die. - The Call of Cthulhu

A university is not a "safe space". If you need a safe space, leave, go home, hug your teddy & suck your thumb until ready for university.  - Richard Dawkins

Link to post
Share on other sites
Posted · Original PosterOP
16 minutes ago, AkiraDaarkst said:

What software do you use for 3D work, usually looking at the software and the recommended specs helps in defining what components will be useful for a good build.  And seriously, do you actually need 1080/Ti's in a quad SLI setup?

Thanks for the comment!

Mainly Octane Render in Cinema4D and Maya.

 

I don't need SLI, GPU renderers use every GPU in the system/network separately.

Actually, with 4 1080 Ti's I'd be still behind a lot of other professionals. Rendering networks are not uncommon in even small offices who work in 3D, I've seen multiple 7x TITAN X setups.

EDIT: I more or less know what I need, I just don't know how to approach. Mainly if some people who have experience in building a workstation similar to this could give some advice, that would be much appreciated.

Link to post
Share on other sites
2 minutes ago, twixx said:

Thanks for the comment!

Mainly Octane Render in Cinema4D and Maya.

 

I don't need SLI, GPU renderers use every GPU in the system/network separately.

Actually, with 4 1080 Ti's I'd be still behind a lot of other professionals. Rendering networks are not uncommon in even small offices who work in 3D, I've seen multiple 7x TITAN X setups.

 

OK then, perhaps you can use PCPartpicker to make a list of what you'd think might be a good build and people here can provide suggestions.  You know what kind of work you'll be doing and the kind of performance you want and the budget you have.

 

In terms of cooling and airflow, you certainly will want a case with sufficient room inside to help maximize airflow.  The biggest case I can think of is something like a Cooler Master Cosmos II.  Very big and heavy but remove all the unnecessary HDD/SSD trays and you certainly have a case that can maximize airflow.  But this case will probably be overkill since you're starting out only with 2 GPUs?  Up to you to decide if you want to get a large enough case that will fit everything inside eventually or take gradual steps.

 

(Another case that will fit 4 GPUs like a 1080 should be something like a Corsair 780T or Thermaltake Define XL something, I don't know every case that is available on the market.)

 

3-4 GPUs with their stock coolers might not be ideal, I think here you should go for water cooling blocks (eventually).  I'd be careful with using a 6850K CPU with a 4 GPUs, only has 28 PCIe lanes.  If you're careful, buying something like a 1080 GPU with a standard stock air cooler and then adding a waterblock later is easy.

 

From your current build, if you decide to get a 6850K you need a new logic board, new memory, and definitely a new larger watt PSU for 4 GPUs.


That is not dead which can eternal lie.  And with strange aeons even death may die. - The Call of Cthulhu

A university is not a "safe space". If you need a safe space, leave, go home, hug your teddy & suck your thumb until ready for university.  - Richard Dawkins

Link to post
Share on other sites
Posted · Original PosterOP

I'll check out those cases, big and heavy is just what I need :D thank you! I think it's better to buy a case which can support the final build, to save some work on the disassembling and reassembling of the build.

 

On intel ark it says the 6850K has 40 lanes, and the 6800K has 28, did I miss something?

 

So you suggest to go liquid cooling once I have all the GPUs in I see.

 

Thanks for the advice!

 

PS.: I forgot to give a budget, it is about 2.5-3K USD for now.

Link to post
Share on other sites
2 hours ago, twixx said:

On intel ark it says the 6850K has 40 lanes, and the 6800K has 28, did I miss something?

It is my mistake, I didn't verify the info on Ark and confused it between the 6800K specs and 6850K specs.

 

And yes I think water cooling might be your best option with all those GPU blocks stacked together.  A GPU like the 1080Ti or even the non-Ti takes up two expansion slots thick.  Stack 4 of them together and they are sandwiched so close together, a custom water block will be thinner and provide wider gaps between each card and will help with air flow.


That is not dead which can eternal lie.  And with strange aeons even death may die. - The Call of Cthulhu

A university is not a "safe space". If you need a safe space, leave, go home, hug your teddy & suck your thumb until ready for university.  - Richard Dawkins

Link to post
Share on other sites

Motherboard is as important as choice of cpu.

 

As you noted, putting four gpu side by side just naturally restricts airflow. You might give some thought to using hybrid gpu. While they have a higher upfront cost, they eliminate the need for a custom loop while offering much of the benefit. Presumably the idea is to add gpu as demand increases and budget becomes available. So the added cost gets spread out over time.

 

At the moment an i7-6850K looks to be the optimal choice. But if time permits, it may be worth waiting a few months to see if Skylake-X offers something better.

 

You may be able to keep the H440. It should have enough locations for mounting two or three 120mm gpu radiators along with a 240mm cpu cooler radiator. If you get a motherboard like the Asus X99-E WS/USB 3.1, you may have to measure the interior to see if the motherboard area is wide enough to take the extra inch of the CEB form factor. (CEB uses ATX mounting points.)

 

You will likely need a 1500W+ psu. Pricey.  Evga offers a number of excellent units in that range: 1600W P2, 1600 T2. Corsair offers the excellent AX1500i. If the system will be working as hard as suggested, a 80+ Titanium might be worth the extra cost.


80+ ratings certify electrical efficiency. Not quality.

 

Link to post
Share on other sites

If you are planning to go that route pick a Xeon, reliability is what they are known for same as new ram and to make it a monster machine a double CPU motherboard will have you set for awhile !!!

Link to post
Share on other sites
Posted · Original PosterOP

Thanks brob for the detailed response!

 

I'd gladly purchase hybrid GPUs, but I can't afford to wait until board partners release them, they usually delay them after their air-cooled versions if I'm not mistaken. So that makes the wait for Skylake-X an invalid option sadly. I have to put together the rig in 3 weeks tops, else I won't be able to meet some deadlines.

But, I could make the 3rd and 4th card hybrids, but I'm not sure if it's worth it at that point. 

The Asus X99 Workstation class board was my choice as well. Thanks for the tip on measuring, I have no idea if it fits right now.

 

Thanks jpms24, a Xeon is definitely interesting, I could save quite a few bucks on that, with cheaper chips having 40 PCIe lanes. 

 

Well it's not an easy situation, but you guys gave me some new angles to approach and think about. Thanks!

Link to post
Share on other sites

@twixx, too bad about the timing. It would still be worth adding hybrids for 3 & 4.

 

Not sure if it is worth the effort, but I'll mention the possibility. Swiftech has a 140mm radiator with integrated pump and reservoir that can be coupled with a Titan X waterblock (compatible with GTX 1080 Ti Founders Edition gpu). 

 


80+ ratings certify electrical efficiency. Not quality.

 

Link to post
Share on other sites
Posted · Original PosterOP
On 2017. 03. 16. at 9:17 PM, Jetfighter808 said:

Check out @smicha 14x 1080's render station.

 

https://www.youtube.com/channel/UCxzzySn5rLO7S6ui4D_wCwQ?&ab_channel=smicha7

 

Seems pretty similar to what you want.

 

 

Yeah, I've seen that, that is some next level stuff. I drooled all over that video haha.

 

 

On 2017. 03. 16. at 10:15 PM, brob said:

@twixx, too bad about the timing. It would still be worth adding hybrids for 3 & 4.

 

Not sure if it is worth the effort, but I'll mention the possibility. Swiftech has a 140mm radiator with integrated pump and reservoir that can be coupled with a Titan X waterblock (compatible with GTX 1080 Ti Founders Edition gpu). 

 

I've checked out Gamer Nexus' test with custom 'hybrid' 1080 Ti, they report average performance gains of 5% over the FE. Article here.

Now after having a few days to think about it all, the best would be to get the reference cards first, then build a custom loop eventually when more money is available (possibly with some help from experienced builders).

 

However I've been checking various motherboards, and I've seen the ASRock X99 WS board is about 30% cheaper than the ASUS X99 WS-E (at least where I live) and I was wondering what is the reason for the difference. Is ASUS just charging a premium because of their brand? As I'm aware ASRock is a solid manufacturer.

So I'm asking because if I don't have to get the ASUS baord, I rather save 30% on it.

Link to post
Share on other sites
7 hours ago, twixx said:

Yeah, I've seen that, that is some next level stuff. I drooled all over that video haha.

 

 

I've checked out Gamer Nexus' test with custom 'hybrid' 1080 Ti, they report average performance gains of 5% over the FE. Article here.

Now after having a few days to think about it all, the best would be to get the reference cards first, then build a custom loop eventually when more money is available (possibly with some help from experienced builders).

 

However I've been checking various motherboards, and I've seen the ASRock X99 WS board is about 30% cheaper than the ASUS X99 WS-E (at least where I live) and I was wondering what is the reason for the difference. Is ASUS just charging a premium because of their brand? As I'm aware ASRock is a solid manufacturer.

So I'm asking because if I don't have to get the ASUS baord, I rather save 30% on it.

I'm a little surprised at the custom hybrid oc results. It will be interesting to see if the commercial hybrids do any better. A custom loop will certainly help keep the noise level down.

 

Undoubtedly there is a significant premium on Asus motherboards. However, the Asrock board is a little more limited than the Asus. Asrock has one less PCIe expansion slot. One of these slots is disabled when an M.2 drive is installed. One of the slots is four PCIe 2.0 lanes. It is also short a SATA port or two. The Asus motherboard includes some nifty features, like USB BIOS Flashback, that while not necessary do make life easier.

 

Asrock started life as a budget brand for Asus. It has transformed itself into a premium motherboard manufacturer.

 

 


80+ ratings certify electrical efficiency. Not quality.

 

Link to post
Share on other sites

First off, what is your workload? Do you have a professional business (professional degree or license needed)?

On 3/16/2017 at 5:06 AM, twixx said:

So I have 3-4 GPUs planned

1080 and 1080 ti do not support >2-way SLI. I would suggest looking into 980ti SLI as the 980ti has more CUDA cores than the 1080 and beats it out in many CUDA accelerated workloads. Also, if you are considering dumping that much money into your GPUs you should be considering one or more Quadro or Tesla accelerators instead. 

On 3/16/2017 at 5:06 AM, twixx said:

PC would often run 10-16 hours a day.

If you are running your PC for that long, ECC memory will make a world of difference.

On 3/16/2017 at 5:06 AM, twixx said:

even 5% performance per GPU is a huge bonus when you have 3-4

Not really, as SLI (even for CUDA) does not scale in a linear fashion.

 

Suggested build based on post: https://pcpartpicker.com/list/QqTg4C

Link to post
Share on other sites
1 hour ago, brob said:

SLI is used in gaming.

I do know this, as I wrote a technical report on the applications of CUDA acceleration and applications in computer vision guidance and mechanical applications. SLI is just an easier thing to say than "multi GPU configuration". 

 

The point still stands that if OctaneRender's software scales linearly to the amount of CUDA cores, the 980ti will still win out over the 1080 (980ti has 256 more CUDA cores) and will be a little more than half the price of a 1080ti (new ~$420 compared to $700-800).

 

980ti build (1262 more CUDA cores):

https://pcpartpicker.com/list/h39HsJ

 

1080ti build:

https://pcpartpicker.com/list/cmvGNN

 

(arbitrary parts other than GPUs, can and likely should be changed but I need to go to sleep)

Link to post
Share on other sites

@Qwweb, OP indicates one GTX 1080 has been purchased. Do you have any experience with mixed gpu environments, i.e. a GTX 1080 and a GTX 1080 Ti?


80+ ratings certify electrical efficiency. Not quality.

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×