Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Allptraum1989

Would 4x RTX 2080ti beat a TR 3990x in blender?

Recommended Posts

Posted · Original PosterOP

Hypothetical question:

What would render faster in blender cycles?

 

-Four RTX 2080ti with Optix enabled

-One Threadripper 3990x

 

Some info:

When rendering using optix, blender will actually raytrace on the RTX's hardware RT cores instead of pure software.

On my RTX 2070 super, enabling this cut render times in half.

Blender can make full use of multiple GPUs.

Eg, four RTX 2080ti are four times faster than one RTX 2080ti

 

The TR 3990x has 64 cores | 128 threads @ 2.9ghz base | 4.3ghz boost

 

Lets say we have 3600mhz ram

 

Tile sizes would be respectively 16x16 and 256x256

Link to post
Share on other sites
Posted · Original PosterOP
Just now, Bombastinator said:

Neither. Both would be zero. One doesn’t have a cpu to power it, the other doesn’t have a gpu to do anything with.

Both would be in one rig. 

We render two times

First render is set to "gpu only", second one to "cpu only".

 

And of course we set the tile sizes

Link to post
Share on other sites
5 minutes ago, Allptraum1989 said:

Both would be in one rig. 

We render two times

First render is set to "gpu only", second one to "cpu only".

 

And of course we set the tile sizes

Well then why are you asking?  Do it and find out.  My money is on the gpu though.  It’s too much like mining.  You may have issues getting 4 Pcie 3.0 GPUs onto the AMD motherboard though.  Last I heard the maximum was 3 for x570, though there may be hacks to get around that.

Edited by Bombastinator
Added x50 quallifier

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
Posted · Original PosterOP
18 minutes ago, Bombastinator said:

Well then why are you asking?  Do it and find out.  My money is on the gpu though.  It’s too much like mining.  You may have issues getting 4 Pcie 3.0 GPUs onto the AMD motherboard though.  Last I heard the maximum was 3 for x570, though there may be hacks to get around that.

Then lets say the gpus are in a rig that supports 4x gpu. 

 

Sure, I'll just spend 15k on that rig, no biggie. 

Link to post
Share on other sites
21 minutes ago, Bombastinator said:

You may have issues getting 4 Pcie 3.0 GPUs onto the AMD motherboard though.  Last I heard the maximum was 3 for x570, though there may be hacks to get around that.

You know TRX40 is not x570 right?

 

also its not really that hard to get 4 GPUs on X570........

 

Also you are looking at the technicallity of doing that test. Not the test itself. 

 

 

 

I havent seen benchmarks of it, as such i dont really have an answer

Link to post
Share on other sites
1 hour ago, GoldenLag said:

You know TRX40 is not x570 right?

 

also its not really that hard to get 4 GPUs on X570........

 

Also you are looking at the technicallity of doing that test. Not the test itself. 

 

 

 

I havent seen benchmarks of it, as such i dont really have an answer

Re: TRx40 vs x570

yes.  Games are being played with the pcie 3.0-4.0 conversion though.  AMD did it with x570.  I wouldn’t put it past them to try it twice.  Should be watched out for.

 

Re: also

”not that hard” is sort of subjective.  It’s a hack.  Whether it was an intentional attempt to sell pcie4 cards in the future or not is unknown.
 

re: benchmarks

I don’t think they’re available at all.  Ryzen 3 4 is still only barely out.  I still have my money on the GPUs though

Edited by Bombastinator
Number correction

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
Just now, Bombastinator said:

Games are being played with the pcie 3.0-4.0 conversion though

You dont need to multiplex anything. 

 

And changing between PCIe revision has been a thing since 1.0 and 2.0 was thing.

 

2 minutes ago, Bombastinator said:

Re: also

”not that hard” is sort of subjective.  It’s a hack

All you need to do is plug in some pcie 3.0 slot adapters and you are done. 

 

And you can get those om ebay or from supermicro or many others. 

 

You have 20 lanes to work with. And you can run GPUs on 4x mode when rendering. 

 

TR4 or TRX40 is super easy to get 4 GPUs as you have 48 lanes to work with iirc. 

Link to post
Share on other sites
12 minutes ago, GoldenLag said:

You dont need to multiplex anything. 

 

And changing between PCIe revision has been a thing since 1.0 and 2.0 was thing.

 

All you need to do is plug in some pcie 3.0 slot adapters and you are done. 

 

And you can get those om ebay or from supermicro or many others. 

 

You have 20 lanes to work with. And you can run GPUs on 4x mode when rendering. 

 

TR4 or TRX40 is super easy to get 4 GPUs as you have 48 lanes to work with iirc. 

Re: multiplex and changing

yes. And every time there are problems.

 

re: all you need..

so yes.  A hack. One where you buy additional parts even.

 

Lanes: 

Of course there are enough lanes.  There are enough lanes on thex570 too.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
Posted · Original PosterOP
37 minutes ago, Bombastinator said:

Re: TRx40 vs x570

yes.  Games are being played with the pcie 3.0-4.0 conversion though.  AMD did it with x570.  I wouldn’t put it past them to try it twice.  Should be watched out for.

 

Re: also

”not that hard” is sort of subjective.  It’s a hack.  Whether it was an intentional attempt to sell pcie4 cards in the future or not is unknown.
 

re: benchmarks

I don’t think they’re available at all.  Ryzen3 is still only barely out.  I still have my money on the GPUs though

Ryzen 3 is barely out?! Almost outdated! Amd already announced the 4800U

Link to post
Share on other sites
1 hour ago, Allptraum1989 said:

Hypothetical question:

What would render faster in blender cycles?

 

-Four RTX 2080ti with Optix enabled

-One Threadripper 3990x

 

Some info:

When rendering using optix, blender will actually raytrace on the RTX's hardware RT cores instead of pure software.

On my RTX 2070 super, enabling this cut render times in half.

Blender can make full use of multiple GPUs.

Eg, four RTX 2080ti are four times faster than one RTX 2080ti

 

The TR 3990x has 64 cores | 128 threads @ 2.9ghz base | 4.3ghz boost

 

Lets say we have 3600mhz ram

 

Tile sizes would be respectively 16x16 and 256x256

Assuming you had at least something so the parts would work, the GPU's would probably outstrip the CPU's (you're basically comparing dedicated hardware to general purpose hardware), however it would probably not be a good investment.

 

For example v-ray can make use of OpenCL on CPU and GPU's, and V-Ray will soon have access to the RT cores on RTX boards. https://www.chaosgroup.com/blog/v-ray-gpu-adds-support-for-nvidia-rtx

 

That is a little ways off still, but just getting that point out there that the ideal option is "all cores, on all devices"

Link to post
Share on other sites
27 minutes ago, Allptraum1989 said:

Ryzen 3 is barely out?! Almost outdated! Amd already announced the 4800U

Wording error. Worth catching. Fixed now.  Ihateautocorrect. Ryzen 4 then.  Definition of terms.  Ryzen 3 is zen 2. The zen+ thing made everything messed up.

 As far as meaning, You know what I’m talking about, I know what I’m talking about.  


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
Posted · Original PosterOP
2 hours ago, Bombastinator said:

Wording error. Worth catching. Fixed now.  Ihateautocorrect. Ryzen 4 then.  Definition of terms.  Ryzen 3 is zen 2. The zen+ thing made everything messed up.

 As far as meaning, You know what I’m talking about, I know what I’m talking about.  

Ikr, what lunatic decided to call it zen+

Link to post
Share on other sites
Posted · Original PosterOP
2 hours ago, Flying Sausages said:

Should have get 4x RTX Titan to flex more.

Might actually be a bit faster. I could imagine they are binned

Link to post
Share on other sites
3 hours ago, Bombastinator said:

so yes.  A hack. One where you buy additional parts even.

Some boards you dont.

 

There is no hack involved.......

 

The parts are for splitting slots if need be. And if a m.2 expancion card is a hack, then im not sure you understand the purpose of the flexibility PCIe offers.

 

 

Edit: also PCIe has a good record of changing to the PCIe speed of the device.

 

Also multiplexing and adpating between PCIe versions are 2 different things. 

Link to post
Share on other sites
43 minutes ago, Allptraum1989 said:

Might actually be a bit faster. I could imagine they are binned

TITAN cards have a few Quadro features in them and some small changes at driver level to make them more compelling than the 2080 Ti for workstation workloads... it's not just those few extra cuda cores and double the memory that sets them apart.


Workstation Rig:
CPU:  Intel Core i9 9900K @5.0ghz  |~| Cooling: beQuiet! Dark Rock 4 |~|  MOBO: Asus Z390M ROG Maximus XI GENE |~| RAM: 32gb 3333mhz CL15 G.Skill Trident Z RGB |~| GPU: nVidia TITAN V  |~| PSU: beQuiet! Dark Power Pro 11 80Plus Platinum  |~| Boot: Intel 660p 2TB NVMe |~| Storage: 2X4TB HDD 7200rpm Seagate Iron Wolf + 2X2TB SSD SanDisk Ultra |~| Case: Cooler Master Case Pro 3 |~| Display: Acer Predator X34 3440x1440p100hz |~| OS: Windows 10 Pro.
 
Personal Use Rig:
CPU: Intel Core i9 9900 @4.75ghz |~| Cooling: beQuiet! Shadow Rock Slim |~| MOBO: Gigabyte Z390M Gaming mATX|~| RAM: 16gb DDR4 3400mhzCL15 Viper Steel |~| GPU: nVidia Founders Edition RTX 2080 Ti |~| PSU: beQuiet! Straight Power 11 80Plus Gold  |~|  Boot:  Intel 660p 2TB NVMe |~| Storage: 2x2TB SanDisk SSD Ultra 3D |~| Case: Cooler Master Case Pro 3 |~| Display: Viotek GN34CB 3440x1440p100hz |~| OS: Windows 10 Pro.


HTPC / "Console of the house":

CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.
Link to post
Share on other sites
1 hour ago, GoldenLag said:

Some boards you dont.

 

There is no hack involved.......

 

The parts are for splitting slots if need be. And if a m.2 expancion card is a hack, then im not sure you understand the purpose of the flexibility PCIe offers.

 

 

Edit: also PCIe has a good record of changing to the PCIe speed of the device.

 

Also multiplexing and adpating between PCIe versions are 2 different things. 

Goes to definition of hack.  It’s a vague word.


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
15 minutes ago, Bombastinator said:

Goes to definition of hack.  It’s a vague word.

Off the top of my head i dont know the real definition of hack. But i wouldnt be surprised if it is technically a hack by literal definition.

 

But i really wouldnt consider an adaptor a hack anymore then i would consider a fan splitter a hack. 

Link to post
Share on other sites

Of course a rig with 4x modern GPUs will beat a single modern processor in graphics rendering tasks. That's just common sense...

 

Regardless of any implementation details, 4x high end GPUs is actually quite a bit of rendering horsepower, and quite expensive as well. Are you sure you actually need  or can make use of such a machine?


I will never succumb to the New Cult and I reject the leadership of @Aelar_Nailo and his wicked parrot armies led by @FakeCIA and @DildorTheDecent. I will keep my eyes pure and remain dedicated to the path of the One True; IlLinusNati

Link to post
Share on other sites

I just I just want to add this piece of stupidity because the title is playing on my mind

290F4748-E761-4385-A47B-79EBBB88F59B.jpeg.279e5bd9ad79896a90d013a8dc90ebb9.jpeg


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites

I want to revive this thread because there's no clear answer yet, would 4 2080ti beat a threadripper 3990x for rendering (let's say) the classroom scene in blender using cycles ?  

Link to post
Share on other sites

I don't think it's answered because no one really cares.

@BigPig


Current PC:

Spoiler

*WORK IN PROGRESS*

 

Mothballed PC:

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites
7 minutes ago, BigPig said:

I want to revive this thread because there's no clear answer yet, would 4 2080ti beat a threadripper 3990x for rendering (let's say) the classroom scene in blender using cycles ?  

This question reminds me of one that was bandied about in my high school long ago:


“If you’re driving down the road and your station wagon loses a tire how many pancakes does it take to cover up a dog house?”


Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
9 minutes ago, Bombastinator said:

This question reminds me of one that was bandied about in my high school long ago:


“If you’re driving down the road and your station wagon loses a tire how many pancakes does it take to cover up a dog house?”

Dude! I don't know if you realize but you're only polluting this thread with your egocentric useless replays, with all the respect... but if you don't have a valid answer let's leave it for somebody else. 

Maybe someone who already own a TR 3990x and 4x2080ti build in this forum can run a render of an available blender scene and provide us with the render time. 

Some people including myself have already tried GPU rendering and aren't pleased with it, so we need to know a couple of things before investing $4000 on a CPU. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×