Jump to content

Would 4x RTX 2080ti beat a TR 3990x in blender?

Allptraum1989

Hypothetical question:

What would render faster in blender cycles?

 

-Four RTX 2080ti with Optix enabled

-One Threadripper 3990x

 

Some info:

When rendering using optix, blender will actually raytrace on the RTX's hardware RT cores instead of pure software.

On my RTX 2070 super, enabling this cut render times in half.

Blender can make full use of multiple GPUs.

Eg, four RTX 2080ti are four times faster than one RTX 2080ti

 

The TR 3990x has 64 cores | 128 threads @ 2.9ghz base | 4.3ghz boost

 

Lets say we have 3600mhz ram

 

Tile sizes would be respectively 16x16 and 256x256

Link to comment
Share on other sites

Link to post
Share on other sites

Neither. Both would be zero. One doesn’t have a cpu to power it, the other doesn’t have a gpu to do anything with.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Neither. Both would be zero. One doesn’t have a cpu to power it, the other doesn’t have a gpu to do anything with.

Both would be in one rig. 

We render two times

First render is set to "gpu only", second one to "cpu only".

 

And of course we set the tile sizes

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Allptraum1989 said:

Both would be in one rig. 

We render two times

First render is set to "gpu only", second one to "cpu only".

 

And of course we set the tile sizes

Well then why are you asking?  Do it and find out.  My money is on the gpu though.  It’s too much like mining.  You may have issues getting 4 Pcie 3.0 GPUs onto the AMD motherboard though.  Last I heard the maximum was 3 for x570, though there may be hacks to get around that.

Edited by Bombastinator
Added x50 quallifier

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Bombastinator said:

Well then why are you asking?  Do it and find out.  My money is on the gpu though.  It’s too much like mining.  You may have issues getting 4 Pcie 3.0 GPUs onto the AMD motherboard though.  Last I heard the maximum was 3 for x570, though there may be hacks to get around that.

Then lets say the gpus are in a rig that supports 4x gpu. 

 

Sure, I'll just spend 15k on that rig, no biggie. 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Bombastinator said:

You may have issues getting 4 Pcie 3.0 GPUs onto the AMD motherboard though.  Last I heard the maximum was 3 for x570, though there may be hacks to get around that.

You know TRX40 is not x570 right?

 

also its not really that hard to get 4 GPUs on X570........

 

Also you are looking at the technicallity of doing that test. Not the test itself. 

 

 

 

I havent seen benchmarks of it, as such i dont really have an answer

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoldenLag said:

You know TRX40 is not x570 right?

 

also its not really that hard to get 4 GPUs on X570........

 

Also you are looking at the technicallity of doing that test. Not the test itself. 

 

 

 

I havent seen benchmarks of it, as such i dont really have an answer

Re: TRx40 vs x570

yes.  Games are being played with the pcie 3.0-4.0 conversion though.  AMD did it with x570.  I wouldn’t put it past them to try it twice.  Should be watched out for.

 

Re: also

”not that hard” is sort of subjective.  It’s a hack.  Whether it was an intentional attempt to sell pcie4 cards in the future or not is unknown.
 

re: benchmarks

I don’t think they’re available at all.  Ryzen 3 4 is still only barely out.  I still have my money on the GPUs though

Edited by Bombastinator
Number correction

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Games are being played with the pcie 3.0-4.0 conversion though

You dont need to multiplex anything. 

 

And changing between PCIe revision has been a thing since 1.0 and 2.0 was thing.

 

2 minutes ago, Bombastinator said:

Re: also

”not that hard” is sort of subjective.  It’s a hack

All you need to do is plug in some pcie 3.0 slot adapters and you are done. 

 

And you can get those om ebay or from supermicro or many others. 

 

You have 20 lanes to work with. And you can run GPUs on 4x mode when rendering. 

 

TR4 or TRX40 is super easy to get 4 GPUs as you have 48 lanes to work with iirc. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, GoldenLag said:

You dont need to multiplex anything. 

 

And changing between PCIe revision has been a thing since 1.0 and 2.0 was thing.

 

All you need to do is plug in some pcie 3.0 slot adapters and you are done. 

 

And you can get those om ebay or from supermicro or many others. 

 

You have 20 lanes to work with. And you can run GPUs on 4x mode when rendering. 

 

TR4 or TRX40 is super easy to get 4 GPUs as you have 48 lanes to work with iirc. 

Re: multiplex and changing

yes. And every time there are problems.

 

re: all you need..

so yes.  A hack. One where you buy additional parts even.

 

Lanes: 

Of course there are enough lanes.  There are enough lanes on thex570 too.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Bombastinator said:

Re: TRx40 vs x570

yes.  Games are being played with the pcie 3.0-4.0 conversion though.  AMD did it with x570.  I wouldn’t put it past them to try it twice.  Should be watched out for.

 

Re: also

”not that hard” is sort of subjective.  It’s a hack.  Whether it was an intentional attempt to sell pcie4 cards in the future or not is unknown.
 

re: benchmarks

I don’t think they’re available at all.  Ryzen3 is still only barely out.  I still have my money on the GPUs though

Ryzen 3 is barely out?! Almost outdated! Amd already announced the 4800U

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Allptraum1989 said:

Hypothetical question:

What would render faster in blender cycles?

 

-Four RTX 2080ti with Optix enabled

-One Threadripper 3990x

 

Some info:

When rendering using optix, blender will actually raytrace on the RTX's hardware RT cores instead of pure software.

On my RTX 2070 super, enabling this cut render times in half.

Blender can make full use of multiple GPUs.

Eg, four RTX 2080ti are four times faster than one RTX 2080ti

 

The TR 3990x has 64 cores | 128 threads @ 2.9ghz base | 4.3ghz boost

 

Lets say we have 3600mhz ram

 

Tile sizes would be respectively 16x16 and 256x256

Assuming you had at least something so the parts would work, the GPU's would probably outstrip the CPU's (you're basically comparing dedicated hardware to general purpose hardware), however it would probably not be a good investment.

 

For example v-ray can make use of OpenCL on CPU and GPU's, and V-Ray will soon have access to the RT cores on RTX boards. https://www.chaosgroup.com/blog/v-ray-gpu-adds-support-for-nvidia-rtx

 

That is a little ways off still, but just getting that point out there that the ideal option is "all cores, on all devices"

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Allptraum1989 said:

Ryzen 3 is barely out?! Almost outdated! Amd already announced the 4800U

Wording error. Worth catching. Fixed now.  Ihateautocorrect. Ryzen 4 then.  Definition of terms.  Ryzen 3 is zen 2. The zen+ thing made everything messed up.

 As far as meaning, You know what I’m talking about, I know what I’m talking about.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

Wording error. Worth catching. Fixed now.  Ihateautocorrect. Ryzen 4 then.  Definition of terms.  Ryzen 3 is zen 2. The zen+ thing made everything messed up.

 As far as meaning, You know what I’m talking about, I know what I’m talking about.  

Ikr, what lunatic decided to call it zen+

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Flying Sausages said:

Should have get 4x RTX Titan to flex more.

Might actually be a bit faster. I could imagine they are binned

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Bombastinator said:

so yes.  A hack. One where you buy additional parts even.

Some boards you dont.

 

There is no hack involved.......

 

The parts are for splitting slots if need be. And if a m.2 expancion card is a hack, then im not sure you understand the purpose of the flexibility PCIe offers.

 

 

Edit: also PCIe has a good record of changing to the PCIe speed of the device.

 

Also multiplexing and adpating between PCIe versions are 2 different things. 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Allptraum1989 said:

Might actually be a bit faster. I could imagine they are binned

TITAN cards have a few Quadro features in them and some small changes at driver level to make them more compelling than the 2080 Ti for workstation workloads... it's not just those few extra cuda cores and double the memory that sets them apart.

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoldenLag said:

Some boards you dont.

 

There is no hack involved.......

 

The parts are for splitting slots if need be. And if a m.2 expancion card is a hack, then im not sure you understand the purpose of the flexibility PCIe offers.

 

 

Edit: also PCIe has a good record of changing to the PCIe speed of the device.

 

Also multiplexing and adpating between PCIe versions are 2 different things. 

Goes to definition of hack.  It’s a vague word.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Bombastinator said:

Goes to definition of hack.  It’s a vague word.

Off the top of my head i dont know the real definition of hack. But i wouldnt be surprised if it is technically a hack by literal definition.

 

But i really wouldnt consider an adaptor a hack anymore then i would consider a fan splitter a hack. 

Link to comment
Share on other sites

Link to post
Share on other sites

Of course a rig with 4x modern GPUs will beat a single modern processor in graphics rendering tasks. That's just common sense...

 

Regardless of any implementation details, 4x high end GPUs is actually quite a bit of rendering horsepower, and quite expensive as well. Are you sure you actually need  or can make use of such a machine?

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

I just I just want to add this piece of stupidity because the title is playing on my mind

290F4748-E761-4385-A47B-79EBBB88F59B.jpeg.279e5bd9ad79896a90d013a8dc90ebb9.jpeg

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 months later...

I want to revive this thread because there's no clear answer yet, would 4 2080ti beat a threadripper 3990x for rendering (let's say) the classroom scene in blender using cycles ?  

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think it's answered because no one really cares.

@BigPig

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, BigPig said:

I want to revive this thread because there's no clear answer yet, would 4 2080ti beat a threadripper 3990x for rendering (let's say) the classroom scene in blender using cycles ?  

This question reminds me of one that was bandied about in my high school long ago:


“If you’re driving down the road and your station wagon loses a tire how many pancakes does it take to cover up a dog house?”

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Bombastinator said:

This question reminds me of one that was bandied about in my high school long ago:


“If you’re driving down the road and your station wagon loses a tire how many pancakes does it take to cover up a dog house?”

Dude! I don't know if you realize but you're only polluting this thread with your egocentric useless replays, with all the respect... but if you don't have a valid answer let's leave it for somebody else. 

Maybe someone who already own a TR 3990x and 4x2080ti build in this forum can run a render of an available blender scene and provide us with the render time. 

Some people including myself have already tried GPU rendering and aren't pleased with it, so we need to know a couple of things before investing $4000 on a CPU. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×