Jump to content

Will PCIe 3.0 have an impact on distributed computing projects

Hey everyone,

One last question for today: Does PCIe 3.0 make a difference in a rig used for distributed computing  when equipped with a latest high-end GPU or will the x16 lanes just never be saturated in these projects?

Thanks in advance!

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

No, there is plenty of bandwidth even with 2.0.

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

2.0 is fine for ALLOT of modern gpu's (even at x8)

 

3.0 should be PLENTY

Link to comment
Share on other sites

Link to post
Share on other sites

The difference in 3.0 and 2.0 is very negligible.  It the reason some individuals use raiser cards and ribbons so they can run multiple GPUs off a single board even when it only at 4x or 8x and not 16x.

 

Like a BOINC setup such as this one.

900x900px-LL-82313448_IMG_0426.jpeg

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Nicnac said:

Hey everyone,

One last question for today: Does PCIe 3.0 make a difference in a rig used for distributed computing  when equipped with a latest high-end GPU or will the x16 lanes just never be saturated in these projects?

Thanks in advance!

Like everything: it depends. For most projects, It's probably not a problem. But I can see some random BOINC project having high bandwidth requirements. Generally, you'll be fine with 3.0 8x or 2.0 x16 and above.

Want to help researchers improve the lives on millions of people with just your computer? Then join World Community Grid distributed computing, and start helping the world to solve it's most difficult problems!

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Ithanul said:

The difference in 3.0 and 2.0 is very negligible.  It the reason some individuals use raiser cards and ribbons so they can run multiple GPUs off a single board even when it only at 4x or 8x and not 16x.

 

Like a BOINC setup such as this one.

900x900px-LL-82313448_IMG_0426.jpeg

Woah.......

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Nicnac said:

Woah.......

I get to see quite a few of those on the different forum I frequent.

 

Some are all air cooled to some with crazy amount of AIOs to custom loops.  The only thing the computers do is crunch 24/7.

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Ithanul said:

I get to see quite a few of those on the different forum I frequent.

 

Some are all air cooled to some with crazy amount of AIOs to custom loops.  The only thing the computers do is crunch 24/7.

What CPU is powering all those? 

Do you think my old Phenom 1090T could support a 1070 and a 970 just running boinc or folding? Obviously it would be bottlenecking in games.

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Nicnac said:

What CPU is powering all those? 

Do you think my old Phenom 1090T could support a 1070 and a 970 just running boinc or folding? Obviously it would be bottlenecking in games.

You be fine running two GPUs with a 1090T.  I do so myself with my folder/BOINC rig at the moment which has a 1090T with a GTX980 and GTX960 all running under Mint Mate 18.  The 1090T has six threads so you will be fine.  I actually fold with both cards and the 1090T has three threads doing BOINC projects.

 

For rigs such as the one pictured, those types tend to run Xeons since Nvidia cards require one thread of the CPU for each card to do compute work.

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

On 23/10/2016 at 1:32 PM, Ithanul said:

For rigs such as the one pictured, those types tend to run Xeons since Nvidia cards require one thread of the CPU for each card to do compute work.

Which is kinda BS on the software part, given that the required thread doesn't actually do anything... but yeah, F@H needs a thread / core per Nvidia GPU.

Want to help researchers improve the lives on millions of people with just your computer? Then join World Community Grid distributed computing, and start helping the world to solve it's most difficult problems!

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×