Jump to content

Folding on different PCIe slots

Go to solution Solved by FloRolf,

no, it doesn't have to Transfer massive data, it just calculates and sends it back.

Will folding on a x1 PCIe slot or x4 PCIe slot make a difference in the performance of a card?

Link to comment
Share on other sites

Link to post
Share on other sites

not really.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

no, it doesn't have to Transfer massive data, it just calculates and sends it back.

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, GDRRiley said:

not really.

6 hours ago, FloRolf said:

no, it doesn't have to Transfer massive data, it just calculates and sends it back.

Oh yeah?

 

Then please, I would love to see recent test results to back up those claims. Please, if you guys actually have a link for such tests, I'm interested in getting them, for I just can't find that no matter how much I search.

 

On the other hand, though, the most recent test I found (which was during core 15/17 era) showed the exact opposite. The guy grabbed a Gtx 970, the latest card at the time, and showed evidence that there was a little loss in performance at 4x and a MASSIVE loss at 1x.

 

@WhatComesAround, this has been a topic for debate ever since the dawn of times. According to @whaler_99, people did testing any years ago and found no difference. That's probably the source of the 2 responses you've seen above me. But both GPUs and software have changed a lot since then, and (as I've said) there have been reports of performance losses. The fact that a free CPU core is advised to feed GPUs only helps that claim

 

So who's right here? I honestly have no idea. Again, it's back to that lack of testing I mentioned.

 

For peace of mind, though, I would avoid folding on 4x if your GPU is a powerful one. And never fold on 1x, unless it's a really potato one.

Want to help researchers improve the lives on millions of people with just your computer? Then join World Community Grid distributed computing, and start helping the world to solve it's most difficult problems!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Imakuni said:

Oh yeah?

 

Then please, I would love to see recent test results to back up those claims. Please, if you guys actually have a link for such tests, I'm interested in getting them, for I just can't find that no matter how much I search.

 

On the other hand, though, the most recent test I found (which was during core 15/17 era) showed the exact opposite. The guy grabbed a Gtx 970, the latest card at the time, and showed evidence that there was a little loss in performance at 4x and a MASSIVE loss at 1x.

 

@WhatComesAround, this has been a topic for debate ever since the dawn of times. According to @whaler_99, people did testing any years ago and found no difference. That's probably the source of the 2 responses you've seen above me. But both GPUs and software have changed a lot since then, and (as I've said) there have been reports of performance losses. The fact that a free CPU core is advised to feed GPUs only helps that claim

 

So who's right here? I honestly have no idea. Again, it's back to that lack of testing I mentioned.

 

For peace of mind, though, I would avoid folding on 4x if your GPU is a powerful one. And never fold on 1x, unless it's a really potato one.

Thank you very much for answering so many of my recent questions, you are really helping me out greatly. :D

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Imakuni said:

Oh yeah?

 

Then please, I would love to see recent test results to back up those claims. Please, if you guys actually have a link for such tests, I'm interested in getting them, for I just can't find that no matter how much I search.

 

On the other hand, though, the most recent test I found (which was during core 15/17 era) showed the exact opposite. The guy grabbed a Gtx 970, the latest card at the time, and showed evidence that there was a little loss in performance at 4x and a MASSIVE loss at 1x.

 

@WhatComesAround, this has been a topic for debate ever since the dawn of times. According to @whaler_99, people did testing any years ago and found no difference. That's probably the source of the 2 responses you've seen above me. But both GPUs and software have changed a lot since then, and (as I've said) there have been reports of performance losses. The fact that a free CPU core is advised to feed GPUs only helps that claim

 

So who's right here? I honestly have no idea. Again, it's back to that lack of testing I mentioned.

 

For peace of mind, though, I would avoid folding on 4x if your GPU is a powerful one. And never fold on 1x, unless it's a really potato one.

Well people also do mining on 1x risers. I think that maybe in the tests you are talking about they didn't have a molex power included in the riser. Because obviously 1x doesn't deliver all the power a 16x could, which is a massive 75watts.

Additionally the argument with one free cpu core is ONLY for Nvidia gpus. Amd does NOT need a free core per gpu. 

 

I haven't done much research but I will so be can have peace. Sadly I can't really test this right now. 

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

It will reduce your PPD around 5-8% if you use a pcie 2,0 isntead of pcie 3.0

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, FloRolf said:

Well people also do mining on 1x risers.

Pointless. Mining is mining, folding is folding. They 2 are completely different things. Just as an example I'm familiar with, geneferocl (for prime number testing) can go from next to no GPU usage with full CPU to screen-lag GPU load with high vRAM abuse and next to no CPU just by changing the type of number being tested, not even changing the processing algorithm.

18 minutes ago, FloRolf said:

I think that maybe in the tests you are talking about they didn't have a molex power included in the riser. Because obviously 1x doesn't deliver all the power a 16x could, which is a massive 75watts.

Quick explanation about how PCIe works:

291549.image0.jpg

The first part of the slot is ONLY for power, while the second is ONLY for data.

 

Now, you notice how regardless of the size (1x or 16x), the first part is exactly the same? Well, that's because it really is! Assuming the slot is following the spec, it WILL be able to provide 75w, no matter if it's 1x or 16x. In fact, it can provide even more than that, as seeing by the RX 480 no blowing up mobos with it's over 75w PCIe requirement.

 

Speaking of which, the 970 pulls less than 75w (when OCed) anyway:

r_600x450.png

 

The reason why risers tend to have molex connectors attached to them is not that the slot can't provide enough power, no. It's that motherboards aren't designed to power 6 or 7 graphics cards all at once, which is where a 1x riser would be likely to be used; to prevent the mobo from burning out, a Molex connector is added.

18 minutes ago, FloRolf said:

Additionally the argument with one free cpu core is ONLY for Nvidia gpus. Amd does NOT need a free core per gpu.

I guess I forgot to explicitly mention that, given that I had the 970 test in mind when I was writing my post. Oops.

 

But I'll question that statement for recent usage anyways, without evidence to back it up, though.

Want to help researchers improve the lives on millions of people with just your computer? Then join World Community Grid distributed computing, and start helping the world to solve it's most difficult problems!

 

Link to comment
Share on other sites

Link to post
Share on other sites

I've tried running a lowly 750TI via a powered PCIE 1x riser, and it never passed 60k PPD. Directly in the 16x slot, it regularly hit 100k. So, there - unscientific realworld data, completely unconfirmed :D

 

Running Heaven benchmark actually gave a smaller performance hit - go figure, I would think that having to display textures and such would be worse. Can't remember details, but it did pull a predictable amount of power from that riser (has a kill-o-watt on it) so don't think the performance decreased was power related. It also boosted to the same levels. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×