Jump to content

290X Crossfire on an x16 and a x4 PCI-E 2.0. Is that bad?

crystal6tak

Motherboard is the GA-970A-UD3.

 

Just realized the 2 slot is an x16 and an x4. Will there be an issue of it slowing down the cards? Hawaii GPU doesn't have an crossfire bridge and relies entirely on PCI-E and plus it's x4... Should I be worried?

Link to comment
Share on other sites

Link to post
Share on other sites

1 in an 8x slot would be acceptable. But 4x?.......

 

"My opinion is that your opinion is wrong." - AlwaysFSX    CPU I5 4690k MB MSI Gaming 5 RAM 2 x 4GB HyperX Blu DDR3 GPU Asus GTX970 Strix,  Case Corsair 760T Storage 1 x 120GB 840EVO 1 x 1TB WD Blue, 1 x 500GB Toshiba  

 The cave/beast v2 (OLD) http://imgur.com/a/8AmeH                                  PSU 600W Raidmax RX600AF Displays ASUS VS278Q-P x2, BenQ Xl2720z Cooling Dark Rock 3, 4 AP120s Keyboard Logitech G710+ Mouse Razer Deathadder 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think Crossfire is able to run down at x4, SLI requires x8 minimum... give it a shot but you may be lacking in performance on that card.

Intel I9-9900k (5Ghz) Asus ROG Maximus XI Formula | Corsair Vengeance 16GB DDR4-4133mhz | ASUS ROG Strix 2080Ti | EVGA Supernova G2 1050w 80+Gold | Samsung 950 Pro M.2 (512GB) + (1TB) | Full EK custom water loop |IN-WIN S-Frame (No. 263/500)

Link to comment
Share on other sites

Link to post
Share on other sites

If it was a x8 slot, I wouldn't worry, but that's a x4... :(

 

Spoiler

Senor Shiny: Main- CPU Intel i7 6700k 4.7GHz @1.42v | RAM G.Skill TridentZ CL16 3200 | GPU Asus Strix GTX 1070 (2100/2152) | Motherboard ASRock Z170 OC Formula | HDD Seagate 1TB x2 | SSD 850 EVO 120GB | CASE NZXT S340 (Black) | PSU Supernova G2 750W  | Cooling NZXT Kraken X62 w/Vardars
Secondary (Plex): CPU Intel Xeon E3-1230 v3 @1.099v | RAM Samsun Wonder 16GB CL9 1600 (sadly no oc) | GPU Asus GTX 680 4GB DCII | Motherboard ASRock H97M-Pro4 | HDDs Seagate 1TB, WD Blue 1TB, WD Blue 3TB | Case Corsair Air 240 (Black) | PSU EVGA 600B | Cooling GeminII S524

Spoiler

(Deceased) DangerousNotDell- CPU AMD AMD FX 8120 @4.8GHz 1.42v | GPU Asus GTX 680 4GB DCII | RAM Samsung Wonder 8GB (CL9 2133MHz 1.6v) | Motherboard Asus Crosshair V Formula-Z | Cooling EVO 212 | Case Rosewill Redbone | PSU EVGA 600B | HDD Seagate 1TB

DangerousNotDell New Parts For Main Rig Build Log, Señor Shiny  I am a beautiful person. The comments for your help. I have to be a good book. I have to be a good book. I have to be a good book.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why oh why would you possibly spend so much money on GPUs and then put it in a budget board? Also to answer your question - Nvidia cards just straightup refuse to work on 4x bandwith so that should give you a pretty good idea.

Core i7 4820K  |  NH-D14 | Rampage IV Extreme | Asus R9 280X DC2T | 8GB G.Skill TridentX | 120GB Samsung 840 | NZXT H440  |  Be quiet! Dark Power Pro 10 650W

Link to comment
Share on other sites

Link to post
Share on other sites

A PCI-E v2 x4 slot is equivalent to a PCI-E v3 x2 slot. That's a big bottleneck right there.

Link to comment
Share on other sites

Link to post
Share on other sites

From what i've heard, it'll work, but there will be a performance hit that keeps the 2nd card from its full potential.

Asus B85M-G / Intel i5-4670 / Sapphire 290X Tri-X / 16GB RAM (Corsair Value 1x8GB + Crucial 2x4GB) @1333MHz / Coolermaster B600 (600W) / Be Quiet! Silent Base 800 / Adata SP900 128GB SSD & WD Green 2TB & SG Barracuda 1TB / Dell AT-101W / Logitech G502 / Acer G226HQL & X-Star DP2710LED

Link to comment
Share on other sites

Link to post
Share on other sites

Let me get his straight forward.You are wasting the second gpu.

 C++,Assembly,Reverse Engineering,Penetration testing,Malware analysis


 


Do you have a question?Are you interested in programming?Do you like solving complex problems?Hit me up with a pm.

Link to comment
Share on other sites

Link to post
Share on other sites

you will probably see a small difference in performance compared to if you were to use 16x 8x, but can you not do 8x 8x or does your motherboard only have a 4x pci slot free?

Link to comment
Share on other sites

Link to post
Share on other sites

It'll be a bottleneck, but it won't be that big.

This.

I did not notice a diferance when I moved my card from x16 to x4 (2.0)

n0ah1897, on 05 Mar 2014 - 2:08 PM, said:  "Computers are like girls. It's whats in the inside that matters.  I don't know about you, but I like my girls like I like my cases. Just as beautiful on the inside as the outside."

Link to comment
Share on other sites

Link to post
Share on other sites

It'll be a bottleneck, but it won't be that big.

 

This.

I did not notice a diferance when I moved my card from x16 to x4 (2.0)

Do you know any rough numbers? Like is it 10% or 20% worse than no bottleneck?

Link to comment
Share on other sites

Link to post
Share on other sites

It will function, just not nearly at 100%

I know from personal experience... im running two xfired 7850's with one on a PCI-E 16x 2.0 (which is totally acceptable) and the second one on a PCI-E 4x 1.1 (lol). Gotta love dat Z68 chipset though xD

CPU: AMD FX8350 @4.4GHz | MOBO: ASUS Sabertooth 990FX R2.0, 990FX chipset | RAM: 16GB (4x4) dual channel Patriot Xtreme series DDR3 @1866MHz @1.65V | GPU: Asus Radeon R9 Fury Strix| PSU: Corsair AX860i 860 watt | CPU cooling: Noctua NH-D15S + additional Noctua NF-F12 | Case:Corsair C70 Black | Storage: 3x 128GB Samsung 840 pro SSDs; 1 for the OS, 2 in RAID 0 for games. 3x WD Red 3TB HDDs in raid 5 for bulk storage | Displays: 1x Dell 3007WFP 30 inch 2560x1600 IPS LCD. 1x I-Inc IH253DPB 25 inch 1920x1080 TN LCD | Keyboard: Corsair K70 with Cherry MX brown switches + Blue LED backlight | Headphones: Sennheiser HD 280 Pro | Mouse: Logitech G600 @1100 DPI | OS: Win 10 Pro 64 bit | 

Mfg/model number: Clevo/W355SSQ | CPU: Intel i7 4710MQ @3.5GHz  | MOBO: W35xSS_370SS, HM87 Chipset | RAM: 16GB (2x8) dual channel Crucial Ballistix DDR3 @1866MHz | GPU: GTX860m 2GB Gddr5 | Battery: 76,960mW/h 8 Cell battery, 3 Hrs full on a full charge | Storage: 1x 128GB Samsung 840 pro SSD for the OS. 1x WD Red 1TB HDD For storage and games | Displays: 1x 15.6" 1080p LCD | Keyboard: Full 103 key back-lit keyboard | Mouse: Logitech M510 | OS: Win 10 Pro 64 bit |

Link to comment
Share on other sites

Link to post
Share on other sites

i was running 2 r9 280x on an h97 chipset (the motherboard was "crossfire ready").The difference from x8 x8 and mine x16 x4 was like 5%.But you are talking about 2 r9 290x so the loss of power will be marginally higher (i'm estimating around 12 to 15% or even higher in some games)

Link to comment
Share on other sites

Link to post
Share on other sites

you may get lower results in benchmarks, but for most games I can't see there being much of a difference. You're really just at the mercy of data being transferred from system memory to GPU memory, which for pci-e 2.0 4x is about 2 Gigabytes/second.

 

This comparison should put your mind at ease: (it seems that in this case the AMD card relies less on pci-e bandwidth than Nvidia for single monitor 1200p gaming, which makes sense given that the 680 in this example has less video memory)

 

crysis_5760_1080.png

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Motherboard is the GA-970A-UD3.

 

Just realized the 2 slot is an x16 and an x4. Will there be an issue of it slowing down the cards? Hawaii GPU doesn't have an crossfire bridge and relies entirely on PCI-E and plus it's x4... Should I be worried?

 

AMD suggest minimum PCIE 3.0 8x/8x for r9 290 crossfire. you will have better performance than with a single card however this will be limited.  for two reasons.

 

1) not enough bandwidth for the cards to share info

2) high latency between the cards (assuming the 970a runs the 4x slot via the chipset)

 

I experienced lower performance and audio stutters when running my r9 290 crossfire via a 16x/4x pcie 3 system. see below results 

 

 - this one was a z97 Gigabyte board with 16x/4x PCIe 3.0

post-144788-0-64635700-1418672845.png

 

 - this onewas a z87 ASUS board with 8x/8x PCIe 3.0

post-144788-0-41973400-1418672861_thumb.

 

 

I also went to about 16k gpu points when I ran pcie 16x/4x on the Z87 board

 

I went from about 95 fps to 99 fps in Valley

 

I noticed no difference in Tomb raider, but farcry 3 was a huge difference

you may get lower results in benchmarks, but for most games I can't see there being much of a difference. You're really just at the mercy of data being transferred from system memory to GPU memory, which for pci-e 2.0 4x is about 2 Gigabytes/second.

 

This comparison should put your mind at ease: (it seems that in this case the AMD card relies less on pci-e bandwidth than Nvidia for single monitor 1200p gaming, which makes sense given that the 680 in this example has less video memory)

 

 

 

r9 290's are a little different... they use xDMA to share data entireley over the PCI bus. Depending on mobo this can be via the chipset - introducing huge latency in accessing memory.  The above test does not consider brdigeless nor PCIe over chipset.

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

So I take it that the second PCIe x16 (actually x4 gen 2) slot on my dead h87m pro would have been effectively useless then for crossfire (which was part of its marketing).

 

 

you may get lower results in benchmarks, but for most games I can't see there being much of a difference. You're really just at the mercy of data being transferred from system memory to GPU memory, which for pci-e 2.0 4x is about 2 Gigabytes/second.

 

This comparison should put your mind at ease: (it seems that in this case the AMD card relies less on pci-e bandwidth than Nvidia for single monitor 1200p gaming, which makes sense given that the 680 in this example has less video memory)

 

crysis_5760_1080.png

Thanks for the chart, by the looks of things my p5k-vm will be just fine with my gtx 970 (pcie v1-full x16), and will be relevant for quite some time.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

So I take it that the second PCIe x16 (actually x4 gen 2) slot on my dead h87m pro would have been effectively useless then for crossfire (which was part of its marketing).

fine for crossfire cards with a bridge. The r9 290 is the first card to use xDMA and the first to require (basically) an Nvidia SLI certified board.

AMD still let you use 16x/4x for r9 290s but it's not optimal.

nvidia are just like "nope" and when you hack it to use hyperSLI you lose similar levels of performance

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

so xDMA is were cards in crossfire actually use the link to share data, instead of relying heavily on PCIe slots? Might be worth looking into AMD for graphics if Nvidia are going to be slack. (I know about some of the current 'issues' people have with AMD cards, but just like all of my previous computers, if it works and does what I want when I want, nothing else matters ie. I'm not a fanboy)

 

Edit: looked it up, it probably will still affect performance on motherboards with a cheaply made pcie setup, because there is no real excuse for not having a x16 sized slot capable of delivering full x16, instead of 2x or 4x with all other pcie slots disabled (aka, there goes my wifi) other than a slightly higher cost. Perhaps if the bridge was dedicated for certain tasks to take some of the load off the pci slots?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

What of the EXTRA latency that it brings that destroys performance smoothness? Meaning non-native 2.0 4x provided from the chipset not the CPU.

 

I get the native PCIE 2.0 4x is mostly fine, but what about the 2ndary chipset latency?

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

What of the EXTRA latency that it brings that destroys performance smoothness? Meaning non-native 2.0 4x provided from the chipset not the CPU.

These is a measured increase in frame time varience in metro ll benchmark

I get the native PCIE 2.0 4x is mostly fine, but what about the 2ndary chipset latency?

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

 

What of the EXTRA latency that it brings that destroys performance smoothness? Meaning non-native 2.0 4x provided from the chipset not the CPU.

These is a measured increase in frame time varience in metro ll benchmark

I get the native PCIE 2.0 4x is mostly fine, but what about the 2ndary chipset latency?

 

I've been reading around on it a bit more, there are a few cases of dual-gpu being worse off than single gpu because of thie variance in frame time delivery :(

One of the reasons I now am grabbing a new 8x8 @ 3.0/3.0 mobo first before another 290 for my existing 16x/4x <-3.0 and non-native 2.0

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×