Jump to content

Negatives about amd cpu

If you plan to upgrade again in the near future anyway, just wait for Zen... In the mean time, assuming you've already settled on using AM3+, just try to find an old Phenom II, as those have got better single-core performance than FX chips, and will be a lot cheaper, which is quite important for something that sounds like it is only ever going to be a stop-gap solution. Whatever you do, I certainly wouldn't buy a brand-new CPU now, only to buy another brand-new one before the end of the year...

Main Rig "Melanie" (click!) -- AMD Ryzen7 1800X • Gigabyte Aorus X370-Gaming 5 • 3x G.SKILL TridentZ 3200 8GB • Gigabyte GTX 970 G1 Gaming • Corsair RM750x • Phanteks Enthoo Pro --

HTPC "Keira" -- AMD Sempron 2650 • MSI AM1I • 2x Kingston HyperX Fury DDR3 1866 8GB • ASUS ENGTX 560Ti • Corsair SF450 • Phanteks Enthoo EVOLV Shift --

Laptop "Abbey" -- AMD E-350 • HP 646982-001 • 1x Samsung DDR3 1333 4GB • AMD Radeon HD 6310 • HP MU06 Notebook Battery • HP 635 case --

Link to comment
Share on other sites

Link to post
Share on other sites

also what user experience?

your own?

so far, people who bitch about stuff is usually the type who can be described as "grass is always greener on the other side".

been there, done that. guess how much it cost me. 550 USD.

guess how much performance i gained. 20% in games.

guess how much my old FX8320 bottlenecked the 295x2 OUTSIDE of The Witcher 3 (which didnt have CF profiles, and also, the reason i changed to intel was to play TW3 and heavily modded skyrim at full details)

the answer is> not a whole lot...

ill ask my friend if i can have my FX and sabertooth back for some testing over the coming weekend. he still havent built his rig.

then ill redo some benches with my FX running at its proper speeds.

ill have to mod my kraken x60 to life again, or run stock cooler with the FX though...

Intel, so far, has provided me with 3 benefits with their z97 chipset.

PCIe Gen3

M.2

Support for high speed memory.

what did i lose vs my old 990FX chipset?

4 USB 2 ports

2 USB 3 ports

2x eSata ports

tripple CF

2x Sata ports

So i lost shitloads oc connectivity. Which is neat, since i own 2x external HDDs, webcam, mouse, KB + some more.

was it worth spending 550USD to get a i7? no. it wasnt.

my old setup ran 2x HD 7950 in CF without issues. Yes i tested them with my i7, they perform a few FPS better, but that is more likely to be due to recent driver improvements as on average, i only got like 5FPS more in games i KNOW can use all 8 threads of a CPU.

so stop spreading bullshit. you havent owned a FX, you never will.

just like with FM2+ vs i3. If you did a blind test without any FPS counters, just pure gaming with no evidence of which was which. i dare say you wouldnt know which was which.

Remember, just because a game uses ONE core, doesnt mean the load is heavy. The CPU workload itself is usually pretty light for games. Although intense, it is not complex. Complexity is what kills FX. Synthetics uses complex code. Thats why intel will win. the workload itself exposes FXs weakness. Which is good, because a synthetic benchmark should do that.

however, let me ask you this.

Haswell is, core for core, 70% faster then FX Piledriver cores.

so in a game that is purely CPU bound with a TitanX, running at 600x480 we should in theory see 70% higher performance.

BUT WE DONT. we see at most 30%.

so what is all that superior IPC doing? What is the lower cache and sub-cache latencies doing?

Or more importantly, WHY THE FUCK IS FX PERFORMING THAT GOOD???? it should be smashed, rightfully so. It loses, but nowhere near as bad as it SHOULD.

now, i will not try to claim AMD has exchanged 70% core performance so they could add some vodoo or witchcraft into their architecture. But in the end, when you boil it down. Most of the benefit Intel has over AMD is PURELY IN YOUR HEAD.

once you get over the "giddy" period where "everything new is so much better", you realize, its just like it was.

so at this point, i am more interested in finding out why Intel processors perform so badly compared to FX.

Because any way you look at it. a 970 is a 970.... if you have 70% more raw CPU power, that should by all means produce MORE then 10fps difference....

to give you an idea how much power that is..

skylake i5 6600k vs Sandy Bridge i5 2500k -> skylake is 30% stronger, core vs core. And it yields roughly 30% better performance when both are using RAM with equal speed. So why the fuck can FX which is notably slower, by every possible measurable way, be within 30% of ANY FUCKING INTEL CPU???

Other people on this forum posting SS in topic with the GPU load after playing games.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

It is shameful when a 8 core @ 4+ Ghz loses to a dual core HT i3 6100 @ 3.7 Ghz in more titles than it wins, i just dont think it is correct to be misinformed that the FX 8350 is a competition for the i5 6600k when gaming when it is losing to the i3.

 

Even in DX12.

Link to comment
Share on other sites

Link to post
Share on other sites

Other people on this forum posting SS in topic with the GPU load after playing games.

SS is useless. those only prove what you want to show.

say 20 seconds of 5 minutes of play show clear signs of bottlenecks. Ok, you dont like FX or want to prove that it bottlenecks, you SS that moment.

this is why i swear to YT videos WITH FPS overlay. If there is bottlenecks they are much more noticeable. If there isnt, they will clearly show that there isnt.

My point still stands though, why does a 5960x, which core vs core is 70% stronger, only beat a FX 8350 by 30%???

what is all that superior performance doing? beause often, its not even 30% difference, its LESS.

Link to comment
Share on other sites

Link to post
Share on other sites

the e series are massively underclocked. they only turbo if they have super low temps.

a 8320e loses to a 8320 non-e in every fucking benchmark

http://anandtech.com/bench/product/1402?vs=698

in gaming benchmarks its even worse. because not all games are able to trigger turbo on AMD CPUs. The load they put on the CPUs simply arent heavy enough for turbo to engage all the time. Unlike intel which turbos almost all the time.

here. 8370e vs 8320

http://anandtech.com/bench/product/1340?vs=698

8370e has higher turbo speeds then 8320. so if it turbos, it should win, every fucking time.

the 8320e that AT got was a golden sample. it hit 4.8Ghz AND was possible to undervolt from stock.... that is fucking insane. So even a golden chip vs the 8320 and stock vs stock IT STILL LOSES.

yet when we do 8320e @ 4.8Ghz things are different. VERY different.

http://anandtech.com/bench/product/1403?vs=698

their 8320e at 4.8Ghz beats out a 9590. Thats how good their chip was/is.

in gaming a 8320e is equal or WORSE then a FX 6300.... yeah, thats how bad it is.

those CPUs are meant as an upgrade for older/lesser chipset users, like 880 and 970 chipsets. Yet in practice, their mostly equal due to clock speed being notably lower then normal.

IF however, you spend the time to OC these e chips, they will offer higher clockspeeds at lower voltages. AND THAT IS ONLY IF YOU BOTHER TO MANUALLY OC AND UNDERVOLT.

Just like i managed to get 4GHZ at 1.28V on my 8320, just downvolted .08v yet works fine at full 4GHZ clock.

Link to comment
Share on other sites

Link to post
Share on other sites

SS is useless. those only prove what you want to show.

say 20 seconds of 5 minutes of play show clear signs of bottlenecks. Ok, you dont like FX or want to prove that it bottlenecks, you SS that moment.

this is why i swear to YT videos WITH FPS overlay. If there is bottlenecks they are much more noticeable. If there isnt, they will clearly show that there isnt.

My point still stands though, why does a 5960x, which core vs core is 70% stronger, only beat a FX 8350 by 30%???

what is all that superior performance doing? beause often, its not even 30% difference, its LESS.

It was more than a few minutes. And they were trying to prove there was no bottleneck with their FX8350 and GTX 970. Oh and I'd like to see where that 5960X is only beating an FX 8350 by 30%.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

It was more than a few minutes. And they were trying to prove there was no bottleneck with their FX8350 and GTX 970. Oh and I'd like to see where that 5960X is only beating an FX 8350 by 30%.

go find a benchmark. like any.

Link to comment
Share on other sites

Link to post
Share on other sites

go find a benchmark. like any.

Uh uh, the burden of proof is on you.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Other people on this forum posting SS in topic with the GPU load after playing games.

They also provide video's and such, but the ignorant will ignore them too.

This video is less than a few months old.... Gameplay starts at 1m:40s

I play this everyday,.. on my GTX970, I know how well it should run, and also saw the same with a HD7950 when I had that and put it in my mates FX6.

See this FX-9 playing WOT with his GTX970... Yes, its playable... has ample FPS for playability,... problem is,...

That GPU should have him at a locked 60fps with Vsync or 70-110fps ALL THE TIME without it.

He's not getting the EFFICIENT use of his hardware he should be expecting.

Yes, I'm giving one example and there are examples of FX playing games fine, but the issue is there on some select games, that THOUSANDS of people play.

I saw it first when I put MY HD7950 into my mates FX-6 and the FPS I was used to seeing,... was halved! Only spiking up to where it should be, but dragging itself back down...

Also... I'd rather buy a CPU you DIDN'T NEED to OC to get the performance your after...

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

They also provide video's and such, but the ignorant will ignore them too.

This video is less than a few months old.... Gameplay starts at 1m:40s

I play this everyday,.. on my GTX970, I know how well it should run, and also saw the same with a HD7950 when I had that and put it in my mates FX6.

See this FX-9 playing WOT with his GTX970... Yes, its playable... has ample FPS for playability,... problem is,...

That GPU should have him at a locked 60fps with Vsync or 70-110fps ALL THE TIME without it.

He's not getting the EFFICIENT use of his hardware he should be expecting.

Yes, I'm giving one example and there are examples of FX playing games fine, but the issue is there on some select games, that THOUSANDS of people play.

I saw it first when I put MY HD7950 into my mates FX-6 and the FPS I was used to seeing,... was halved! Only spiking up to where it should be, but dragging itself back down...

Also... I'd rather buy a CPU you DIDN'T NEED to OC to get the performance your after...

Considering my Phenom II N970 can handle WOT.....

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Considering my Phenom II N970 can handle WOT.....

Thing is, many 1-4core games have these FPS drops due to the single core performance of FX. People write off F2P games, but their so popular... it should count for something.

 

Thank god more modern games are doing better, but thing is, many thousands of people play these lighter titles and it should count...

But people just see me saying World of Tanks and go its a shit game, (Im just using it as an example) even tho the issue still is present on other games when it comes to MIN-FPS and the drops.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Thing is, many 1-4core games have these FPS drops due to the single core performance of FX. People write off F2P games, but their so popular... it should count for something.

 

Thank god more modern games are doing better, but thing is, many thousands of people play these lighter titles and it should count...

But people just see me saying World of Tanks and go its a shit game, (Im just using it as an example) even tho the issue still is present on other games when it comes to MIN-FPS and the drops.

I honestly don't think that people who say that offhand were actually players who were able to actually work with the rest of their team in WOT-its always those who are used to playing single player games (or play games like COD/CSGO/BF) who dislike WOT (though it does have its faults-its far from a bad or shit game). And games like WOT are good because they are designed to run well on even low end hardware-it even lets you choose between the original engine and the new one. Though WT even runs on the Intel HD4600, so almost any modern laptop or OEM desktop can run it properly.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Uh uh, the burden of proof is on you.

talk about missed opportunity. i gave you the chance to dig up ANY counter claim. no matter how biased or cherry picked. you should have taken my offer.

 

either way, here you go

 

All my % values are based upon average % difference. The reason for this, is that just going by sheer % difference, you need to choose which value is the "reference". Either way you put it, you cannot choose intel or AMD value as reference, as such a value will be BIASED AS FUCK. so you got to use the average difference.

 

Method for average % difference used is as following

 

Step 1: Calculate the difference (subtract one value form the other) ignore any negative sign Step 2: Calculate the average (add the values, then divide by 2) Step 3: Divide the difference by the average Step 3: Convert that to a percentage (by multiplying by 100 and adding a "%" sign)

 

 

First off. from the 5960x review done by Techreport

c3-fps.gif

Difference in Crysis 3:  8.88%

 

watch-fps.gif

Difference in Watch Dogs: 12.19%

 

arkham-fps.gif

(here i will use the i7 4790k)

Difference in Arkham Origins:16.26%

 

thief-d3d.gif

Thief and BF4 using mantle shows 1FPs difference, between all intel and AMD CPUs. so i wont be bothering with mantle.

Difference in Thief (DX11): 25.21%

 

 

Now over to Anandtech

 

67049.png

(Here we will use the 8350, because it is the most bought FX CPU. and the 4960x since it scored the highest)

Difference in F1 2013:  34.24% FINALLY WE FOUND A SINGLE GAME TO PROVE ME WRONG!!!!

 

 

67050.png

(here we will use the 8350, and the 4790k)

Difference in Bioshock Infinite: 18.58%

 

67051.png

no point calculating this one. less then 5% difference

 

67052.png

(5960x vs 8350)

Difference in Sleeping Dogs: 11.83%

 

 

67053.png

(5960x vs 8350)

Difference in BF4: 14.00%

 

BUT WAIT. THERE IS MORE. LET US REDO ALL OF THOSE ANANDTECH BENCHMARKS. NOW LOOKING ONLY AT MINIMUMS!!!!!!

 

67054.png

Difference in Minimums: 46.1% EHRMAGAWD. INTEL IS ACTUALLY SHOWING THEIR SUPERIORITY:......

 

67055.png

Difference in Minimums: 80.19% difference <-FINALLY WE SEE THAT 70% STRONGER CORE DO SOMETHING....

 

67056.png

uhm... FX is better. right.....

 

67057.png

Difference in Minimums: 26.93%

 

67058.png

Difference in Minimums: 30.96%

 

 

so in the end, only in 4 cases we see intel pulling ahead notably. Bear in mind, ive made no difference if it is a i5 4590 or 5960X scoring the highest for intel. i take intels highest score and ONLY calculate vs the FX 8350. Which does "hurt my argument" as the 9590 would help the numbers for FX. however, only idiots buy a 9590, and 8350 is the highest grossing FX CPU atm. Sooooo, ill shoot myself in the foot and go with the 8350 even if it performs worse then other FX options.

 

in the end, if we average the numbers in total, we end up with: 27.11% difference...

 

So my question still stands. What are the fuck is going on with Intel CPUs?

They should perform better then this, or what the fuck is going on with AMD CPUs?

They should perform worse then this....

 

70% core performance does not explain the differences we see here. as we do include 8core 16 thread Intel CPUs, which should be vastly superior core vs core, and thread count wise. Things simply do not add up.

 

So perhaps the 70% benefit we see in sheer synthetic workloads really cannot translate how these pieces of hardware performs in real life. Bottom line is. Something is off, and it aint my math. Something is underperforming/overperforming, i dont know which of the "two sides" it is. But something doesnt seem right.

 

On the flip side.

AMD FX 8350 + 990 FX board is about 250-300 bucks.

a 5960x is around 1000 bucks, X99 boards around 150...

 

Gaming performance difference, as we can see, measure and calculate with MATH. Proves that the difference is less then 30% on average.

in that sense you pay 850 USD to increase your performance by 30%....850 USD.... i can build a GOOD second gaming PC for that kind of money.

Link to comment
Share on other sites

Link to post
Share on other sites

You know the CPU market has gone down the drain when people start recommending i3s in gaming builds O_O

A shadowy flight into the dangerous world of a man who does not exist.

 

Core 4 Quad Not Extreme, only available on LGA 557 at your local Circuit City

Link to comment
Share on other sites

Link to post
Share on other sites

You know the CPU market has gone down the drain when people start recommending i3s in gaming builds O_O

vZVI5qp.png
Te8fBhp.png
2BvbVqT.png
4cVKlC1.png
JBZHDiK.png
A9tdG8l.png
o01jvsH.png
Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

I am not going to pay an extra $100 to get 10-20FPS in games, if anything I would get a last generation i5 or Xeon, the PTP is just not their with the i3s and trying to do anything else while gaming on them is impossible. Hopefully the CPU market improves but like I said it in the the toilet right now if you are buying new. 

 

Edit: Also the i3s are not going to age well IMO when it comes to games, I also said this in another post but i3s are like taking a plane with 4 engines and having 2 die, the plane can still fly on those 2 engines but your not going to fly around the globe with them. I will not be surprised if game developers start requiring 4 true (logical) cores in future games, not likely but if it does happy well your in a bad position. 

A shadowy flight into the dangerous world of a man who does not exist.

 

Core 4 Quad Not Extreme, only available on LGA 557 at your local Circuit City

Link to comment
Share on other sites

Link to post
Share on other sites

I am not going to pay an extra $100 to get 10-20FPS in games, if anything I would get a last generation i5 or Xeon, the PTP is just not their with the i3s and trying to do anything else while gaming on them is impossible. Hopefully the CPU market improves but like I said it in the the toilet right now if you are buying new. 

 

Edit: Also the i3s are not going to age well IMO when it comes to games, I also said this in another post but i3s are like taking a plane with 4 engines and having 2 die, the plane can still fly on those 2 engines but your not going to fly around the globe with them. I will not be surprised if game developers start requiring 4 true (logical) cores in future games, not likely but if it does happy well your in a bad position. 

 

Still better than buying an 8 core AMD imo, the (desktop) CPU industry is shit thanks to monopoly thanks to lack of competition thanks to AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

Still better than buying an 8 core AMD imo, the (desktop) CPU industry is shit thanks to monopoly thanks to lack of competition thanks to AMD.

Yea I would not recommend the 8xxx series anymore. As for the fault part Intel has not been helping all that much either and then again they don't need to cause they know they can make money either way.

A shadowy flight into the dangerous world of a man who does not exist.

 

Core 4 Quad Not Extreme, only available on LGA 557 at your local Circuit City

Link to comment
Share on other sites

Link to post
Share on other sites

Yea I would not recommend the 8xxx series anymore. As for the fault part Intel has not been helping all that much either and then again they don't need to cause they know they can make money either way.

I have an FX-8320 at 4.73GHz, and on Cinebench r15 I can OC it to 5GHz and come close to a 4770k. But I know that AMD has one shot with their next gen CPUs or this is toast lmao.

Main Rig:

| 13900K@6.1/4.7 w/TVB | Corsair h150i Elite LCD | Sapphire NITRO SE+ 6900XT@2710MHz | ASUS Z790 Strix-E | Corsair DDR5 Dominator Platinum 2x16GB@6200MT/s | Lian Li O11 Dynamic EVO |

My Folding Stats

 

Link to comment
Share on other sites

Link to post
Share on other sites

Pcie3/M.2/Bottlenecking/Heat/Single-Core_Performance/FPS_in_games

PCIE3 doesn't matter and there are boards with M.2 support. Do not spread misinformation

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

talk about missed opportunity. i gave you the chance to dig up ANY counter claim. no matter how biased or cherry picked. you should have taken my offer.

 

either way, here you go

 

All my % values are based upon average % difference. The reason for this, is that just going by sheer % difference, you need to choose which value is the "reference". Either way you put it, you cannot choose intel or AMD value as reference, as such a value will be BIASED AS FUCK. so you got to use the average difference.

 

Method for average % difference used is as following

 

 

First off. from the 5960x review done by Techreport

 

Difference in Crysis 3:  8.88%

Difference in Watch Dogs: 12.19%

 

 

(here i will use the i7 4790k)

Difference in Arkham Origins:16.26%

 

 

Thief and BF4 using mantle shows 1FPs difference, between all intel and AMD CPUs. so i wont be bothering with mantle.

Difference in Thief (DX11): 25.21%

 

 

Now over to Anandtech

 

 

(Here we will use the 8350, because it is the most bought FX CPU. and the 4960x since it scored the highest)

Difference in F1 2013:  34.24% FINALLY WE FOUND A SINGLE GAME TO PROVE ME WRONG!!!!

 

 

 

(here we will use the 8350, and the 4790k)

Difference in Bioshock Infinite: 18.58%

 

 

no point calculating this one. less then 5% difference

 

 

(5960x vs 8350)

Difference in Sleeping Dogs: 11.83%

 

 

 

(5960x vs 8350)

Difference in BF4: 14.00%

 

BUT WAIT. THERE IS MORE. LET US REDO ALL OF THOSE ANANDTECH BENCHMARKS. NOW LOOKING ONLY AT MINIMUMS!!!!!!

 

 

Difference in Minimums: 46.1% EHRMAGAWD. INTEL IS ACTUALLY SHOWING THEIR SUPERIORITY:......

 

 

Difference in Minimums: 80.19% difference <-FINALLY WE SEE THAT 70% STRONGER CORE DO SOMETHING....

 

 

uhm... FX is better. right.....

 

 

Difference in Minimums: 26.93%

 

 

Difference in Minimums: 30.96%

 

 

so in the end, only in 4 cases we see intel pulling ahead notably. Bear in mind, ive made no difference if it is a i5 4590 or 5960X scoring the highest for intel. i take intels highest score and ONLY calculate vs the FX 8350. Which does "hurt my argument" as the 9590 would help the numbers for FX. however, only idiots buy a 9590, and 8350 is the highest grossing FX CPU atm. Sooooo, ill shoot myself in the foot and go with the 8350 even if it performs worse then other FX options.

 

in the end, if we average the numbers in total, we end up with: 27.11% difference...

 

So my question still stands. What are the fuck is going on with Intel CPUs?

They should perform better then this, or what the fuck is going on with AMD CPUs?

They should perform worse then this....

 

70% core performance does not explain the differences we see here. as we do include 8core 16 thread Intel CPUs, which should be vastly superior core vs core, and thread count wise. Things simply do not add up.

 

So perhaps the 70% benefit we see in sheer synthetic workloads really cannot translate how these pieces of hardware performs in real life. Bottom line is. Something is off, and it aint my math. Something is underperforming/overperforming, i dont know which of the "two sides" it is. But something doesnt seem right.

 

On the flip side.

AMD FX 8350 + 990 FX board is about 250-300 bucks.

a 5960x is around 1000 bucks, X99 boards around 150...

 

Gaming performance difference, as we can see, measure and calculate with MATH. Proves that the difference is less then 30% on average.

in that sense you pay 850 USD to increase your performance by 30%....850 USD.... i can build a GOOD second gaming PC for that kind of money.

Ok then.....now find a comparison with them at the same clock speed and see the negatives of the FX8350.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

PCIE3 doesn't matter and there are boards with M.2 support. Do not spread misinformation

It does when you want to run SLI or Crossfire in x8, and especially with Crossfire if you need to run it at x4 for 3 way.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I have an FX-8320 at 4.73GHz, and on Cinebench r15 I can OC it to 5GHz and come close to a 4770k. But I know that AMD has one shot with their next gen CPUs or this is toast lmao.

Looks like you got a good chip :D Yea I have not pushed my 6300 all that much but it is a decent overclocker from what I have so far.

A shadowy flight into the dangerous world of a man who does not exist.

 

Core 4 Quad Not Extreme, only available on LGA 557 at your local Circuit City

Link to comment
Share on other sites

Link to post
Share on other sites

It does when you want to run SLI or Crossfire in x8, and especially with Crossfire if you need to run it at x4 for 3 way.

PCI-E 2.0 x8 does not bottleneck modern GPUs afaik, PCI-E 2.0 x4 does though, by 10-15% so as long as you don't go below PCI-E 2.0 x8 you're good. There even is a new 970 chipset motherboard released that's SLI certified. It's from ASUS. And lots of 990FX mobos are as well. No Triple-SLI option available AFAIK but still, who would have it with an FX CPU :P

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

PCI-E 2.0 x8 does not bottleneck modern GPUs afaik, PCI-E 2.0 x4 does though, by 10-15% so as long as you don't go below PCI-E 2.0 x8 you're good. There even is a new 970 chipset motherboard released that's SLI certified. It's from ASUS. And lots of 990FX mobos are as well. No Triple-SLI option available AFAIK but still, who would have it with an FX CPU :P

I know that PCIe 2.0 x8 doesn't bottleneck in most games, but with 3 way crossfire you wouldn't have PCIe Gen 2.0 x8, x8 and then PCIe Gen 3 x4. No matter what you think PCIe Gen 3 is useful-especially for PCIe SSD.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×