Jump to content

Why is the AMD FX series so hated?

I honestly don't understand why these chips seemed to be so hated.  I built a LOT of FX systems, including my own gaming rig with the FX-9590.  I absolutely lovd it, it was air cooled, and rab relatively cool and fast, even under heavy load

Link to comment
Share on other sites

Link to post
Share on other sites

Because its PERCORE performance was abysmal.

Plenty of games at the time and forward was still single-dual threaded, its slower than an aged overclocked Core2Duo PerCore.

 

Multithreading for its time, was fine enough..

Singlethreading was so gimped though.

 

CPUs not getting full independent FPUs but shared FPU abilities...and all the other stuff aside.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

In addition to the bad single core performance, there was controversy involving whether or not the CPUs had the advertised core count. In a number of multicore tasks, the FX 8150 struggled against the Phenom X6 1090T. If this is a newer 8c/8t CPU vs an older 6c/6t processor, that doesn't make much sense.

 

Basically, whether or not a Bulldozer module is better thought of as having two cores or a single core with simultaneous multithreading depends on the workload. If the work requires floating point operations, the CPU effectively only has half as many cores.

 

Now, a 4c/8t chip was still top-of-the-line while FX was current, so that isn't in-and-of-itself the problem, but it wasn't advertised that way.

Link to comment
Share on other sites

Link to post
Share on other sites

I was fine with them because they were dirt cheap, even built a couple systems with them myself. However, they had a ton of drawbacks. 

 

For the time, power consumption was bad, and temps were really hot (by modern standards they're cool, but TJmax on those chips was only like 70C).

For single threaded performance, it was actually slower than the last generation Phenom CPUs. When the previous generation of CPUs is faster than the current generation, you know you did something wrong. 

It also stuck around for way longer than it should have. FX CPUs were around for 5 years. They were a bit slow when they were released, and 5 years later they barely had a refresh and a couple other SKUs like the 9590 that were basically just pre-OC'd versions of the 8350 so they could say "We're not dead!" Even AMD themselves admitted FX was terrible, when they went to release one of their GPUs (I think it was the Fury) they actually benchmarked it on an Intel system because they knew their CPUs were likely to be a CPU bottleneck. 

 

If it works for you, go ahead and keep using it. It's just a platform that the only reason it made any sense to use was if you got it for next to nothing. That's half the reason 6-7 years ago you could get an 8320 for $110, at that price it was hard to beat if you weren't putting something like GTX 980 SLI in it, and it let you get a much higher end GPU than if you had gone for something like an i5 4690k. It's just that if you needed to do actual CPU work, you'd rather have that i5 all day of the week. 

Link to comment
Share on other sites

Link to post
Share on other sites

FX was used back on socket 939 and was about being fast. 

 

Then, they used Phenom during the TLB errata of C2 Phenom arena chips, most where lucky to hit 3ghz-3.2ghz. These where almost called FX.

 

So the pushed the name back. And the FX line was born. And not fast.

 

And made the FX name mud.

 

The end.

Link to comment
Share on other sites

Link to post
Share on other sites

Lots of power and heat, but still embarrassingly slower than it's competition.

 

Like 40% slower

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

short story: they are bad

I Use my knowledge as business owner and self taught technician aswell as an AI to help people. AI might be controversial but it actually works pretty well 90% of the time.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SavageNeo said:

short story: they are bad

They are indeed bad but at the same time they are good too.
FX is kinda weird that way but it is.

Inefficient to the point of pain in single core applications but they do OK in multi-threaded applications.

I'll explain it like this - Actually there are at least two ways I can think of to do that:
Think of a dragster at the strip during a run, this being an FX chip. He comes off the line, burning the slicks all the way off the line and even down the track but the other one "Hooks" and goes, result is the other one wins the race.
Both have the power to get it done but if you can't get all that power to the ground it's just a waste. That is the thing with FX - It does have the power but it's so inefficient, it's trying and frying the slicks from the word "Go" and gets nowhere fast vs the other guy.

As to why they sold well enough to keep AMD afloat it helps FX in that, after folks saw how high in MHz they could go that made them an enthusiast chip overnight with setups routinely going to and over 8GHz in clockspeed.

It's the same basic thing as if benching a GPU to see how many FPS you can get - Folks LOVE high(er) numbers even if there is no practical application for it.

Point here is even if your card can top 200FPS in a game or benchmark, that really doesn't mean alot since the human eye can only distinguish a difference in smoothness of framerates up to about 150FPS, after that it's just a number on the screen if benching. In a game, unless you have something telling you what FPS it's running you can't tell any difference between 160FPS or 210FPS, it looks the same.

These numbers are what folks love to see and the higher, the better of course.

In the case of FX it was the higher numbers in MHz that made it into a fav, plus they were cheap enough to buy and reliable enough to make it happen.
Also helps that, unlike Intel chips they have no CB/CBB making them XOC "Friendly" to those that would run them on Ln2 for max results.
You can top off the pot and it just keeps going, most Intels won't do that without the CB/CBB issues kicking in.
That's one reason why FX did well enough to let AMD hang on long enough to get Ryzen to market.

Would I personally suggest one today for a daily/gaming machine?
No.
Would I suggest one to mess around with for tinkering/OC'ing purposes?
Yes.

Hope this helps clear a few things up.

As for the efficiency issues, there is a util that does help with the problem.
Be sure to scan the file BEFORE unzipping - I believe it to be clean but scan it anyway to be sure and just follow the directions for use when done.
It's self-contained so there is no actual install, just create your .exe shortcut to desktop, click it and go.
BDC_R1.03B.zip

"If you ever need anything please don't hesitate to ask someone else first"..... Nirvana
"Whadda ya mean I ain't kind? Just not your kind"..... Megadeth
Speaking of things being "All Inclusive", Hell itself is too.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I appreciate all the responses and info.  I was curious about this since while I was running mine in m7y gaming rig a while back, in every day use and gaming it was performing better than the at the time best i7 available with other components roughly the same in the build.  True, it ran hotter than the Intel, but it was also DESIGNED to do so.  Performance wise, what i and my friends saw / witnessed was at the low end mildly better and at the high end massively better performance.  I have long ago realized that benchmarks are essentially useless since they can be skewed to essentially provide what ever result you want when you run them.  What I experience with these chips FAR outstripped what the benchmarks said.

 

As a side note, yes I am an AMD fanboy.  Through MANY years of using their products, I have always found them FOR THE PRICE a better processor...  Though at times my belief has been rather stretched to be sure, lol.

 

I also want to point out that for the most part I do not see a point in overclocking these days.  CPUs are so powerful, I literally cannot see the difference between an overclocked and a stock system.  That being said, I DO realize that for certain tasks, overclocking DOES have it's uses.  However, your average gamer won't see enough of a difference to make it worth the extra cost and risk.

 

I also don't want anyone to think that I believe that I am right and everyone else is wrong.  I am very interested to hear all sides of things, and as usual come to my own conclusions about what i would want to do with my own system.

 

I currently run a Ryzen 7, 32gb RAM and a 1TB ssd with a RTX 5500.  I was debating chucking my old 9590 into a case with an older but still half decent GPU and selling it as a 'gaming' rig, since I recently lost my job and need rent monies.  >.<

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×