Jump to content

The Worst CPU Ever Made

Plouffe
On 7/20/2023 at 4:12 PM, LinusTech said:

FX-7* was literally so bad throughout its entire life that no other motherboard manufacturer bothered to make motherboards for it. It wasn't an exclusivity deal. It was just still-born. Everyone from AMD and ASUS that remembers this absolute shit show knows it was terrible. If you think you know better than them, you're wrong.

 

Yes, but it was never meant to be a successful mainstream product. 

 

It was placeholder, an attempt to have something/anything to call a HALO product.

 

Yes, you had to be completely out of the loop at that time to think it actually was anything like an efficient, sublime, or even serious solution.

 

Everyone knew It !!!

 

The REVISIONISM is you claiming anyone took it as a serious effort.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/20/2023 at 5:12 PM, LinusTech said:

This thread is baffling. I remember how this went down just fine.

 

How many benchmarks did you want us to run on it to determine that it is, in fact, terrible. Would a couple more sticks of RAM taken it from 10/10 terrible to 9.8/10 terrible. Maybe. Who cares? It was just a fun little retrospective look at a shoddy product, not a personal attack on you or your favorite company. AMD is doing fine now. We can laugh about this.

 

FX-7* was literally so bad throughout its entire life that no other motherboard manufacturer bothered to make motherboards for it. It wasn't an exclusivity deal. It was just still-born. Everyone from AMD and ASUS that remembers this absolute shit show knows it was terrible. If you think you know better than them, you're wrong.

 

No offense, but reading this thread made my IQ lower. 

Would have been funny if you pulled up Cinebench r15 and had it set next to the Intel machine that you referenced in this video, just to see it get stomped. But besides that, no complaints here for what it's worth (obviously you should consider DANK_AS_gay's opinion, I see no reason for someone named that to be dismissed :P). No point doing a full review on a system that old, especially when it wasn't notable at the time for anything more than sucking.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, mdk777 said:

It was placeholder, an attempt to have something/anything to call a HALO product.

even the 11900k was a halo product for the sake of having a halo product. Doesn't prevent anyone from calling it absolutely terrible and godawful and all the terms associated there.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, WolframaticAlpha said:

even the 11900k was a halo product for the sake of having a halo product. Doesn't prevent anyone from calling it absolutely terrible and godawful and all the terms associated there.

correct, but it doesn't mean that there weren't dozens if not hundreds of similar examples.

"Hey you know what was really horrible, this car made in 1946."

Ah yeah, they were pretty much all horrible at that time. 

Just seems like a negative waste of time.

Hard to pick out the worst when so many were equally bad.

"Worst CPU ever made" no,  worst than some competitors at that time but something like 1000 times better than anything that existed like 10 year before and 10,0000 times better than anything 20 years before.

get some perspective.

 

generating pointless content for clicks.

 

"NOT making anyone more intelligent for reading or watching." 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mdk777 said:

"Worst CPU ever made" no,  worst than some competitors at that time but something like 1000 times better than anything that existed like 10 year before and 10,0000 times better than anything 20 years before.

iteration times in CPUs meant that 8-9 generations would go away in 10 yrs

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WolframaticAlpha said:

iteration times in CPUs meant that 8-9 generations would go away in 10 yrs

????

They were roasting a CPU from 2006....that was 17 years ago.

So, yeah pointless for today. And pointless even a few generations/years after....which was still a long time ago. 

What is your point again?

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, mdk777 said:

????

They were roasting a CPU from 2006....that was 17 years ago.

So, yeah pointless for today. And pointless even a few generations/years after....which was still a long time ago. 

What is your point again?

It was really really really bad. It's like the Juicero, while it technically worked and did what it was supposed to, it was a failure in every possible way otherwise. 

 

Put it into perspective: This is from the same company that released the Athlon 64 Venice CPUs in recent memory, an absolutely awesome value for money CPU. Additionally, at the same time some of the competitive former generation Intel Xeons started appearing on the refurb market with affordable dual or quad CPU motherboards, and they could be had for less than the monstrosity you're trying to defend. And, since electricity costs back then were reasonable, that was a road a lot of enthusiasts went down.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, ImorallySourcedElectrons said:

 Additionally, at the same time some of the competitive former generation Intel Xeons started appearing on the refurb market with affordable dual or quad CPU motherboards, and they could be had for less than the monstrosity you're trying to defend.

Never tried to defend it.

Simply stated that attacking the failings of 17 year old technology is pointless.

I'm well aware of the economics and options that were available at the time.

I was involved in FOLDING@HOME and looked at many options to generate the best Flops for the buck.

 

Point of fact, I was involved in the first iterations of using GPU instead of CPU for math processing.

 

The list of massively over priced CPU and GPU that have hit the market at one time or another is huge list.

Once you pay 4x-10x over the sweet spot for percentage gains in performance....Well that all goes in the same basket for me.

 

I could probably find a list of government and academic installations that were overbudget, delayed, and failed to even work, much less be 10 years antiquated. Worst CPU? 

Yesterdays technology.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/16/2023 at 1:20 PM, Radium_Angel said:

"Worst CPU evah!" LTT

 

Fine, where's the proof? We got a couple of dodgy GPUs, 1.5 games tried, no benchmarks, no hard numbers, no attempts at anything for proof, just "take my word!"

 

For the record, CraftComputing did a video on the Opteron CPUs, and did it way better than this one, with proof on just how bad these CPUs are, but I only know about that video because I"m subscribed to CC. 

 

It's 2023 gang, up your standards.

And he pimps good beers to us too 😉

 

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, mdk777 said:

Never tried to defend it.

Simply stated that attacking the failings of 17 year old technology is pointless.

Nah, it's a pretty good example of what not to do for us in the electronics industry: It's a product made by a company that should have known better at a time when they were capable of much more. They had just come off a very successful series of products and then they proceeded to screw it up and stay on the backburner for over ten years. AMD would have been better off without ever releasing this thing.

 

And sure, there were objectively worse CPUs, just off the top of my memory:

  • SPARC in the Java era: Compare the CPU design with the JVM design, and then realise they marketed these machines to run Java garbage. SPARC itself wasn't a bad design, though it's not nearly as capable for general purpose computing as many made it out to be.
  • Intel 80860: Let's start off with the fact that this thing is horribly named, the Intel 8086 is what kicked off x86 and led to the i386, etc. Meanwhile, the 80860 has absolutely nothing to do with x86 and was a RISC fever dream, and was so bad that Intel gave up on RISC and licensed ARM IP. The architecture was horrible and implemented many of the features the current pro-RISC crowd still loves to drool/theorize about. I wouldn't be surprised if anyone wrote entire books about how horrible it was, but it did lead to the hilarious case of Intel having bought a license to ARM's instruction set and base architecture during a period when ARM could use a cash injection, meaning Intel can roll their own ARM CPUs. 😂
  • A large portion of AMD's line-up during the late 90s/early 2000s lacking anything along the lines of thermal protection and other basic system management quality of life features, leading to many cracked dies and magic smoke events.
  • DEC/HP Alpha (for reference: PDP11 --> VAX --> PRISM / Alpha): History lesson time, and let me start with a hot take: the PDP11 was so popular because everyone in the 70s and 80s was high on illegal substances, if you don't understand why I urge you to consult its instruction set and specifically the part about data storage formats. (I've had the "pleasure" of having to port PDP11 code to a modern system, and I think it took a couple of years off my life.) Anyhow, VAX took this and tried to make it slightly saner but simultaneously expanded upon it with some really cool features, which actually led to a surprisingly popular product. But some of the egg heads at DEC wanted to make a RISC architecture, because that was the popular thing at the time (see Intel trying to make the 80860, etc.), so DEC came up with PRISM, which was quickly cancelled due to various quite valid reasons (see the internet for extensive explanations on this one). But folks at DEC kept tinkering and changing PRISM which led to Alpha, which was meant to replace the highly successful VAX architecture, and they just kept tinkering with it instead of saying enough-is-enough and pushing it out the door. The end result was a CPU that had a lot of design-by-committee aspects to it, which was only somewhat compatible with the pre-existing VAX code, that entered a market where Intel, AMD and Citrix were trading blows with each other and running really high volumes. It ended up killing DEC, leading to their purchase by Compaq, which was then bought up by the Fiorina HP Enterprise horrible CPU architecture graveyard. So while this ain't a specific part, I'd nominate the entire range of DEC/Compaq/HP Alpha CPUs are some of the worst ones in existence.
  • Everything that touched SGI could probably get its own section in this entire wall of text.
  • MIPS-based CPUs didn't go anywhere for plenty of reasons, and this is also why RISC-V will never achieve anything. (Which is a hot take, but a valid one in my opinion.)
  • The countless Motorola attempts to recapture the 68k crowd, each more ill-suited than the predecessor before it.

But the absolute place of honour goes to the transputer, it was a bad idea in the 80s, and it's a bad idea now. It sounds like a good idea when you run into a technological wall, because you can just slap on more CPU (cores) to get more performance - and this is definitely not something AMD and Apple are doing right now. But it has a serious problem, it depends heavily on point-to-point links or a bus interface, and due to physical constraints this always saturates a couple of years down the road, but now you've designed yourself into a corner because you put CPU core development on the backburner and went all in on parallelization. The transputer killed the company that made it, and also made a whole pile of tax payer money go the way of the dodo with it. And I sincerely hope AMD stops going down this route, because Intel really needs some competition ten years from now.

 

I could continue for a while, but I hope you get the point?

 

2 hours ago, mdk777 said:

The list of massively over priced CPU and GPU that have hit the market at one time or another is huge list.

Once you pay 4x-10x over the sweet spot for percentage gains in performance....Well that all goes in the same basket for me.

This isn't just about price, AMD CPUs from this era had major performance issues that went far beyond instructions per buck spent.

 

2 hours ago, mdk777 said:

I could probably find a list of government and academic installations that were overbudget, delayed, and failed to even work, much less be 10 years antiquated. Worst CPU? 

Yesterdays technology.

See above, I got started on said list.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, ImorallySourcedElectrons said:

They had just come off a very successful series of products and then they proceeded to screw it up and stay on the backburner for over ten years. AMD would have been better off without ever releasing this thing.

Again, well aware.

I had something like 20,000 shares of stock that were worth about 1/10 of what I paid for them.

Except, they didn't have the luxury of releasing nothing.

 

The problem was they were at a juncture where they didn't have that next good product in the pipeline yet.

They did need to survive those ten years wandering in the desert somehow.

 

Since Mana was not falling from the sky during those years...they spent a great deal of time retreading, repurposing, and hyping.

 

Yes, it was crap.

 

Worst crap?

meh, as we agree....there is plenty of it to go around.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, mdk777 said:

Again, well aware.

I had something like 20,000 shares of stock that were worth about 1/10 of what I paid for them.

Except, they didn't have the luxury of releasing nothing.

 

The problem was they were at a juncture where they didn't have that next good product in the pipeline yet.

They did need to survive those ten years wandering in the desert somehow.

 

Since Mana was not falling from the sky during those years...they spent a great deal of time retreading, repurposing, and hyping.

 

Yes, it was crap.

 

Worst crap?

meh, as we agree....there is plenty of it to go around.

 

 

 

 

No, the economics of this thing make absolutely no sense at all. They'd have been better off not releasing anything at all versus this specific SKU. It were these sort of things that led to them being mediocre at best, in fact it almost killed them.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, ImorallySourcedElectrons said:

No, the economics of this thing make absolutely no sense at all. They'd have been better off not releasing anything at all versus this specific SKU. It were these sort of things that led to them being mediocre at best, in fact it almost killed them.

You apparently don't follow many capital intensive cyclical industries.

Your option is to sell the crap you have today, or cease to exist forever.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, mdk777 said:

You apparently don't follow many capital intensive cyclical industries.

Your option is to sell the crap you have today, or cease to exist forever.

Do you actually understand anything about the electronics industry? This thing was a giant money sink, they'd literally have more money (both on the balance sheet and in hard cash) in the end by not making these. This wasn't "crap you have today", this was something that was expensive to develop and manufacture, no one really wanted it, no one needed it, and anyone with half a brain saw it was a horrible horrible idea, but someone at AMD management had an aneurism or something and pushed it through anyway. This CPU belongs in the same category as 3dfx their final products, where the company would have been better off not actually making those products and pushing them on the market, and instead put resources into developing the actual competitive product. And AMD is lucky to have survived several bad moves like this.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, ImorallySourcedElectrons said:

Do you actually understand anything about the electronics industry? This thing was a giant money sink, they'd literally have more money (both on the balance sheet and in hard cash) in the end by not making these.

Wrong, 

You don't just say "oh snap, this isn't working...hold the presses"

It is exactly the multi-year planning and development, the capital investment that makes that impossible.

You make a wrong turn in the electronics business (any capital intensive industry) and it take a long time to change course.

 

AIRBUS ....yeah we thought people wanted huge planes...but it turns out that they like smaller planes flying more frequently.

No problem, we'll just cancel the A380 now (2010) and eat the $25 Billion investment...No Problem.

https://en.wikipedia.org/wiki/Airbus_A380

Nope, that is never the way anything works.

You've spent Billions in development and you do your best to sell the end product, regardless if it is the optimal, or most competitive.

So yeah, it is a profit loss....you loose BILLIONS over the years. But you do not loose  $25 BILLION all at once.

 

So, you may understand electronics, but you don't seem to understand the Business end very well. 

Yes it was a long slog to turn AMD around. But your advice would have caused them to merely go Chapter 14 total liquidation.

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/27/2023 at 7:53 PM, mdk777 said:

Wrong, 

You don't just say "oh snap, this isn't working...hold the presses"

It is exactly the multi-year planning and development, the capital investment that makes that impossible.

You make a wrong turn in the electronics business (any capital intensive industry) and it take a long time to change course.

 

AIRBUS ....yeah we thought people wanted huge planes...but it turns out that they like smaller planes flying more frequently.

No problem, we'll just cancel the A380 now (2010) and eat the $25 Billion investment...No Problem.

https://en.wikipedia.org/wiki/Airbus_A380

Nope, that is never the way anything works.

You've spent Billions in development and you do your best to sell the end product, regardless if it is the optimal, or most competitive.

So yeah, it is a profit loss....you loose BILLIONS over the years. But you do not loose  $25 BILLION all at once.

 

So, you may understand electronics, but you don't seem to understand the Business end very well. 

Yes it was a long slog to turn AMD around. But your advice would have caused them to merely go Chapter 14 total liquidation.

 

Are you really this dense? If we were talking about a single socket CPU I'd agree, but we're not, we're talking about the Quad FX. It was a stupid idea back then, and it's a stupid idea now. You don't just slap two CPUs together on a motherboard and have them "magically" work together, they wasted a lot of R&D that went exactly nowhere. And before you claim it's relevant for multi-core or even multi-die architectures, it's not, the problems you encounter are entirely different on multi-socket systems. No, what they did was spend a lot of time integrating server technology on a desktop platform with zero expected demand for said solution. All they had to do was take some high-end opterons, make them compatible with a desktop chipset and socket, disable some of the fancy features (or just remove them from the box), and call it an enthusiast grade Athlon64 FX, but instead we got Quad FX which required massive changes from everyone in the supply chain for a one off.

 

And that's such a false comparison, the A380 is a design from the late 80s/early 90s when airlines had highly centralized networks. You were flown to a hub with a narrow body, after which a wide body (747 or A380) took you to the nearest hub to your destination, after which you again transferred to an aircraft to take you to your destination. When Airbus stepped into the final design phases and production they had signed orders from customers. A couple of years into production the market changed drastically due to various factors. So Airbus released a product that had demand, had actual customers willing to put down a couple of billion euro in orders, and the market changed during the lifetime leading to its early cancellation.

 

Please do explain, how is that anything like AMD making a CPU that by all intents and purposes required massive development costs with no expected demand except for the rare idiot that was easy to separate from their money and enthusiasts who just buy stuff because it looks cool?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, ImorallySourcedElectrons said:

hey wasted a lot of R&D that went exactly nowhere. And before you claim it's relevant for multi-core or even multi-die architectures, it's not, the problems you encounter are entirely different on multi-socket systems. No, what they did was spend a lot of time integrating server technology on a desktop platform with zero expected demand for said solution. All they had to do was take some high-end opterons, make them compatible with a desktop chipset and socket, disable some of the fancy features (or just remove them from the box), and call it an enthusiast grade Athlon64 FX, but instead we got Quad FX which required massive changes from everyone in the supply chain for a one off.

No, they didn't waste hardly any R&D. 

Yes, they were taking server technology and throwing at the consumers and hoping for the best.

YES, It was a one off HALO attempt. 

YES, it was always going to be an extremely small percentage of the market even if it was successful. 

You keep contradicting yourself in your arguments. Yes, the QUAD FX was a joke, that they knew would only sell in hundreds, perhaps a couple thousand if successful. THE massive investment, the mistake that almost destroyed them was the entire Bulldozer line. 

5 hours ago, ImorallySourcedElectrons said:

Please do explain, how is that anything like AMD making a CPU that by all intents and purposes required massive development costs

The comparison was to the entire Bulldozer line...which was the root cause and had similar long lead times and required investments predicting where you thought the market would be some years in the future. The false comparison is you thinking that the QUAD FX had any serious development or investment.

AMD made the BET (the prediction, the investment) that since they could not win the single core speed race(which was rapidly flattening out anyway) they would invest in Bulldozer. If you watched the link, it gives a pretty unbiased and comprehensive history. 

They were not wrong, just early. (which in business is the same thing as wrong) Games didn't take advantage of the multi-cores, single core speed remained paramount. Latency still needed to be improved, programs in general needed to catch up to the massive problem of utilizing parallelism.

 

AIRBUS made the same long term BET (prediction, investment in where they thought the industry would be years in advance)

 

Actually, you are correct that the comparison does fail in one regard.

 

AMD was actually RIGHT. The entire industry has gone in the direction they predicted. Massive core counts and massive Parallelism is indeed where everyone, AMD, INTEL, APPLE have all ended up. AMD was early and needed to wait for the technology (production execution) to catch up.

 

AIRBUS was simply wrong. People hate the hub and spoke system. Given any choice at all, they will always take a single direct flight over going out of their way to a HUB, then flying the most painful planes know to exist to their final destination(after considerable delays, chance of missing connections, loosing luggage etc. etc.).

 

So the comparison of investments and lead times in on point. However the results are reversed.

 

AMD correctly predicted the future but failed to produce a product (at that time) that was a quality delivery.

Subsequently the vision was achieved, by AMD and everyone else.

AIRBUS mis predicted the future, but delivered a quality product, that turned out to have no sustainable market.

No one else is still attempting their vision. 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/16/2023 at 8:17 PM, Plouffe said:

AMD and Intel have been competing for years and they've both had their ups and downs. Back in 2006, AMD may as well have been at the bottom of the ocean with how down they were when Quad FX launched. How bad is it now?

 

 

People forgot the AMD E1 1200, and honestly that is VERY bad.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×