Jump to content

AMD disappointed me

Message added by Morgan MLGman

Please keep the thread civil, I will be monitoring if the thread gets out-of-hand. If it does, it will be locked so keep that in mind. We do not want to start a flamewar.

When the Threadripper 2 launched, I recalled back of what AMD did.

 

  1. AMD shamed Intel when they launched the Pentium D.
  2. AMD launched the Black Edition, later K and X.
  3. AMD shamed Intel for making two segments (this not really sure).
  4. AMD stating that they will no longer focus on CPU-only in the consumer space, and APU is the way to go.
  5. AMD launching the 6-core Phenom II for the price of Intel's quad-core.
  6. AMD launching the highly controversial core count Faildozer series, the FX-8100, for the price of Intel's quad-core. (I bet it's their experiment to create a similar HTT, just more of a you-can't-turn-it-off)

 

I argued that AMD is basically copying Intel in some areas, even some minority of the hardcore AMD fanboy agreed to a certain extend. I remembered the days when AMD was like Apple, offering small range of CPU, but they beat Intel (which had a wider range) in performance and pricing.

 

When Ryzen launched, I was hoping that:

  1. It was an APU right out of the box
  2. Offering 16-cores for the price of Intel's quad-core

Did we get that? No. Their early AM4 line-up were basically based on the last Faildozer iteration line-up with R5/7 graphics series. I was even expecting their R-series (they copied the Intel Core i-series, so I derived from that) to be APUs, with R5/7 graphics, but we only just get the CPU-only dies glued together on the packaging board.

 

The argument of the iGPU is just a waste of space. I mean, I do use Intel's QuickSync, for de/encoding productivity apps, so my 6700K's HD 530 is not wasted. Even QS is more recognised then AMD's VCE. Before I got a 1070 (I had a 55/70), I used Intel's iGPU to record my own gameplay.

 

I was expecting their highest end R7 to be a 16 cores, or 12 cores (just to be a little bit hopeful), but NOPE, they copied Intel segmentation, basically NOT giving the consumer more than 12 cores in a desktop. I was expecting the R7 to be 16-cores, R5 to be 8, R3 to be 6, and the A(thlon)-Ryzen to be 6(6c/6t)/4. Do note that this scenario, they all have iGPU.

 

Here comes the argument like:

  1. A 16-core for US$400 (based on Amazon price, and if tiered to the i7-8700K), and who is going to use that much cores, even without a dGPU?
  2. A 16-core is already a workstation tier, motherboards will be catered for quad-channel memory, what about those R5,R3, and AR? They are better at dual-channel, I have to use quad-channel memory for all of them, even the cheap entry AR! A waste of my money to buy all four sticks!

The first point, it really depends on the user's needs. Sure, there will be some pre-builds with that 16-cores, and some of them, don't even have a dGPU, but what if they were bought to purposed it into a rendering box?

 

The second point, YOU DON'T NECESSARILY NEED TO STICK 4 RAM STICKS! You want a dual-channel memory operation, just look at the motherboard's manual, and see which slot is best for dual channel memory operation. Even if you don't want to look at the manual, surely, motherboard manufacturers will come out a board that physically limits it to operate to dual-channel memory operation. 6 and 8 cores not suitable for quad-channel memory? We already have that on Intel and the server space. Did you know that more memory channels that your CPU has to deal with, the memory controller has more stress and there will be more latency delay?

 

Even if AMD released their chips with dual-channel memory controllers, AMD could still push the industry to make 1x16GB stick affordable, basically becoming a price of 1x8GB stick (US$80, Amazon pricing).

 

Not sure why the Faildozer series, the FX-series, the CPU-only line-up, has NO suffix to denote that they are overclockable (could be that they are marketed as HEDT, making the last generation be the more budget stuff, and ignoring the late comers, the FX-9370 and -9590), basically AMD did the right thing, saying that all of their chips are overclockable, equal, just a matter of silicon lottery. Though, changed when they released the APU, copying Intel's K-suffix, just to show that that chip with that suffix, overclocks even better than the rest. This continues in the R-series line-up.

 

What are your thoughts? Do you think AMD should have kept their ego and word and released a 16-core chip with a iGPU to the market?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AlfaProto said:

]When Ryzen launched, I was hoping that:

  1. It was an APU right out of the box
  2. Offering 16-cores for the price of Intel's quad-core

 

1. I don't recall AMD ever saying it was an APU out of the box.

2. Niggawut

Ryzen 5 1600 @ 3.9 Ghz  | Gigabyte AB350M Gaming 3 |  PaliT GTX 1050Ti  |  8gb Kingston HyperX Fury @ 2933 Mhz  |  Corsair CX550m  |  1 TB WD Blue HDD


Inside some old case I found lying around.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, AlfaProto said:

When Ryzen launched, I was hoping that:

  1. It was an APU right out of the box
  2. Offering 16-cores for the price of Intel's quad-core

This here quite well summarizes what I see in your post: you are delusional and have completely out-of-proportion expectations.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, WereCatf said:

This here quite well summarizes what I see in your post: you are delusional and have completely out-of-proportion expectations.

Not really, just if AMD followed their trend.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, AlfaProto said:

AMD launching the highly controversial core count Faildozer series, the FX-8100, for the price of Intel's quad-core. (I bet it's their experiment to create a similar HTT, just more of a you-can't-turn-it-off)
 

AMD was relying on CMT rather than SMT (=hyperthreading) for the Bulldozer family.

 

As for 16 cores, what the hell kind of expectation is that? Intel was only offering 4 cores on their mainstream platform, and AMD shows up with 8 cores. Twice as much isn't enough for you?

 

Regarding quad channel memory, when you're running workloads that actually benefit from that many cores, chances are you also need the extra memory bandwidth.

 

10 minutes ago, AlfaProto said:

Not sure why the Faildozer series, the FX-series, the CPU-only line-up, has NO suffix to denote that they are overclockable (could be that they are marketed as HEDT, making the last generation be the more budget stuff, and ignoring the late comers, the FX-9370 and -9590), basically AMD did the right thing, saying that all of their chips are overclockable, equal, just a matter of silicon lottery. Though, changed when they released the APU, copying Intel's K-suffix, just to show that that chip with that suffix, overclocks even better than the rest. This continues in the R-series line-up.

What? The Ryzen chips are all overclockable regardless of suffix.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Sakkura said:

As for 16 cores, what the hell kind of expectation is that? Intel was only offering 4 cores on their mainstream platform, and AMD shows up with 8 cores. Twice as much isn't enough for you?

What? The Ryzen chips are all overclockable regardless of suffix.

Like I stated: depends on user's needs.

 

I am stating that the FX-chip doesn't need any thing to show that they are overclockable, previous chip had Black Edition, and after FX, K and X, and they denote that they could overclock better than those non-suffix chips.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AlfaProto said:

Like I stated: depends on user's needs.

 

I am stating that the FX-chip doesn't need any thing to show that they are overclockable, previous chip had Black Edition, and after FX, K and X, and they denote that they could overclock better than those non-suffix chips.

The "X" means that those chips have higher clock speeds out-of-the-box, it also means that those chips should be binned better - that's it. It's no guarantee of overclocking and I've seen plenty 1700s overclocking higher than the 1700Xs. All Ryzen chips are overclockable and the X suffix doesn't mean anything in regards to overclocking. Would it change anything for you if there was an R7 1700 and an R7 1750 instead of a 1700X? I find the current nomenclature easier to remember personally.

Also, I added a message at the top of the thread to keep this thread civil, I will be monitoring it to make sure it doesn't get out of control.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AlfaProto said:

Like I stated: depends on user's needs.

 

I am stating that the FX-chip doesn't need any thing to show that they are overclockable, previous chip had Black Edition, and after FX, K and X, and they denote that they could overclock better than those non-suffix chips.

You're just displaying total misunderstanding of several concepts. The more you post, the further you're stuffing your foot in your mouth.

Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, AlfaProto said:

When Ryzen launched, I was hoping that:

  1. It was an APU right out of the box
  2. Offering 16-cores for the price of Intel's quad-core

1. Why would that be useful? Quicksync is and will always remain Intel only, so it IS wasted space on the chip. A GPU also makes the chip more complicated, which means worse yields and significantly higher prices. Also, higher power consumption and heat output. Low end graphics in a high end CPU don't make sense at all.

2. If they did that, they'd be bankrupt. Most software barely utilizes more than 4 threads. Even Premiere doesn't scale beyond 8-10 cores that well. There'd be a CPU without profit margin that performs like another Faildozer.

 

10 minutes ago, AlfaProto said:
  1. AMD launching the highly controversial core count Faildozer series, the FX-8100, for the price of Intel's quad-core. (I bet it's their experiment to create a similar HTT, just more of a you-can't-turn-it-off)

Nah, that wasn't it. They made it so that the FPU was shared between each bulldozer module. This cut FPU performance in half. Nothing related to HT/SMT

 

12 minutes ago, AlfaProto said:

I was expecting their highest end R7 to be a 16 cores, or 12 cores (just to be a little bit hopeful), but NOPE, they copied Intel segmentation, basically NOT giving the consumer more than 12 cores in a desktop. I was expecting the R7 to be 16-cores, R5 to be 8, R3 to be 6, and the A(thlon)-Ryzen to be 6(6c/6t)/4. Do note that this scenario, they all have iGPU.

Again, that many cores on a consumer board is utterly useless. Software can't make use of it and at a low price point, AMD can't make money on the chips so no more development. An underperforming chip doesn't sell period so that'd kill the company so Intel can sell the i7 models at double the price.

 

Quote

Even if AMD released their chips with dual-channel memory controllers, AMD could still push the industry to make 1x16GB stick affordable, basically becoming a price of 1x8GB stick (US$80, Amazon pricing).

I think you're not really sure how this industry works. If you think AMD and Intel have the power to push DRAM development you're sadly mistaken. They're dependent on companies like Micron, SK Hynix, Samsung etc. Without their memory, there is no point to make a memory controller to use it. And RAM pricing is a supply/demand construction. Memory is cheap when supply is high and demand is low. Simple economics. 

 

18 minutes ago, AlfaProto said:

What are your thoughts? Do you think AMD should have kept their ego and word and released a 16-core chip with a iGPU to the market?

Ego word? Lmao. AMD made a smart move. There is no use case for mass produced 16 core consumer processors with a low end graphics solution integrated. This is why servers have super basic video since high core count chips are for compute only. Graphics don't usually matter. When they do, dedicated chips are used. This is how the world works, and is out of reach for consumer computing.

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, NelizMastr said:

1. Why would that be useful? Quicksync is and will always remain Intel only, so it IS wasted space on the chip. A GPU also makes the chip more complicated, which means worse yields and significantly higher prices. Also, higher power consumption and heat output. Low end graphics in a high end CPU don't make sense at all.

2. If they did that, they'd be bankrupt. Most software barely utilizes more than 4 threads. Even Premiere doesn't scale beyond 8-10 cores that well. There'd be a CPU without profit margin that performs like another Faildozer.

 

Again, that many cores on a consumer board is utterly useless. Software can't make use of it and at a low price point, AMD can't make money on the chips so no more development. An underperforming chip doesn't sell period so that'd kill the company so Intel can sell the i7 models at double the price.

 

I think you're not really sure how this industry works. If you think AMD and Intel have the power to push DRAM development you're sadly mistaken. They're dependent on companies like Micron, SK Hynix, Samsung etc. Without their memory, there is no point to make a memory controller to use it. And RAM pricing is a supply/demand construction. Memory is cheap when supply is high and demand is low. Simple economics. 

AMD has AMD Zero Core for their GCN. Then again, this scenario chip is possible if they glued the CPU+GPU on a single packaging, rather than a monolithic die.

 

There's the R5 8-core scenario chip, and the story will be like the Sandy Bridge days, you don't need an i7-2600K for gaming, you just need a i5-2500k.

 

Actually, it can be mostly attributed by server and datacentre. If I can have 1x32GB vs 4x8GB, I can reduced the motherboard footprint, especially when it is multi-socketed motherboards on a rack.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, AlfaProto said:

AMD has AMD Zero Core for their GCN. Then again, this scenario chip is possible if they glued the CPU+GPU on a single packaging, rather than a monolithic die.

 

There's the R5 8-core scenario chip, and the story will be like the Sandy Bridge days, you don't need an i7-2600K for gaming, you just need a i5-2500k.

 

Actually, it can be mostly attributed by server and datacentre. If I can have 1x32GB vs 4x8GB, I can reduced the motherboard footprint, especially when it is multi-socketed motherboards on a rack.

All what you're saying is possible for AMD to do, however it's not viable for them to do.

Remember, they're a company and they've got to earn money to be able to develop new products and technologies.

Keep in mind that AMD released an 8 core, 16 thread CPU for 329$ MSRP on 2nd of March 2017. At the time, Intel's cheapest consumer-grade 8-core CPU costed 1000$ with similar performance (IPC, clock speeds) - the only thing that differed was that AMD used the mesh cache architecture instead of a traditional, Intel-like ring-bus design.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, AlfaProto said:

 If I can have 1x32GB vs 4x8GB, I can reduced the motherboard footprint, especially when it is multi-socketed motherboards on a rack.

Even older Opterons can use 16 and 32GB DDR3 ECC DIMMs no problem, and Epyc can also use 32GB DIMMs no problem so I don't see how this is relevant. Consumer chips can't be run in multi socket boards anyway.

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, valdyrgramr said:

So basically, OP wanted a bunch of illogical things that would bankrupt any company while making assumptions about what they would do?

 

 

Yes. AMD is basically shit because they refuse to kill themselves with products that nobody wants or needs.

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

Your standards are low, I want 64 cores with vega graphics at the price of a pentium! They can't do that...? What a disappointment! Shame on you FailMD

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Sauron said:

Your standards are low, I want 64 cores with vega graphics at the price of a pentium! They can't do that...? What a disappointment! Shame on you FailMD

That "APU" better have ray-tracing capabilities.

Link to comment
Share on other sites

Link to post
Share on other sites

As much as I would love for Ryzen to come with even a Vega 1 or some shit out of the box, it didn't happen and that's fine. I would've liked a higher core count, but that'd make the CPUs more expensive by nature and the segment of the market Ryzen is aimed at doesn't require that many cores. Memory compatibility issues at launch were bad, yeah. But if you're still experiencing them you probably have a dead board. All these complaints are fairly stupid, and you're asking far too much.

 

Just because Ryzen wasn't the second coming of Christ doesn't mean it was bad. The fact that it doesn't completely smash Intel is good. It creates some fair competition, which is always refreshing in this market.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, NMS said:

That "APU" better have ray-tracing capabilities.

Ryzen AMDRipper 3990WXRTG, 128c/256t with vega128 for $2,50. BUY IT NOW! ONLY 2500WATTS!

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AlfaProto said:

When the Threadripper 2 launched, I recalled back of what AMD did.

 

  1. AMD shamed Intel when they launched the Pentium D.
  2. AMD launched the Black Edition, later K and X.
  3. AMD shamed Intel for making two segments (this not really sure).
  4. AMD stating that they will no longer focus on CPU-only in the consumer space, and APU is the way to go.
  5. AMD launching the 6-core Phenom II for the price of Intel's quad-core.
  6. AMD launching the highly controversial core count Faildozer series, the FX-8100, for the price of Intel's quad-core. (I bet it's their experiment to create a similar HTT, just more of a you-can't-turn-it-off)

 

I argued that AMD is basically copying Intel in some areas, even some minority of the hardcore AMD fanboy agreed to a certain extend. I remembered the days when AMD was like Apple, offering small range of CPU, but they beat Intel (which had a wider range) in performance and pricing.

 

When Ryzen launched, I was hoping that:

  1. It was an APU right out of the box
  2. Offering 16-cores for the price of Intel's quad-core

Did we get that? No. Their early AM4 line-up were basically based on the last Faildozer iteration line-up with R5/7 graphics series. I was even expecting their R-series (they copied the Intel Core i-series, so I derived from that) to be APUs, with R5/7 graphics, but we only just get the CPU-only dies glued together on the packaging board.

 

The argument of the iGPU is just a waste of space. I mean, I do use Intel's QuickSync, for de/encoding productivity apps, so my 6700K's HD 530 is not wasted. Even QS is more recognised then AMD's VCE. Before I got a 1070 (I had a 55/70), I used Intel's iGPU to record my own gameplay.

 

I was expecting their highest end R7 to be a 16 cores, or 12 cores (just to be a little bit hopeful), but NOPE, they copied Intel segmentation, basically NOT giving the consumer more than 12 cores in a desktop. I was expecting the R7 to be 16-cores, R5 to be 8, R3 to be 6, and the A(thlon)-Ryzen to be 6(6c/6t)/4. Do note that this scenario, they all have iGPU.

Your expectations are unrealistic and ridiculous. You wanted 16 cores in a chip for the price of a QUAD CORE chip, what kind of drugs are you on? Isn't twice the amount of cores for the price of a quad core good enough for you? You say early AM4 CPUs were based upon bulldozer, but it wasn't. It was a complete different architecture called Zen. Ever heard of it? Only early AM4 APUs (Carrizo) were based upon bulldozer and even then they were closer to performing more like Zen than they were bulldozer.

There are 10 types of people in this world. Those that understand binary and those that don't.

Current Rig (Dominator II): 8GB Corsair Vengeance LPX DDR4 3133 C15, AMD Ryzen 3 1200 at 4GHz, Coolermaster MasterLiquid Lite 120, ASRock B450M Pro4, AMD R9 280X, 120GB TCSunBow SSD, 3TB Seagate ST3000DM001-9YN166 HSD, Corsair CX750M Grey Label, Windows 10 Pro, 2x CoolerMaster MasterFan Pro 120, Thermaltake Versa H18 Tempered Glass.

 

Previous Rig (Black Magic): 8GB DDR3 1600, AMD FX6300 OC'd to 4.5GHz, Zalman CNPS5X Performa, Asus M5A78L-M PLUS /USB3, GTX 950 SC (former, it blew my PCIe lane so now on mobo graphics which is Radeon HD 3000 Series), 1TB Samsung Spinpoint F3 7200RPM HDD, 3TB Seagate ST3000DM001-9YN166 HDD (secondary), Corsair CX750M, Windows 8.1 Pro, 2x 120mm Red LED fans, Deepcool SMARTER case

 

My secondary rig (The Oldie): 4GB DDR2 800, Intel Core 2 Duo E8400 @ 3GHz, Stock Dell Cooler, Foxconn 0RY007, AMD Radeon HD 5450, 250GB Samsung Spinpoint 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management. UPDATE: SPECS UPGRADED DUE TO CASEMOD, 8GB DDR2 800, AMD Phenom X4 9650, Zalman CNPS5X Performa, Biostar GF8200C M2+, AMD Radeon HD 7450 GDDR5 edition, Samsung Spinpoint 250GB 7200RPM HDD, Antec HCG 400M 400W Semi Modular PSU, Windows 8.1 Pro, 80mm Cooler Master fan, Dell Inspiron 530 Case modded for better cable management and support for non Dell boards.

 

Retired/Dead Rigs: The OG (retired) (First ever PC I used at 3 years old back in 2005) Current Specs: 2GB DDR2, Pentium M 770 @ 2.13GHz, 60GB IDE laptop HDD, ZorinOS 12 Ultimate x86. Originally 512mb DDR2, Pentium M 740 @ 1.73GHzm 60GB IDE laptop HDD and single boot XP Pro. The Craptop (dead), 2gb DDR3, Celeron n2840 @ 2.1GHz, 50GB eMMC chip, Windows 10 Pro. Nightrider (dead and cannibalized for Dominator II): Ryzen 3 1200, Gigabyte A320M HD2, 8GB DDR4, XFX Ghost Core Radeon HD 7770, 1TB Samsung Spinpoint F3 (2010), 3TB Seagate Barracuda, Corsair CX750M Green, Deepcool SMARTER, Windows 10 Home.

Link to comment
Share on other sites

Link to post
Share on other sites

The discussion doesn't seem to be going anywhere.

Thread locked.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.

×