Jump to content

Intel Rebranding "Core i" Brand to "Core Ultra"

1 hour ago, MageTank said:

This still does not address the issue of AMD intentionally confusing customers with their similar chipset naming conventions to Intel. The only thing that separates AMD and Intel is a single letter/digit and customers may forget which one is which. The easier solution would be using I and A for Intel vs AMD designation on chipsets. I370 vs A370. Still separated by a single letter change, but you can simply tell customers "Look for I if you need an Intel board, look for A if you need an AMD board". You can use additional numbers or letters at the very end of the chipset to designate different performance tiers.t, lol.

I don't disagree it could be better, as with most things. An I/A would certainly be nice, but again, if a consumer isn't going to spend an hour researching such basic things, they shouldn't be building their own computer. It's really not hard to know what CPUs go with what motherboards. In fact, I'd argue it's hard NOT to, and you'd almost have to try to mess it up. Companies already do a lot to make it clear, and they also need to do what they need to, within reason, to sell their products, and marketing them with similar numbers to the competition so consumers will look at them too, as opposed to having different numbers which may be perceived by the consumer as not as good and therefore be ignored, is part of that. Again, I find it interesting how people take issue with it in the CPU space but not the GPU space. It seems the main reason for that is potential compatibility issues, which don't really exist with GPUs, but again, all it takes is a tiny bit of research.

 

1 hour ago, MageTank said:

Most reviews are simply click bait when the new hardware launches. Every time Intel launches a new product, the title is always "AMD is DEAD!!!!" or "Intel is BACK". When AMD launches a new product, it's the exact same thing in reverse. It's okay to be a fan of AMD, but it's not okay to ignore objective facts and substitute them with personal opinions.lol.

I guess it depends on what you consider a "review" and where you're going for them. I look at various sites, both at reviews and general articles. Considering you're going places with titles like that, I can understand why you feel the way you do. I've seen plenty of good and bad about both, but again, the general trend is that AMD has been outperforming Intel overall lately. They have better multi-core, better efficiency, and better graphics, all for typically a lower price. It's okay to be a fan of Intel, but it's not okay to ignore objective facts and substitute them with personal opinons.lol.

 

As for the rest, as I said, if you can't understand how loss of profits today affects a company's performance tomorrow, I can't help you.

Link to comment
Share on other sites

Link to post
Share on other sites

I just don't understand why brands have to change their names every time I start to understand them.

CPU: Ryzen 5950X Ram: Corsair Vengeance 32GB DDR4 3600 CL14 | Graphics: GIGABYTE GAMING OC RTX 3090 |  Mobo: GIGABYTE B550 AORUS MASTER | Storage: SEAGATE FIRECUDA 520 2TB PSU: Be Quiet! Dark Power Pro 12 - 1500W | Monitor: Acer Predator XB271HU & LG C1

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, vertigo220 said:

Yes. They released Pentium in the early 90s, and then didn't do much for the next decade, essentially making incremental improvements and eventually releasing the Pentium D, which was pretty bad. Whether or not they were competing with AMD or not at the time is irrelevant, and I never said they were. They didn't have sufficient competition to push them along, so they were complacent. Only when AMD released the Athlon64 and then the Athlon 64 x2 (which is what pushed Intel to go multi-core as well) did Intel get their act together and release the Core 2 Duo, etc, followed by the i-series chips.

That time period was one quite a battle between AMD and Intel. Clock speeds were increasing at quite a pace, and it was the marketing milestone race of hitting 1 GHz that AMD won. This clock race contributed to Intel's decision to make the Pentium 4 design, which didn't do so great compare to the post-GHz AMD CPUs of the time.

 

It took a while before Intel's desktop CPUs reset from the P4 route taking inspiration from their mobile design leading to the start of Core.

 

While AMD's X2 were the first dual core CPU offerings, Intel already had HT giving a 2nd thread years before that, and we had various dual-CPU mobos like the Abit BP6 that gave two core experience even earlier.

 

22 minutes ago, vertigo220 said:

Then early last decade, they stagnated again, with multiple generations being only single-digit improvements over the last, then once AMD came out with Ryzen, which was a substantial improvement and started offering real competition, Intel started making bigger improvements again with larger IPC increases, better iGPUs, etc.

It wasn't that Intel didn't do anything, they couldn't make it due to their process woes and the 14nm meme that resulted. They had the architecture designs, but not the manufacturing. They still haven't completed their recovery path yet. We had a glimpse of what could have been with Ice Lake, as the first post Skylake architecture in 2019. Had manufacturing not been a problem it would have gone against Zen 2, which was barely any better than Skylake. Of course, it is easy to talk about what could have been. They didn't have the manufacturing, the products didn't happen, and it is history now.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, vertigo220 said:

Yes. They released Pentium in the early 90s, and then didn't do much for the next decade, essentially making incremental improvements and eventually releasing the Pentium D, which was pretty bad. Whether or not they were competing with AMD or not at the time is irrelevant, and I never said they were. They didn't have sufficient competition to push them along, so they were complacent. Only when AMD released the Athlon64 and then the Athlon 64 x2 (which is what pushed Intel to go multi-core as well) did Intel get their act together and release the Core 2 Duo, etc, followed by the i-series chips. Then early last decade, they stagnated again, with multiple generations being only single-digit improvements over the last, then once AMD came out with Ryzen, which was a substantial improvement and started offering real competition, Intel started making bigger improvements again with larger IPC increases, better iGPUs, etc.

pentium 4 and pentium D are both netburst and share almost no similarities with pentium 3 and earlier. and to say the improvements from 386 to 486 to 586 (pentium) to pentium II(p6 (as in 686)) and 3(p6) were incremental is a wild take. There was a massive difference between pentium and everything else called pentium. we went back to p6+AMD64 with core.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

 

And that is basically never going to be possible ever. The more generations apart something is the more different it's going to be and there isn't really any system that is going to account for that. The simple fact is if a product is a generation newer the expectation should be that anything older could potentially be slower against any newer SKU from bottom to top.

 

A 12100 could be faster than a 11900k for some workloads (CB 1n they are nearly identical). In gaming it almost was, it's close in many cases and that should tell you everything you need to know that the generation part of the model does actually matter a lot, sometimes more than any other part of it.

We seem to be having a circular argument in this thread where I'm saying "it would be nice to know that a product is indeed better than the previous generation" if you want to stay on the same performance level or TDP design so no liquid cooler is required. 

 

There is a very specific reason I picked the 11700K and prior to that the 4700 non-K parts before, because the PC's I had envisioned for the respective builds because I wanted to avoid the liquid cooling solutions. One part down from the top of the deck seemed the correct option when the TDP wasn't capable of doubling under turbo boost.

 

What will that next chip be? Well based on the trajectory it was going to be the Ryzen 9 7900X, or X3D, but then two issues popped up, melting CPU's and Dying Intel Ethernet parts. The latter is actually easier to avoid by picking something like the TUF instead of the ROG board, but if there's a good possibility of stuff dying when correctly installed. I want to wait till that stuff is out of the retail channel, and if a 14th gen comes out by then, hopefully you won't need an "anti-bending" buckle for it like you need for the 12th and 13th. There are enough teething problems to safely wait it out unless you need something *now*

 

So right now I'm just really like "there are enough problems already to write off buying additional hardware" Waiting 7 generations (doing from last model DDR2(2006) to last model DDR3(2013) to last model DDR4(2021)) worked out pretty well, because all the issues with the platforms were worked out. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, porina said:

That time period was one quite a battle between AMD and Intel. Clock speeds were increasing at quite a pace, and it was the marketing milestone race of hitting 1 GHz that AMD won. This clock race contributed to Intel's decision to make the Pentium 4 design, which didn't do so great compare to the post-GHz AMD CPUs of the time.

 

It took a while before Intel's desktop CPUs reset from the P4 route taking inspiration from their mobile design leading to the start of Core.

 

While AMD's X2 were the first dual core CPU offerings, Intel already had HT giving a 2nd thread years before that, and we had various dual-CPU mobos like the Abit BP6 that gave two core experience even earlier.

 

It wasn't that Intel didn't do anything, they couldn't make it due to their process woes and the 14nm meme that resulted. They had the architecture designs, but not the manufacturing. They still haven't completed their recovery path yet. We had a glimpse of what could have been with Ice Lake, as the first post Skylake architecture in 2019. Had manufacturing not been a problem it would have gone against Zen 2, which was barely any better than Skylake. Of course, it is easy to talk about what could have been. They didn't have the manufacturing, the products didn't happen, and it is history now.

This is interesting as it's in direct contradiction with Wolram's claim that Intel didn't compete with AMD in the 90s, though your account is more inline with what I recall from the time. Though according to you (and I'm not calling it into question), AMD beat Intel on frequency, and then on performance as a result of that frequency, while I recall AMD being competitive with, and possibly better than, Intel at lower frequencies due to higher IPC in the mid-90s.

 

I didn't realize Intel had HT that early on. My first build was with an Athlon64, due to being generally regarded as better than Intel's offerings at the time as well as cheaper, but I constantly struggled due to the 1C/1T design, and after a couple years or so replaced it with an X2 which was night-and-day. Hard to say now, but who knows, maybe an Intel with HT from the start would have been the better option, but I'm guessing it was prohibitively expensive, especially considering it was another several years or more before HT became standard on all their CPUs and not something you had to pay more for.

 

As for the process issues, that's fair, though I wonder how hard they were really trying, or if they didn't push as hard and spend as much as they could have due to not having much competition. It just seems awfully coincidental that they suddenly started having massive improvements, especially finally replacing the extremely old iGPU with a halfway decent one, only after AMD started becoming a real threat again. Just like how they came out with the C2D/C2Q and then the i-series, all of which were substantial improvements over what they had been doing, when the Athlons started taking away a decent amount of market share, and even then it took them a couple years, indicating they weren't even ready.

 

In any event, whether they were anti-competitive so they wouldn't have to try as hard or because they were struggling with manufacturing or any other number of reasons, their actions were both reprehensible and harmful. If it was due to manufacturing issues, and done to keep them from falling behind due to those issues vs simply to allow them to be complacent, that's just as bad if not worse. Though I don't know what the exact timeline was, and it seems their fab struggles happened toward the end of all that.

 

25 minutes ago, starsmine said:

pentium 4 and pentium D are both netburst and share almost no similarities with pentium 3 and earlier. and to say the improvements from 386 to 486 to 586 (pentium) to pentium II(p6 (as in 686)) and 3(p6) were incremental is a wild take. There was a massive difference between pentium and everything else called pentium. we went back to p6+AMD64 with core.

Pentium D was bad. 386 to 486 to 586 were late-80s/early-90s. Maybe I'm wrong, but I just don't recall Pentium 2 and 3 being that significant, and Pentium 4 being a decent improvement but then it took another five years or so for the Pentium D, which again wasn't good, evidenced by the fact they released the C2D only a year later, which was a massive improvement and set them off on a several-year run of very good improvements until they started to stagnate again. Perhaps that more recent case was due to manufacturing issues, as mentioned by @porina, though as I said in my reply to them above, I seriously wonder if they tried as hard as they could/should have to resolve those issues vs figuring they didn't need to because they didn't have the competition to force them to do so.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, MageTank said:

This still does not address the issue of AMD intentionally confusing customers with their similar chipset naming conventions to Intel. The only thing that separates AMD and Intel is a single letter/digit and customers may forget which one is which. The easier solution would be using I and A for Intel vs AMD designation on chipsets. I370 vs A370. Still separated by a single letter change, but you can simply tell customers "Look for I if you need an Intel board, look for A if you need an AMD board". You can use additional numbers or letters at the very end of the chipset to designate different performance tiers.

 

Most reviews are simply click bait when the new hardware launches. Every time Intel launches a new product, the title is always "AMD is DEAD!!!!" or "Intel is BACK". When AMD launches a new product, it's the exact same thing in reverse. It's okay to be a fan of AMD, but it's not okay to ignore objective facts and substitute them with personal opinions.

 

No, it really doesn't. The very last antitrust ruling was back in 2010. Nothing has come ever since. Are you telling me Intel violating antitrust back in 2010 is hurting AMD's laptop sales now? You realize back in 2010, AMD's laptops were objectively awful, right? The best mobile AMD processor back in 2010 was the Phenom II X920 BE which had to compete against the i7 940XM (clarksdale). This was not a fair fight at all. The 940XM had twice the threads, significantly higher clock speeds, better IPC and nearly 4x the cache. Even without antitrust violations, it would not have been hard to market against AMD with performance numbers like that backing your product up. AMD hasn't been legitimately competitive in the mobile segment until Ryzen launched.

 

Are you claiming that antitrust violations from 13 years ago are still hurting AMD now, do you have any evidence to back that claim up? At the very least, cite a single source as I've been kind enough to do so for you when making claims.

 

I need a source for this claim too. Let's look at AMDs own roadmap back in 2010:

AMD-s-2010-Roadmap-Includes-4-New-Platforms-2.jpg

"Increase both notebook performance and battery life by at least 25%". What do we know about Bulldozers launch? Well... It certainly wasn't a 25% performance boost: https://arstechnica.com/gadgets/2011/10/can-amd-survive-bulldozers-disappointing-debut/. One could argue it was a regression in performance. Did Intel's antitrust practices cause this? Or was it AMD intentionally straying from Jim Keller's design in the hopes that software development would follow suite? They made a bad gamble on Bulldozer and marketed it deceptively themselves: https://www.anandtech.com/show/14804/amd-settlement

 

Intel did not hurt AMD. What hurt AMD back then was trying to complete in two very competitive markets simultaneously. They acquired ATI back in 06, rebranded it to AMD in 2010 (heavy push on GPU market) and subsequently put their eggs in that basket while their CPU market suffered. Was this Intel's fault too? Did they tell AMD it was a good deal to buy ATI and put their money towards marketing that instead?

 

I am going to join you in suspending logic here. Let's say you are right here (ignoring AMD violating Intel's copyright in their processors several times and also violating the X86-64 license agreement a few times throughout the mid-90's early 2000's), that means AMD was awarded damages for the impact Intel's antitrust violations had, right? Said damages should cover the R&D cost and should allow AMD to return to prominence in the market, right? How come in the last 13 years, that has not happened? You've never answered the question as to why your history lessons matter in the present (again, ignoring the misinformation). You keep ignoring evidence and are instead supplementing your opinion as fact. When having a disagreement with someone, it's best to supplement your claims with proof so others can actually adjust their view on the subject. Start doing that.

 

Oh, I wouldn't want them to attempt to make PC building idiot proof. The problem with attempting to make something idiot proof is that they end up building a better idiot, it's an exercise in futility. I am simply asking that they do not go out of their way to make it more difficult for the consumers, such as the chipset naming conventions currently used between AMD and Intel.

 

It's not just about PC building either. Let's say you do research online and learn that you need a B650 platform because it has certain features you need and fits your budget. You go to the store and speak to a salesperson, only to forget the exact name of the chipset. You say "oh, it was B something" and they say "B660?" and you say "Yeah! That was it", not realizing B650 and B660, while sounding the same, are not the same. You leave with a B660 system and it's missing a feature you were intentionally looking for. Can't blame that on the consumer when a slight difference in name could completely resolve that confusion without hurting the products in any way.

 

BTW, that was a hypothetical. Please do not research the differences between the B650 and B660 chipsets and roast me for it, lol.

Yeah I will give you that. Honestly I think AMD mimicking intels naming convention was probably good and bad. I mean the main reason they did so is because those familiar to intel naming would be able to easily grasp AMDs new lineup naming. On the one hand I don't see much of a problem with their r3, r5, and r9 naming scheme even though it's clearly a copy of Intel naming simply because it's much easier to differentiate between say an i7 and a r7. Even the X670 is sorta ok because it is easier to differentiate though still more confusing than the cpu naming. Now the problem is with the B series motherboard which I think they could have easily just used a different letter and we probably wouldn't have as much confusion but instead they decided to do the exact same naming convention which was stupid and sorta petty as they knew Intel would have to change their b series motherboard naming away from say 650 to what they use now 660. 

Link to comment
Share on other sites

Link to post
Share on other sites

I just remembered Intel had also use the name Max for the Xeon cpus, calling them Xeon Max. The difference between this and their Core branding, is Xeon Max is a new product line up and not a rebrand. There is still other Xeon family of CPU and there is Xeon Max. Xeon Max comes with 64GB of HBM.

 

Side note, speaking of Max, they even have Max series of GPU, but it's mainly for datacenters.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, vertigo220 said:

Though according to you (and I'm not calling it into question), AMD beat Intel on frequency, and then on performance as a result of that frequency, while I recall AMD being competitive with, and possibly better than, Intel at lower frequencies due to higher IPC in the mid-90s.

It's a long time ago so my memory is somewhat unclear. I don't recall who was ahead in the clock wars at which point, likely leapfrogging, but it was AMD that hit 1 GHz first. That's the memorable part of it.

 

I can't remember AMD offerings around the early Pentium 2 era. I recall owning Intel around then so whatever AMD did wasn't interesting to me.

 

Pentium 4 specifically was designed for clock, at the cost of lower IPC. The AMD CPUs of that era had higher IPC.

 

1 hour ago, vertigo220 said:

I didn't realize Intel had HT that early on. My first build was with an Athlon64, due to being generally regarded as better than Intel's offerings at the time as well as cheaper, but I constantly struggled due to the 1C/1T design, and after a couple years or so replaced it with an X2 which was night-and-day. Hard to say now, but who knows, maybe an Intel with HT from the start would have been the better option, but I'm guessing it was prohibitively expensive, especially considering it was another several years or more before HT became standard on all their CPUs and not something you had to pay more for.

Looking it up, Pentium 4 with HT was from 2003, Athlon X2 was 2005. Cheapest one at the time was a 2.4 at $178. Similar clocked ones of the same core without HT cost more, but they did come out a year or two earlier.

 

HT did help with system responsiveness as we were in the early days of multi-tasking. I even went all out around then with a dual socket Xeon system giving me 2 cores and 4 threads in 2004. Back in the days when consumer chipsets were hacked by mobo manufacturers to do things you wouldn't be allowed now. That's before the X2 was released.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, vertigo220 said:

This is interesting as it's in direct contradiction with Wolram's claim that Intel didn't compete with AMD in the 90s

I didn't claim it didn't compete. I claimed it's major priority wasn't AMD. It didn't compete with AMD in the sense that it had bigger fish to fry at the time. 

5 hours ago, vertigo220 said:

s for fabs, I'm curious why you'd think they shouldn't have invested in them and didn't/don't need them

I take blame for the confusion here. I was talking about overinvesting in their fabs for excess capacity that didn't materialize.

5 hours ago, vertigo220 said:

I don't deny, and never said, there aren't other reasons for AMD's struggles. Also, Intel didn't do what they did in 2009, they did it for years and agreed to stop in 2009. So while yes, AMD did make mistakes of their own, it absolutely hurt them by being hamstrung, again for years, causing a loss of profits AND market share, both of which have long-lasting detrimental effects. And I don't know exact timelines or their financials, so this is conjecture, but it certainly seems possible the loss of income from Intel's anti-competitive dominance in the laptop market contributed to their lackluster products you mentioned, as they had less money for R&D and to take the time necessary to produce good products. People don't seem to understand the domino effect revenue (or lack thereof) has for a company. As for video cards, I don't follow them as closely, but my impression was that ATi was behind Nvidia when they were acquired by AMD, so it's not that AMD mishandled things and caused Nvidia to pull ahead, but that they started already behind and just haven't managed to catch up (though they're getting close save for ray-tracing). I could be wrong on that, though.

 

Reread what I said. Laptops weren't major cause for why AMD fell behind. They made bad technological and financial decisions, overspent in the wrong places and critically failed to compete with Intel in the server space for too long. Intel was there with it's overhauled Xeons when PowerPC, SPARC, MIPS etc all started fading away. AMDs products were delayed, and when they finally came, had worse performance than advertised and had to be recalled and had performance degraded post relaunch.

 

>  but my impression was that ATi was behind Nvidia
ATI was lagging, but not by much. AMDs woes affected them and the gap between nvidia and ati/amd widened considerably post acquisition. To put it plainly, AMD couldn't keep up.

 

The main issue with AMD was the timing. Bad decisions like this might have been wiped off if taken over a larger timeframe. However all these missteps took place in the same time when Intel released Core, which was a massive success, and during the financial crisis of 07-08.

 

Also the reason why AMD was the first one with 64bit x86 processors and dual cores was mainly because Intel was betting that Itanium would catch on and considered x86 a dead end platform for the server and enterprise markets.

 

3 hours ago, vertigo220 said:

As for the process issues, that's fair, though I wonder how hard they were really trying, or if they didn't push as hard and spend as much as they could have due to not having much competition. It just seems awfully coincidental that they suddenly started having massive improvements, especially finally replacing the extremely old iGPU with a halfway decent one, only after AMD started becoming a real threat again. Just like how they came out with the C2D/C2Q and then the i-series, all of which were substantial improvements over what they had been doing, when the Athlons started taking away a decent amount of market share, and even then it took them a couple years, indicating they weren't even ready.

 

Sorry for parroting what their ceo says, but it is a concise summary of why intel fell behind on process technology:

Quote

Right. This is what I was saying. I think it wasn’t a risky enough bet on EUV.

We were betting against it. We had taken a lot of risk in Intel 10 when we were like, “Hey, we don’t need EUV. We will go to advanced quad patterning of the lithography.” We were doing other things to avoid needing EUV, and those things just weren’t panning out. It might have been a good decision when we did it, but as those things slipped, we were on the wrong side of EUV. TSMC grabbed EUV because of that. By the way, Intel drove the creation of it. How did we not monetize and leverage something that we created? At a minimum, we should have had a parallel program on EUV that said, “If we get this wrong… If we get quad patterning or the other techniques we’re doing in this self-aligning wrong…” We should have had a program for that, but we didn’t. We were betting against it. How stupid could we be? 

Pat Gelsinger came back to turn Intel around — here’s how it’s going - The Verge

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, WolframaticAlpha said:

I didn't claim it didn't compete. I claimed it's major priority wasn't AMD. It didn't compete with AMD in the sense that it had bigger fish to fry at the time.

Sorry, I took your statement as meaning they weren't really competitors, not that they were but that AMD just wasn't a "major" one. Makes more sense, but still doesn't seem to align with what porina said and what I recall. I don't even remember IBM being very big at that time, as it seems they had fallen out of the consumer space at least by then, and I don't recall SPARC at all. All I remember is Intel vs AMD. But that's not to say there wasn't more at the time that I wasn't aware of, though it definitely does seem Intel was competing with AMD a good bit.

 

39 minutes ago, WolframaticAlpha said:

I take blame for the confusion here. I was talking about overinvesting in their fabs for excess capacity that didn't materialize.

Ok, but I'm sure it's hard to know that. I imagine they weren't just blindly investing and they thought they would have need of it. Had the opposite happened and they didn't invest in it and needed it, as seems to be the case now, they could be equally criticized for that. Of course, it's also possible they just didn't do due diligence and it was a case of irresponsible management.

 

45 minutes ago, WolframaticAlpha said:

Reread what I said. Laptops weren't major cause for why AMD fell behind. They made bad technological and financial decisions, overspent in the wrong places and critically failed to compete with Intel in the server space for too long. Intel was there with it's overhauled Xeons when PowerPC, SPARC, MIPS etc all started fading away. AMDs products were delayed, and when they finally came, had worse performance than advertised and had to be recalled and had performance degraded post relaunch.

Interesting info, and I have no doubt that all played a role. It may likely have even been the majority of it. All I'm saying is Intel using anti-competitive actions to gain/maintain laptop market share over AMD didn't help. It both cost AMD money in lost sales and it cost them name recognition, both of which resulted in continued costs down the road. It's absolutely possible that's a drop in the bucket compared to the other factors you mentioned and I'm overestimating its impact, but it doesn't change the fact that Intel engaging in those behaviors was harmful and egregious, which was my original/main point, and I still suspect the lost revenue due to it had to have had at least some impact on those other things.

 

59 minutes ago, WolframaticAlpha said:

ATI was lagging, but not by much. AMDs woes affected them and the gap between nvidia and ati/amd widened considerably post acquisition. To put it plainly, AMD couldn't keep up.

Now that I think about it, I seem to recall feeling back then that maybe AMD bit off more than they could chew. Instead of being able to put their resources behind ATI to make it better, it just spread them too thin, especially, as you mentioned, due to Intel releasing the Core product.

 

1 hour ago, WolframaticAlpha said:

Also the reason why AMD was the first one with 64bit x86 processors and dual cores was mainly because Intel was betting that Itanium would catch on and considered x86 a dead end platform for the server and enterprise markets.

 

Sorry for parroting what their ceo says, but it is a concise summary of why intel fell behind on process technology:

Pat Gelsinger came back to turn Intel around — here’s how it’s going - The Verge

So basically they made a couple bad bets one after the other. That certainly explains a lot. Hindsight is 20/20, so easy for him to say that now, but back then, I suppose it wasn't so obvious. What's their reasoning though for keeping a crappy iGPU for years and only improving it when AMD did?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

can't remember AMD offerings around the early Pentium 2 era. I recall owning Intel around then so whatever AMD did wasn't interesting to me.

 

Pentium 4 specifically was designed for clock, at the cost of lower IPC. The AMD CPUs of that era had higher IPC.

AMD wasn't really performance competing with Intel until later Pentium III and then a lot more, surpassing, with Pentium 4. So basically 2000 onward.

 

I had a socket 939 motherboard and the original FX processors, those were sweet. I did move back to Pentium 4 HT though.

 

AMD was riding high on the 64bit then dual core technology leadership but then made successively worse architectures that spelt doom. Phenom (K10) wasn't that good and then Bulldozer, so yea two bad moves in a row. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, vertigo220 said:

What's their reasoning though for keeping a crappy iGPU for years and only improving it when AMD did?

Well, intel's approach to making GPUs was pretty weird for many years. Intel saw the line that nvidia had taken, with GPUs essentially transitioning from specialized processors to general purpose processors with a few optimizations for traditional workloads, and extended it. And so came the doomed Larrabee(Larrabee (microarchitecture) - Wikipedia) and Xeon Phi cards. Also Xe didn't really come because AMD threatened them in the GPU space, though that might've played a part. The major cause behind intel going all in on GPUs was because they wanted to compete with nvidia in the rapidly expanding enterprise GPU market. Intel, AMD and Nvidias first priority is always enterprise parts, and most of their decision making often revolves around it.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, WolframaticAlpha said:

Well, intel's approach to making GPUs was pretty weird for many years. Intel saw the line that nvidia had taken, with GPUs essentially transitioning from specialized processors to general purpose processors with a few optimizations for traditional workloads, and extended it. And so came the doomed Larrabee(Larrabee (microarchitecture) - Wikipedia) and Xeon Phi cards. Also Xe didn't really come because AMD threatened them in the GPU space, though that might've played a part. The major cause behind intel going all in on GPUs was because they wanted to compete with nvidia in the rapidly expanding enterprise GPU market. Intel, AMD and Nvidias first priority is always enterprise parts, and most of their decision making often revolves around it.

Makes sense, but I find the timing too coincidental. Intel released Xe shortly after AMD started pushing greatly improved iGPUs, which tells me they had it ready to go (possibly for the reasons you mentioned, i.e. development for the purpose of competing with Nvidia) and only actually started using it when they had to, as a reaction to AMD, as opposed to taking it upon themselves to improve it beforehand. I suppose it's possible they just weren't ready yet, or it was already in the pipeline, but I just find the timing a bit suspicious.

Link to comment
Share on other sites

Link to post
Share on other sites

The whole "i" thing was getting long in the tooth but this is just bizarre

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, vertigo220 said:

Makes sense, but I find the timing too coincidental. Intel released Xe shortly after AMD started pushing greatly improved iGPUs

Stuff like this takes YEARS(3-4 years and more) to develop. Intel didn't just flick a switch to make their iGPUs decent. Realistically they have been working on this for years, and a number of companies, especially Apple had been telling them to get better in the iGPU space for years. My guess is that Alder lake and Xe realistically had been in works since 2017-2018.

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, WolframaticAlpha said:

My guess is that Alder lake and Xe realistically had been in works since 2017-2018.

In the (non-PC) tech company I used to work at, there were multiple layers going towards a product on sale. I'd imagine large tech companies in general would do similar.

 

Considering expected future user needs:

1, look at future technologies that may become useful, or create them if they don't exist

2, look at new existing technologies

3, actually designing product

4, getting manufacturing sorted

5, ship it!

 

A formal product project team is one thing, but work contributing towards it would have started even earlier.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, WolframaticAlpha said:

Stuff like this takes YEARS(3-4 years and more) to develop. Intel didn't just flick a switch to make their iGPUs decent. Realistically they have been working on this for years, and a number of companies, especially Apple had been telling them to get better in the iGPU space for years. My guess is that Alder lake and Xe realistically had been in works since 2017-2018.

Yeah, I realize they had to have been working on it for years already and couldn't just come up with it last minute, but what I was saying is it seems like they may have had it ready to go and just sat on it since they didn't have much competition in that regard. But I can also see that may not be the case due to Apple pushing them to improve it, and based on that it seems it was perhaps a situation of too little, too late, with them getting to that point just as Apple was breaking off the partnership.

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/1/2023 at 11:48 PM, LAwLz said:

I think rebranding their processors like this would have been better:

Intel Core i3 => Intel Bronze

Intel Core i5 => Intel Silver

Intel Core i7 => Intel Gold

Intel Core i9 => Intel Platinum

I kind of like this naming scheme, it would be so much easier to identify and understand where it stands in the product stack. Would wok well for cross-gen comparisons as well. maybe instead of wholly renaming it, make it the background colour of each tier on the sticker.  

On 5/1/2023 at 6:55 PM, soldier_ph said:

Great, way to go Intel. Now you're gonna confuse non Techies and especially Techies even more with your utterly Stupid naming schemes. I really liked the "Core i" branding, imo it was legendary. Stop fixing things which actually work and you're better off investing that time into fixing things which are actually broken like your High End Chips which are a real Pain to cool.

Honestly, half the reason I own a Ryzen PC is that it's SO much simpler to navigate the product stack, especially across generations. It's not that Intel makes bad chips (previous gens) or that it runs as hot as the sun and needs a Dyson Sphere Cooler, but I can't be arsed to have to look up the SKU EVERY DAMN TIME I WANT TO KNOW MANY CORES AND THREADS IT HAS

On 5/1/2023 at 6:50 PM, OhioYJ said:

Really a bunch of people sat around a table and that's the best thing they came up with.... 

don't forget that they get paid $who-knows-how-many-Gs
honestly, they could have gone with something a bit more unique or understandable

PC: Ryzen 5 2600, 16GB 3200mhz RAM (8GBx2), Gigabyte B550M DS3H, GTX 1050 2GB, 650W Semi-Modular PSU80+ Gold

Phone: Poco F3 8GB + 256GB

Audio: Samson SR850s

Sound Card: SoundBlaster Play 4 USB sound card

IEM: planning to get the KBEAR KS2s
Please be patient with me, I'm fatally dumb and its honestly a miracle I've made it this far

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/1/2023 at 12:50 PM, OhioYJ said:

Really a bunch of people sat around a table and that's the best thing they came up with.... 

not just people,  intel employees. its not surprising. at all.

 

...back to you Steve! 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×