Jump to content

Why do AMD/nVidia/Intel even bother innovating?

Raytsou

Each company has a duopoly in their market. What stops them from not releasing any more new chips for 5~10 years? No company can ever catch up to them in that time frame. They could even offer drivers as a subscription so they still have a source of revenue without people upgrading their hardware.

This is a signature.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel is currently in the phase where they aren't releasing a new chip for ~5-10 years. Because of it, AMD is currently slaughtering them in year-over-year market gain. (Read: Intel has market share dominance, but they're losing ground to AMD's sales performance.

Nvidia continues to innovate because they push technologies beyond consumer hardware, and it trickles down. You're also seeing premiums for their products because they currently hold that "It's us if you want the best", with AMD only recently beginning to compete again with their "big Navi" RDNA2 launch, what limited stock there's been.

~Remember to quote posts to continue support on your thread~
-Don't be this kind of person-

CPU:  AMD Ryzen 7 5800x | RAM: 2x16GB Crucial Ripjaws Z | Cooling: XSPC/EK/Bitspower loop | MOBO: Gigabyte x570 Aorus Master | PSU: Seasonic Prime 750 Titanium  

SSD: 250GB Samsung 980 PRO (OS) | 1TB Crucial MX500| 2TB Crucial P2 | Case: Phanteks Evolv X | GPU: EVGA GTX 1080 Ti FTW3 (with EK Block) | HDD: 1x Seagate Barracuda 2TB

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Raytsou said:

Each company has a duopoly in their market. What stops them from not releasing any more new chips for 5~10 years? No company can ever catch up to them in that time frame. They could even offer drivers as a subscription so they still have a source of revenue without people upgrading their hardware.

Only way that happens is if all these big players make a pact to stop innovation which is... not gonna happen. It's extremely unlikely they'd be able to make much money off of subscription drivers as opposed to actual products. Even then, there's risks that smaller companies, or even individuals are eventually going to be able to catch up. It's not nearly as impossible as you claim, even within a timeframe of 5-10 years. 

I am NOT a professional and a lot of the time what I'm saying is based on limited knowledge and experience. I'm going to be incorrect at times. 

Motherboard Tier List                   How many watts do I need?
Best B550 Motherboards             Best Intel Z490 Motherboards

PC Troubleshooting                      You don't need a big PSU

PSU Tier List                                Common pc building mistakes 
PC BUILD Guide! (POV)              How to Overclock your CPU 

 

Link to comment
Share on other sites

Link to post
Share on other sites

In the long run they know they'll make more money by innovating because gamers want more fps with higher details whilst enterprises want faster number crunching times with less power consumption and heat output.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, for one, it's not a true duopoly. Intel has the lion's share of the market in both desktops, laptops, and servers. AMD is playing the role of the upstart right now, stealing market share in all three, specifically because of their R&D and innovation. Intel is losing ground, specifically because they sat on their laurels for too long. For a true duopoly, there has to be equal market share. Otherwise, one is always going to be trying to steal the others toys.

 

Besides, there's the looming threat to both of ARM. Apple is making its own ARM desktop and mobile chips now, mobile (except for laptops) was completely lost to ARM long ago. Nvidia is making a play for servers with their ARM CPUs, and Microsoft has been trying (unsuccessfully) to bring ARM to PC laptops. Eventually some company is going to make a breakthrough there, especially now that Apple has lit the way.

 

That leaves desktops as the last bastion for x86. That will hold out longer, because power efficiency is not such an issue there. However, many countries are enforcing power efficiency standards that may at least make ARM very enticing on the desktop for business machines, and there may come a point where ARM is just better for performance as well. There's already research showing some serious potential in that direction. Both Intel and AMD will have to stay innovative just to prevent the inevitable flank by ARM.

CPU: AMD Ryzen 9 5900X · Cooler: Artic Liquid Freezer II 280 · Motherboard: MSI MEG X570 Unify · RAM: G.skill Ripjaws V 2x16GB 3600MHz CL16 (2Rx8) · Graphics Card: ASUS GeForce RTX 3060 Ti TUF Gaming · Boot Drive: 500GB WD Black SN750 M.2 NVMe SSD · Game Drive: 2TB Crucial MX500 SATA SSD · PSU: Corsair White RM850x 850W 80+ Gold · Case: Corsair 4000D Airflow · Monitor: MSI Optix MAG342CQR 34” UWQHD 3440x1440 144Hz · Keyboard: Corsair K100 RGB Optical-Mechanical Gaming Keyboard (OPX Switch) · Mouse: Corsair Ironclaw RGB Wireless Gaming Mouse

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Raytsou said:

Each company has a duopoly in their market. What stops them from not releasing any more new chips for 5~10 years? No company can ever catch up to them in that time frame. They could even offer drivers as a subscription so they still have a source of revenue without people upgrading their hardware.

Because they compete against *each other*, should the collude to stop competing with each other, that would all but certainly end in an anti-trust suit.

 

They also still compete against *themselves*.  If they stop making new products, their only customers will be people who need either additional CPUs or replacement CPUs, there would no longer be people looking for *upgrade CPUs*.  No one will pay to upgrade their hardware if they can only buy the exact same hardware.

Desktop: Ryzen 9 3950X, Asus TUF Gaming X570-Plus, 64GB DDR4, MSI RTX 3080 Gaming X Trio, Creative Sound Blaster AE-7

Gaming PC #2: Ryzen 7 5800X3D, Asus TUF Gaming B550M-Plus, 32GB DDR4, Gigabyte Windforce GTX 1080

Gaming PC #3: Intel i7 4790, Asus B85M-G, 16B DDR3, XFX Radeon R9 390X 8GB

WFH PC: Intel i7 4790, Asus B85M-F, 16GB DDR3, Gigabyte Radeon RX 6400 4GB

UnRAID #1: AMD Ryzen 9 3900X, Asus TUF Gaming B450M-Plus, 64GB DDR4, Radeon HD 5450

UnRAID #2: Intel E5-2603v2, Asus P9X79 LE, 24GB DDR3, Radeon HD 5450

MiniPC: BeeLink SER6 6600H w/ Ryzen 5 6600H, 16GB DDR5 
Windows XP Retro PC: Intel i3 3250, Asus P8B75-M LX, 8GB DDR3, Sapphire Radeon HD 6850, Creative Sound Blaster Audigy

Windows 9X Retro PC: Intel E5800, ASRock 775i65G r2.0, 1GB DDR1, AGP Sapphire Radeon X800 Pro, Creative Sound Blaster Live!

Steam Deck w/ 2TB SSD Upgrade

Link to comment
Share on other sites

Link to post
Share on other sites

Also IBM is still lingering in the shadows... 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Mark Kaine said:

Also IBM is still lingering in the shadows... 

Don't forget Cyrix!
I'm sure they'll come back soon...

Any day now...

 

come back Cyrix all is forgiven

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, FakeKGB said:

Don't forget Cyrix!
I'm sure they'll come back soon...

I do wonder though , why is IBM making 2nm chips... whats their motivation...  what are they doing anyways?

 

Cyrix somehow went past me...  but I always thought IBM would never lose their position as innovator and (market leader) 

Still waiting on CELL 2 personally! 👀

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Lawsuits

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Raytsou said:

Each company has a duopoly in their market.

If this was full true, and they would start working together to maximize profits, it would become cartel which is against the laws in any country with free and open markets. Same reason why monopolies are looked at very strictly when they are allowed.

 

The competition being a thing is why companies need to innovate. If one company would just stop, the other would bring better products, changing situation where even dropping prices would not be enough for getting marketshare back. This is why even Microsoft which has had dominating place in consumer OS space has had to improve their software too. Its also why even those companies producing basic goods, like bakeries for example, need to innovate and bring new products. Not for just keeping old customers happy, but to bring in new customers.

^^^^ That's my post ^^^^
<-- This is me --- That's your scrollbar -->
vvvv Who's there? vvvv

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Raytsou said:

What stops them from not releasing any more new chips for 5~10 years

The fact that in that case no one would buy a new CPU because their current one remains top of the line for 10 years?

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Mark Kaine said:

I do wonder though , why is IBM making 2nm chips... whats their motivation...  what are they doing anyways?

They are doing a lot of research and licensing the patents. For example, Intel announced recently that they will do more collaboration with IBM.

13 hours ago, FakeKGB said:

Don't forget Cyrix!
I'm sure they'll come back soon...

Any day now...

 

come back Cyrix all is forgiven

Don’t forget Via

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I suppose the answer at this point is, "it's illegal" and "too much interest in more computation power."

Coolios.

14 hours ago, Chris Pratt said:

Well, for one, it's not a true duopoly. Intel has the lion's share of the market in both desktops, laptops, and servers. AMD is playing the role of the upstart right now, stealing market share in all three, specifically because of their R&D and innovation. Intel is losing ground, specifically because they sat on their laurels for too long. For a true duopoly, there has to be equal market share. Otherwise, one is always going to be trying to steal the others toys.

 

Besides, there's the looming threat to both of ARM. Apple is making its own ARM desktop and mobile chips now, mobile (except for laptops) was completely lost to ARM long ago. Nvidia is making a play for servers with their ARM CPUs, and Microsoft has been trying (unsuccessfully) to bring ARM to PC laptops. Eventually some company is going to make a breakthrough there, especially now that Apple has lit the way.

 

That leaves desktops as the last bastion for x86. That will hold out longer, because power efficiency is not such an issue there. However, many countries are enforcing power efficiency standards that may at least make ARM very enticing on the desktop for business machines, and there may come a point where ARM is just better for performance as well. There's already research showing some serious potential in that direction. Both Intel and AMD will have to stay innovative just to prevent the inevitable flank by ARM.

 

13 hours ago, Mark Kaine said:

Also IBM is still lingering in the shadows... 

I suppose an extension of my question would be, what if Qualcomm, Apple, AMD, nVidia, Intel, IBM, etc form a cartel and just stop innovating? Which, I guess is still very illegal.

I was talking with my friend and we came to the conclusion that the military industrial complex probably pushes for increase in computation power and that naturally trickles down to consumers.

 

13 hours ago, FakeKGB said:

Don't forget Cyrix!
I'm sure they'll come back soon...

Any day now...

 

come back Cyrix all is forgiven

IIRC, Cyrix got bought by VIA. I actually interned at VIA's headquarters in Taipei last summer. I asked my supervisor about their x86 division and I was told they gave up, but they currently license their x86 patents to a Chinese company called ZhaoXin. I haven't looked into them much but I recall reading a forum post on here a few years ago of a leak of an engineering sample running at <2.0 GHz with Skylake IPC.

12 hours ago, Mark Kaine said:

I do wonder though , why is IBM making 2nm chips... whats their motivation...  what are they doing anyways?

 

Cyrix somehow went past me...  but I always thought IBM would never lose their position as innovator and (market leader) 

Still waiting on CELL 2 personally! 👀

Ooh! I think I'm qualified to answer this. So Moore's Law tells us that the number of transistors per chip doubles every two years, but he doesn't say how we get there. So CPUs are more powerful when they have more transistors, but there is a maximum number of transistors you can cram on a chip before manufacturing costs become prohibitive. Lithography methods improve and become more accurate every year, and Dennard scaling tells us how to use that improvement to make each transistor faster.

 

In short, a transistor is a gate where current above a threshold is a 1 and below is a 0. This current flows from a drain to a source via a channel of silicon. Dennard scaling says, decrease the width and the length of the channel each by 0.7X and design the voltages such that the electrons flow across this channel at the same velocity. The total channel area is reduced by 0.7*0.7 ~= 0.5, and constant electron velocity across means our transistor delay time is halved and our speed is effectively doubled. Not only that, but in doing so we maintain our power density so heat is well managed.

 

Thus, every other year the transistor industry has tried to follow the 0.7X rule (22*0.7 = 15.4, 14*0.7=9.8, 10*0.7=7, 7*0.7=4.9, 5*0.7=3.5, 3*0.7=2.1). Unfortunately, this has become increasingly hard to do as making the device smaller means electrons just tunnel through the channel, which causes leakage current and excessive heat (power is proportional to current squared). We know that silicon isn't really feasible for 2nm, and so IBM figuring out 2nm with carbon nanotubes is a p big deal. 

 

12 hours ago, Caroline said:

That's how the system works.

 

Who would win? the company making light bulbs that last for 20.000 hours or the one making bulbs that only last 1.000?

 

Think about it, if everyone made lasting and efficient stuff without pursuing infinite wealth this economic system would simply fall apart because there would be no reason to ravage resources in order to create "new" stuff, millions of jobs wouldn't exist and the most important: software developers would have to actually optimise code to make it compatible with existing hardware instead of relying in i.e phone manufacturers to simply increase memory or CPU foundries to find a way to make MoRe CoReS!!! :old-laugh:

Funny you say that...

 

27 minutes ago, akio123008 said:

The fact that in that case no one would buy a new CPU because their current one remains top of the line for 10 years?

Nothing stopping them from baking in planned obsolescence into the next decade of CPUs, then throwing their hands up and saying "We hit a wall". Imagine if Intel was able to maintain the status quo from 5 years ago from another 20+ years...

This is a signature.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Raytsou said:

We know that silicon isn't really feasible for 2nm

I know!

https://www.google.com/amp/s/www.technologyreview.com/2020/02/24/905789/were-not-prepared-for-the-end-of-moores-law/amp/

 

Quote

RIP

 

“It’s over. This year that became really clear,” says Charles Leiserson, a computer scientist at MIT and a pioneer of parallel computing, in which multiple calculations are performed simultaneously. The newest Intel fabrication plant, meant to build chips with minimum feature sizes of 10 nanometers, was much delayed, delivering chips in 2019, five years after the previous generation of chips with 14-nanometer features. Moore’s Law, Leiserson says, was always about the rate of progress, and “we’re no longer on that rate.” Numerous other prominent computer scientists have also declared Moore’s Law dead in recent years. In early 2019, the CEO of the large chipmaker Nvidia agreed.

 

13 minutes ago, Raytsou said:

IBM figuring out 2nm with carbon nanotubes is a p big deal. 

I see, but i was more asking why they arent making consumer chips anymore (afaik) did the failed Sony cooperation regarding CELL really break their neck? (and yes i know its not really a feasible concept, but it was still a breakthrough technology for multi core processing, which wasnt really a thing back then)

 

16 minutes ago, Raytsou said:

We hit a wall"

Well, they kinda did...!  Until we get those 2nm IBM nano machines tubes all we're seeing is really fake innovation of packing more things in bigger packages so people think "oh moores law isnt really dead at all!"... 

¯\_(ツ)_/¯

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mark Kaine said:

Bro I had to set up a connection to my college's VPN to get access to that article.

 

1 hour ago, Mark Kaine said:

I see, but i was more asking why they arent making consumer chips anymore (afaik) did the failed Sony cooperation regarding CELL really break their neck? (and yes i know its not really a feasible concept, but it was still a breakthrough technology for multi core processing, which wasnt really a thing back then)

I wouldn't really be able to comment on the management decisions of IBM and frankly I have no idea what they're doing.

 

1 hour ago, Mark Kaine said:

Well, they kinda did...!  Until we get those 2nm IBM nano machines tubes all we're seeing is really fake innovation of packing more things in bigger packages so people think "oh moores law isnt really dead at all!"... 

¯\_(ツ)_/¯

That's not fair to say imo. There's a lot of other ways to innovate in chip design beyond lithography. The transistor itself has been more or less re-designed every generation. We just need people to come up with creative solutions, whether it be in material science or electrical engineering. 

This is a signature.

Link to comment
Share on other sites

Link to post
Share on other sites

Amd did that (well kinda): Intel takes over

Intel did that (once again, kinda): AMD Takes over

nvidia had one bad generation: AMD Catches up  huge amount

AMD stoped making good cards back in the early 2000s: NVIDIA has a 2 decade long lead

 

 

this never goes well for any company. Companys that arent AMD, Intel, or Nvidia tired this, which is why there are only 2 consumer companies in each market. 

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Raytsou said:

That's not fair to say imo

i wasnt trying to be fair or not, im not saying they arent creative or not trying, more like they really hit a wall, 7nm, 5nm... if Moores law would still work we should be at like 1nm already or something, but it doesnt , where something like IBMs nano machines come in, maybe its just not lucrative for companies like intel, amd, nvidia to be truly innovative anymore (as in actually finding a replacement for the outdated silicone) Theyre more innovating around that problem currently. (in my impression, i mean i dont really know what theyre doing behind closed doors obviously)

 

 

24 minutes ago, Raytsou said:

Bro I had to set up a connection to my college's VPN to get access to that article.

Im sorry, it used to be free!  MIT be like broke apparently. n_n

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Intel has been innovating in the last 5 years? Besides the dedicated graphics cards, what have they done for innovation in their historic field?

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×