Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Lord Szechenyi

What's the future of CPUs in the next 5 years?

Recommended Posts

Posted · Original PosterOP

since we are getting close to the maximum (or rather, minimum) size of CPU dies when using silicon, will we see some other material, or will we see popularity for systems with more than 1 cpu, or perhaps make the cpu bigger?

Link to post
Share on other sites
1 minute ago, Lord Szechenyi said:

since we are getting close to the maximum (or rather, minimum) size of CPU dies when using silicon, will we see some other material, or will we see popularity for systems with more than 1 cpu, or perhaps make the cpu bigger?

The future?

Intel and AMD will go down in flames.

Via will rise again.

Along with Matrox.

 

Hey, if I could predict the future, I'd be a lot richer than I am now.


So rise up, all ye lost ones, as one, we'll claw the clouds

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Radium_Angel said:

The future?

Intel and AMD will go down in flames.

Via will rise again.

Along with Matrox.

 

Hey, if I could predict the future, I'd be a lot richer than I am now.

That doesn't make any sense.

EVERYONE would be affected by this, since (AFAI) all cpus are manufactured in the same way

Spoiler

But don't worry, i still found your joke funny

 

Link to post
Share on other sites
3 minutes ago, Lord Szechenyi said:

will we see some other material

How do you expect anyone to be able to answer this? We have no way of knowing until someone comes up with a new, suitable material!


Hand, n. A singular instrument worn at the end of the human arm and commonly thrust into somebody’s pocket.

Link to post
Share on other sites
1 minute ago, Lord Szechenyi said:

That doesn't make any sense.

EVERYONE would be affected by this, since (AFAI) all cpus are manufactured in the same way

  Reveal hidden contents

But don't worry, i still found your joke funny

 

Personally, I'd love to see the resurrection of custom iron like SGI and Sun systems, it won't happen, but I'd kill to have a modern non-x86 SGI on my desk


So rise up, all ye lost ones, as one, we'll claw the clouds

Link to post
Share on other sites
Posted · Original PosterOP
1 minute ago, Radium_Angel said:

Personally, I'd love to see the resurrection of custom iron like SGI and Sun systems, it won't happen, but I'd kill to have a modern non-x86 SGI on my desk

i had no idea what you were talking about, so i checked it out and WOW i was impressed.

but the problem is, they stopped researching that kind of CPU decades ago (probably for the best) so even if we did make these now there would be lots of research time needed to get remotely close to the same level as today's cpus are

Link to post
Share on other sites

imo they will switch to some other srmi ccondactor like germanium (unlikely)


if it was useful give it a like :) btw if your into linux pay a visit here  and i will be thankful if you send me an opinion here  

 

Link to post
Share on other sites
5 minutes ago, Lord Szechenyi said:

i had no idea what you were talking about, so i checked it out and WOW i was impressed.

but the problem is, they stopped researching that kind of CPU decades ago (probably for the best) so even if we did make these now there would be lots of research time needed to get remotely close to the same level as today's cpus are

Yeah, even today those old SGI and Sun systems go for prime dosh on ebay, and have been hopelessly outclassed by even 2nd-gen Intel chips. Shame really, there was an elegance in SGI systems you won't find today.


So rise up, all ye lost ones, as one, we'll claw the clouds

Link to post
Share on other sites
Posted · Original PosterOP
5 minutes ago, Biomecanoid said:

Since everybody is talking about his hopes and dreams :P I want 3DFX to rise from the dead and instead of direct3d and opengl, glide becomes the norm.

GOD PLEASE NO

we are already having issues nowadays with old games not running correctly i don't want Today's Games having compatibility issues in the future

Link to post
Share on other sites
Posted · Original PosterOP
21 minutes ago, Ian Greenhalgh said:

Considering how little mainstream CPUs have changed in the last 5 years, I don't expect much to change radically in the next 5 years.

Fair Point, but something quite big is going to (and must) happen to be able to make better CPUs

Link to post
Share on other sites

Some say graphene is the future. Graphene chips can in theory run at frequencies of 1000+ GHz, use less power, dissipate heat faster and conduct electricity better than current materials. It's all in experimental stage so there is no telling if it will work out. It could just be hype.

Link to post
Share on other sites
Posted · Original PosterOP
23 minutes ago, Internet Person said:

Some say graphene is the future. Graphene chips can in theory run at frequencies of 1000+ GHz, use less power, dissipate heat faster and conduct electricity better than current materials. It's all in experimental stage so there is no telling if it will work out. It could just be hype.

well there really isn't any other viable option

Link to post
Share on other sites

Ryzen 103 7900000XX

I52 142450000k

 

 

/s

 

Should be interesting to see. Especially with where we've come in the last 5 years.


CPU I9-9900k, Motherboard Asus ROG Maximus XI Code, RAM CORSAIR Vengeance LPX 32GB 3000mhz (8x4), GPU ASUS ROG STRIX GeForce RTX 2070 8GB, Case Phanteks Enthoo Evolv X, Storage CORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, Seagate BarraCuda 3TB 7200 RPM, PSU CORSAIR RM1000i, Cooling Asus ROG RYUO 240, Displays LG 34UC98-W 34-Inch, 32 inch curved that I can't find the model number for right now, Keyboard Corsair K95, Mouse Corsair Nightblade, Sound AT2020+ USB Mic and Headphones
 

Link to post
Share on other sites
14 minutes ago, Lord Szechenyi said:

well there really isn't any other viable option

 

Yeah, the other option is stagnation. We hit the GHz barrier. We already have 16+ cores consumer CPUs so I don't see much in this direction. New materials is the only way.

Link to post
Share on other sites

5 year time frame - 

I'm speculating:


Hybrid architectures (handful of big cores, large number of small cores)
more specialized hardware/accelerators (think video accelerators, AI inferencing accelerators, etc.)
An extra level in the memory hierarchy (eDRAM? HMC/HBM? maybe the last level of RAM is going to be optane?)
 


 

I also wouldn't be surprised to see more stuff out of ARM. We could end up in a market where AMD/Intel serve high performance needs and ARM is adequate for everything else. Why does my mother need anything more than a video conferencing box + a handful of 20 year old games + office + taxes?


R9 3900x; 64GB RAM | RTX 2080 | 1.5TB Optane P4800x

1TB ADATA XPG Pro 8200 SSD | 2TB Micron 1100 SSD
HD800 + SCHIIT VALI | Topre Realforce Keyboard

Link to post
Share on other sites
Posted · Original PosterOP
2 minutes ago, comander said:

5 year time frame - 

I'm speculating:


Hybrid architectures (handful of big cores, large number of small cores)
more specialized hardware/accelerators (think video accelerators, AI inferencing accelerators, etc.)
An extra level in the memory hierarchy (eDRAM? HMC/HBM? maybe the last level of RAM is going to be optane?)
 


 

I also wouldn't be surprised to see more stuff out of ARM. We could end up in a market where AMD/Intel serve high performance needs and ARM is adequate for everything else. Why does my mother need anything more than a video conferencing box + a handful of 20 year old games + office + taxes?

I'm doubtful about any of the ones you mentioned except ONE.

AI may very well be the answer, after all what we are seeing with Nvidia it's very likely that AI might be the best solution

Link to post
Share on other sites

Only 5 years ahead? People might still be using 3rd Gen Ryzen lol.

 

What of 20 years from now? 5 years we can already see from here....


- If it ain't broken, don't fix it! 

- Your post codes and beep codes in the drop down below.

Spoiler

 

 

Link to post
Share on other sites
Posted · Original PosterOP
Just now, ShrimpBrime said:

Only 5 years ahead? People might still be using 3rd Gen Ryzen lol.

 

What of 20 years from now? 5 years we can already see from here....

No i mean that we are about 2 cpu generations before we hit a peak (at least with silicon).

i'm sure most people will use current cpus but will applications stop being better? will there be nothing else after ray tracing?

Link to post
Share on other sites
1 minute ago, Lord Szechenyi said:

No i mean that we are about 2 cpu generations before we hit a peak (at least with silicon).

i'm sure most people will use current cpus but will applications stop being better? will there be nothing else after ray tracing?

Depends.

 

Humans are starting to let AI engineer stuff. Perhaps the way we compute may totally change. X128 instead of X64 could even be a possibility.

 

For the hardware, because of something like the above, the physical processor might greatly change. 

 

One thing is certain, the software is behind a bit. 

 

Either way, I'm in for the ride!


- If it ain't broken, don't fix it! 

- Your post codes and beep codes in the drop down below.

Spoiler

 

 

Link to post
Share on other sites
6 minutes ago, Lord Szechenyi said:

I'm doubtful about any of the ones you mentioned except ONE.

AI may very well be the answer, after all what we are seeing with Nvidia it's very likely that AI might be the best solution

Intel already launched such a part fairly recently and this set up was common place in the ARM world YEARS ago. 
Beyond that Intel has more accelerators on their roadmap and is making a big deal out of it. Also, anecdotally I want to a holiday party and ended up chatting with a CPU designer from a big company that designs processors... they're making accelerators. That's the future. Figure out 10 use cases and make dedicated hardware that does those 10 things VERY VERY well at very low die area/energy costs. This overlaps with neural network inferencing

Same idea with more cache levels - Intel already has it (Broadwell, also Xeon with optane)

 

For most of these it's more of a question of where, how and to what extent - not whether it'll happen. Some of these things will make sense in SOME use cases but not others (e.g. Server vs Laptop vs Workstation vs Desktop will all have different performance needs and cost targets)

 

Just now, ShrimpBrime said:

Humans are starting to let AI engineer stuff. Perhaps the way we compute may totally change. X128 instead of X64 could even be a possibility.

The main benefit of going from 32 bit to 64 bit was the ability to handle more RAM. There aren't really any performance benefits and there are probably performance drawbacks due to increased overhead. 

2^32 = 4294967296 bytes aka 4GB RAM. 
2^64 = 1.8446744e+19 bytes AKA 16 billion GB of RAM. 

I have yet to find a use case for normal computers that needs more than 16 Billion GB. 
 


R9 3900x; 64GB RAM | RTX 2080 | 1.5TB Optane P4800x

1TB ADATA XPG Pro 8200 SSD | 2TB Micron 1100 SSD
HD800 + SCHIIT VALI | Topre Realforce Keyboard

Link to post
Share on other sites
Posted · Original PosterOP
9 minutes ago, comander said:

Intel already launched such a part fairly recently and this set up was common place in the ARM world YEARS ago. 
Beyond that Intel has more accelerators on their roadmap and is making a big deal out of it. Also, anecdotally I want to a holiday party and ended up chatting with a CPU designer from a big company that designs processors... they're making accelerators. That's the future. Figure out 10 use cases and make dedicated hardware that does those 10 things VERY VERY well at very low die area/energy costs. This overlaps with neural network inferencing

Same idea with more cache levels - Intel already has it (Broadwell, also Xeon with optane)

 

For most of these it's more of a question of where, how and to what extent - not whether it'll happen. Some of these things will make sense in SOME use cases but not others (e.g. Server vs Laptop vs Workstation vs Desktop will all have different performance needs and cost targets)

 

The main benefit of going from 32 bit to 64 bit was the ability to handle more RAM. There aren't really any performance benefits and there are probably performance drawbacks due to increased overhead. 

2^32 = 4294967296 bytes aka 4GB RAM. 
2^64 = 1.8446744e+19 bytes AKA 16 billion GB of RAM. 

I have yet to find a use case for normal computers that needs more than 16 Billion GB. 
 

honestly it was also for storage

so max file size doesn't have to be 4GB (or whatever number it was)

Link to post
Share on other sites
18 minutes ago, Lord Szechenyi said:

honestly it was also for storage

so max file size doesn't have to be 4GB (or whatever number it was)

I don't believe that there's any direct connection between HDD/SSD size and register width on a CPU. Same goes for file size. It's possible I'm off there, not my expertise and I haven't done a deep dive into that area in around 15 years. 

 

If you're talking about file sizes directly, that SHOULD have more to do with the file system than the register width of the CPU. 

As an aside there were work arounds to getting more RAM (PAE) to be addressed but there were drawbacks to those work arounds. 

 

https://en.wikipedia.org/wiki/64-bit_computing#32-bit_vs_64-bit
 

 

EDIT: looks like I'm wrong about file size limitations. being able to address a large file DOES potentially rely on having a larger word size to some extent. Switching to NTFS was the main thing that handled it though, even on 32bit CPUs/OSes

 

I will go on record and say that I see 0 need for 16 billion GB files in the next 5 years, even for enterprise use cases. 


R9 3900x; 64GB RAM | RTX 2080 | 1.5TB Optane P4800x

1TB ADATA XPG Pro 8200 SSD | 2TB Micron 1100 SSD
HD800 + SCHIIT VALI | Topre Realforce Keyboard

Link to post
Share on other sites
28 minutes ago, comander said:

 

I have yet to find a use case for normal computers that needs more than 16 Billion GB. 
 

Interesting way to put it.... lol 

 

Who said a processor's main focus was consumer related? 

 

 

 


- If it ain't broken, don't fix it! 

- Your post codes and beep codes in the drop down below.

Spoiler

 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×