Jump to content

are we going to see the first terahertz cpu in the future

Basically, when you get up to these frequencies, the speed of electricity actually matters.  Light travels 300,000,000 meters in 1 second.  So, if you have a 1GHz CPU, doing 1,000,000,000 cycles in 1 second, light can travel 0.3m (30cm) before the next CPU operation occurs.  In a 10GHz CPU, light could only travel 3cm before the next CPU cycle occurs.  That means the maximum distance (so, diagonal) on a 10GHz CPU die can theoretically be no greater than 3cm, less in practice, but perhaps it could be stretched a bit with some clever engineering, ensuring that one side of the CPU is never required to signal to the opposite corner or something... It would be a whole new level (nightmare) in CPU architecting...  Anyway, the limit would be reduced to 1.5cm for a 20GHz CPU, 3mm for a 100GHz CPU, and so on.  So, to increase clock speeds we would be limited to a very small die size, and we would basically rely on further transistor gate shrinks to be able to fit more transistors in that maximum area.  Transistors as well though, are approaching the scale where the size of atoms restricts any further gate shrinks, as well as electron leakage.  Carbon atoms such as in graphene are smaller than Silicon atoms, so using graphene would allow for further shrinkage than Silicon, beyond the 1nm barrier, but the minimum size limit would only be pushed a little further back until we are restrained by the size of Carbon atoms too.

 

At that point, the entire foundation of computer technology would need a successor to advance any further, no longer based on semiconductor transistors.  Such as quantum computing.  Or black hole computing xD That's a fun one...

Link to comment
Share on other sites

Link to post
Share on other sites

But what about those processors that would use light instead of electricity to transfer data? I can't remember what they're called, photon processors? Anyway, like I said, it's a processor which would use light to process data instead of electricity. That would eliminate crosstalk interference, and heat output. And would enable use of more light beams parallel. That's all I remember, it's a concept, but a very plausable one.

read glens post.

 

At that point, the entire foundation of computer technology would need a successor to advance any further, no longer based on semiconductor transistors.  Such as quantum computing.  Or black hole computing xD That's a fun one...

that is pretty much it

AAAAAAAAAGGGGGGGHHHHH!!!!

Link to comment
Share on other sites

Link to post
Share on other sites

we'll be using multiple activities in the future, so i'd reckon that for commercial use, more cores and possibly less clock speeds to save power

Spoiler

Gaming/Engineering PC: -i7 6700K, 4-4.2GHz "Eleanor" -ASUS ROG HERO VIII MOBO -16GB DDR4 3000MHz Corsair (2x8GB) -Gigabyte Windforce 980Ti OC edition (1405MHz GPU clock) -H110i GT Corsair CPU Water cooler -980GB Sandisk Ultra II SSD -Corsair 450D ATX Case -RM850i Corsair PSU (Modular) -28” 4K Samsung -27” 1080p Samsung 

Link to comment
Share on other sites

Link to post
Share on other sites

http://www.technologyreview.com/view/518426/how-to-save-the-troubled-graphene-transistor/

 

took me a second to find it again but 400ghz transistors  very simple transistors but still i don't see silicon being viable much after 2020 so they are going to have find something to replace it eventually

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

My guess is that we will have quantum computers before we even get near to those figures.

We are so close to making a breakthrough in Quantum computing. My mum works at the UNSW and she's always like QUANTUM COMPUTERS ARE THE FUTURE ( yet she knows like nothing about hardware in pcs )

But yeah we are close to a breakthrough

Link to comment
Share on other sites

Link to post
Share on other sites

Don't forget that as CPU speed increases, the memory-CPU performance gap also increases. Throwing more on-die cache at the problem won't help because that would require a larger die.

Link to comment
Share on other sites

Link to post
Share on other sites

Basically, when you get up to these frequencies, the speed of electricity actually matters.  Light travels 300,000,000 meters in 1 second.  So, if you have a 1GHz CPU, doing 1,000,000,000 cycles in 1 second, light can travel 0.3m (30cm) before the next CPU operation occurs.  In a 10GHz CPU, light could only travel 3cm before the next CPU cycle occurs.  That means the maximum distance (so, diagonal) on a 10GHz CPU die can theoretically be no greater than 3cm, less in practice, but perhaps it could be stretched a bit with some clever engineering, ensuring that one side of the CPU is never required to signal to the opposite corner or something... It would be a whole new level (nightmare) in CPU architecting...  Anyway, the limit would be reduced to 1.5cm for a 20GHz CPU, 3mm for a 100GHz CPU, and so on.  So, to increase clock speeds we would be limited to a very small die size, and we would basically rely on further transistor gate shrinks to be able to fit more transistors in that maximum area.  Transistors as well though, are approaching the scale where the size of atoms restricts any further gate shrinks, as well as electron leakage.  Carbon atoms such as in graphene are smaller than Silicon atoms, so using graphene would allow for further shrinkage than Silicon, beyond the 1nm barrier, but the minimum size limit would only be pushed a little further back until we are restrained by the size of Carbon atoms too.

 

At that point, the entire foundation of computer technology would need a successor to advance any further, no longer based on semiconductor transistors.  Such as quantum computing.  Or black hole computing xD That's a fun one...

 

Further, it is not solely the size of the atoms that constrain the size of the classical transistor. At smaller and smaller transistor sizes, the distance between the source and the drain of the transistor is so small that it gets harder and harder to control the flow of current due to quantum tunneling.

 

However, I have heard of some promising research using photonics (much faster clock speeds, less electromagnetic cross talk, significantly lower temperatures, etc).

 

We are so close to making a breakthrough in Quantum computing. My mum works at the UNSW and she's always like QUANTUM COMPUTERS ARE THE FUTURE ( yet she knows like nothing about hardware in pcs )

But yeah we are close to a breakthrough

 

Quantum computers are not a replacement of classical computers. In fact, in heavily serialized loads, a quantum computer will perform far worse than a classical one. Quantum computers are built for parallelism and that is where they shine (understatement).

| CPU: 2600K @ 4.5 GHz 1.325V | MB: ASUS P8Z68-V pro | GPU: EVGA GTX 480 clk 865/mem 2100 | RAM: Corsair Vengeance 1600 MHz CL9 | HDD: Muskin Chronos Deluxe 240GB(win8) && ADATA SX900 120 GB(ubuntu 12.04) | PSU: Seasonic x760 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×