Jump to content

The End of CPU Advancement on our Doorstep (Moore's Law and the 7nm Barrier) Discussion

The argument is redundant because by the time that we actually reach the point discussed by the OP the processing will be done off site of the consumer on the cloud and streamed to the home device.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, veli2501 said:

The argument is redundant because by the time that we actually reach the point discussed by the OP the processing will be done off site of the consumer on the cloud and streamed to the home device.

What makes you think that?

As far as my knowledge extends, there isn't any evidence that would suggest that asynchronous wide area network speeds would increase beyond the speed possible on die for *most* applications. Perhaps you should look into Network On a Chip.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

Quantum computers 

Corsair 4000D RGB

Asus B550 Tuf Gaming II

Asus 7700XT Tuf Gaming

AMD 5600x3d

32gb 3200mhz gskil 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, veli2501 said:

The argument is redundant because by the time that we actually reach the point discussed by the OP the processing will be done off site of the consumer on the cloud and streamed to the home device.

Yeah because we have a nice good internet infrastructure. Cloud Computing will never take off until every one has a decent connection. The fact is, there are millions of American's stuck on slow as shit DSL or Satellite connections. Luck for me Im not one of them. 

 

 

 

Now my question is why dont they just make the CPU's physically bigger and just add more cores? Im sure eventually they will find a new material to build CPU's out of. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Donut417 said:

make the CPU's physically bigger and just add more cores

To be perfectly honest, this is what I'm expecting.

 

Who knows, in the next 10 years or so, consumer-grade chips might be the same physical size as the Extreme Edition chips.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Donut417 said:

Now my question is why dont they just make the CPU's physically bigger and just add more cores? Im sure eventually they will find a new material to build CPU's out of. 

Probably cuts into their profit margins since yields are lower.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tjcater said:

Probably cuts into their profit margins since yields are lower.

Have you seen threadripper? 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, veli2501 said:

The argument is redundant because by the time that we actually reach the point discussed by the OP the processing will be done off site of the consumer on the cloud and streamed to the home device.

Internet infrastructure aside, the total amount of processing still has to occur somewhere which means data centre CPU's will have to take up all the extra processing. 

20 minutes ago, Donut417 said:

Now my question is why dont they just make the CPU's physically bigger and just add more cores? Im sure eventually they will find a new material to build CPU's out of. 

 

I don't think bigger dies are possible due to the nature of density and getting the heat out.  More dies on the one substrate is looking like the answer as both AMD and Intel have played around with those.   

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

Threadripper is two Zeppelin dies (I mean, four if you want to include the dead ones) on a single package. It's not a single CPU.

 

I can't wait to see this in GPUs in the next few years.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/16/2018 at 7:41 AM, wcreek said:

Hasn't it been known for some years now that Silicon at least in itself is getting close to reaching a point where Gallium or some combination of Gallium will be needed to get smaller than 7nm?

Samsung has a 10nm process. Interesting Intel has been struggling to get below 14nm.

AMD has plans for a 12nm and eventually a 10nm process. Guessing Samsung will likely be the fab for at least the future 10nm chips. Idk if TSMC or GloFlo can go below 14nm yet. Since I'm guessing those are AMD's two major fabs still.

 

Edit: Not a lot of info on the foundries that fab AMD chips lol

But it looks like GloFlo and TSMC will be able get small enough.

if I had to make a guess, it is that Intel is having a difficult time reaching target clocks on 10nm to be sufficient for desktop and high-end laptop usages.

 

This is the primary difference I can think of between what Intel's 10nm is for, and Samsung's 10nm.

 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, tjcater said:

I can't wait to see this in GPUs in the next few years.

There have been GPUs with more than one processing units on a single card, which is almost what Threadripper is in GPU form.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Donut417 said:

Now my question is why dont they just make the CPU's physically bigger and just add more cores? Im sure eventually they will find a new material to build CPU's out of. 

There's a few problems with this:

  • The power density of CPUs is very large. On the order of something like 290W per square inch. The square/cubed principle will make cooling off processors that grow in size a lot harder.
  • Larger die size means less processors per silicon wafer. Considering how complicated processors are and thus the chances of a manufacturing defect occurring, it'd be in the manufacturer's best interest to not have larger dies.
  • Multi-module approaches could be done, but have their own headaches. Threadripper needs a special mode for games to shove games onto one die, because games are not designed for NUMA based architectures. Though sure, that's a software problem. However, it may not benefit games to be NUMA aware.
  • More cores doesn't automatically improve performance. Almost every application you use for every day life is an I/O bound task, meaning it doesn't matter how many processor cores you throw at it, performance won't improve.
Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Crunchy Dragon said:

There have been GPUs with more than one processing units on a single card, which is almost what Threadripper is in GPU form.

Woops forgot about the x2 GPUs. I was just excited thinking of mcm

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tjcater said:

Woops forgot about the x2 GPUs. I was just excited thinking of mcm

Remember the R9 295X2? That was a beast of a card, I remember Linus had to borrow Austin Evans' sample for their review xD

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, tjcater said:

Woops forgot about the x2 GPUs. I was just excited thinking of mcm

 

Just now, Crunchy Dragon said:

Remember the R9 295X2? That was a beast of a card, I remember Linus had to borrow Austin Evans' sample for their review xD

How could you forget this monstrosity?

773068Voodoo5k6twelftha.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

There's a few problems with this:

  • The power density of CPUs is very large. On the order of something like 290W per square inch. The square/cubed principle will make cooling off processors that grow in size a lot harder.
  • Larger die size means less processors per silicon wafer. Considering how complicated processors are and thus the chances of a manufacturing defect occurring, it'd be in the manufacturer's best interest to not have larger dies.
  • Multi-module approaches could be done, but have their own headaches. Threadripper needs a special mode for games to shove games onto one die, because games are not designed for NUMA based architectures. Though sure, that's a software problem. However, it may not benefit games to be NUMA aware.
  • More cores doesn't automatically improve performance. Almost every application you use for every day life is an I/O bound task, meaning it doesn't matter how many processor cores you throw at it, performance won't improve.

But what if this is the only answer for them for a few years? Its not like Intel or AMD can go years and years without bringing out a new chip. I mean AMD has kinda done this, but it didnt work out well. I mean the fact is Carbon Nano tubes are not there yet and Quantum computing is not there either. I dont see any other choice. Software devs will have to get off their lazy asses and learn out to code for these new chips. 

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Donut417 said:

But what if this is the only answer for them for a few years? Its not like Intel or AMD can go years and years without bringing out a new chip. I mean AMD has kinda done this, but it didnt work out well. I mean the fact is Carbon Nano tubes are not there yet and Quantum computing is not there either. I dont see any other choice. Software devs will have to get off their lazy asses and learn out to code for these new chips. 

Even if we make CPUs bigger with more cores, Amdahl's Law and Gustafson's Law will kick in. And even then, you can only improve an algorithm so much before the effort to improve it further becomes a monumental task that's not worth the effort, unless you'd like to pay for it.

 

Quantum computing won't really solve anything. It's more beneficial for a class of problems, not every problem.

 

And as much as software developers would like to get off their "lazy butts", there's a ton of other issues that have to be solved beyond coding the software. And again, some applications simply are not CPU performance sensitive enough. Open up your task manager. I'm almost certain all but a handful of apps are not consistently using more than 1% of CPU time.

 

If the solution were that easy, then none of us would be having this conversation. But the problem is the solution isn't that easy.

Edited by M.Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Donut417 said:

Software devs will have to get off their lazy asses and learn out to code for these new chips. 

There is a limit to what can be done especially in tasks dependent on prior output for input, in games this is often the case with ai and physics granted some things could be optimized for more core utilization but there is a limit and a large factor of diminishing returns (and high potential for stuttering to).

 

Honestly though it isn't remotely worth the extra investment atm so nothing will be improved beyond standard incremental changes.

https://linustechtips.com/main/topic/631048-psu-tier-list-updated/ Tier Breakdown (My understanding)--1 Godly, 2 Great, 3 Good, 4 Average, 5 Meh, 6 Bad, 7 Awful

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/10/2018 at 10:55 AM, Scheer said:

I truly don't mean to sound like I'm attacking you here, but this whole thing is very close minded IMO. Just because you can't think of how something can get better, doesn't mean it can't get better. It is going to sound snide, but as your OP is snide itself I find it fitting.

 

Advancement WILL slow down, I'm not arguing that, its not going to suddenly come to a halt tho. It will take longer to come out with CPUs that have smaller transistors, history has shown this, but eventually it will happen or something else will come out that allows faster CPUs.

 

There was no reason to upgrade Pony Express, then there was no reason to upgrade the telegraph, then no reason to upgrade phone lines, bag phones are amazing it can't possibly get better than this, do you see where I am going with this...? 20 years from now we will look back at the iPhoneX, S9, etc laughing at how out of date they are.

 

I mean, 70 years ago we were able to make cars that drove faster than the speed limit, yet we kept improving them...

 

You mention flying cars will never exist because of how expensive they are going to be, do you have any idea how expensive the first computers were? Any idea how much power they used? I'm sure glad they didn't  give up there. Rockets are another great example, compare how much we spent first getting to space to what it takes now, I'll bet you $20 its cheaper now.

 

If Military/Business tech moves forward, so does consumer tech, no rational business would get something newer/better/cheaper and decide not to market it to consumers. 

 

Personally it is easy to see individual pixels on a 27" 1440 panel from ~25" away, not that I'm going to complain about it right now as its not bad, but I'll surely upgrade at some point. My brother actually has a KU6500, and in my humble opinion it doesn't even come close to being as good as my not even that good Dell S2716DG monitor I use for gaming.

 

The Xbox refresh just had more cores thrown onto it because it was WAY cheaper for them to do that than redesign the console and possibly have to charge even more than $500 for it. If you honestly believe the Xbox One X is powerful enough and has no where to go you are on the wrong forum.

 

 

Sorry for it being in such a random order, I kept face palming and loosing my spot.

I agree with all this but I believe you are missing the point, or at least some of it, possibly just misinterpreting it?

 

Im not even sure how to explain in because in my head what you are saying makes sense for very good reasons but what Im saying also makes sense for also very good reasons.

 

Ahhhhhhhhh!!! Lol the technicalities are getting intese in here lol!

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/10/2018 at 2:26 AM, WallacEngineering said:

Yes, they are real vision scientists. I need to keep searching for another article I found a while back, it makes alot more scientific sense than the article in OP.

 

That makes perfect sense for 1080p, especially if its a larger display. My old Sanyo TV was a 60" 1080p. Up close you could definitely see the pixels. However, you are never going to be constantly sitting that close to the display.

 

Im talking about standard viewing distance. So 5 - 10 feet away from a TV, maybe 2 - 3 feet away from a big monitor. You know, standard viewing distances. It doesnt matter if you can see pixels a few inches to a foot away from a screen if thats not actually going to be the way you actually use the display. That just doesn't make any sense to compare.

 

A 4K monitor will be a good upgrade for you from 1080p. But as soon as you do it, you will think to yourself "wow, its better, but it isn't game changing"

 

Then one day, you might upgrade to 8K and I guarantee you that your first thought will be "wow, I wasted my money" lol.

 

By the way, you may not care about HDR and color support but they make a MASSIVE difference to a displays quality.

 

My Uncle still uses his 65" Sony Bravia 1080p TV in his basement. Man, what a beauty that thing is. He spent like 2 grand on it a long time ago, and honestly it looks nearly as good as my new 4K TV. Thats just how good the Bravia is.

Not everyone uses TV's as monitors bud...  I sit maybe 2 feet away from my monitors. Still look pixel-y AF... I think that in your situation, you are right. There is no difference. But the closer you are, the easier it is to tell the difference. 1080p used to cost a fortune, and now it costs 100 dollars. My friends have some 4k ultrawides, and they look so good compared to my 1440p one. Not from color, from pixel density.

Case: InWin 303 Motherboard: Asus TUF X570-Plus Processor: Ryzen R9-3900x GPU: Gigabyte RTX 3070 Ram: 32 GB DDR4 3000 MHZ

 PSU: Corsair CX750M Storage: 1TB Intel 660p NVME SSD and a 2TB Seagate 7200RPM HDD Mouse: Logitech G600 Keyboard: Razer Blackwidow Ultimate 2014 HeadphonesSteelseries Arctis 7 Audio: Shure PGA58 with a Focusrite Scarlett Solo 3rd Gen

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, jtmoseley said:

Not everyone uses TV's as monitors bud...  I sit maybe 2 feet away from my monitors. Still look pixel-y AF... I think that in your situation, you are right. There is no difference. But the closer you are, the easier it is to tell the difference. 1080p used to cost a fortune, and now it costs 100 dollars. My friends have some 4k ultrawides, and they look so good compared to my 1440p one. Not from color, from pixel density.

Well u know 4K 21:9 isnt 4K yet. They call it 4K because its ~4K resolution wide by 1440 tall. So really its "Streched 1440p" or "4K by 1440p" or whatever you want to call it (21:9 makes naming schemes quite difficult lol).

 

This is why earlier in the comments I said "well theres one more non-VR resolution bump that MIGHT be meaningful that I forgot about: TRUE 4K ultrawide (~5000x2160p)" Which is actually 5K by... damn you ultrawides...

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Donut417 said:

Yeah because we have a nice good internet infrastructure. Cloud Computing will never take off until every one has a decent connection. The fact is, there are millions of American's stuck on slow as shit DSL or Satellite connections. Luck for me Im not one of them. 

 

 

 

Now my question is why dont they just make the CPU's physically bigger and just add more cores? Im sure eventually they will find a new material to build CPU's out of. 

Im stuck at 10Mb/s down and 2Mb/s up (Internet included with rent)

 

I can barely stream 4K video :(

 

EDIT: Oh also I updated OP with a section discussing the idea of physically larger CPUs.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×