Jump to content

The End of CPU Advancement on our Doorstep (Moore's Law and the 7nm Barrier) Discussion

4 minutes ago, straight_stewie said:

You know, alot of people think Moore's law will end when we hit the smallness limit of transistors. Unfortunately for those people, there are two ways to increase the number of transistors on a die...

These are all very good points, probably the best explanation/argument I have seen so far. There are, however, teething issues with these ideas as well.

 

For one, as the articles specifically state, smaller transistors past the 7nm mark are going to be next to impossible, so that idea just doesn't work.

 

Splitting the die, hmmmm. Might help, but splitting the die means the two separate dies now need a way to communicate with one another. AMDs Infinity Fabric seems to work well, but it is slower than just having a single physical die. Therefore, if the communication bridge isn't any better than the CPU enhancements itself, then its redundant to split the die. I suppose we can hope that Infinity Fabric improves with time.

 

I will agree that exploring new ISA's is probably the only real solution, at least for now. Maybe we could see a new ISA that enables our current-gen CPUs to double their throughput. This would solve the problem of CPU performance for quite a while, but then AMD and Intel would still have no physical way of improving silicon-based CPUs other than the physical CPU size increase which again, poses quite a few issues.

 

Expecting manufacturers to create entirely different sockets, motherboards, CPU cooling mounts, and other accessories for CPUs that physically change every single year seems a bit unrealistic, but I suppose it is possible.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, WallacEngineering said:

Splitting the die, hmmmm. Might help, but splitting the die means the two separate dies now need a way to communicate with one another. AMDs Infinity Fabric seems to work well, but it is slower than just having a single physical die. Therefore, if the communication bridge isn't any better than the CPU enhancements itself, then its redundant to split the die. I suppose we can hope that Infinity Fabric improves with time.

Who said anything about splitting the die? Keep the small mask size, just make the die bigger... 

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, straight_stewie said:

Who said anything about splitting the die? Keep the small mask size, just make the die bigger... 

Oh my bad I read it wrong. You said "cut the size in half" not "cut the die in half"

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, WallacEngineering said:

Oh my bad I read it wrong. You said "cut the size in half" not "cut the die in half"

Well, the term "die" is somewhat ambiguous. It's used both to mean the size of the transistor, and to generally refer to the piece of silicon the circuit is "printed" on.

In reality, the term "die" just means the lithography mask that is used to "print" the parts onto the silicon. 

The term is wholly ambiguous, so I differentiated in a way that made sense to me. I suppose I could work on describing those differentiations a little better.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, straight_stewie said:

Well, the term "die" is somewhat ambiguous. It's used both to mean the size of the transistor, and to generally refer to the piece of silicon the circuit is "printed" on.

In reality, the term "die" just means the lithography mask that is used to "print" the parts onto the silicon. 

The term is wholly ambiguous, so I differentiated in a way that made sense to me. I suppose I could work on describing those differentiations a little better.

Nah, I wholly understand now, I just read the word wrong. I think your explanation was very detailed and correct.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Canada EH said:

Some do some dont, you want extra heat for the winter its great.

Lol thats what I liked about my old Sanyo TV. It putout so much heat that half of the time I would have to run a fan even in the winter with a foot of snow outside my window.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Damascus said:

Never going to happen, Intel just won't release impressively better hardware.  Mainstream chips will stay on 7nm and just get larger and larger.  After a while new technology will inevitably come to fruition (Intel will keep throwing billions of dollars at it until something works) and meaningful progress will resume

I suppose Intel does have the funding to pretty much do whatever the hell they feel like.

 

"Got a problem?"; "Yea, we cant improve the next generation of CPUs, what should we do?"; "Heres 10 Billion USD, fix it, or your knee caps may be at risk"

 

Lol, good old Intel

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

I can defintly see the difference between 1440p and 4k on for example a 27" monitor.

So I dont what the f you are talking about with that.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

Also, there a lot of other games than FPS games.

Your opinion about VR is really subjective.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Mihle said:

Also, there a lot of other games than FPS games.

Your opinion about VR is really subjective.

Well yes I know thats why I used words like "in my opinion" and "to me, this..."

 

The difference you are seeing in monitor resolution could also be better color support, fancy HDR stuff, or other things.

 

Even if 4K might look ever so slightly sharper to you, even THEN would you find the difference to justify spending the money on a new monitor?

 

And even if people can claim that 4K makes a difference in comparison to 1440p (not sure how when vision doctors and scientists say its impossible and Im going to believe them over anyone else) so what? That technology has already been around for a few years. Its not like 4K is a new thing.

 

So what then? 8K next? Just how much longer are people going to buy into the marketing crap before they finally realize "oh hey, this makes no difference compared to my old display"?

 

Pretty much everyone already believes that 8K is a total gimmick and wont be worth the cost. So regardless of weather or not you think 4K is any better, we are still at the end of MEANINGFUL display resolution bumps.

 

EDIT: There MIGHT be one last meaningful display bump to come out that I forgot about.

 

Possibly TRUE 4K Ultrawide 21:9 (somewhere around 4900x2160 pixels)

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, WallacEngineering said:

Well yes I know thats why I used words like "in my opinion" and "to me, this..."

 

The difference you are seeing in monitor resolution could also be better color support, fancy HDR stuff, or other things.

 

Even if 4K might look ever so slightly sharper to you, even THEN would you find the difference to justify spending the money on a new monitor?

 

And even if people can claim that 4K makes a difference in comparison to 1440p (not sure how when vision doctors and scientists say its impossible and Im going to believe them over anyone else) so what? That technology has already been around for a few years. Its not like 4K is a new thing.

 

So what then? 8K next? Just how much longer are people going to buy into the marketing crap before they finally realize "oh hey, this makes no difference compared to my old display"?

 

Pretty much everyone already believes that 8K is a total gimmick and wont be worth the cost. So regardless of weather or not you think 4K is any better, we are still at the end of MEANINGFUL display resolution bumps.

SOME "scientist". I have seen a lot that doesnt say that. even if an scientist say something doesnt mean its true. if every scientist said it its something else.

I would say it more than just "slight" difference, but the word "slight" isnt really a specific amount and its actual "amount" is subjective. And how much better something have to be to be worth it is also highly subjective.

 

At a normal distance(arm lenght) on 1080p 24" I can see the pixels. Well, in the way that I can for example see that the W in text the lines doesnt look compleatly straight slanted line, it loks sort of zigsag. 
I talk about pixels and not colour and HDR and stuff.

When I get a new computer at some point I will upgrade to a 27 or 30" 4k monitor.

at

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Mihle said:

SOME "scientist"...

Yes, they are real vision scientists. I need to keep searching for another article I found a while back, it makes alot more scientific sense than the article in OP.

 

That makes perfect sense for 1080p, especially if its a larger display. My old Sanyo TV was a 60" 1080p. Up close you could definitely see the pixels. However, you are never going to be constantly sitting that close to the display.

 

Im talking about standard viewing distance. So 5 - 10 feet away from a TV, maybe 2 - 3 feet away from a big monitor. You know, standard viewing distances. It doesnt matter if you can see pixels a few inches to a foot away from a screen if thats not actually going to be the way you actually use the display. That just doesn't make any sense to compare.

 

A 4K monitor will be a good upgrade for you from 1080p. But as soon as you do it, you will think to yourself "wow, its better, but it isn't game changing"

 

Then one day, you might upgrade to 8K and I guarantee you that your first thought will be "wow, I wasted my money" lol.

 

By the way, you may not care about HDR and color support but they make a MASSIVE difference to a displays quality.

 

My Uncle still uses his 65" Sony Bravia 1080p TV in his basement. Man, what a beauty that thing is. He spent like 2 grand on it a long time ago, and honestly it looks nearly as good as my new 4K TV. Thats just how good the Bravia is.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have been speculating for a while now that Intel will move to a new architecture where they will strip out anything that is not necessary (like legacy support and old instructions that aren't really necessary) in order to lean up the design process. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

I have been speculating for a while now that Intel will move to a new architecture where they will strip out anything that is not necessary (like legacy support and old instructions that aren't really necessary) in order to lean up the design process. 

Hmmmm, probably. That would free up some space to work with. The only real problem would be software support. Man, devs are going to be pissed if Intel strips away something like x86 (32-bit) and runs on x64 only. That will suck REALLY hard lol.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, WallacEngineering said:

Hmmmm, probably. That would free up some space to work with. The only real problem would be software support. Man, devs are going to be pissed if Intel strips away something like x86 (32-bit) and runs on x64 only. That will suck REALLY hard lol.

I started a thread about this a while back in the CPU section, people expressed the same concerns, but the reality is are we going to hold onto all legacy forever?  I think a new OS and a new CPU uArch (maybe purely 64bit with only essential instructions) could be a start.  

 

EDIT: I mean people are going to have to accept that X86 on 32bit won't be around forever and everyone should start planning.  Who knows how many issues there are lurking in the older stuff just waiting to be exploited. And we need CPU real estate if we can't find an alternative to silicon.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, WallacEngineering said:

Now the reason 7nm seems to be the physical limit is because beyond 7nm, the transistors would produce more heat than the power they would output, making the idea of 5nm CPUs an oxy-moron, or paradox.

What do you measure "the power a CPU outputs" in, Watts? A CPU simply outputs heat. It's not like nuclear fusion, where you can measure energy spent vs. energy generated.

The limits of shrinking transistors have to do with "quantum tunneling" occurring at small enough scales, which nullifies the role of the transistor (since the electrons can get through regardless of the 0/1 state of the transistor). 

(btw, in any case being inefficient wouldn't be an oxymoron nor a paradox).

 

6 hours ago, WallacEngineering said:

Now you may be thinking: "Well why not just increase the physical size of CPUs so more 7nm transistors can fit into it?" Well that is technically a great idea, but its got a few issues. One, there needs to be room in between the CPU and RAM for cooler mount clearance, so going much bigger than ThreadRipper isn't really possible

I don't see why: distance between CPU and RAM can be kept constant regardless 

6 hours ago, WallacEngineering said:

 and if we start pushing the RAM slots out then we start changing the ATX form factor standard which trust me, that isn't going to happen.

And no one will ever need more than 640K of memory....

 

6 hours ago, WallacEngineering said:

This would mean all cases, power supplies, and other accessories would need to be completely redesigned to fit these new motherboards,

You don't need to re-design power supplies just because you changed the size of the motherboard. Right now, we have ITX, mATX, ATX, E-ATX, BTX, and all the custom OEM and server boards, and while specialized PSUs are made when needed, you can mostly use the same PSU with any of them. PSU incompatibility arises when you change the power feed needs and layout, not the shape of the motherboard.

Other than that, manufacturers would love to have a good excuse to change standards and make sure everyone needs a new version of anything, rather than recycling any component from one build to the other. As we speak, Microsoft tries to make sure you can't use newer CPUs if you don't migrate to Win 10, Intel tells you you need Kaby Lake for certain video streams or Optane, and the PSU ATX standard is on its 2.3x revision. Some revision just introduced support for I don't remember which CPU power state, and boy PSU manufacturers couldn't wait to put "Haswell ready!" all over their packaging :P 

 

6 hours ago, WallacEngineering said:

and all this work would be done for a technology that will only last a few more years anyways. The largest issue however would be heat output. The current i7-8700K is already a monster of a heat producer, and that's just a mainstream CPU. Imagine a CPU with more than double the 8700K's processing power!

That largely depends on the design, though. For example, Threadripper's size makes it much easier to dissipate heat, and while it does require more cooling than an i7, normalized by performance (say, heat per cinebench point :P) is substantially lower. And the i7 itself only gets that hot when overclocked, at stock it's not a big deal. If anything, transistor shrink has reduced heat output over time, certainly heat-per-performance unit. Custom loops for consumer-grade hardware have largely gone from cooling solutions to cosmetic devices.

 

6 hours ago, WallacEngineering said:

There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over!

You have to be joking. Consoles are so behind the technological frontier that the only reason not to update them is to keep them relatively cheap, and potential demand limitations in a market that isn't based in counting FPS (basically, a market that doesn't care about increased computational power anyway). 

 

6 hours ago, WallacEngineering said:

Now lets take a look at automobiles.

(...)

Now lets look at current displays.

I struggle to see the link to "the end of CPU Advancement" o.O

 

6 hours ago, WallacEngineering said:

Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still.

Now you are adding assumptions: your premise is not only that nothing better will ever be developed, but also that manufacturing technology will stall, so we won't be able to reduce the cost of manufacturing the same either. Even if I'd accept the first premise, I would see no basis for the second. Building a $3000 home PC will probably be as bad an idea as it ever was.

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, WallacEngineering said:

Yes, they are real vision scientists. I need to keep searching for another article I found a while back, it makes alot more scientific sense than the article in OP.

 

That makes perfect sense for 1080p, especially if its a larger display. My old Sanyo TV was a 60" 1080p. Up close you could definitely see the pixels. However, you are never going to be constantly sitting that close to the display.

 

Im talking about standard viewing distance. So 5 - 10 feet away from a TV, maybe 2 - 3 feet away from a big monitor. You know, standard viewing distances. It doesnt matter if you can see pixels a few inches to a foot away from a screen if thats not actually going to be the way you actually use the display. That just doesn't make any sense to compare.

 

A 4K monitor will be a good upgrade for you from 1080p. But as soon as you do it, you will think to yourself "wow, its better, but it isn't game changing"

 

Then one day, you might upgrade to 8K and I guarantee you that your first thought will be "wow, I wasted my money" lol.

I do sit at an arm length from my monitor at all times. like if I reach my arm out my Index finger tip touch the screen. or about 66 cmTotally normal viewing distance. And thats also the distance where diagonal lines or like the lines in a W doesnt look completely straight but more zigzag. Its a 24" and 1080p and can see the zigzag. a 1440p 27" woudnt be that much better, just little bit.
 

Ofc it isnt gamechanging, almost nothing is. and it doesnt need to be either. but it IS a noticeable difference and when I get an stable predictable income I will get it. I personally think it would be worth it.


btw, 24 Inch on about 61 cm distance is about the same as a 120" one at 3 meter.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

Ups, Ignore/delete this post

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, SpaceGhostC2C said:

I don't see why: distance between CPU and RAM can be kept constant regardless 

 

Not  only that, but back in the p3 days Intel made slot 1 so they could mount larger external cache (that was purely a size issue) as well as motherboards that ran two p3 chips (because back then two chips was a more viable option than increasing core count).  I don't see why we can't go back down that path if the chip can't be made better on the inside to have a dual i5/i7 or dual ryzen7 mobo .

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SpaceGhostC2C said:

What do you measure "the power a CPU outputs" in, Watts? A CPU simply outputs heat. It's not like nuclear fusion, where you can measure energy spent vs. energy generated.

The limits of shrinking transistors have to do with "quantum tunneling" occurring at small enough scales, which nullifies the role of the transistor (since the electrons can get through regardless of the 0/1 state of the transistor). 

(btw, in any case being inefficient wouldn't be an oxymoron nor a paradox).

 

I don't see why: distance between CPU and RAM can be kept constant regardless 

And no one will ever need more than 640K of memory....

 

You don't need to re-design power supplies just because you changed the size of the motherboard. Right now, we have ITX, mATX, ATX, E-ATX, BTX, and all the custom OEM and server boards, and while specialized PSUs are made when needed, you can mostly use the same PSU with any of them. PSU incompatibility arises when you change the power feed needs and layout, not the shape of the motherboard.

Other than that, manufacturers would love to have a good excuse to change standards and make sure everyone needs a new version of anything, rather than recycling any component from one build to the other. As we speak, Microsoft tries to make sure you can't use newer CPUs if you don't migrate to Win 10, Intel tells you you need Kaby Lake for certain video streams or Optane, and the PSU ATX standard is on its 2.3x revision. Some revision just introduced support for I don't remember which CPU power state, and boy PSU manufacturers couldn't wait to put "Haswell ready!" all over their packaging :P 

 

That largely depends on the design, though. For example, Threadripper's size makes it much easier to dissipate heat, and while it does require more cooling than an i7, normalized by performance (say, heat per cinebench point :P) is substantially lower. And the i7 itself only gets that hot when overclocked, at stock it's not a big deal. If anything, transistor shrink has reduced heat output over time, certainly heat-per-performance unit. Custom loops for consumer-grade hardware have largely gone from cooling solutions to cosmetic devices.

 

You have to be joking. Consoles are so behind the technological frontier that the only reason not to update them is to keep them relatively cheap, and potential demand limitations in a market that isn't based in counting FPS (basically, a market that doesn't care about increased computational power anyway). 

 

I struggle to see the link to "the end of CPU Advancement" o.O

 

Now you are adding assumptions: your premise is not only that nothing better will ever be developed, but also that manufacturing technology will stall, so we won't be able to reduce the cost of manufacturing the same either. Even if I'd accept the first premise, I would see no basis for the second. Building a $3000 home PC will probably be as bad an idea as it ever was.

 

Yes consoles are behind but there is a reason for this: They only cost $500. If they want to go any further in the next few years, they are probably going to have to up the prices, probably by alot, and then people wont really want them anymore. Xbox One X already does 4K gaming smoothly so how much farther can they go anyways?

 

Yea Threadripper is larger and therefore cools better than other CPUs but that theory has its limits as well.

 

Yes you would have to redesign power supplies. Not like how they work, but their form factor to meet new mounting holes on new cases designed for new size motherboards.

 

Im gonna add this next part to the OP, but since when do we absolutely need new CPUs anyways? Your old 2000 or 3000 series i7 is more than enough computing power for the games or software of today. My Phenom II X4 970 is STILL enough for everything but gaming. The only reason I even picked up my Ryzen 5 1600X is because my current system fails Fire Strike physics tests.

 

There are more problems too. How do we create VRMs strong enough to power more power thirsty CPUs that grow in physical size past Threadripper? How do we make all the traces when a CPU is so large that it takes up all the available space? Where will we move the first PCIE slot to? Where will we move the RAM slots to?

 

If we are stuck with 7nm then efficiency cant increase, so power draw will get higher and higher on these larger and larger CPUs along with heat output, so how do we effectively cool them using traditional and affordable methods once they put out so much heat that custom water loops struggle with it?

 

Yes, my guesses at price are just that: guesses. Its the future, nobody knows what the pricing will be. Why do you think I said "this part is purely my own speculation"? Look at it this way, the guy from Global Foundries in that YouTube video stated that they were looking into particle accelerator technology to help manufacture smaller transistors. Do you really think that a CPU made with the help of a particle accelerator is going to be cheap? Probably not EVER.

 

Sure, manufacturers would love to make us do all of those specialized things, but do you really think any of us who know better will fall for it and actually spend all that money knowing that the improvements are minimal and compromises vast needing to switch to an entirely new form factor with very limited support for things like radiators and CPU air coolers? I know I wouldn't.

 

Comparisons to Automobiles and Displays are not only good ways to compare technology advancement because they do scale with CPUs in the same way, but also because they are related. Automobiles use computers in them and have ever since fuel injection was invented. Thats what an ECU is, effectively a computer that controls ignition timing, fuel injection, throttle response, ect. So if microchips cant improve due to physical limitations, neither can ECUs.

 

Ok so perhaps my wording is a little bit off. Here is a nother article explaining why going further than 7nm is an impossibility:

 

Screenshot_Chrome_20180310-021816.thumb.png.691c073e9d891b07d714c071ecacef23.png

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ok, OP edited with a few more bits of information

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Dude you're getting hung up on 7/5nm way too much.  I'm willing to bet right now that before any kind of price inflation happens were going to see die size go up, followed by more efficient cpu designs, followed by a massive surge into some new technology (quantom computing, perhaps)

 

There will never be a $1000 Pentium (in 2018 dollars, accounting for inflation of course) because no-one will buy it and Intel only cares about sales.  They are more likely to stop selling new cpus for a decade than sell ludicrously overpriced ones

Want to custom loop?  Ask me more if you are curious

 

Link to comment
Share on other sites

Link to post
Share on other sites

I truly don't mean to sound like I'm attacking you here, but this whole thing is very close minded IMO. Just because you can't think of how something can get better, doesn't mean it can't get better. It is going to sound snide, but as your OP is snide itself I find it fitting.

 

Advancement WILL slow down, I'm not arguing that, its not going to suddenly come to a halt tho. It will take longer to come out with CPUs that have smaller transistors, history has shown this, but eventually it will happen or something else will come out that allows faster CPUs.

 

There was no reason to upgrade Pony Express, then there was no reason to upgrade the telegraph, then no reason to upgrade phone lines, bag phones are amazing it can't possibly get better than this, do you see where I am going with this...? 20 years from now we will look back at the iPhoneX, S9, etc laughing at how out of date they are.

 

I mean, 70 years ago we were able to make cars that drove faster than the speed limit, yet we kept improving them...

 

You mention flying cars will never exist because of how expensive they are going to be, do you have any idea how expensive the first computers were? Any idea how much power they used? I'm sure glad they didn't  give up there. Rockets are another great example, compare how much we spent first getting to space to what it takes now, I'll bet you $20 its cheaper now.

 

If Military/Business tech moves forward, so does consumer tech, no rational business would get something newer/better/cheaper and decide not to market it to consumers. 

 

Personally it is easy to see individual pixels on a 27" 1440 panel from ~25" away, not that I'm going to complain about it right now as its not bad, but I'll surely upgrade at some point. My brother actually has a KU6500, and in my humble opinion it doesn't even come close to being as good as my not even that good Dell S2716DG monitor I use for gaming.

 

The Xbox refresh just had more cores thrown onto it because it was WAY cheaper for them to do that than redesign the console and possibly have to charge even more than $500 for it. If you honestly believe the Xbox One X is powerful enough and has no where to go you are on the wrong forum.

 

 

Sorry for it being in such a random order, I kept face palming and loosing my spot.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, WallacEngineering said:

There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to.

The mid gen console refreshes happened because 4k TVs got really cheap at the same time gpus were able to be manufactured on 14nm nodes. The XBox One X doesn't have any more Jaguar cores than the original XBox One.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×