Jump to content

The End of CPU Advancement on our Doorstep (Moore's Law and the 7nm Barrier) Discussion

I would like to start an interesting discussion regarding traditional PCs and CPU performance enhancements and why I feel the final major CPU releases are right around the corner. Now to start, I should explain that while I have quite a few resources to back up my theory, this is all still pure speculation, and the sources I will provide are based on speculation as well.

 

So what is my theory exactly? That around 2020-2022, people will stop buying new PCs, and from there, like a domino effect, technology advancements for CONSUMERS will come to an end (lets make it clear that I am not talking about Military Technology or technology as a whole, I am convinced these areas will continue growing).

 

So lets start off with why I even think this is possible. So "Moore's Law" is an observation first recorded by Gordon Moore, whose 1965 paper described that the number of transistors in an integrated circuit will double every year for at least a decade. Now he was slightly incorrect about the timeline but correct otherwise. The number of transistors has been doubling about every 15-18 months in our CPUs. The increasing of transistors is what allows our CPUs to get more cores and faster clock speed with each generation. The issue with this is that there is a physical limit where increasing the number of transistors within a CPU becomes a physical impossibility. Now according to most computer scientists, computer enthusiasts such as you and I, and the tech community in general, we all agree that the physical impossibilities will begin to occur after 7nm-based CPUs hit the shelves. If you simply google "Moore's Law" you will come across a wide variety of articles explaining the definition and then discussing the physical limits of Moore's Law and when they predict that we will no longer be able to advance traditional CPUs. Most of these articles come to the same conclusion: "Moore's Law will reach it's physical limitation sometime in the 2020's". Take Time to read the article below:

 

5aa32c4a50a18_IntelMooresLawStatement.thumb.jpg.084f7ff560fec81957ed1679f4037943.jpg

 

So INTEL THEMSELVES said in 2016 that transistors can only continue to shrink for about 5 more years. Guess what AMD has already announced for 2020-2021? Zen 3, their first 7nm+ CPU. Put two and two together and you get... four. Its as simple as that.

 

Now the reason 7nm seems to be the physical limit is because beyond 7nm, the transistors would produce more heat than the power they would output, making the idea of 5nm CPUs an oxy-moron, or paradox. Now I believe that there is a small chance that scientists will figure out a way to produce 5nm CPUs, but the technology would be extremely difficult and expensive to manufacture, so much so that none of us consumers would be interested in upgrading from 7nm to 5nm CPUs.

 

This next source I cannot provide because I can no longer find it. I saw a YouTube video about 18 months ago where a man who actually worked at Global Foundries was now legally allowed to speak about his time at the FAB and what difficulties they were expecting to see in the coming years. He said something about particle accelerator technology being required to create transistors around 7nm or smaller. I know, I don't understand how a particle accelerator could have anything to do with the FinFET manufacturing process either, but that's what he said, and that's why he makes more money than I do lol.

 

This next part is purely my own speculation but its regarding the price of 5nm CPUs if we get there. So if we could purchase 5nm CPUs, they are likely to be so expensive that we will keep our 7nm CPUs instead. You think GPU pricing is bad right now? Try $1000+ entry level Pentium CPUs, or i9 extreme CPUs that cost tens of thousands of dollars. Ya, I don't know about you but I am staying WELL away from 5nm CPUs.

 

Now you may be thinking: "Well why not just increase the physical size of CPUs so more 7nm transistors can fit into it?" Well that is technically a great idea, but its got a few issues. One, there needs to be room in between the CPU and RAM for cooler mount clearance, so going much bigger than ThreadRipper isn't really possible, and if we start pushing the RAM slots out then we start changing the ATX form factor standard which trust me, that isn't going to happen. This would mean all cases, power supplies, and other accessories would need to be completely redesigned to fit these new motherboards, and all this work would be done for a technology that will only last a few more years anyways. The largest issue however would be heat output. The current i7-8700K is already a monster of a heat producer, and that's just a mainstream CPU. Imagine a CPU with more than double the 8700K's processing power! Heat production would likely be so intense that even the most expensive custom water loop solutions would struggle to cool it, and don't even THINK about tying your GPU(s) into that loop, not gonna happen, especially as GPUs are likely to face the same issues as CPUs.

 

Another issue that needs to be discussed is weather or not CPU upgrades are even necessary anymore, even TODAY! It is widely accepted that 8 Threads of processing power is more than enough for even the most demanding games, and while productivity and content creation software may be slower on something like my old Phenom II X4 970, its still possible.

 

The only reason, and I mean the ONLY reason I just bought a Ryzen 5 1600X and related components is because the X4 970 suffers from only having 4 threads of compute power and sits on terrible green 1333 MHz RAM that is NOT overclockable. This means that while MOST of my daily computer use is still INSTANTANEOUS such as loading programs, watching Netflix, or transferring files thanks to my SSD, it fails in gaming physics. Firestrike graphics tests run very smoothly at 45 FPS, but as soon as the physics test starts I tank straight down to around 10 FPS. CPU intensive games like Kerbal Space Program and From the Depths simply arent capable of running at decent FPS no matter how low I set graphical settings.

 

That and I bought every single component used so I managed to get a Ryzen 5 1600X system with 16GB DDR4 3000 MHz ram, an R9 390X, a Samsung 960 EVO 250GB NVME boot drive, and a Samsung 850 EVO 500GB 2.5" SSD while reusing my PSU and case, for about $600. Now THAT is what makes this purchase justifiable.

 

Say its the year 2020. Zen 3 is out and you decide to pick up the new (just guessing on the specs here) Threadripper 4950 X with 20 Cores, 40 Threads and can reach 6GHz with some overclocking. How much of that processing power are you EVER going to use? When do you think that kind of compute performance will become obsolete and REQUIRE replacement? Software is a good two decades behind fully utilizing high-end hardware as things currently sit. And thats not even talking about something like threadripper. Its likely to take two decades to fully utilize the i7 8700K in just about ANY software that isn't related to content creation or rendering!

 

If you bought a CPU as powerful as the Threadripper 4950X (as I specified) then theres very good chance that as long as the CPU physically survives and doesnt fail, that you wouldn't need to replace it at ANY point in the remainder of your lifetime.

 

EDIT/UPDATE:

Lets examine this issue from another perspective. Alot of comments suggest that carbon nanotube or quantum computing could be the solution, that Technology will not stop and continue on. Well keep in mind I never said technology will stop, what I am suggesting is that us, as consumers, do not find these new technologies to be justifiable purchases, at least not at first.

 

Assuming that these new technologies that computer science will try to implement are extremely expensive in their first few years (which is likely as every new technology is expensive when it is first implemented), we can basically expect to see most PC consumers, including enthusiasts, to no longer have any reason to purchase these new technologies until these technologies improve, mature, and lower in cost sometime in the future.

 

Take a look at every single "big leap forward" in man kinds history. When the microchip was first invented, it took many years of development and research before every day people like us were able to get an affordable computer in our homes. When the Automobile was first invented, it took many years before everyday people could afford one for themselves. It took us 20 years of testing space rockets to ever actually put an astronaut on the moon.

 

And that is exactly the point. When a REALLY big game-changing technology is invented, it usually takes at least a decade, if not longer; of research, testing, maturity, and manufacturing efficiency, before the vast majority of the consumer market ever gets a chance of actually owning this technology for themselves.

 

History has proven itself to repeat, we all know this. Intel JUST invested $5 Billion USD in 10nm manufacturing. This suggests that more than likely they EITHER arent even researching what to do after transistors reach their physical limit, or their R&D on the subject is very limited at this time, which means come the 2020's, it probably wont be ready, and we will be forced to wait it out.

 

So what we are talking about then, is a strong possibility of another one of man kinds "big leaps" as far as computing is concerned, probably in the early 2020's, which is only a few years from now.

 

The "Lets just add more transistors and create physically bigger and more power hungry CPUs" idea is NOT "Forward-Thinking" This is more relatable to Jeremy Clarkson from Top Gear UK/ The Grand Tour's take on automobile performance: "More Power!!!" With companies as advanced as CPU manufacturers like Intel and AMD, I would probably guess that neither of them would be okay with using this approach. The motto of world of computing always has been and always will be "Smaller, Faster, Smarter"

 

There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to.

 

Now lets take a look at automobiles. High-end cars today use heads-up displays that are basically high definition televisions. Then you have insane technologies such as Range Rover's Terrain Response System which can detect weather you are on Asphalt, Mud, Snow, or Sand and adjust power output to each individual wheel to keep you going pretty much no matter what. You can also actively adjust your ride height and suspension stiffness in a lot of cars today. There isn't much more we can do with a machine that uses four round pieces of rubber to move around. So whats next, flying cars? Well yea, they already exist and have existed for about a decade or so. But there are a number of MASSIVE problems with this technology that makes me think it will NEVER, EVER become a reality for 99.9% of people. Firstly, its EXTREMELY EXPENSIVE. It uses pivoting jet engines just like a real military VTOL aircraft does, do you have any idea how much those cost to buy, or to run? Lets just not even talk about it. Plus, do you really think your federal government is going to let you just fly around wherever you want? Ever heard of 9/11? Well imagine millions of idiot drivers behind the wheel of flying machines? Yea, 9/11 would look like a holiday in comparison. So NO, it will NOT EVER HAPPEN. On top of this, Google's self driving cars already work. They were successfully tested in Arizona in 2017 going on a few thousand test runs. No accidents AT ALL, they are now a reality and we should be seeing them on the streets by 2020.

 

Now lets look at current displays. Displays have already hit their "Moore's Law Wall"! A scientific test was conducted that revealed that even the healthiest human eye cannot depict the difference between 1440p and 4K displays from more than a few inches away from the screen. So unless you like to sit so close to your monitor or TV that you cant even see half the screen and like to burn out your retnas, then 4K is simply a marketing gimmick, and has no real benefit what so ever. Some of you may know that I actually own a Samsung MU6500 Curved Glass 4K Smart TV that I use as my monitor and you may be wondering why. Well, my previous TV was a 7-year-old, crappy, low quality, cheap Sanyo 1080p flat screen. The image quality was terrible, the colors were abysmal, and it was so old that I figured it would die soon, so I sold it before it kicked the bucket on me. Plus I bought the Samsung on Black Friday 2017 so $500 for a $1000 TV? Sure, sign me right up. This display is also WAY better in terms of gaming capability. When you select "Game Console" as the selected source, it disables "Motion Rate" interpolation, most of the graphics processing options, and even lowers input lag. I can honestly say I cant tell the difference between this TV and any high-refresh gaming monitor I have ever played on. Its THAT fast and smooth, way to go Samsung! Anyways, here is the article that explains the scientific test:

 

https://www.nbcnews.com/technology/enough-pixels-already-tvs-tablets-phones-surpass-limits-human-vision-2d11691618

 

I do understand that Virtual Reality does need to increase in pixel count still and has a while to go before it's technology is maxed out. However I, like many other gamers, prefer a display over VR, and find VR to be an expense that simply isn't worth the money. Firstly, how do you go from playing a high-intensity, fast-paced FPS and then head into VR and find the aiming is so difficult that you are suddenly taking so much time to aim and fire your virtual weapon? Personally for me, any game that uses the VR motion controllers is pretty much worthless. I do agree, however, that there are some games that look tempting in VR. Games like Elite Dangerous or Forza Motorsport. Since you are in a vehicle you do not need the stupid motion controllers and being inside the vehicle just gives you a whole new level of immersion. But regardless, as discussed, GPUs are facing the same issues as CPUs, so good luck with 16K.

 

So here we are then, all evidence points to the end of technology advancements as we know it, and its expected to be very soon. What are we to do? Well if you do the Moore's Law Google search you will find that at the end of most articles, they describe how Quantum Computing will be taking over the known world of computers, but if you have seen Linus's video where he takes a tour of the Quantum Computing headquarters, that technology is far from being available to the masses. So what do we do until then? Well, honestly? Nothing. Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves.

 

So what do you guys think about all of this? Do you agree? Disagree? What do you think is next?

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

I can't wait to see what 1nm carbon nanotube CPUs will be like! Besides quantum computers, will we find some other material to make CPUs out of? What if we get rid of transistors entirely?

 

Anyways, I doubt that consumer technology will slow down that much, and they can always try to make efficiency improvements.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, thegreengamers said:

I can't wait to see what 1nm carbon nanotube CPUs will be like! Besides quantum computers, will we find some other material to make CPUs out of? What if we get rid of transistors entirely?

 

Anyways, I doubt that consumer technology will slow down that much, and they can always try to make efficiency improvements.

True, efficiency improvements are certainly a plus, but I get electricity included in rent and my rent has never changed, so why would I care about efficiency? I just don't think there will be any reason to upgrade from my Zen 3 build for a VERY long time.

 

Getting rid of transistors entirely is an idea I guess but Im sure we are decades away from any such technology. Transistors are a fundamental part of how a CPU or GPU  even functions, so how do we get rid of them?

 

I am also pretty sure that the nanotube CPUs will be extremely expensive and nobody will buy them, at least not at first.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

interesting.

 

larger CPUs may be a bit better than you say though, it's not too difficult to dissipate ~1-2 kilowatts of power with a custom loop and phase change/water chillers are also an option.

i hope we see some stupidly large extreme edition CPUs

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Prqnk3d said:

interesting.

 

larger CPUs may be a bit better than you say though, it's not too difficult to dissipate ~1-2 kilowatts of power with a custom loop and phase change/water chillers are also an option.

i hope we see some stupidly large extreme edition CPUs

I don't think most people have custom water loops or phase-change coolers, and neither are they anytime soon

CPU: Core i9 12900K || CPU COOLER : Corsair H100i Pro XT || MOBO : ASUS Prime Z690 PLUS D4 || GPU: PowerColor RX 6800XT Red Dragon || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB (Boot), Crucial P2 1TB, Crucial MX500 1TB (x2), Samsung 850 EVO 1TB || PSU: Corsair RM850 || CASE: Fractal Design Meshify C Mini || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: Bose QC35 II || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

 

LAPTOP: Dell XPS 15 7590

TABLET: iPad Pro

PHONE: Galaxy S9

She/they 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, WallacEngineering said:

So what do you guys think about all of this? Do you agree? Disagree? What do you think is next?

I would have to disagree. For one the near inability to increase transistor density does not equal the end of advancements. CPUs today are far from perfect while clock speeds and transistor count are not the whole equation.

1 hour ago, WallacEngineering said:

There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to.

You do know that consoles producers try to keep things cheap and in doing so they relied on custom jaguar cores from AMD (Which this design is very old, let alone comparable to Zen/Coffeelake). In the context of Microsoft, they wanted the processor to be architecturally similar to the one in the OG Xbox One so that devs wouldn't have to make changes (There's an article but I would have to find it). You likely won't find a console that is truly on par with the latest CPU designs and nodes as that would be far more expensive to do (Unless they decide to target the rich and ignore most of their player base, which is a bad idea be it in market share, number of devs, and total profit margins.) Another note is that the One X uses the exact same number of cores as previous versions. (And the PS4 S/Pro are no different.) Console gamers typically don't want to have their game collection divided by hardware which is part of why we have large generational gaps for consoles in the first place.

1 hour ago, WallacEngineering said:

So here we are then, all evidence points to the end of technology advancements as we know it, and its expected to be very soon. What are we to do? Well if you do the Moore's Law Google search you will find that at the end of most articles, they describe how Quantum Computing will be taking over the known world of computers, but if you have seen Linus's video where he takes a tour of the Quantum Computing headquarters, that technology is far from being available to the masses. So what do we do until then? Well, honestly? Nothing. Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves.

To end this, I just want to reiterate that not being able to increase transistor density does not mean that improvements can't be made to the underlying architectures. Intel has a from scratch design coming around ~2022. What I think is more of a limiting factor is that CPU design is hard and coming up with improvements more so.

 

One extra part:

Some other areas you could advance in are using materials that are more suitable for the job. Processors used to be Gallium based but transitioned to Silicon as it was significantly cheaper. Now we are looking at it again. Other areas for improvement can be the use of meta-materials to better insulate heat from other components of the chip. (Some materials may create near zero transfer too, material science has seen a huge boom in recent years)

Edited by Guest
Slightly better and extra part
Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, thegreengamers said:

I can't wait to see what 1nm carbon nanotube CPUs will be like! Besides quantum computers, will we find some other material to make CPUs out of? What if we get rid of transistors entirely?

 

Anyways, I doubt that consumer technology will slow down that much, and they can always try to make efficiency improvements.

There's a good chance we'll find something other than silicon to make processing units out of, there's only so much silicon in the world...

 

Agreed, efficiency could always improve. Or more CPUs will be designed for multi-processor environments and we can add to the amount of processing units in our systems.

 

17 minutes ago, WallacEngineering said:

Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves.

I doubt technology will just reach a standstill for 20 years when 7nm processing units reach the market. I also doubt your "final PC ever" will be Zen 3-based. Like @Prqnk3d said, the surface area of a CPU could increase to accommodate higher transistor count. What if in the next 10 years, consumer-grade CPUs are similar in size to the higher end Xeons with upwards of 20 cores?

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, WallacEngineering said:

True, efficiency improvements are certainly a plus, but I get electricity included in rent and my rent has never changed, so why would I care about efficiency?

Greater efficiency can mean greater performance at a lower heat output, and where is heat really an issue? Laptops! There are also people like me who have nice electrical bills, and we want to lower that in any way we can.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, orbitalbuzzsaw said:

Yeah gimme my 250-core one-terahertz i9.

Right??? Could you even imagine what a CPU like that could be capable of? Most software today apart from content creation struggles to make use of more than 8 threads of processing power, so my new Ryzen 5 1600X is already technically overpowered.

 

Do you know of that one weird theory that life is all just a giant simulation inside of a mega computer?

 

Im pretty sure this CPU would be capable of it lol.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, WallacEngineering said:

Right??? Could you even imagine what a CPU like that could be capable of? Most software today apart from content creation struggles to make use of more than 8 threads of processing power, so my new Ryzen 5 1600X is already technically overpowered.

 

Do you know of that one weird theory that life is all just a giant simulation inside of a mega computer?

 

Im pretty sure this CPU would be capable of it lol.

Yep. Then we wait till the simulations develop their own simulations, and so on, until Windows 5000 crashes.

CPU: Core i9 12900K || CPU COOLER : Corsair H100i Pro XT || MOBO : ASUS Prime Z690 PLUS D4 || GPU: PowerColor RX 6800XT Red Dragon || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB (Boot), Crucial P2 1TB, Crucial MX500 1TB (x2), Samsung 850 EVO 1TB || PSU: Corsair RM850 || CASE: Fractal Design Meshify C Mini || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: Bose QC35 II || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

 

LAPTOP: Dell XPS 15 7590

TABLET: iPad Pro

PHONE: Galaxy S9

She/they 

Link to comment
Share on other sites

Link to post
Share on other sites

I mostly agree with all of what you have said but I believe that there will be some genius out there that will eventually find a way to create something new that will be 10x (or however much better) better than what we have today/in the next few decades. I also want to add that, I believe that if we had a efficient cooling system, we would have supercomputers--we can see from those extreme overclockers that use liquid nitrogen, etc. to cool and overclock but of course, that is not possible because we do not have that technology (or the money; consumer-wise) to allow consumers to buy a product that does the things stated above. 

 

Also, this was one of the first "long articles" that I have read on this forum, good job on making it interesting (or maybe I'm just not interested in other things). 

"May your frame rates be high and your temperatures low"

I misread titles/posts way too often--correct me if I don't.

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Crunchy Dragon said:

There's a good chance we'll find something other than silicon to make processing units out of, there's only so much silicon in the world...

 

Agreed, efficiency could always improve. Or more CPUs will be designed for multi-processor environments and we can add to the amount of processing units in our systems.

 

I doubt technology will just reach a standstill for 20 years when 7nm processing units reach the market. I also doubt your "final PC ever" will be Zen 3-based. Like @Prqnk3d said, the surface area of a CPU could increase to accommodate higher transistor count. What if in the next 10 years, consumer-grade CPUs are similar in size to the higher end Xeons with upwards of 20 cores?

Well not really a complete stand-still. I did say that I expect military technology to continue advancing. But for us, regular consumers who just want a faster PC? Threadripper and i9 can already crush any software out there, so are we really going to say "Yes, my Threadripper 3950X is totally outdated and needs to be replaced" in the next 10-20 years? I doubt it. Im still fine with my AM3 Phenom II X4 970 right now. With its upgraded SSD, most of my tasks are nearly instantaneous and the only reason Im building a Ryzen 1600X right now is because the 970 has begun to fail physics tests for modern games.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting thoughts. 

 

What I had in my mind for some time now is that CPU and GPU tasks such as gaming are coming to the point where there is little room for improvements. There are several game engines that can produce life-like virtual environment using current processing power. Advance 2 years ahead, and you will have most games (Except CoD) looking more real than in real life.

 

So maybe we will reach equilibrium in technology.

 

Right now monitors are edging on the ability of the human eye to see such sharp resolution. Yes we're getting higher and higher res monitors, but those have very negative effect on the human eye due to insane pixel density.

 

 

Main system: Ryzen 7 7800X3D / Asus ROG Strix B650E / G.Skill Trident Z5 NEO 32GB 6000Mhz / Powercolor RX 7900 XTX Red Devil/ EVGA 750W GQ / NZXT H5 Flow

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, PopsicleHustler said:

Interesting thoughts. 

 

What I had in my mind for some time now is that CPU and GPU tasks such as gaming are coming to the point where there is little room for improvements. There are several game engines that can produce life-like virtual environment using current processing power. Advance 2 years ahead, and you will have most games (Except CoD) looking more real than in real life.

 

Right now monitors are edging on the ability of the human eye to see such sharp resolution. Yes we're getting higher and higher res monitors, but those have very negative effect on the human eye due to insane pixel density.

 

 

Pretty much exactly what I was trying to state with this thread. Theres just no reason to upgrade after 7nm

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, orbitalbuzzsaw said:

Yep. Then we wait till the simulations develop their own simulations, and so on, until Windows 5000 crashes.

LOL!

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

lol you completely lost it at the 4k vs 1440p argument. Credibility gone. Sorry.

 

If you read a study where the test subjects have 20/100 or worse vision maybe that would make sense but that's inherently objectively wrong.

      __             __
   .-'.'     .-.     '.'-.
 .'.((      ( ^ `>     )).'.
/`'- \'._____\ (_____.'/ -'`\
|-''`.'------' '------'.`''-|
|.-'`.'.'.`/ | | \`.'.'.`'-.|
 \ .' . /  | | | |  \ . '. /
  '._. :  _|_| |_|_  : ._.'
     ````` /T"Y"T\ `````
          / | | | \
         `'`'`'`'`'`
Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, violentnumeric said:

lol you completely lost it at the 4k vs 1440p argument. Credibility gone. Sorry.

 

If you read a study where the test subjects have 20/100 or worse vision maybe that would make sense but that's inherently objectively wrong.

What are you talking about, its right there in the article. Also, have you seen a noticeable difference between a nice 1440p monitor and a 4K one? I sure haven't.

 

Remember, 3840x2160p is 8,294,400 pixels. Can you really sit there and tell me that your eyes are capable of picking out eight million different different objects on your 40-or-so inch monitor? Even your 80" TV? Try loading 8 million different images onto your screen at once and tell me if you can see anything at all.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hilarious. 4k doesn't even remotely look "perfect" to anyone with reasonably good vision at a distance that actually takes up any real part of their FOV, and you're trying to say 1440p is better. Kek

      __             __
   .-'.'     .-.     '.'-.
 .'.((      ( ^ `>     )).'.
/`'- \'._____\ (_____.'/ -'`\
|-''`.'------' '------'.`''-|
|.-'`.'.'.`/ | | \`.'.'.`'-.|
 \ .' . /  | | | |  \ . '. /
  '._. :  _|_| |_|_  : ._.'
     ````` /T"Y"T\ `````
          / | | | \
         `'`'`'`'`'`
Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, violentnumeric said:

Hilarious. 4k doesn't even remotely look "perfect" to anyone with reasonably good vision at a distance that actually takes up any real part of their FOV, and you're trying to say 1440p is better. Kek

Nah, not that's its better, just that there's no reason to spend the money to upgrade to 4K if you have 1440p, especially if you have a nice Ultra-Wide (which I guess is technically about 4K wide anyways)

 

EDIT: OK heres a good real-word example, so Im typing on my 4K TV right now. My current AM3 build runs a GTX 670 so it isn't really capable of 4K, even video lags a bit, so I turned down my Nvidia Control Panel Settings to 1920x1080 desktop resolution. Let Me switch to 3840x2160... Okay, so slight improvement, nothing I would consider worth spending money on. Now thats coming from 1080p, not 1440p. So let me change my settings to 2560x1440... Nope, no noticeable difference WHAT SO EVER.

 

The TV is the Samsung MU6500 49 inch Curved Glass 4K TV, and I am positioned exactly 79 inches from the center of the screen (deepest point in any curved display)

 

IMG_20180309_201043.thumb.jpg.81206f771265c319afb85f970f1e4d82.jpg

 

This is with the settings turned back down to 1080p after testing. Image is extremely sharp, nearly perfect. Although taking a picture of your display using your phone never does the display any justice so you will have to take my word for it.

 

IMG_20180309_201057.thumb.jpg.562c8b58229e9ec00db138408220eb04.jpg

 

Please excuse the mess, I am moving soon and things are just sort of "laying around" lol

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, WallacEngineering said:

Try $1000+ entry level Pentium CPUs, or i9 extreme CPUs that cost tens of thousands of dollars. Ya, I don't know about you but I am staying WELL away from 5nm CPUs

Never going to happen, Intel just won't release impressively better hardware.  Mainstream chips will stay on 7nm and just get larger and larger.  After a while new technology will inevitably come to fruition (Intel will keep throwing billions of dollars at it until something works) and meaningful progress will resume

Want to custom loop?  Ask me more if you are curious

 

Link to comment
Share on other sites

Link to post
Share on other sites

Most is at a standstill due to inefficiencies in programming a lot of stuff barely makes use of just a few cores. It's just now accelerating rapidly. 

 

 The next 7nm chip has a 10nm base also.   They can easily get 32/64 per die and put 4 dies on a difference base material that can hit 30Ghz.

 

Making it in 3d like the memory is doing???

 

Plenty of room to grow before going Quantum. 

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, ewitte said:

Most is at a standstill due to inefficiencies in programming a lot of stuff barely makes use of just a few cores. It's just now accelerating rapidly. 

 

 The next 7nm chip has a 10nm base also.   They can easily get 32/64 per die and put 4 dies on a difference base material that can hit 30Ghz.

 

Making it in 3d like the memory is doing???

 

Plenty of room to grow before going Quantum. 

Huh, didnt know the first 7nm chips were going to be built on 10nm bases. Thanks for that.

 

I like the idea of the Carbon Nanotubes myself, but I cant help but feel that for the first 5 years it will be too expensive to justify the purchase.

Top-Tier Air-Cooled Gaming PC

Current Build Thread:

 

Link to comment
Share on other sites

Link to post
Share on other sites

You know, alot of people think Moore's law will end when we hit the smallness limit of transistors. Unfortunately for those people, there are two ways to increase the number of transistors on a die:

  1. Make the transistors smaller. If you cut the size in half, you can fit double the transistors.
  2. Increase the size of the chip: A chip that is twice as large can hold double the transistors.

Does this solve the problem indefinitely? No, not at all. Can this solve the problem for many, many years? Yes, yes it can. 

There is another way to increase performance besides throwing transistors at it. You can build problem specific components and include them in the architecture. Get ready to see better integrated graphics and better FPUs. You can design an ASIC around basically any software problem, get ready to see common ones appearing on chips (like cryptography. I'm excited to see a prime number checker on die).

Keep in mind that the above only applies to CISC architectures. RISC architectures still have alot of catching up to do to get to the performance of current RISC processors.

Get prepared also to see an exploration of other ISAs. There are very few popular Instruction Set Architectures. When throwing more transistors at the same problem stops increasing performance, you will start to see people experiment with creating completely new devices. I find this to be good. In my opinion, the alleged backwards (really forwards) compatability of x86 is both the best and worst thing to ever happen to the world of computers.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×