Search the Community
Showing results for tags 'moores law'.
-
Nvidia CEO Jensen Huang said at the GTC confrence in Beijing that GPUs will replace CPUs. http://www.pcgamer.com/nvidia-ceo-says-moores-law-is-dead-and-gpus-will-replace-cpus/ Intel denies that Moore's Law is dying, and continues to back it up. It's an interesting thing to consider, GPUs replacing CPUs. We see it happening a lot already in modern gaming consoles. What's everyone else think? Are CPUs failing? -Ken
- 51 replies
-
- nvidia
- moores law
-
(and 2 more)
Tagged with:
-
In the last WAN show @James used the term singularity in the way it is so often referred to in the tech world, and threw around a few terms that might have confused people. So here is the shortest possible explanation for the singularity I can give: Singularity is when self-improvement time reaches 0, making a machine improve infinitely quickly and thus infinitely powerful. But that can't happen, so now people use it to refer to: for the reasons outlined below. Here is a version with a tiny bit more detail: When scientists plotted the computation power available (back in the 80s I think, couldn't find the paper anymore ) and drew a trendline through it, they did not find it to be a logistic or exponential growth as you would expect from Moore's law and physical consideration, the best fitting line had what we in maths people call a singularity. That meaning nothing but that the value goes to infinity at a finite point. This is not always crazy, when you think about the length of a small line segment very near to the poles on a typical map, that gets stretched to infinity as well. But there are also "bad" or "real" singularities like 1/0 on the 1/x plot. The singularity in the paper was a bad singularity. Scientists generally agree that something like that can't happen with classical computers simply because infinite processes require an infinite amount of energy (there is a limit to how efficient a computer can be) and energy in the universe is limited. In fact there are a large amount of other physics reasons why that can't happen, but they require thermodynamics and everyone hates thermodynamics. Now, why is everyone talking about it? Basically the media picked up the term related to the paper, that obviously got a lot of citations in the press, and so it transformed into a term referring to a time, when machines can self-improve infinitely. Now, this also can't happen, because of the same physical limits, so there is a (all be it REALLY LARGE) limit on the processing power machines can ever have in the universe. Thus the only definition of the singularity that can actually happen is, that machines manage to selfimprove better than humans can improve them. And this definition is super weak, I mean technically the new alpha-go hit that goal, it selfimproved better than the version of alpha-go humans fed. So if you still want the singularity to be somewhat scary you need to add another qualifier: "The singularity is, when artificial general intelligences can selfimprove quicker than humans can improve them, and reach a level of intelligence incomprehensible by humans", without any of these qualifiers the singularity is either reached, not scary or physically impossible. At this point you can easily see that this has nothing at all to do with a real singularity, and that a 0 has also nothing to do with it, it is just a mystical misnomer with a weird etymology.
- 7 replies
-
- learning
- singularity
-
(and 1 more)
Tagged with:
-
Source: Institute of Electrical and Electronics Engineers (IEEE) TL;DR: We're WAY behind on Moore's law, by a factor of 15 if it was still viable. It's time to focus on optimizing software instead of hardware. Media: Quotes/Excerpts: My Thoughts: We going to start seeing more and more ASIC's as time progresses. As mentioned by Linus in the video below, application specific processors and languages will increasing in popularity as clock-speeds and performance stagnates. Machine learning cores will be just one of the ASICs to come out of this stagnation.
-
I would like to start an interesting discussion regarding traditional PCs and CPU performance enhancements and why I feel the final major CPU releases are right around the corner. Now to start, I should explain that while I have quite a few resources to back up my theory, this is all still pure speculation, and the sources I will provide are based on speculation as well. So what is my theory exactly? That around 2020-2022, people will stop buying new PCs, and from there, like a domino effect, technology advancements for CONSUMERS will come to an end (lets make it clear that I am not talking about Military Technology or technology as a whole, I am convinced these areas will continue growing). So lets start off with why I even think this is possible. So "Moore's Law" is an observation first recorded by Gordon Moore, whose 1965 paper described that the number of transistors in an integrated circuit will double every year for at least a decade. Now he was slightly incorrect about the timeline but correct otherwise. The number of transistors has been doubling about every 15-18 months in our CPUs. The increasing of transistors is what allows our CPUs to get more cores and faster clock speed with each generation. The issue with this is that there is a physical limit where increasing the number of transistors within a CPU becomes a physical impossibility. Now according to most computer scientists, computer enthusiasts such as you and I, and the tech community in general, we all agree that the physical impossibilities will begin to occur after 7nm-based CPUs hit the shelves. If you simply google "Moore's Law" you will come across a wide variety of articles explaining the definition and then discussing the physical limits of Moore's Law and when they predict that we will no longer be able to advance traditional CPUs. Most of these articles come to the same conclusion: "Moore's Law will reach it's physical limitation sometime in the 2020's". Take Time to read the article below: So INTEL THEMSELVES said in 2016 that transistors can only continue to shrink for about 5 more years. Guess what AMD has already announced for 2020-2021? Zen 3, their first 7nm+ CPU. Put two and two together and you get... four. Its as simple as that. Now the reason 7nm seems to be the physical limit is because beyond 7nm, the transistors would produce more heat than the power they would output, making the idea of 5nm CPUs an oxy-moron, or paradox. Now I believe that there is a small chance that scientists will figure out a way to produce 5nm CPUs, but the technology would be extremely difficult and expensive to manufacture, so much so that none of us consumers would be interested in upgrading from 7nm to 5nm CPUs. This next source I cannot provide because I can no longer find it. I saw a YouTube video about 18 months ago where a man who actually worked at Global Foundries was now legally allowed to speak about his time at the FAB and what difficulties they were expecting to see in the coming years. He said something about particle accelerator technology being required to create transistors around 7nm or smaller. I know, I don't understand how a particle accelerator could have anything to do with the FinFET manufacturing process either, but that's what he said, and that's why he makes more money than I do lol. This next part is purely my own speculation but its regarding the price of 5nm CPUs if we get there. So if we could purchase 5nm CPUs, they are likely to be so expensive that we will keep our 7nm CPUs instead. You think GPU pricing is bad right now? Try $1000+ entry level Pentium CPUs, or i9 extreme CPUs that cost tens of thousands of dollars. Ya, I don't know about you but I am staying WELL away from 5nm CPUs. Now you may be thinking: "Well why not just increase the physical size of CPUs so more 7nm transistors can fit into it?" Well that is technically a great idea, but its got a few issues. One, there needs to be room in between the CPU and RAM for cooler mount clearance, so going much bigger than ThreadRipper isn't really possible, and if we start pushing the RAM slots out then we start changing the ATX form factor standard which trust me, that isn't going to happen. This would mean all cases, power supplies, and other accessories would need to be completely redesigned to fit these new motherboards, and all this work would be done for a technology that will only last a few more years anyways. The largest issue however would be heat output. The current i7-8700K is already a monster of a heat producer, and that's just a mainstream CPU. Imagine a CPU with more than double the 8700K's processing power! Heat production would likely be so intense that even the most expensive custom water loop solutions would struggle to cool it, and don't even THINK about tying your GPU(s) into that loop, not gonna happen, especially as GPUs are likely to face the same issues as CPUs. Another issue that needs to be discussed is weather or not CPU upgrades are even necessary anymore, even TODAY! It is widely accepted that 8 Threads of processing power is more than enough for even the most demanding games, and while productivity and content creation software may be slower on something like my old Phenom II X4 970, its still possible. The only reason, and I mean the ONLY reason I just bought a Ryzen 5 1600X and related components is because the X4 970 suffers from only having 4 threads of compute power and sits on terrible green 1333 MHz RAM that is NOT overclockable. This means that while MOST of my daily computer use is still INSTANTANEOUS such as loading programs, watching Netflix, or transferring files thanks to my SSD, it fails in gaming physics. Firestrike graphics tests run very smoothly at 45 FPS, but as soon as the physics test starts I tank straight down to around 10 FPS. CPU intensive games like Kerbal Space Program and From the Depths simply arent capable of running at decent FPS no matter how low I set graphical settings. That and I bought every single component used so I managed to get a Ryzen 5 1600X system with 16GB DDR4 3000 MHz ram, an R9 390X, a Samsung 960 EVO 250GB NVME boot drive, and a Samsung 850 EVO 500GB 2.5" SSD while reusing my PSU and case, for about $600. Now THAT is what makes this purchase justifiable. Say its the year 2020. Zen 3 is out and you decide to pick up the new (just guessing on the specs here) Threadripper 4950 X with 20 Cores, 40 Threads and can reach 6GHz with some overclocking. How much of that processing power are you EVER going to use? When do you think that kind of compute performance will become obsolete and REQUIRE replacement? Software is a good two decades behind fully utilizing high-end hardware as things currently sit. And thats not even talking about something like threadripper. Its likely to take two decades to fully utilize the i7 8700K in just about ANY software that isn't related to content creation or rendering! If you bought a CPU as powerful as the Threadripper 4950X (as I specified) then theres very good chance that as long as the CPU physically survives and doesnt fail, that you wouldn't need to replace it at ANY point in the remainder of your lifetime. EDIT/UPDATE: Lets examine this issue from another perspective. Alot of comments suggest that carbon nanotube or quantum computing could be the solution, that Technology will not stop and continue on. Well keep in mind I never said technology will stop, what I am suggesting is that us, as consumers, do not find these new technologies to be justifiable purchases, at least not at first. Assuming that these new technologies that computer science will try to implement are extremely expensive in their first few years (which is likely as every new technology is expensive when it is first implemented), we can basically expect to see most PC consumers, including enthusiasts, to no longer have any reason to purchase these new technologies until these technologies improve, mature, and lower in cost sometime in the future. Take a look at every single "big leap forward" in man kinds history. When the microchip was first invented, it took many years of development and research before every day people like us were able to get an affordable computer in our homes. When the Automobile was first invented, it took many years before everyday people could afford one for themselves. It took us 20 years of testing space rockets to ever actually put an astronaut on the moon. And that is exactly the point. When a REALLY big game-changing technology is invented, it usually takes at least a decade, if not longer; of research, testing, maturity, and manufacturing efficiency, before the vast majority of the consumer market ever gets a chance of actually owning this technology for themselves. History has proven itself to repeat, we all know this. Intel JUST invested $5 Billion USD in 10nm manufacturing. This suggests that more than likely they EITHER arent even researching what to do after transistors reach their physical limit, or their R&D on the subject is very limited at this time, which means come the 2020's, it probably wont be ready, and we will be forced to wait it out. So what we are talking about then, is a strong possibility of another one of man kinds "big leaps" as far as computing is concerned, probably in the early 2020's, which is only a few years from now. The "Lets just add more transistors and create physically bigger and more power hungry CPUs" idea is NOT "Forward-Thinking" This is more relatable to Jeremy Clarkson from Top Gear UK/ The Grand Tour's take on automobile performance: "More Power!!!" With companies as advanced as CPU manufacturers like Intel and AMD, I would probably guess that neither of them would be okay with using this approach. The motto of world of computing always has been and always will be "Smaller, Faster, Smarter" There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to. Now lets take a look at automobiles. High-end cars today use heads-up displays that are basically high definition televisions. Then you have insane technologies such as Range Rover's Terrain Response System which can detect weather you are on Asphalt, Mud, Snow, or Sand and adjust power output to each individual wheel to keep you going pretty much no matter what. You can also actively adjust your ride height and suspension stiffness in a lot of cars today. There isn't much more we can do with a machine that uses four round pieces of rubber to move around. So whats next, flying cars? Well yea, they already exist and have existed for about a decade or so. But there are a number of MASSIVE problems with this technology that makes me think it will NEVER, EVER become a reality for 99.9% of people. Firstly, its EXTREMELY EXPENSIVE. It uses pivoting jet engines just like a real military VTOL aircraft does, do you have any idea how much those cost to buy, or to run? Lets just not even talk about it. Plus, do you really think your federal government is going to let you just fly around wherever you want? Ever heard of 9/11? Well imagine millions of idiot drivers behind the wheel of flying machines? Yea, 9/11 would look like a holiday in comparison. So NO, it will NOT EVER HAPPEN. On top of this, Google's self driving cars already work. They were successfully tested in Arizona in 2017 going on a few thousand test runs. No accidents AT ALL, they are now a reality and we should be seeing them on the streets by 2020. Now lets look at current displays. Displays have already hit their "Moore's Law Wall"! A scientific test was conducted that revealed that even the healthiest human eye cannot depict the difference between 1440p and 4K displays from more than a few inches away from the screen. So unless you like to sit so close to your monitor or TV that you cant even see half the screen and like to burn out your retnas, then 4K is simply a marketing gimmick, and has no real benefit what so ever. Some of you may know that I actually own a Samsung MU6500 Curved Glass 4K Smart TV that I use as my monitor and you may be wondering why. Well, my previous TV was a 7-year-old, crappy, low quality, cheap Sanyo 1080p flat screen. The image quality was terrible, the colors were abysmal, and it was so old that I figured it would die soon, so I sold it before it kicked the bucket on me. Plus I bought the Samsung on Black Friday 2017 so $500 for a $1000 TV? Sure, sign me right up. This display is also WAY better in terms of gaming capability. When you select "Game Console" as the selected source, it disables "Motion Rate" interpolation, most of the graphics processing options, and even lowers input lag. I can honestly say I cant tell the difference between this TV and any high-refresh gaming monitor I have ever played on. Its THAT fast and smooth, way to go Samsung! Anyways, here is the article that explains the scientific test: https://www.nbcnews.com/technology/enough-pixels-already-tvs-tablets-phones-surpass-limits-human-vision-2d11691618 I do understand that Virtual Reality does need to increase in pixel count still and has a while to go before it's technology is maxed out. However I, like many other gamers, prefer a display over VR, and find VR to be an expense that simply isn't worth the money. Firstly, how do you go from playing a high-intensity, fast-paced FPS and then head into VR and find the aiming is so difficult that you are suddenly taking so much time to aim and fire your virtual weapon? Personally for me, any game that uses the VR motion controllers is pretty much worthless. I do agree, however, that there are some games that look tempting in VR. Games like Elite Dangerous or Forza Motorsport. Since you are in a vehicle you do not need the stupid motion controllers and being inside the vehicle just gives you a whole new level of immersion. But regardless, as discussed, GPUs are facing the same issues as CPUs, so good luck with 16K. So here we are then, all evidence points to the end of technology advancements as we know it, and its expected to be very soon. What are we to do? Well if you do the Moore's Law Google search you will find that at the end of most articles, they describe how Quantum Computing will be taking over the known world of computers, but if you have seen Linus's video where he takes a tour of the Quantum Computing headquarters, that technology is far from being available to the masses. So what do we do until then? Well, honestly? Nothing. Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves. So what do you guys think about all of this? Do you agree? Disagree? What do you think is next?
- 183 replies
-
- moores law
- 7nm
-
(and 1 more)
Tagged with:
-
I have been planning on upgrading my currently VERY outdated PC (See my signature for more info) once the Ryzen 2 refresh CPUs hit the shelves in Late April - Early May 2018. However, upon doing research on the internet about the Ryzen 2 CPUs and what to expect, I came across a very interesting article that may prompt me to simply hold off and deal with my older hardware until 2019 or even 2020. The Article: https://www.pcworld.com/article/3246211/computers/amd-reveals-ryzen-2-threadripper-2-7nm-navi-and-more-in-ces-blockbuster.html Now there is one important point I want to make clear from the start: The 2000 series AMD CPUs coming in April 2018 are NOT Zen 2 architecture CPUs, but rather Zen+, a refreshed version of the Zen 1 CPUs such as the Ryzen 7 1800X. The official Zen 2 architecture, according to AMD; is not due to release until NEXT year, 2019. The article above suggests that the Zen 3 architecture of 2020 will be the one to implement 7nm manufacturing, but some sites rumor the 7nm process to come to Zen 2 in 2019. So why does this information make me want to wait for Zen 2 or Zen 3? The answer is simple: Moore's Law. For those of you who are not aware, Moore's Law is the statement that one day, the transistor manufacturing process will get so small, that consumer-grade CPUs and other electronics will no longer advance. This law is expected to take effect once 7nm manufactured CPUs hit the market, and the reason this law is expected to be correct is because manufacturing anything smaller than 7nm transistors would require insane technologies that will be so expensive to implement, that 99% of consumers would find the cost too high and simply decide to keep their older hardware. For example, the 5nm process is expected to LITERALLY require a freaking particle accelerator to create transistors that small. This would translate to EXTREMELY HIGH costs to purchase CPUs, GPUs, or other hardware based on this 5nm process. If you are complaining about the $1000+ GTX 1080 Ti GPUs due to crypto currency today, you probably wont believe your eyes when you see just how expensive 5nm technology is expected to be. Now pricing on these future technologies can only be guessed at, as there aren't even rumors yet about what the pricing could be, but I have a guess. Try Intel Celeron or Pentium grade CPUs costing more than $400, with higher end i7 and Ryzen 7 CPUs costing more than $1200. As a reference, Imagine an Intel i3-8350K that costs around $600, or an i9-7900K that costs around $2500. Now imagine a Ryzen 5 2400G that costs around $600 or a Ryzen Threadripper 1950X that costs about $2500. At these prices, just about anyone would be reluctant to upgrade at all. This is what Moore's Law explains. It's not that technology will stop advancing, of course not; but rather that it becomes so expensive, that MOST consumers basically stop upgrading their Desktop and Laptop computers. This would also translate into other items in life stopping advancements a few years later. Cars already have Wide-Screen, HD TVs as their dashboards in higher end vehicles, and can do things like actively raise or lower your ride height depending on your needs; plus Google has already successfully tested it's self-driving car last year. There was also a scientific test that was recently done that shows the fact that the human eye is unable to see the difference in 1440p and 4K resolutions from anything more than 3-4 inches away from the screen, meaning that purchasing anything higher than a 1440p display is technically just a waste of money. Another piece of interesting evidence that points to the likelihood of Moore's Law being on our doorstep, is the fact that Intel recently struck a deal with Federal Governments where Intel supplies $5 Billion USD as well as Federal Governments supplying another $5 Billion USD in Research and Development to Global Foundries for the 10nm manufacturing process because Global Foundries themselves admitted to "having issues" with manufacturing 10nm transistors. This will probably translate into 10nm-based CPUs seeing a substantial price increase over previously "normal" generation-to-generation CPU price increases. The point is, consumer-grade electronics are getting to the point that soon, there will be nowhere left to go, or at least nowhere to advance that is considered a worthwhile investment to the average consumer. So why then would I want to hold off on building my new PC? Once again, the answer is simple: Value. If I hold off until 10nm or 7nm CPUs are available and then complete my build based off that technology, I could, theoretically; build a PC that will remain relevant, up to date, and keep pace with the VAST MAJORITY of the PC enthusiast community for 10, 20, or even 30 years. For those of you concerned with Price-To-Performance ratio, this will be your time to shine. I would personally recommend you dish out the extra cash and build a Ryzen 7/ Intel Core i7 or i9 higher-end build in 2019-2020 to get the absolute maximum future proof-ness while keeping costs RELATIVELY acceptable ($2000-$3000 range). This build should also feature other high-end components such as NVME M.2 SSD(s) and 32GB + RAM to ensure that upcoming software can be run flawlessly. Now of course all of this is pure speculation but it is widely accepted throughout the tech community that the 7nm manufacturing process is most likely to become the official "Moore's Law" or "Wall" or "Barrier" that marks the end of consumer upgrades. What are your thoughts regarding Moore's Law and the 7nm manufacturing process? I am happy, or at least tolerant, of my current rig's performance, so should I wait for Zen 2/3? I think this discussion could get very interesting...
-
While Nvidia may feel like they are fixing the problem of Moore's law with Volta: they are essentially adding another processor. If it really was the solution to quantum tunneling, I feel like it probably would have been done by a team of physicists rather than a group of NvidiaCorp employees. What is your opinion on 'Gpu Extension'?
-
This is quite a nice thing to hear so with any luck Intel will return to the tick tock cycle. I don't know about you guys but as for myself there's no way i would get a new CPU just for efficiency boosts, if they're going to be increasing their spending like this i'm going to want to see a serious performance boost that will make me want to upgrade from my 3770k. http://www.pcworld.com/article/3040381/computers/intel-eyes-a-path-to-get-back-in-line-with-moores-law.html
- 11 replies
-
- moores law
- intel
- (and 4 more)
-
The chip industry is no longer going to treat Gordon Moore's law as the target to aim for. Moore's law is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years. The observation is named after Gordon E. Moore, the co-founder of Intel and Fairchild Semiconductor, whose 1965 paper described a doubling every year in the number of components per integrated circuit,[2] and projected this rate of growth would continue for at least another decade.In 1975 looking forward to the next decade, he revised the forecast to doubling every two years. Wikipedia SOURCE: ARS TECHNICAUK