Jump to content

ReLightz

Member
  • Posts

    10
  • Joined

  • Last visited

Reputation Activity

  1. Funny
    ReLightz reacted to TheRandomness in LISTEN UP LINUS   
    -Thread locked. Shitposting of this level has never been encountered before (also begging isn't allowed).
  2. Informative
    ReLightz reacted to Felix-pod2 in Headset advice   
    I would personally go for the Cloud. I got the Cloud X`s myself and they really are good, recommended if you can get them. As for 7.1, I`m not that into it, and i would always take better hardware based improvements in audio over virtual or software, but I heard it`s great for gaming, not so much for music.
  3. Funny
    ReLightz reacted to Syntaxvgm in [Update] Security flaws discovered in AMD zen processors : AMD's meltdown?   
    Shame your post isn't. 
  4. Agree
    ReLightz reacted to Streetguru in Upgrade question   
    It would be a complete waste of money to upgrade to an i5 7600K since it's only 4c/4t where as you could get a 6 core ryzen chip for the same price, or even a 4c/8t ryzen chip because the motherboard is so cheap, unless you're getting the i5 for like $100

    If you don't already have the 9590 for sure don't buy it
     
    PCPartPicker part list: https://pcpartpicker.com/list/jrpzRG
    Price breakdown by merchant: https://pcpartpicker.com/list/jrpzRG/by_merchant/
    CPU: AMD - Ryzen 5 1400 3.2GHz Quad-Core Processor  ($149.99 @ Amazon)
    Motherboard: ASRock - AB350M Pro4 Micro ATX AM4 Motherboard  ($74.49 @ SuperBiiz)
    Total: $224.48
    Prices include shipping, taxes, and discounts when available
    Generated by PCPartPicker 2018-03-12 03:19 EDT-0400
  5. Informative
    ReLightz reacted to WallacEngineering in The End of CPU Advancement on our Doorstep (Moore's Law and the 7nm Barrier) Discussion   
    I would like to start an interesting discussion regarding traditional PCs and CPU performance enhancements and why I feel the final major CPU releases are right around the corner. Now to start, I should explain that while I have quite a few resources to back up my theory, this is all still pure speculation, and the sources I will provide are based on speculation as well.
     
    So what is my theory exactly? That around 2020-2022, people will stop buying new PCs, and from there, like a domino effect, technology advancements for CONSUMERS will come to an end (lets make it clear that I am not talking about Military Technology or technology as a whole, I am convinced these areas will continue growing).
     
    So lets start off with why I even think this is possible. So "Moore's Law" is an observation first recorded by Gordon Moore, whose 1965 paper described that the number of transistors in an integrated circuit will double every year for at least a decade. Now he was slightly incorrect about the timeline but correct otherwise. The number of transistors has been doubling about every 15-18 months in our CPUs. The increasing of transistors is what allows our CPUs to get more cores and faster clock speed with each generation. The issue with this is that there is a physical limit where increasing the number of transistors within a CPU becomes a physical impossibility. Now according to most computer scientists, computer enthusiasts such as you and I, and the tech community in general, we all agree that the physical impossibilities will begin to occur after 7nm-based CPUs hit the shelves. If you simply google "Moore's Law" you will come across a wide variety of articles explaining the definition and then discussing the physical limits of Moore's Law and when they predict that we will no longer be able to advance traditional CPUs. Most of these articles come to the same conclusion: "Moore's Law will reach it's physical limitation sometime in the 2020's". Take Time to read the article below:
     

     
    So INTEL THEMSELVES said in 2016 that transistors can only continue to shrink for about 5 more years. Guess what AMD has already announced for 2020-2021? Zen 3, their first 7nm+ CPU. Put two and two together and you get... four. Its as simple as that.
     
    Now the reason 7nm seems to be the physical limit is because beyond 7nm, the transistors would produce more heat than the power they would output, making the idea of 5nm CPUs an oxy-moron, or paradox. Now I believe that there is a small chance that scientists will figure out a way to produce 5nm CPUs, but the technology would be extremely difficult and expensive to manufacture, so much so that none of us consumers would be interested in upgrading from 7nm to 5nm CPUs.
     
    This next source I cannot provide because I can no longer find it. I saw a YouTube video about 18 months ago where a man who actually worked at Global Foundries was now legally allowed to speak about his time at the FAB and what difficulties they were expecting to see in the coming years. He said something about particle accelerator technology being required to create transistors around 7nm or smaller. I know, I don't understand how a particle accelerator could have anything to do with the FinFET manufacturing process either, but that's what he said, and that's why he makes more money than I do lol.
     
    This next part is purely my own speculation but its regarding the price of 5nm CPUs if we get there. So if we could purchase 5nm CPUs, they are likely to be so expensive that we will keep our 7nm CPUs instead. You think GPU pricing is bad right now? Try $1000+ entry level Pentium CPUs, or i9 extreme CPUs that cost tens of thousands of dollars. Ya, I don't know about you but I am staying WELL away from 5nm CPUs.
     
    Now you may be thinking: "Well why not just increase the physical size of CPUs so more 7nm transistors can fit into it?" Well that is technically a great idea, but its got a few issues. One, there needs to be room in between the CPU and RAM for cooler mount clearance, so going much bigger than ThreadRipper isn't really possible, and if we start pushing the RAM slots out then we start changing the ATX form factor standard which trust me, that isn't going to happen. This would mean all cases, power supplies, and other accessories would need to be completely redesigned to fit these new motherboards, and all this work would be done for a technology that will only last a few more years anyways. The largest issue however would be heat output. The current i7-8700K is already a monster of a heat producer, and that's just a mainstream CPU. Imagine a CPU with more than double the 8700K's processing power! Heat production would likely be so intense that even the most expensive custom water loop solutions would struggle to cool it, and don't even THINK about tying your GPU(s) into that loop, not gonna happen, especially as GPUs are likely to face the same issues as CPUs.
     
    Another issue that needs to be discussed is weather or not CPU upgrades are even necessary anymore, even TODAY! It is widely accepted that 8 Threads of processing power is more than enough for even the most demanding games, and while productivity and content creation software may be slower on something like my old Phenom II X4 970, its still possible.
     
    The only reason, and I mean the ONLY reason I just bought a Ryzen 5 1600X and related components is because the X4 970 suffers from only having 4 threads of compute power and sits on terrible green 1333 MHz RAM that is NOT overclockable. This means that while MOST of my daily computer use is still INSTANTANEOUS such as loading programs, watching Netflix, or transferring files thanks to my SSD, it fails in gaming physics. Firestrike graphics tests run very smoothly at 45 FPS, but as soon as the physics test starts I tank straight down to around 10 FPS. CPU intensive games like Kerbal Space Program and From the Depths simply arent capable of running at decent FPS no matter how low I set graphical settings.
     
    That and I bought every single component used so I managed to get a Ryzen 5 1600X system with 16GB DDR4 3000 MHz ram, an R9 390X, a Samsung 960 EVO 250GB NVME boot drive, and a Samsung 850 EVO 500GB 2.5" SSD while reusing my PSU and case, for about $600. Now THAT is what makes this purchase justifiable.
     
    Say its the year 2020. Zen 3 is out and you decide to pick up the new (just guessing on the specs here) Threadripper 4950 X with 20 Cores, 40 Threads and can reach 6GHz with some overclocking. How much of that processing power are you EVER going to use? When do you think that kind of compute performance will become obsolete and REQUIRE replacement? Software is a good two decades behind fully utilizing high-end hardware as things currently sit. And thats not even talking about something like threadripper. Its likely to take two decades to fully utilize the i7 8700K in just about ANY software that isn't related to content creation or rendering!
     
    If you bought a CPU as powerful as the Threadripper 4950X (as I specified) then theres very good chance that as long as the CPU physically survives and doesnt fail, that you wouldn't need to replace it at ANY point in the remainder of your lifetime.
     
    EDIT/UPDATE:
    Lets examine this issue from another perspective. Alot of comments suggest that carbon nanotube or quantum computing could be the solution, that Technology will not stop and continue on. Well keep in mind I never said technology will stop, what I am suggesting is that us, as consumers, do not find these new technologies to be justifiable purchases, at least not at first.
     
    Assuming that these new technologies that computer science will try to implement are extremely expensive in their first few years (which is likely as every new technology is expensive when it is first implemented), we can basically expect to see most PC consumers, including enthusiasts, to no longer have any reason to purchase these new technologies until these technologies improve, mature, and lower in cost sometime in the future.
     
    Take a look at every single "big leap forward" in man kinds history. When the microchip was first invented, it took many years of development and research before every day people like us were able to get an affordable computer in our homes. When the Automobile was first invented, it took many years before everyday people could afford one for themselves. It took us 20 years of testing space rockets to ever actually put an astronaut on the moon.
     
    And that is exactly the point. When a REALLY big game-changing technology is invented, it usually takes at least a decade, if not longer; of research, testing, maturity, and manufacturing efficiency, before the vast majority of the consumer market ever gets a chance of actually owning this technology for themselves.
     
    History has proven itself to repeat, we all know this. Intel JUST invested $5 Billion USD in 10nm manufacturing. This suggests that more than likely they EITHER arent even researching what to do after transistors reach their physical limit, or their R&D on the subject is very limited at this time, which means come the 2020's, it probably wont be ready, and we will be forced to wait it out.
     
    So what we are talking about then, is a strong possibility of another one of man kinds "big leaps" as far as computing is concerned, probably in the early 2020's, which is only a few years from now.
     
    The "Lets just add more transistors and create physically bigger and more power hungry CPUs" idea is NOT "Forward-Thinking" This is more relatable to Jeremy Clarkson from Top Gear UK/ The Grand Tour's take on automobile performance: "More Power!!!" With companies as advanced as CPU manufacturers like Intel and AMD, I would probably guess that neither of them would be okay with using this approach. The motto of world of computing always has been and always will be "Smaller, Faster, Smarter"
     
    There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to.
     
    Now lets take a look at automobiles. High-end cars today use heads-up displays that are basically high definition televisions. Then you have insane technologies such as Range Rover's Terrain Response System which can detect weather you are on Asphalt, Mud, Snow, or Sand and adjust power output to each individual wheel to keep you going pretty much no matter what. You can also actively adjust your ride height and suspension stiffness in a lot of cars today. There isn't much more we can do with a machine that uses four round pieces of rubber to move around. So whats next, flying cars? Well yea, they already exist and have existed for about a decade or so. But there are a number of MASSIVE problems with this technology that makes me think it will NEVER, EVER become a reality for 99.9% of people. Firstly, its EXTREMELY EXPENSIVE. It uses pivoting jet engines just like a real military VTOL aircraft does, do you have any idea how much those cost to buy, or to run? Lets just not even talk about it. Plus, do you really think your federal government is going to let you just fly around wherever you want? Ever heard of 9/11? Well imagine millions of idiot drivers behind the wheel of flying machines? Yea, 9/11 would look like a holiday in comparison. So NO, it will NOT EVER HAPPEN. On top of this, Google's self driving cars already work. They were successfully tested in Arizona in 2017 going on a few thousand test runs. No accidents AT ALL, they are now a reality and we should be seeing them on the streets by 2020.
     
    Now lets look at current displays. Displays have already hit their "Moore's Law Wall"! A scientific test was conducted that revealed that even the healthiest human eye cannot depict the difference between 1440p and 4K displays from more than a few inches away from the screen. So unless you like to sit so close to your monitor or TV that you cant even see half the screen and like to burn out your retnas, then 4K is simply a marketing gimmick, and has no real benefit what so ever. Some of you may know that I actually own a Samsung MU6500 Curved Glass 4K Smart TV that I use as my monitor and you may be wondering why. Well, my previous TV was a 7-year-old, crappy, low quality, cheap Sanyo 1080p flat screen. The image quality was terrible, the colors were abysmal, and it was so old that I figured it would die soon, so I sold it before it kicked the bucket on me. Plus I bought the Samsung on Black Friday 2017 so $500 for a $1000 TV? Sure, sign me right up. This display is also WAY better in terms of gaming capability. When you select "Game Console" as the selected source, it disables "Motion Rate" interpolation, most of the graphics processing options, and even lowers input lag. I can honestly say I cant tell the difference between this TV and any high-refresh gaming monitor I have ever played on. Its THAT fast and smooth, way to go Samsung! Anyways, here is the article that explains the scientific test:
     
    https://www.nbcnews.com/technology/enough-pixels-already-tvs-tablets-phones-surpass-limits-human-vision-2d11691618
     
    I do understand that Virtual Reality does need to increase in pixel count still and has a while to go before it's technology is maxed out. However I, like many other gamers, prefer a display over VR, and find VR to be an expense that simply isn't worth the money. Firstly, how do you go from playing a high-intensity, fast-paced FPS and then head into VR and find the aiming is so difficult that you are suddenly taking so much time to aim and fire your virtual weapon? Personally for me, any game that uses the VR motion controllers is pretty much worthless. I do agree, however, that there are some games that look tempting in VR. Games like Elite Dangerous or Forza Motorsport. Since you are in a vehicle you do not need the stupid motion controllers and being inside the vehicle just gives you a whole new level of immersion. But regardless, as discussed, GPUs are facing the same issues as CPUs, so good luck with 16K.
     
    So here we are then, all evidence points to the end of technology advancements as we know it, and its expected to be very soon. What are we to do? Well if you do the Moore's Law Google search you will find that at the end of most articles, they describe how Quantum Computing will be taking over the known world of computers, but if you have seen Linus's video where he takes a tour of the Quantum Computing headquarters, that technology is far from being available to the masses. So what do we do until then? Well, honestly? Nothing. Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves.
     
    So what do you guys think about all of this? Do you agree? Disagree? What do you think is next?
×