Search the Community
Showing results for tags '7nm'.
-
Ok since apparently only Samsung and TSMC are able to make any 7nm to 8nm chips, I feel like the market is in the situation the two cannot manufacture enough chips? It's also maybe not only due to Covid but also due to supply & demand. If so why doesn't Samsung or TSMC licence the 8nm and 7nm recipe to someone like Global Foundries? Samsung even did it in the past with 14nm right?
-
Summary From the Sapphire Nation Podcast Quotes My thoughts I am sorry to say it but this may last all year if not longer... UwU Sources
-
Summary Quotes My thoughts GPU market currently has seen lot of action, the price drop, Nvidia 4000 series launching soon, other players trying to bring some competition is kind of a good thing. Though they claim big numbers, for AI work loads, hands on testing might have different results especially gaming or other tasks. Still, these chips are a way for China to move independently from the West to build their own things, growing self reliance. Not just hardware, they also have BIRENSUPA, a software development platform similar to that of CUDA family from Nvidia. So it is a fresh GPU and didn't get much attention at Hotchips34 so had to share it here. Sources https://wccftech.com/china-most-powerful-gpu-birentech-br100-77-billion-transistors-7nm-faster-ai-than-nvidia-a100/ https://www.nextplatform.com/2022/08/25/china-launches-the-inevitable-indigenous-gpu/ https://www.hpcwire.com/2022/08/22/chinese-startup-biren-detail-br100-gpu/ https://hc34.hotchips.org/ https://www.birentech.com/BR100.html
-
A review was published "by accident" by a Polish site called Benchmark.pl. The review was conducted with early drivers and apparently, these drivers also don't support overclocking. Therefore, the results may not correspond to the final performance. Nonetheless, behind the Spoiler tags lies the results; benchmarking in Shadow of the Tomb Raider, Far Cry 5, Wolfenstein II: The New Colossus, 3DMark Time Spy and 3DMark Fire Strike. Last graph is Full System Power Consumption with an i9-9900k, and it looks like Power Consumption is relatively decent when compared to the RTX 2070. Lastly, the RX 5700 looks like it will be a great price to performance offering. Source: https://videocardz.com/newz/amd-radeon-rx-5700-xt-and-rx-5700-review-leaks-out
-
This is clearly not a subtle way of pointing out that while AMD might have been the first to create a 7nm graphics card, it can’t really compete with Nvidia’s 12nm Turing. In fairness, even the most ardent Team Red fan would find it difficult to argue with that! These statements, are however, pretty hefty shots fired and with AMD set to release their new graphics card range sometime this Summer, let’s hope he doesn't end up eating those words! Source: https://www.eteknix.com/nvidia-fire-shots-at-amds-7nm-graphics-technology/ Intel's entrance into the dGPU space can't come soon enough (2020). It's one thing smacking around AMD with your product stack's performance alone. It's another thing being entirely egotistical about it and getting too comfortable or content. Either way, I'm wondering if this overconfidence is a bit of foreshadowing of what to expect from another future architecture from NVIDIA, like the supposed Ampere.
- 199 replies
-
Hi, In the Radeon VII video Linus mentioned that the FP64 performance of the Radeon VII will be crippled. Is this based on anything/was any info on this revealed or just a "reasonable" assumption?
-
Do we have any idea when theses will be out? Or at least when we can get more info about them?
-
Summary Is Intel CPU development in even more trouble than we thought...!?! Ex employee reports as much... My thoughts As mentioned many times before we benefit the most from (at least) two strong CPU companies leading the development not from one winning and one in trouble... "RetiredEngineer" (twitter handle) reports serious trouble in development since previous CPU generations and suggests problems with working environment, not the right people in the charge of the development as well as and that§s the worst in my opinion...the right people leaving Intel being the cause for what seems to be even worse situation than Intel is trying to portrait. Jim Keller§s time at Intel described as "breath of fresh air" but he did not stay long enough to cause a postie long lasting change for better. Can we get a Linus Tech Tips probe into CPU Intel / AMD development?!? Photos Sources https://di1it.cz/clanek/byvaly-pracovnik-intelu-zverejnil-co-stoji-za-problemy-spolecnosti
-
I'm thinking of buying a desktop gaming PC with whiskey lake and RTX coming up. But then again, I'm reading about cannonlake aming for 10nm and 7nm GPUs being on the up and coming. Do people have any ideas about how big the leap in performance will be, just based on the change in architecture? Any reason to wait for the new standard? I'm kinda torn, because I could use a desktop PC (only have a low performing gaming laptop) and it is always something new coming, but it seems kinda sad to buy new gear right before a big leap (if that is what it is). Any thoughts?
-
SOURCE: https://wccftech.com/amd-7nm-vega-gpu-zen-2-cpu-mass-production-tsmc/ TSMC has just confirmed that they've begun Massproduction of their 7nm lithography, confirming that AMD will be contracting them to produce their Vega+ GPUs as part of their Radeon Instinct and Radeon Pro lineups. These GPUs are expected to hit market during H2, 2018. AMD is also expected to use TSMC for their upcoming Zen 2, 7nm, processors. TSMC's production capacity of their 7nm lithography is expected to increase three-fold by the end for the year, this is what WCCFTech says about it: "This means that TSMC will be able to deliver a mass total of 1.1 million units by the next year, a three-fold increase over the current year. The AMD 7nm OEM orders are expected to being production in Q4 2018 and TSMC is expected 7nm to account for up to 20% of their Q4 2018 revenue. As for other nodes, TSMC will be beginning mass production on the 7nm+ using EUV technology next year, which will further reduce power consumption by 10%, compared to 7nm."
-
Today, GlobalFoundries announced that it is immediately halting all 7nm development and will not move forward with any 7nm node production and will instead focus on specialized processes. Additionally, AMD has announced that it's moving all production to TSMC in the wake of this announcement by GF. I think it's interesting to see them make such a rapid turn in such a short time. Whether this will mean better processors for other products though is still yet to be seen (and probably won't for a couple years). I hope AMD can continue pushing forward with their 7nm process at TSMC without much issue. Article: https://www.anandtech.com/show/13277/globalfoundries-stops-all-7nm-development Edit: Re-read the article and it looks like AMD designed their 7nm process for TSMC anyway so this shouldn't have much, if any, impact on their 7nm process (hopefully)
- 78 replies
-
- globalfoundries
- tsmc
-
(and 2 more)
Tagged with:
-
Not really news at this point, but it seems like a good "don't worry, we got you" sort of thing, and if Lisa Su is actually the one that says this, it's a confirmation of something must of us already knew. Now, before anyone starts, due to limited web access, WCCFTECH is my only source at the time of writing. Tom's Hardware mentions it on one of their earlier posts as well (link below if you're interested) but this is the latest article I have access to right now and the only one actually claiming a confirmation on this. Now, I for one will choose to give this article the benefit of the doubt. Especially after considering this little tidbit from the AMD site itself: So if Vega is expected at 7nm, then the next gen has to be at 7nm. Agree or disagree at your leisure, but I am excited to see what they come up with for Navi. Sources Used: https://wccftech.com/amd-radeon-rx-7nm-graphics-cards-gamers-2019/ https://www.amd.com/en-us/press-releases/Pages/pushing-boundaries-for-2018jun05.aspx https://www.tomshardware.com/news/amd-7nm-gpu-vega-gaming,37228.html
-
https://www.anandtech.com/show/12677/tsmc-kicks-off-volume-production-of-7nm-chips News is actually from yesterday, which surprised me that it hadn't made it into the forum. This is definitely one of those "big deal" announcements. With all of the trouble that the foundries have been dealing with in the last 10 years of nodes, this is going to be a fairly sizable move for the entire industry. By TSMC's own numbers, the move from 16FF+ node (Nvidia's Pascal is on this node) to 7FF is over a 70% area reduction. (Same number of transistors fits in 30% of the previous space. That's a massive improvement.) There are also a few other important areas this effects. We've had those PS5 rumors running around, and both the Xbox & PS4 are manufactured by TSMC using AMD designs. If 7nm node is in high volume production, then a PS5 could actually launch this year. The other really interesting tidbit is the part I bolded about "server CPU". Is this going to be Qualcomm CPUs for ARM servers, or are we going to see AMD moving Zen2 production to TSMC? This level of area shrink & performance uplift means the next 2 years of products should see a large jump in performance. Last important possibility that comes to mind is that Nvidia will be skipping their 12nm node (customized for them for Volta) and moving to 7nm for their next GPUs, which would launch later in the year.
-
I would like to start an interesting discussion regarding traditional PCs and CPU performance enhancements and why I feel the final major CPU releases are right around the corner. Now to start, I should explain that while I have quite a few resources to back up my theory, this is all still pure speculation, and the sources I will provide are based on speculation as well. So what is my theory exactly? That around 2020-2022, people will stop buying new PCs, and from there, like a domino effect, technology advancements for CONSUMERS will come to an end (lets make it clear that I am not talking about Military Technology or technology as a whole, I am convinced these areas will continue growing). So lets start off with why I even think this is possible. So "Moore's Law" is an observation first recorded by Gordon Moore, whose 1965 paper described that the number of transistors in an integrated circuit will double every year for at least a decade. Now he was slightly incorrect about the timeline but correct otherwise. The number of transistors has been doubling about every 15-18 months in our CPUs. The increasing of transistors is what allows our CPUs to get more cores and faster clock speed with each generation. The issue with this is that there is a physical limit where increasing the number of transistors within a CPU becomes a physical impossibility. Now according to most computer scientists, computer enthusiasts such as you and I, and the tech community in general, we all agree that the physical impossibilities will begin to occur after 7nm-based CPUs hit the shelves. If you simply google "Moore's Law" you will come across a wide variety of articles explaining the definition and then discussing the physical limits of Moore's Law and when they predict that we will no longer be able to advance traditional CPUs. Most of these articles come to the same conclusion: "Moore's Law will reach it's physical limitation sometime in the 2020's". Take Time to read the article below: So INTEL THEMSELVES said in 2016 that transistors can only continue to shrink for about 5 more years. Guess what AMD has already announced for 2020-2021? Zen 3, their first 7nm+ CPU. Put two and two together and you get... four. Its as simple as that. Now the reason 7nm seems to be the physical limit is because beyond 7nm, the transistors would produce more heat than the power they would output, making the idea of 5nm CPUs an oxy-moron, or paradox. Now I believe that there is a small chance that scientists will figure out a way to produce 5nm CPUs, but the technology would be extremely difficult and expensive to manufacture, so much so that none of us consumers would be interested in upgrading from 7nm to 5nm CPUs. This next source I cannot provide because I can no longer find it. I saw a YouTube video about 18 months ago where a man who actually worked at Global Foundries was now legally allowed to speak about his time at the FAB and what difficulties they were expecting to see in the coming years. He said something about particle accelerator technology being required to create transistors around 7nm or smaller. I know, I don't understand how a particle accelerator could have anything to do with the FinFET manufacturing process either, but that's what he said, and that's why he makes more money than I do lol. This next part is purely my own speculation but its regarding the price of 5nm CPUs if we get there. So if we could purchase 5nm CPUs, they are likely to be so expensive that we will keep our 7nm CPUs instead. You think GPU pricing is bad right now? Try $1000+ entry level Pentium CPUs, or i9 extreme CPUs that cost tens of thousands of dollars. Ya, I don't know about you but I am staying WELL away from 5nm CPUs. Now you may be thinking: "Well why not just increase the physical size of CPUs so more 7nm transistors can fit into it?" Well that is technically a great idea, but its got a few issues. One, there needs to be room in between the CPU and RAM for cooler mount clearance, so going much bigger than ThreadRipper isn't really possible, and if we start pushing the RAM slots out then we start changing the ATX form factor standard which trust me, that isn't going to happen. This would mean all cases, power supplies, and other accessories would need to be completely redesigned to fit these new motherboards, and all this work would be done for a technology that will only last a few more years anyways. The largest issue however would be heat output. The current i7-8700K is already a monster of a heat producer, and that's just a mainstream CPU. Imagine a CPU with more than double the 8700K's processing power! Heat production would likely be so intense that even the most expensive custom water loop solutions would struggle to cool it, and don't even THINK about tying your GPU(s) into that loop, not gonna happen, especially as GPUs are likely to face the same issues as CPUs. Another issue that needs to be discussed is weather or not CPU upgrades are even necessary anymore, even TODAY! It is widely accepted that 8 Threads of processing power is more than enough for even the most demanding games, and while productivity and content creation software may be slower on something like my old Phenom II X4 970, its still possible. The only reason, and I mean the ONLY reason I just bought a Ryzen 5 1600X and related components is because the X4 970 suffers from only having 4 threads of compute power and sits on terrible green 1333 MHz RAM that is NOT overclockable. This means that while MOST of my daily computer use is still INSTANTANEOUS such as loading programs, watching Netflix, or transferring files thanks to my SSD, it fails in gaming physics. Firestrike graphics tests run very smoothly at 45 FPS, but as soon as the physics test starts I tank straight down to around 10 FPS. CPU intensive games like Kerbal Space Program and From the Depths simply arent capable of running at decent FPS no matter how low I set graphical settings. That and I bought every single component used so I managed to get a Ryzen 5 1600X system with 16GB DDR4 3000 MHz ram, an R9 390X, a Samsung 960 EVO 250GB NVME boot drive, and a Samsung 850 EVO 500GB 2.5" SSD while reusing my PSU and case, for about $600. Now THAT is what makes this purchase justifiable. Say its the year 2020. Zen 3 is out and you decide to pick up the new (just guessing on the specs here) Threadripper 4950 X with 20 Cores, 40 Threads and can reach 6GHz with some overclocking. How much of that processing power are you EVER going to use? When do you think that kind of compute performance will become obsolete and REQUIRE replacement? Software is a good two decades behind fully utilizing high-end hardware as things currently sit. And thats not even talking about something like threadripper. Its likely to take two decades to fully utilize the i7 8700K in just about ANY software that isn't related to content creation or rendering! If you bought a CPU as powerful as the Threadripper 4950X (as I specified) then theres very good chance that as long as the CPU physically survives and doesnt fail, that you wouldn't need to replace it at ANY point in the remainder of your lifetime. EDIT/UPDATE: Lets examine this issue from another perspective. Alot of comments suggest that carbon nanotube or quantum computing could be the solution, that Technology will not stop and continue on. Well keep in mind I never said technology will stop, what I am suggesting is that us, as consumers, do not find these new technologies to be justifiable purchases, at least not at first. Assuming that these new technologies that computer science will try to implement are extremely expensive in their first few years (which is likely as every new technology is expensive when it is first implemented), we can basically expect to see most PC consumers, including enthusiasts, to no longer have any reason to purchase these new technologies until these technologies improve, mature, and lower in cost sometime in the future. Take a look at every single "big leap forward" in man kinds history. When the microchip was first invented, it took many years of development and research before every day people like us were able to get an affordable computer in our homes. When the Automobile was first invented, it took many years before everyday people could afford one for themselves. It took us 20 years of testing space rockets to ever actually put an astronaut on the moon. And that is exactly the point. When a REALLY big game-changing technology is invented, it usually takes at least a decade, if not longer; of research, testing, maturity, and manufacturing efficiency, before the vast majority of the consumer market ever gets a chance of actually owning this technology for themselves. History has proven itself to repeat, we all know this. Intel JUST invested $5 Billion USD in 10nm manufacturing. This suggests that more than likely they EITHER arent even researching what to do after transistors reach their physical limit, or their R&D on the subject is very limited at this time, which means come the 2020's, it probably wont be ready, and we will be forced to wait it out. So what we are talking about then, is a strong possibility of another one of man kinds "big leaps" as far as computing is concerned, probably in the early 2020's, which is only a few years from now. The "Lets just add more transistors and create physically bigger and more power hungry CPUs" idea is NOT "Forward-Thinking" This is more relatable to Jeremy Clarkson from Top Gear UK/ The Grand Tour's take on automobile performance: "More Power!!!" With companies as advanced as CPU manufacturers like Intel and AMD, I would probably guess that neither of them would be okay with using this approach. The motto of world of computing always has been and always will be "Smaller, Faster, Smarter" There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to. Now lets take a look at automobiles. High-end cars today use heads-up displays that are basically high definition televisions. Then you have insane technologies such as Range Rover's Terrain Response System which can detect weather you are on Asphalt, Mud, Snow, or Sand and adjust power output to each individual wheel to keep you going pretty much no matter what. You can also actively adjust your ride height and suspension stiffness in a lot of cars today. There isn't much more we can do with a machine that uses four round pieces of rubber to move around. So whats next, flying cars? Well yea, they already exist and have existed for about a decade or so. But there are a number of MASSIVE problems with this technology that makes me think it will NEVER, EVER become a reality for 99.9% of people. Firstly, its EXTREMELY EXPENSIVE. It uses pivoting jet engines just like a real military VTOL aircraft does, do you have any idea how much those cost to buy, or to run? Lets just not even talk about it. Plus, do you really think your federal government is going to let you just fly around wherever you want? Ever heard of 9/11? Well imagine millions of idiot drivers behind the wheel of flying machines? Yea, 9/11 would look like a holiday in comparison. So NO, it will NOT EVER HAPPEN. On top of this, Google's self driving cars already work. They were successfully tested in Arizona in 2017 going on a few thousand test runs. No accidents AT ALL, they are now a reality and we should be seeing them on the streets by 2020. Now lets look at current displays. Displays have already hit their "Moore's Law Wall"! A scientific test was conducted that revealed that even the healthiest human eye cannot depict the difference between 1440p and 4K displays from more than a few inches away from the screen. So unless you like to sit so close to your monitor or TV that you cant even see half the screen and like to burn out your retnas, then 4K is simply a marketing gimmick, and has no real benefit what so ever. Some of you may know that I actually own a Samsung MU6500 Curved Glass 4K Smart TV that I use as my monitor and you may be wondering why. Well, my previous TV was a 7-year-old, crappy, low quality, cheap Sanyo 1080p flat screen. The image quality was terrible, the colors were abysmal, and it was so old that I figured it would die soon, so I sold it before it kicked the bucket on me. Plus I bought the Samsung on Black Friday 2017 so $500 for a $1000 TV? Sure, sign me right up. This display is also WAY better in terms of gaming capability. When you select "Game Console" as the selected source, it disables "Motion Rate" interpolation, most of the graphics processing options, and even lowers input lag. I can honestly say I cant tell the difference between this TV and any high-refresh gaming monitor I have ever played on. Its THAT fast and smooth, way to go Samsung! Anyways, here is the article that explains the scientific test: https://www.nbcnews.com/technology/enough-pixels-already-tvs-tablets-phones-surpass-limits-human-vision-2d11691618 I do understand that Virtual Reality does need to increase in pixel count still and has a while to go before it's technology is maxed out. However I, like many other gamers, prefer a display over VR, and find VR to be an expense that simply isn't worth the money. Firstly, how do you go from playing a high-intensity, fast-paced FPS and then head into VR and find the aiming is so difficult that you are suddenly taking so much time to aim and fire your virtual weapon? Personally for me, any game that uses the VR motion controllers is pretty much worthless. I do agree, however, that there are some games that look tempting in VR. Games like Elite Dangerous or Forza Motorsport. Since you are in a vehicle you do not need the stupid motion controllers and being inside the vehicle just gives you a whole new level of immersion. But regardless, as discussed, GPUs are facing the same issues as CPUs, so good luck with 16K. So here we are then, all evidence points to the end of technology advancements as we know it, and its expected to be very soon. What are we to do? Well if you do the Moore's Law Google search you will find that at the end of most articles, they describe how Quantum Computing will be taking over the known world of computers, but if you have seen Linus's video where he takes a tour of the Quantum Computing headquarters, that technology is far from being available to the masses. So what do we do until then? Well, honestly? Nothing. Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves. So what do you guys think about all of this? Do you agree? Disagree? What do you think is next?
- 183 replies
-
- moores law
- 7nm
-
(and 1 more)
Tagged with:
-
I was looking up any information on new radeon GPUs (like i do every day) and found this article talking about the Vega 20. The Vega 20 is for AI and machine learning. however with this leak, maybe we expect new GPUs from AMD sooner than i had hoped. "AMD’s 7nm Vega 20 Pops up in Linux Patch A peculiar piece of code has been spotted in a recent Linux patch that makes reference to Vega 20 by name. The new patch appears to introduce support for more than 50 new Vega specific hardware-level features that were previously absent from the Linux kernel or had only been partially implemented. This indicates that AMD could be approaching final post-silicon testing and validation of Vega 20." Ive been holding out to see when new radeon pro gpus might be released. article: https://wccftech.com/amd-7nm-vega-20-ai-gpu-leaked-in-linux-patch/ https://twitter.com/AMDNews/status/989258151345229825?s=20
-
Just wanted to put this out here. I got a friend that works at AMD and just happens to be on the team working on 7nm until today. ( He didn't get fired he's just no longer on that project ) and he was telling me that it's going super well. The 7nm design is going great with the Vega graphics gpu
-
I have been planning on upgrading my currently VERY outdated PC (See my signature for more info) once the Ryzen 2 refresh CPUs hit the shelves in Late April - Early May 2018. However, upon doing research on the internet about the Ryzen 2 CPUs and what to expect, I came across a very interesting article that may prompt me to simply hold off and deal with my older hardware until 2019 or even 2020. The Article: https://www.pcworld.com/article/3246211/computers/amd-reveals-ryzen-2-threadripper-2-7nm-navi-and-more-in-ces-blockbuster.html Now there is one important point I want to make clear from the start: The 2000 series AMD CPUs coming in April 2018 are NOT Zen 2 architecture CPUs, but rather Zen+, a refreshed version of the Zen 1 CPUs such as the Ryzen 7 1800X. The official Zen 2 architecture, according to AMD; is not due to release until NEXT year, 2019. The article above suggests that the Zen 3 architecture of 2020 will be the one to implement 7nm manufacturing, but some sites rumor the 7nm process to come to Zen 2 in 2019. So why does this information make me want to wait for Zen 2 or Zen 3? The answer is simple: Moore's Law. For those of you who are not aware, Moore's Law is the statement that one day, the transistor manufacturing process will get so small, that consumer-grade CPUs and other electronics will no longer advance. This law is expected to take effect once 7nm manufactured CPUs hit the market, and the reason this law is expected to be correct is because manufacturing anything smaller than 7nm transistors would require insane technologies that will be so expensive to implement, that 99% of consumers would find the cost too high and simply decide to keep their older hardware. For example, the 5nm process is expected to LITERALLY require a freaking particle accelerator to create transistors that small. This would translate to EXTREMELY HIGH costs to purchase CPUs, GPUs, or other hardware based on this 5nm process. If you are complaining about the $1000+ GTX 1080 Ti GPUs due to crypto currency today, you probably wont believe your eyes when you see just how expensive 5nm technology is expected to be. Now pricing on these future technologies can only be guessed at, as there aren't even rumors yet about what the pricing could be, but I have a guess. Try Intel Celeron or Pentium grade CPUs costing more than $400, with higher end i7 and Ryzen 7 CPUs costing more than $1200. As a reference, Imagine an Intel i3-8350K that costs around $600, or an i9-7900K that costs around $2500. Now imagine a Ryzen 5 2400G that costs around $600 or a Ryzen Threadripper 1950X that costs about $2500. At these prices, just about anyone would be reluctant to upgrade at all. This is what Moore's Law explains. It's not that technology will stop advancing, of course not; but rather that it becomes so expensive, that MOST consumers basically stop upgrading their Desktop and Laptop computers. This would also translate into other items in life stopping advancements a few years later. Cars already have Wide-Screen, HD TVs as their dashboards in higher end vehicles, and can do things like actively raise or lower your ride height depending on your needs; plus Google has already successfully tested it's self-driving car last year. There was also a scientific test that was recently done that shows the fact that the human eye is unable to see the difference in 1440p and 4K resolutions from anything more than 3-4 inches away from the screen, meaning that purchasing anything higher than a 1440p display is technically just a waste of money. Another piece of interesting evidence that points to the likelihood of Moore's Law being on our doorstep, is the fact that Intel recently struck a deal with Federal Governments where Intel supplies $5 Billion USD as well as Federal Governments supplying another $5 Billion USD in Research and Development to Global Foundries for the 10nm manufacturing process because Global Foundries themselves admitted to "having issues" with manufacturing 10nm transistors. This will probably translate into 10nm-based CPUs seeing a substantial price increase over previously "normal" generation-to-generation CPU price increases. The point is, consumer-grade electronics are getting to the point that soon, there will be nowhere left to go, or at least nowhere to advance that is considered a worthwhile investment to the average consumer. So why then would I want to hold off on building my new PC? Once again, the answer is simple: Value. If I hold off until 10nm or 7nm CPUs are available and then complete my build based off that technology, I could, theoretically; build a PC that will remain relevant, up to date, and keep pace with the VAST MAJORITY of the PC enthusiast community for 10, 20, or even 30 years. For those of you concerned with Price-To-Performance ratio, this will be your time to shine. I would personally recommend you dish out the extra cash and build a Ryzen 7/ Intel Core i7 or i9 higher-end build in 2019-2020 to get the absolute maximum future proof-ness while keeping costs RELATIVELY acceptable ($2000-$3000 range). This build should also feature other high-end components such as NVME M.2 SSD(s) and 32GB + RAM to ensure that upcoming software can be run flawlessly. Now of course all of this is pure speculation but it is widely accepted throughout the tech community that the 7nm manufacturing process is most likely to become the official "Moore's Law" or "Wall" or "Barrier" that marks the end of consumer upgrades. What are your thoughts regarding Moore's Law and the 7nm manufacturing process? I am happy, or at least tolerant, of my current rig's performance, so should I wait for Zen 2/3? I think this discussion could get very interesting...
-
Maybe we’ll finally get that Intel GPU! https://wccftech.com/intel-ceo-beyond-cpu-7nm-more/
-
Apparently AMD is working on a 7nm processor with 48 cores and 96 threads for servers. Codename Starship It's also stated that a consumer model with fewer cores is planned. Crazy thing is: AMD wants to release those beasts as soon as 2018. Even though it's just a concept yet. Good luck with that. Source
-
I am asking myself which materials are possible to be used to make CPUs after Intel reached the 7nm limit for silicon. Furthermore: Do we have to expect massive problems and higher prices for those new technologies for at least three years? I heard something about Irdium based CPUs, but what about heat issues and stability, are there any research results so far? Please stay friendly and no Intel vs. AMD stuff, as both companies will have the physical limitation of silicon to deal with.
-
"As per the rumor mill"Since skylake won't be that much of a jump over haswell and would be still using DDR3 i am thinking about selling my 2500 before its value depreciates even more and buying a xeon 1220 v3 for its 8mb cache over the 4690k's 6mb .Should i camp out or should i make the jump ? These are rumors so don't hate on me.
-
Intel is saying that 10nm will be the last process to use silicon and is currently investigating alternative materials. This could have deep-reaching implications. Silicon has been used in circuitry and processors ever since Intel released what could be considered the first commercial general purpose programmable microprocessor, the Intel 4004. I'm curious what this could mean for Intel's competition, and what kinds of benefits this could bring. You would assume this would certainly bring a reasonable increase in clock speeds, so I checked a second source; but there's a catch - The second source is a bit dated, (2012), but hopefully since then the price has gone down. This could revolutionize processing the same way the Intel 4004 did, way back in 1971. Silicon has been the building block of computers for so long, it will be interesting if other chip makers can adapt. Sources: http://arstechnica.com/gadgets/2015/02/intel-forges-ahead-to-10nm-will-move-away-from-silicon-at-7nm/ http://www.theregister.co.uk/2012/12/11/mit_non_silicon_transistor_tech/
- 49 replies