Jump to content

Intel 10nm potentially delayed to 2018

Source: http://www.fool.com/investing/general/2016/02/12/intel-corps-10-nanometer-technology-may-be-delayed.aspx

It seems Intel could be in big trouble. 

They recently posted a job listing that stated 10nm production would start in two years. 

Quote

Notice that the listing says that the site "will begin production on 10nm in approximately two years." The listing went live on Jan. 21, 2016, so exactly two years from that point would peg production start in Jan. 2018. However, if we give Intel the benefit of the doubt (although frankly following the 14-nanometer debacle they do not deserve it), let's suppose that they really mean 1.5 years.

This would still peg production start in the middle of 2017, implying that products won't really make it to customers in high volumes until late in 2017/early 2018.

Even in best case scenarios this would suggest intel is pretty behind. 

Perhaps something will follow Kaby Lake before cannonlake, making it a tick tick tick tock cycle. 

Tsmc is supposed to start 7nm in 2018, which would put them ahead of intel. 

This would also put amd zen on a more level playing field, as zen is on 14 nm and so would be competing with 14nm parts instead of 10nm ones. 

This could really play to amds advantage. 

 

Also kudos to myself for typing this all out on a phone lol. I don't have my computer with me right now so had to do it all on this. That's also the reason why there are no tags; I couldn't get those to work on here. I will add them once I have time. 

 

EDIT: Added the tags :)

Make sure to quote me or tag me when responding to me, or I might not know you replied! Examples:

 

Do this:

Quote

And make sure you do it by hitting the quote button at the bottom left of my post, and not the one inside the editor!

Or this:

@DocSwag

 

Buy whatever product is best for you, not what product is "best" for the market.

 

Interested in computer architecture? Still in middle or high school? P.M. me!

 

I love computer hardware and feel free to ask me anything about that (or phones). I especially like SSDs. But please do not ask me anything about Networking, programming, command line stuff, or any relatively hard software stuff. I know next to nothing about that.

 

Compooters:

Spoiler

Desktop:

Spoiler

CPU: i7 6700k, CPU Cooler: be quiet! Dark Rock Pro 3, Motherboard: MSI Z170a KRAIT GAMING, RAM: G.Skill Ripjaws 4 Series 4x4gb DDR4-2666 MHz, Storage: SanDisk SSD Plus 240gb + OCZ Vertex 180 480 GB + Western Digital Caviar Blue 1 TB 7200 RPM, Video Card: EVGA GTX 970 SSC, Case: Fractal Design Define S, Power Supply: Seasonic Focus+ Gold 650w Yay, Keyboard: Logitech G710+, Mouse: Logitech G502 Proteus Spectrum, Headphones: B&O H9i, Monitor: LG 29um67 (2560x1080 75hz freesync)

Home Server:

Spoiler

CPU: Pentium G4400, CPU Cooler: Stock, Motherboard: MSI h110l Pro Mini AC, RAM: Hyper X Fury DDR4 1x8gb 2133 MHz, Storage: PNY CS1311 120gb SSD + two Segate 4tb HDDs in RAID 1, Video Card: Does Intel Integrated Graphics count?, Case: Fractal Design Node 304, Power Supply: Seasonic 360w 80+ Gold, Keyboard+Mouse+Monitor: Does it matter?

Laptop (I use it for school):

Spoiler

Surface book 2 13" with an i7 8650u, 8gb RAM, 256 GB storage, and a GTX 1050

And if you're curious (or a stalker) I have a Just Black Pixel 2 XL 64gb

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not like it matters too much anyway. Lots of games are going for GPU usage, the kinds of non-AI games. If the performance of CPUs come to a screeching halt, all you'd really have to worry about is bottlenecking.

Please notice me senpie

 

Xeon x3430 - GTX 660SC - Core 1000 - CX430 - DQ57TM - 4gb DDR3

Link to comment
Share on other sites

Link to post
Share on other sites

Well, we're hitting the wall anyway.  Maybe developers and software can finally catch up to a CPU's actual capability with such a stall.  

Link to comment
Share on other sites

Link to post
Share on other sites

Motley Fool's a bit behind. Fab 42 is already being repurposed for 10nm gen 1 since 14nm demand didn't explode the way Intel originally predicted in 2010. Tapeout will begin at the end of this year pretty much as expected. However, 10nm is going to be a long-term node since EUV is just nowhere near where it's needed to be for mass production. In terms of cost per transistor it's a huge step backward for now.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, patrickjp93 said:

Motley Fool's a bit behind. Fab 42 is already being repurposed for 10nm gen 1 since 14nm demand didn't explode the way Intel originally predicted in 2010. Tapeout will begin at the end of this year pretty much as expected. However, 10nm is going to be a long-term node since EUV is just nowhere near where it's needed to be for mass production. In terms of cost per transistor it's a huge step backward for now.

So what you're saying is that CPUs bought now, or a generation ago, will likely be relevent for even longer than previously thought.

"If you ain't first, you're last"

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully this will give AMD time to bring out Zen+ to really bring them onto level playing fields with Intel on IPC.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Memories4K said:

So what you're saying is that CPUs bought now, or a generation ago, will likely be relevent for even longer than previously thought.

10nm will be relevant a very long time. However, depending on how game programming evolves in coming years, I'm not prepared to guess how long Haswell will be relevant.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Good news for AMD I guess.

and congratulations for typing this on a phone, the site is barely usable right now and I can't  wait to see how my post is gonna turn out.

 

 

 

Why is SpongeBob the main character when Patrick is the star?

Link to comment
Share on other sites

Link to post
Share on other sites

If true AMD might have a chance

Error: 451                             

I'm not copying helping, really :P

Link to comment
Share on other sites

Link to post
Share on other sites

Well the next generation is probably not going to be faster anyway so it doesn't matter? As said it looks like we are hitting a wall, and that's fine tbh for most of us.

I mean most cpu's are more than fast enough so i think it's time that programmers are going to be more efficient again and are making things that are more efficient.

They can't rely on cpu's getting faster and faster so it looks like they don't even have another choice :P

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

On February 13, 2016 at 10:47 AM, That Norwegian Guy said:

AMD to the rescue

Because Intel could never make meaningful architecture improvements at 14nm :eyeroll:

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ONOTech said:

I'm pretty sure we need to switch away from Silicon to see any major improvements from here on out.

At 10nm we may see a switch to FD-FF from Intel which should provide a boost in clock speeds even if architecturally there isn't much going on. At 7nm Intel switches to either Silicon-Germanium or a III-V material which could have a multitude of effects.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, patrickjp93 said:

Because Intel could never make meaningful architecture improvements at 14nm :eyeroll:

without actual competition, intel will have no reason to put serious manpower into anything at all.

If intel wanted to, they could have had 10nm out by now. But without competition, they could just not give a fuck and paddle along in their own jurassic pace. Doing a shitload of side-quests rather then the main quest.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Prysin said:

without actual competition, intel will have no reason to put serious manpower into anything at all.

If intel wanted to, they could have had 10nm out by now. But without competition, they could just not give a fuck and paddle along in their own jurassic pace. Doing a shitload of side-quests rather then the main quest.

No, yield issues and tech issues are real. Most thought EUV would be necessary for 10nm. Intel and others are currently trying to go without it and use multipatterning to get around the enormous costs. EUV is not ready for prime time, and that's affecting everyone. Moore's Law is also a point of pride for Intel. They wouldn't give up on it until they absolutely had to. Competition has nothing to do with it, and if anything competition for HPC accelerators should be making Intel accelerate to 10nm for Knight's Hill. Your premises are way too shallow. Intel is innovating faster than anyone else in the industry, and yet they're treated like the worst of the bunch.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, patrickjp93 said:

No, yield issues and tech issues are real. Most thought EUV would be necessary for 10nm. Intel and others are currently trying to go without it and use multipatterning to get around the enormous costs. EUV is not ready for prime time, and that's affecting everyone. Moore's Law is also a point of pride for Intel. They wouldn't give up on it until they absolutely had to. Competition has nothing to do with it, and if anything competition for HPC accelerators should be making Intel accelerate to 10nm for Knight's Hill. Your premises are way too shallow. Intel is innovating faster than anyone else in the industry, and yet they're treated like the worst of the bunch.

you are basing the premise of your theory that intel CANNOT go faster. However we are in a peculiar situation where intel HAS no competition. AMDs market share is what? below 10% now in the CPU market?... lets not talk about HPC, in which they arent even worth mentioning in the same breath.

IBM is the only "competitor" intel has for desktop CPUs, however we all know IBM only cares about enterprise and professional markets. They do not give a flying fuck about consumers anymore.

Intel still cares about consumers, as they are for the time being an essential part to their revenue.

 

When everyone is buying intel because intel, rather then buying intel because price to performance made more sense. And no, we are not talking litterally here, we are talking figuratively. There is no reason to buy AMD above i3 levels, ever. Just isnt, the end. So for the wast majority of products, to consumers there are only TWO options.. Intel or ARM (including deviates such as Apple, Samsung, Qualcomm etc...) Because either you buy a proper PC or you buy a tablet/phone, in todays society either one can do the trick for MOST of us.

 

So since intel has zero fear of losing even 0,01% market share to AMD at this point. Why should they actually SPEND money on getting more engineers, better engineers (assuming there is noone better out there is foolish and arrogant. There is ALWAYS someone better), better machine manufacturing processes.

 

Intel has no motivation.

Intel will not waste money for the sake of wasting money when there is ZERO profit to be had from it.

Intel only cares about market share, at over 90% share, they have so very little to gain it isnt even funny to try get any more market share.

 

Your argument cannot be true, because it is based upon a situation that is infact not true. On the contrary, every historical event EVER has shown that competition drives innovation (and subsequently costs). And i do dare you to find me a situation where competition has actually lead to degredation of product deveolpment and performance rather then improvements and innovations.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/13/2016 at 6:47 PM, Superdoge said:

Not like it matters too much anyway. Lots of games are going for GPU usage, the kinds of non-AI games. If the performance of CPUs come to a screeching halt, all you'd really have to worry about is bottlenecking.

Not everyone looks for CPUs to play games...

PROJECT Simplifyi5-4690K OC'd @ 4.2 GHz | GTX 760 | Ballistix Sport 8 GB | ASRock Z97M-ITX/AC | Corsair H80i GT | SilverStone SG13B |

UPCOMING| RX Series GPU | Mechanical Keyboard | Mirrorless Camera | 

VIDEO KIT: Canon SX520 HS | Opteka "Glidecam" | Zoom H5 | Shure SM58 | 

I'm 16 and I like to think I'm good at videography.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Prysin said:

you are basing the premise of your theory that intel CANNOT go faster. However we are in a peculiar situation where intel HAS no competition. AMDs market share is what? below 10% now in the CPU market?... lets not talk about HPC, in which they arent even worth mentioning in the same breath.

IBM is the only "competitor" intel has for desktop CPUs, however we all know IBM only cares about enterprise and professional markets. They do not give a flying fuck about consumers anymore.

Intel still cares about consumers, as they are for the time being an essential part to their revenue.

 

When everyone is buying intel because intel, rather then buying intel because price to performance made more sense. And no, we are not talking litterally here, we are talking figuratively. There is no reason to buy AMD above i3 levels, ever. Just isnt, the end. So for the wast majority of products, to consumers there are only TWO options.. Intel or ARM (including deviates such as Apple, Samsung, Qualcomm etc...) Because either you buy a proper PC or you buy a tablet/phone, in todays society either one can do the trick for MOST of us.

 

So since intel has zero fear of losing even 0,01% market share to AMD at this point. Why should they actually SPEND money on getting more engineers, better engineers (assuming there is noone better out there is foolish and arrogant. There is ALWAYS someone better), better machine manufacturing processes.

 

Intel has no motivation.

Intel will not waste money for the sake of wasting money when there is ZERO profit to be had from it.

Intel only cares about market share, at over 90% share, they have so very little to gain it isnt even funny to try get any more market share.

 

Your argument cannot be true, because it is based upon a situation that is infact not true. On the contrary, every historical event EVER has shown that competition drives innovation (and subsequently costs). And i do dare you to find me a situation where competition has actually lead to degredation of product deveolpment and performance rather then improvements and innovations.

It's not a theory. Money is being spent to improve their foundries in magnitudes only Samsung can match. Intel has to get yields up on a new node with progressively larger die sizes to be able to compete with AMD and Nvidia in accelerators. They are not going any slower than they have to. That is a huge market to gain ground in.

 

Intel does not give a rat's rear end about desktop. Mobile and server are growing markets. Regardless for server SKUs, Intel has to get yields high enough before it starts making those large-die chips (E5 2699 V3 is 662 mm sq., bigger than any but the largest dGPU die), and then there's the 720 mm sq. Xeon Phi chips. Intel HAS to keep up its node advantage to keep pushing core count on those. And people buy Intel because of price/performance. The server world does not work any other way. For gaming Intel has been the logical choice on that front as well up until only recently when a 9590 can now match a 2600K in gaming performance on the most modern titles.

 

intel is after other markets. It has competitors other than AMD. That's where you're falling behind and your argument doesn't hold up.

 

Actually that's not hard. Any dis-economy of scale situation does that. You can find a list just by googling the term. 

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×