Jump to content

TSMC on Track to Begin Volume Production of 3nm Chips in 2022

Spindel

Summary

TSMC is on track to begin risk production of a 3-nanometer fabrication process in the second half of this year, when the foundry will be capable of processing 30,000 wafers built using the more advanced technology, according to a new report today

 

Quotes

Quote

TSMC reportedly plans to expand its 3nm process capacity to 55,000 units monthly in 2022, thanks to Apple's order commitment, and will further scale up the output to 105,000 units in 2023. The 3nm process yields 30 percent and 15 percent power consumption and performance improvements over the 5nm process.

Quote

Meanwhile, TSMC plans to scale up its 5nm process manufacturing capacity throughout the year to meet increasing demands from its major customers. According to today's report, TSMC will upscale to 105,000 wafers monthly in the first half of 2021, up from 90,000 units in fourth-quarter 2020, with plans to further expand the process capacity to 120,000 units in the second half of this year.

My thoughts

Appel centric source, but still more chips for everyone (at least at 5nm). 

 

Sources

https://www.macrumors.com/2021/03/01/tsmc-3nm-chip-volume-production-2022/

Link to comment
Share on other sites

Link to post
Share on other sites

The question is, are they upscaling production in 5nm because they are going to have more factories, or more fab space? Or are they cannibalizing fab space currently allocated to other processes?
Cause if it's the latter, then it might not be as nice as it sounds.

Link to comment
Share on other sites

Link to post
Share on other sites

Does anyone know how they managed to deal with quantum tunnelling? I can't find any solid info online, I know they switched to EUV lithography but my understanding was that wouldn't get them below 5nm.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Rauten said:

The question is, are they upscaling production in 5nm because they are going to have more factories, or more fab space? Or are they cannibalizing fab space currently allocated to other processes?
Cause if it's the latter, then it might not be as nice as it sounds.

TSMC is not reducing 7nm capacity as far as I know, they also still have fab lines for nodes larger than 7nm still. It's probably a bit of both, expanding in to current floor space and reducing low demand fab lines.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Master Disaster said:

Does anyone know how they managed to deal with quantum tunnelling? I can't find any solid info online, I know they switched to EUV lithography but my understanding was that wouldn't get them below 5nm.

i would assume the gate size didn't shrink as much as it sounds, but the density would scale up as if it shrunk by 1.4 times

maybe stacking or something? idk

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

My understanding is there is a water issue.  This may have a larger effect than space canibalization.  If they can’t make enough chips to fill the space it doesn’t matter how it’s allocated

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Bombastinator said:

My understanding is there is a water issue.  This may have a larger effect than space canibalization.  If they can’t make enough chips to fill the space it doesn’t matter how it’s allocated

It's a seasonal issue, but yeah, it's worse than normal. The drought couldn't come at a worse possible time. SMH

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, StDragon said:

It's a seasonal issue, but yeah, it's worse than normal. The drought couldn't come at a worse possible time. SMH

Seasonal stuff is sort of up in the air with global warming.   Don’t know what things are going to do.  One would think that water would be reusable.  Makes one wonder what is in the used water and what is being done with it.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Bombastinator said:

One would think that water would be reusable.  Makes one wonder what is in the used water and what is being done with it.

I would agree, once the water is processed for purity, I too would think refiltration would be a lot cheaper and require less energy to recycle. In addition, at least your plant can account for water inventory during this period.

 

Hopefully an expert in fab production can chime in. When it comes to plant operation, stuff gets complicated.

Link to comment
Share on other sites

Link to post
Share on other sites

Crazy to think we've come to 3nm chips already. What will happen 10 years from now? Will we be using quantum physics as the norm in fabricating new tech soon?

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Rym said:

Crazy to think we've come to 3nm chips already. What will happen 10 years from now? Will we be using quantum physics as the norm in fabricating new tech soon?

I doubt it, once limits are being hit, they're probably going to stack the cpu's/gpu's (there's mentions of intel already planning for that), refine the architecture to squeeze every bit of performance out of it (basically a similar thing to what intel has done with 14nm+++++), and try to see if other materials can be used with semiconductors.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Rym said:

Crazy to think we've come to 3nm chips already. What will happen 10 years from now? Will we be using quantum physics as the norm in fabricating new tech soon?

This sort of thing has happened before in other industries.  Quite often a solution has already been thought of that doesn’t make sense yet.  Back in the 80’s this same worry popped up and there was the concept of germanium instead of silicon chips floated. SOI got invented though.  It turned out not to be needed then, but germanium still exists.  Also there has been research done on optical computing and quantum computing.  Quantum computing is the current darling in that it is showing itself to be possibly useful befor silicon even runs itself out completely.  Other options exist though.  They’re just not currently proactively because silicon is relatively cheap. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bombastinator said:

This sort of thing has happened before in other industries.  Quite often a solution has already been thought of that doesn’t make sense yet.  Back in the 80’s this same worry popped up and there was the concept of germanium instead of silicon chips floated. SOI got invented though.  It turned out not to be needed then, but germanium still exists.  Also there has been research done on optical computing and quantum computing.  Quantum computing is the current darling in that it is showing itself to be possibly useful befor silicon even runs itself out completely.  Other options exist though.  They’re just not currently proactively because silicon is relatively cheap. 

There's room for a lot of software optimization. While code-bloat (excessive reliance on libraries and copy-paste code) is more of an issue in RAM utilization, it still impacts CPU cycles to a lesser extent.

 

I think CPU performance at some point will plateau with focus on gains being more on code optimization.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, StDragon said:

There's room for a lot of software optimization. While code-bloat (excessive reliance on libraries and copy-paste code) is more of an issue in RAM utilization, it still impacts CPU cycles to a lesser extent.

 

I think CPU performance at some point will plateau with focus on gains being more on code optimization.

There’s just a lot of places for slack to be reeled in.  I don’t see it as a problem likely to prove to be insurmountable.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Here's hoping the chip shortage is fixed by then.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×