Jump to content

Billions of transistors are sooo last year: Adaptive transistors could cut cpu sizes by up to 85%

BachChain

Summary

Researchers have developed a new germanium-based transistor with the main feature being a "control electrode" that allows the transistor's behavior to be modified on-the-fly. This additional flexibility enables logic to be implemented using significantly fewer transistors, up to 85% less than traditional ICs. Fewer transistors means lower power consumption, and smaller die areas could cut manufacturing costs. This new design can be made using existing techniques, so it should be relatively straightforward to begin wide-scale production

 

Quotes

Quote

The death of Intel's tick-tock strategy showcases how transistor density advancements are becoming more difficult to achieve. And while materials and design research have devised many ways to improve transistors, their fundamental design remains unchanged. And where there's a lack of change, there's an opportunity: what benefits could come from redesigning the transistor?

 

Quote

A team of researchers with the Vienna University of Technology have evolved computing's most fundamental unit: the transistor. Tapping into the element Germanium (Ge), they've developed a new, adaptive transistor design that can change its configuration on the fly, according to the workload requirements.

 

Quote

"The fact that we use germanium is a decisive advantage," explains Dr. Sistani. "This is because germanium has a very special electronic structure: when you apply voltage, the current flow initially increases, as you would expect. After a certain threshold, however, the current flow decreases again – this is called negative differential resistance. With the help of the control electrode, we can modulate at which voltage this threshold lies. This results in new degrees of freedom that we can use to give the transistor exactly the properties that we need at the moment."

 

Quote

Arithmetic operations, which previously required 160 transistors, are possible with 24 transistors due to this increased adaptability. In this way, the speed and energy efficiency of the circuits can also be significantly increased," explained Prof. Walter Weber, a member of the team. In other words, the new adaptive transistors can reduce the number of required transistors for a given workload by up to 85%. Furthermore, with fewer transistors operating for the same work, power consumption, temperatures, and leakage points are reduced across the design - which in turn would allow for higher frequency scaling and performance.

 

My thoughts

Perhaps this will start a new CPU race and make ops per transistor a marketing term. I also don't envy engineering students that might have to learn dynamic transistor logic.

 

Sources

https://www.tomshardware.com/news/researchers-develop-intelligent-transistors-uses-85-percent-fewer-transistors

Link to comment
Share on other sites

Link to post
Share on other sites

how does this adapt per say? 

 

|:Insert something funny:|

-----------------

*******

#

Link to comment
Share on other sites

Link to post
Share on other sites

i just understood less..

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BachChain said:

Researchers have developed a new gallium-based transistor with the main feature being a "control electrode" that allows the transistor's behavior to be modified on-the-fly.

I heard about this almost 10 years ago for the first time, so I feel like it's not really new.

It was an electrically reconfigurable dual metal-gate and a different approach, but this sounds really similar. 😉

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, adarw said:

how does this adapt per say? 

 

 

From whats said it functions as a NOTIF gate. Any given transistor is normally a simple IF gate IF A=B then Y ELSE N. This can function as a not gate however by changing the input voltage on one side.

Link to comment
Share on other sites

Link to post
Share on other sites

Cool. 

 

I can't wait to see them make something with this in 30 years and think back fondly on this thread. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, dizmo said:

Cool. 

 

I can't wait to see them make something with this in 30 years and think back fondly on this thread. 

Yeah, I think we've reached the point technologically where any improvements are either iterative or take an astronomical amount of effort for the leap in capability.

For example: graphene transistors were supposed to be teh shits as well... How long have we been waiting?

CPU - Ryzen 7 3700X | RAM - 64 GB DDR4 3200MHz | GPU - Nvidia GTX 1660 ti | MOBO -  MSI B550 Gaming Plus

Link to comment
Share on other sites

Link to post
Share on other sites

Just like every new fancy battery and storage tech we see every year.... I'm looking forward to never seeing this in a consumer product within my lifetime.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, BachChain said:

Summary

Researchers have developed a new germanium-based transistor with the main feature being a "control electrode" that allows the transistor's behavior to be modified on-the-fly. This additional flexibility enables logic to be implemented using significantly fewer transistors, up to 85% less than traditional ICs. Fewer transistors means lower power consumption, and smaller die areas could cut manufacturing costs. This new design can be made using existing techniques, so it should be relatively straightforward to begin wide-scale production

 

Quotes

 

 

 

 

My thoughts

Perhaps this will start a new CPU race and make ops per transistor a marketing term. I also don't envy engineering students that might have to learn dynamic transistor logic.

 

Sources

https://www.tomshardware.com/news/researchers-develop-intelligent-transistors-uses-85-percent-fewer-transistors

As exciting as this is, this will never come to the consumer tech space in its current form for one reason: cost.

 

As I'm sure you know, normal semiconductors are made using cheap, abundant silicon. These funky new transistors, however, were built using germanium. And germanium isn't cheap.


Silicon metal is currently selling at ~$3.5k per ton, although this price has been very volatile the last few months. But germanium? You're talking ~$1k per kilogram. You're talking a 300x increase in raw materials cost per kilogram, which even the theoretical maximum space savings of 85% advertised here wouldn't be able to offset. Also, germanium is ~2.5x as dense as silicon, so a germanium chip utilising this technology would only use ~60% less material by weight than a comparable silicon chip. Add this all together and you get a happy face each germanium-based chip costing ~120x as much as the comparable silicon chip, at least when you're talking raw resouces. Ouch.

 

And this cost increase will, of course, just get passed down the chain. If the raw material costs 300x more for the foundry (eg TSMC) to buy, you can bet your ass that they will sell that wafer to AMD, Nvidia etc. for a buttload more than a silicon wafer. This price increase would of course get passed down to the board partners who, due to their complete lack of profit margins, would have no choice but to pass it on to us, the consumers. Yay.

 

Note: germanium is currently used in industry as an alloy with silicon (SiGe) in products such as photovoltaic cells and LEDs. This is not the form used by this research - this research uses pure germanium. Germanium is usable by these other industries despite its price partially because a large amount of the cost of germanium actually comes from the cost of purifying it. Silicon is required to be at a very high level of purity for wafer production - 99.9999999% (9N) is normal pre-doping - but this level of material purity is generally not required by other applications, thereby allowing them to purchase lower-grade materials at a lower cost. They also generally use a very small quantity of germanium (eg a thin layer atop a silicon wafer) so the cost increase vs pure silicon is not particularly high.

 

If this technology is able to be applied to SiGe-based semiconductors then this could get very interesting. Simliarly if someone manages to make Germanium drop in price by a few orders of magnitude, that would be great. Until that point, however, I don't see this getting out of a lab. I also wonder if it has more of an application in FPGAs than in conventional CPUs/GPUs thanks to its ability to change configuration at runtime, and question how capable they would be vs conventional transistors at reaching the high frequencies used by modern electronics. Reducing the size of our CPUs by 85% is useless if it means reducing their maximum frequency by 85% as well.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, tim0901 said:

Silicon metal is currently selling at ~$3.5k per ton, although this price has been very volatile the last few months. But germanium? You're talking ~$1k per kilogram. You're talking a 300x increase in raw materials cost per kilogram, which even the theoretical maximum space savings of 85% advertised here wouldn't be able to offset. Also, germanium is ~2.5x as dense as silicon, so a germanium chip utilising this technology would only use ~60% less material by weight than a comparable silicon chip. Add this all together and you get a happy face each germanium-based chip costing ~120x as much as the comparable silicon chip, at least when you're talking raw resouces. Ouch.

You do know that wafers are made out of mono-crystalline Silicon (which is not a metal), with the least amount of impurities and the wafers are cut to a specific crystal plane? Your $3500/ton is just an arbitrary number for some form of silicon, not for the stuff that is actually used to make state-of-the-art electronics. 

And you don't need the entire wafer to be made out of Gallium. The substrate can be almost anything. White LEDs (or the blue ones) use a sapphire substrate and MOVPE to get the GaN layer. Your calculation doesn't make sense. Material cost are quite frankly almost negligible compared to the price of the rest of the process. 

Link to comment
Share on other sites

Link to post
Share on other sites

The chip size generally isn't an issue. The issue is heat concentration. We cram more transistors into chips than ever and the chips themselves are smaller than they were 10 years ago in a lot of cases because of node shrinking.

 

What that creates is an issue of its own where it creates heat concentration. Chips create more heat within smaller surface area which means you can have a 480mm AiO with best block in existence and it doesn't really help much because the contact with CPU and block is so tiny and you can pump only as much heat through that tiny contact surface.

 

In a way I wish chips would have more of dummy silicon around to increase surface area of actual chip, but we know that won't happen because wafer yields demand that they cram as many chips into a single wafer to maximize profits and availability. But just imagine if actual chip had extra silicon around it creating bigger contact surface area. Chips are not 2D, they have height and all the contact on the sides would pull the heat out of the chip to the sides and then contact copper block above it. Silicon still has thermal conductivity of 148W/mK which is still huge number.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, tim0901 said:

germanium is currently used in industry as an alloy with silicon (SiGe) in products such as photovoltaic cells and LEDs

 

3 hours ago, tim0901 said:

If this technology is able to be applied to SiGe-based semiconductors then this could get very interesting.

 

Already is/has, long ago.

 

Quote

IBM may have sold its fabrication plants to GlobalFoundries, but the company remains committed to long-term semiconductor investment and research. Today, the firm is announcing the first test silicon on the 7nm process node. The new chips were built in partnership with GlobalFoundries, Samsung, and IBM’s equipment providers at the SUNY Polytechnic Institute’s College of Nanoscale Science and Engineering. Not only are these the first 7nm chips that we’ve heard of, they’re the first to use Extreme Ultraviolet Lithography (EUV) and the first to use silicon germanium (SiGe).

https://www.extremetech.com/extreme/209523-ibm-announces-7nm-breakthrough-builds-first-test-chips-on-new-process-with-euv

 

It's been risk/mass production ready since 2015.

 

Oh and just because this is super amusing

image.png.6ca3a811b4297e2465769bf1cdef3734.png

Well this comment aged well didn't it lol

Link to comment
Share on other sites

Link to post
Share on other sites

Neat. We'll see once something of sort becomes in general mainstream use for chips. Some actual breakthrough with new materials for chips, now that will be incredible to see.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

 

 

Already is/has, long ago.

 

https://www.extremetech.com/extreme/209523-ibm-announces-7nm-breakthrough-builds-first-test-chips-on-new-process-with-euv

 

It's been risk/mass production ready since 2015.

 

Oh and just because this is super amusing

image.png.6ca3a811b4297e2465769bf1cdef3734.png

Well this comment aged well didn't it lol

By "this technology" I meant this new type of adaptive transistor - the one described by this article that has only just been developed.

 

And yeah that aged like milk. 🤣

5 hours ago, HenrySalayne said:

You do know that wafers are made out of mono-crystalline Silicon (which is not a metal), with the least amount of impurities and the wafers are cut to a specific crystal plane? Your $3500/ton is just an arbitrary number for some form of silicon, not for the stuff that is actually used to make state-of-the-art electronics. 

The raw ingredient for mono-crystalline silicon is known as 'silicon metl'. No form of silicon is a metal by the physics definition - the name is dumb - but that's what it's called. This is then used by companies through something like the Czochralski process to grow the boules of mono-crystalline silicon from which the wafers are cut.

5 hours ago, HenrySalayne said:

And you don't need the entire wafer to be made out of Gallium. The substrate can be almost anything. White LEDs (or the blue ones) use a sapphire substrate and MOVPE to get the GaN layer. Your calculation doesn't make sense. Material cost are quite frankly almost negligible compared to the price of the rest of the process. 

This uses germanium, not gallium. Very different elements. Current uses of both indeed use a small layer of substance atop a standard silicon wafer. This new tech - the adaptive transistors in the main article - does not. Read the actual paper and it explains this was indeed performed in pure germanium semiconductor - a silicon replacement - not a layer on silicon. Whether it is possible to create these adaptive transistors when using a thin layer of Ge (known as GeSi in the industry) is still unknown.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, tim0901 said:

By "this technology" I meant this new type of adaptive transistor - the one described by this article that has only just been developed.

Sweet as, my bad. This ones is probably way ages off ever getting used, if ever 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

Less components = less heat. There's a few options here:

 

  • Smaller processors. One of the largest takers of space on a motherboard could finally be heavily shrunk. ITX loves could finally rejoice
  • Cooler processors which means more energy efficient processors. 30W TDP K series anyone?
  • MOAR CORES.

PLEASE QUOTE ME IF YOU ARE REPLYING TO ME

Desktop Build: Ryzen 7 2700X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 48GB Corsair DDR4 @ 3000MHz, RX5700 XT 8GB Sapphire Nitro+, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

Sweet as, my bad. This ones is probably way ages off ever getting used, if ever 🤷‍♂️

Given how we're reaching node size limit, chip makers maybe won't make a choice and they'll adopt it sooner. We'll still have tons of chips made on old transistor tech because of volume and reliability, but cutting edge products may very well start using it soon.

Link to comment
Share on other sites

Link to post
Share on other sites

One thing to remember about the raw materials price is that part of that is going to be because a great deal of effort has been put into supplying the large volumes of Si necessary for modern chip production cheaply. The same hasn't been done for most alternative materials. So it's unlikely current prices are representative of where thing would fall out.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RejZoR said:

Given how we're reaching node size limit, chip makers maybe won't make a choice and they'll adopt it sooner. We'll still have tons of chips made on old transistor tech because of volume and reliability, but cutting edge products may very well start using it soon.

Well the issue is if this can actually be utilized in a chip for a CPU or GPU, I wouldn't assume this is the case. Wouldn't be of much use if you lost a huge amount of switching frequency or power/current handling, or pretty much anything else that would be getting rather deep in to things you'd have to be in the field to know about.

 

Graphene promised the world 20 years ago as well, look how far that's gotten.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Well the issue is if this can actually be utilized in a chip for a CPU or GPU, I wouldn't assume this is the case. Wouldn't be of much use if you lost a huge amount of switching frequency or power/current handling, or pretty much anything else that would be getting rather deep in to things you'd have to be in the field to know about.

 

Graphene promised the world 20 years ago as well, look how far that's gotten.

Well, clock isn't everything as it has been shown time and time again. Remember Athlon XP CPU's? They had way lower clocks yet performance beyond competition with higher clocks. Remember Core and Core 2 CPU's? Same story. Lower clocks, way higher performance. Same with Ryzen to this very day actually. Lower clocks than Intel's offerings and same performance in worst case most of the time.

 

The thing with graphene is that it's not exactly a semiconductor by itself like silicon is. One can make it such, but I think that was the issue, which is why it was never utilized for such purpose (yet). We'll see how this pans out.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RejZoR said:

Well, clock isn't everything as it has been shown time and time again

True however we are talking about transistors that operate very differently than current ones and we have no idea what that does to switching time or logic path changes. It's going to be real painful performance wise if it takes multiple equivalent clock cycles to reconfigure transistors and logic which is an entire period of time of processing stall where nothing is happening at all (in that portion of the chip). Having parts of the chip stall means other parts of the pipeline will also inevitably stall as well.

 

You cannot IPC your way out of a pipeline stall, or a much more sever switching speed reduction. If these transistors have to regress back to ~1GHz in CPU use cases then I cannot see any feasible way to work around that sort of limitation with current software or anything 10 years from now.

 

So like I said all well and good however it actually has to be able to be used for what we want it to be used for which I would not assume to be the case.

 

1 hour ago, RejZoR said:

The thing with graphene is that it's not exactly a semiconductor by itself like silicon is. One can make it such, but I think that was the issue, which is why it was never utilized for such purpose (yet). We'll see how this pans out.

IBM was creating R&D Graphene chips little over a decade ago and everyone was literally touting that to replace silicon. It didn't happen because of two reasons. The first was it was RF research for military (it's being used today now) and the second was they could only handle very low power. So that Graphene research was ultimately useless for CPU and GPUs, it was still valid and in use today transistor technology. So if you believed the headlines back then we would have 50GHz CPUs today, since we don't have that I wouldn't go assuming we'll have ~85% less transistors in GPUs and CPUs in 10 years time nor ones that have a majority of transistors that can be reconfigured.

 

For all we know, and this would still be a great best/worst case, is that these transistors could only be used for SRAM and not anything to do with execution units.

 

And yes Graphene transistors can switch as 50GHz and 100GHz, Silicon transistors can switch at 40GHz, we don't have 40GHz CPUs either. Not all transistors are equal.

 

Edit:

Oh and forgot to mention transistor switching speed isn't the same thing as CPU operating frequency or necessarily all that directly related either.

Link to comment
Share on other sites

Link to post
Share on other sites

@leadeater

Well, we've seen silicon based chips hit heavy interference issues at certain GHz speeds at ambient temperatures. Around 5GHz is what's really still usable, anything beyond this just doesn't work well and is only usable at subzero temperatures where everything behaves differently.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

IBM was creating R&D Graphene chips little over a decade ago and everyone was literally touting that to replace silicon. It didn't happen because of two reasons. The first was it was RF research for military (it's being used today now) and the second was they could only handle very low power. So that Graphene research was ultimately useless for CPU and GPUs, it was still valid and in use today transistor technology. So if you believed the headlines back then we would have 50GHz CPUs today, since we don't have that I wouldn't go assuming we'll have ~85% less transistors in GPUs and CPUs in 10 years time nor ones that have a majority of transistors that can be reconfigured.

They made CNT transistors in the lab of my university 10 years ago. Because the CNTs need to be grown with the correct size and length at the correct space with the correct orientation, they had a yield of only 60% (of transistors, not even dice) but were quite happy with that result. This technology needs a very long time to be fine tuned for commercial applications. A defect rate of 1 in a million (per transistor) is the lower limit for smaller applications. For larger chips they'll need at least 1 in a billion. The coming years will be quite interesting in this regard and we'll probably see more and more graphene and CNT based ICs. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, RejZoR said:

@leadeater

Well, we've seen silicon based chips hit heavy interference issues at certain GHz speeds at ambient temperatures. Around 5GHz is what's really still usable, anything beyond this just doesn't work well and is only usable at subzero temperatures where everything behaves differently.

Correct, just note that this is true for logic chips though rather than RF switching transistors. RF switching transistors in the 40 GHz range are super common, they are also extremely tiny and simple ICs as well.

 

The problem is there is a lot of research in the field of transistors and fabrication and because it's all really complicated these can, and have, gotten confused between where it is applicable and where it is not, like that Graphene research I mentioned.

 

Quote

The BFP640 is a RF bipolar transistor based on SiGe:C technology that is part of Infineon’s established sixth generation transistor family. Its transition frequency fT of 42 GHz and high linearity characteristics at low currents make this device particularly suitable for energy efficiency designs at frequency as high as 8 GHz. It remains cost competitive without compromising on ease of use.

https://gr.mouser.com/datasheet/2/196/Infineon_BFP640_DS_v03_00_EN-2309171.pdf

 

Applications for the above are things like SatNav/GPS, FM Radio, Terrestrial TV, Radar etc. All things radio frequency (RF) basically. Also note the difference between the switching frequency and the actual circuit operating frequency.

 

Not trying to be a downer btw, just providing caution with optimism advice.

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/27/2021 at 5:47 PM, PocketNerd said:

Yeah, I think we've reached the point technologically where any improvements are either iterative or take an astronomical amount of effort for the leap in capability.

For example: graphene transistors were supposed to be teh shits as well... How long have we been waiting?

turns out, graphine is really hard to make. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×