Jump to content

Google AI designs* chip in 6 hours

WolframaticAlpha

Summary

Google has designed an artificial intelligence program that can design a chip 'floorplan' in 6 hours, something that takes months if it is done by human engineers. The floorplan of a chipset is the basic design of a silicon system-on-chip that determines the optimal positioning of the CPU, GPU, and memory units on a chipset. This method is being used to design Google's much awaited next gen TPUs.

 

Quotes

Quote

 

This usually takes months for a chip-design engineer to come up with, while Google’s new AI can design it in six hours’ time. The developers fed over 10,000 chip floorplans to help the AI system learn the task and come up with an optimal floorplan of a chipset that can deliver improved performance by using less power

 

My thoughts

This looks cool. My guess is that AI will lead to me getting fired in 2040, as I know batshit about AI.

 

Sources

A graph placement methodology for fast chip design | Nature

Google's AI Software Can Design a Chipset in Just Six Hours | Beebom

Google is using AI to design its next generation of AI chips more quickly than humans can - The Verge

Now Google is using AI to design chips, far faster than human engineers can do the job | ZDNet

Link to comment
Share on other sites

Link to post
Share on other sites

Not surprising. There's a pletorha of jobs that can and will be replaced by AI. Other jobs will pop up that we can't even imagine. It's going to be a tough transition for some. 

 

AI is also superb at designing material objects to be strong yet incredibly light in ways human engineers simply can't. At least not in any economical capacity. 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I know AI is awesome, but the hype makes me hate it.

Engineers spend that much time because the layout is optimized by a multitude of goals, and there are known algorithms used to help.

Beware of people saying something is "optimal", especially when dealing with AI. It usually is fundamentally wrong. It may be great/good/reasonable, but not optimal.
In this specific case, that is "easily" verifiable by comparing with deterministic optimization algorithms that do produce optimal results (not really, as multi-objective optimization = NP-Hard to verify).

Didn't look the article nor links, but I believe it were the crackhead journalists claiming something to be optimal, once again.

Link to comment
Share on other sites

Link to post
Share on other sites

We suppose to use technology to work less since the beginning of time. And yeah be jobs will appear, like Youtubers and cinema actors, sport players were not a thing a century ago.

Anyway that seems like AI creates AI and that's kinda scary 😄

Link to comment
Share on other sites

Link to post
Share on other sites

This feels a lot more like "easy headline" and "we couldn't hire good enough engineers to do the work" issue than some great advancement. A massive amount of chip design is already automated, and layout isn't just about the design itself. The node and objectives of the silicon will matter quite a lot. I could see this being useful after a design team (note, it's going to be an entire team) has figured out all of the requirements. That would allow the program to find extra efficiencies. 

 

This sounds a lot more like a precursor to "Google TPU delayed again". lol

 

@WolframaticAlphaAI isn't going to change as much as everyone is currently projecting. Frankly, it'll only really work well in fully digitized information sets. Further, all it's really good at is optimizing within the framework it has been given to operate and upon the information that is available. It's change some things a lot, but it'll actually put even more demand on "Systems" Engineers.

 

It's already like 3D Printing. It's really important, but it's got its place in all things, rather than taking over everything.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Taf the Ghost said:

@Forbidden WaferAI will be good for optimization runs on many designs and engineering tasks [...]

I believe it will end up being used as an heuristic, to speed up deterministic optimization or Monte Carlo simulations.

Link to comment
Share on other sites

Link to post
Share on other sites

Can someone give me a tldr on what's new? The AI chip design chip by google isn't new: https://www.nextplatform.com/2020/02/20/google-teaches-ai-to-play-the-game-of-chip-design/

I wonder what's actually the news here.

 

37 minutes ago, Forbidden Wafer said:

I know AI is awesome, but the hype makes me hate it.

Engineers spend that much time because the layout is optimized by a multitude of goals, and there are known algorithms used to help.

Beware of people saying something is "optimal", especially when dealing with AI. It usually is fundamentally wrong. It may be great/good/reasonable, but not optimal.
In this specific case, that is "easily" verifiable by comparing with deterministic optimization algorithms that do produce optimal results (not really, as multi-objective optimization = NP-Hard to verify).

Didn't look the article nor links, but I believe it were the crackhead journalists claiming something to be optimal, once again.

It may not be optimal, but should be good enough to fit all of the needed requirements, that's how GAs/evolutionary strategies work (and that may be the "optimal" case for a specific scenario).

 

 

 

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, igormp said:

It may not be optimal, but should be good enough to fit all of the needed requirements, that's how GAs/evolutionary strategies work (and that may be the "optimal" case for a specific scenario).

Good and optimal are different things.

Evolutionary strategies are often used as heuristics to speed up the search for optimal results. That's exactly where I think AI will be used.

Link to comment
Share on other sites

Link to post
Share on other sites

Reject humans. Embrace automation. 
 

On a more serious note, I’m surprised technology like this hasn’t already been in use. I didn’t think people would be designing chips by hand in 2021. 

Laptop: 2019 16" MacBook Pro i7, 512GB, 5300M 4GB, 16GB DDR4 | Phone: iPhone 13 Pro Max 128GB | Wearables: Apple Watch SE | Car: 2007 Ford Taurus SE | CPU: R7 5700X | Mobo: ASRock B450M Pro4 | RAM: 32GB 3200 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 11 | Storage: 1TB Crucial P3 NVME SSD, 1TB PNY CS900, & 4TB WD Blue HDD | PSU: Be Quiet! Pure Power 11 600W | Display: LG 27GL83A-B 1440p @ 144Hz, Dell S2719DGF 1440p @144Hz | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G305 | Audio: Audio Technica ATH-M50X & Blue Snowball | Server: 2018 Core i3 Mac mini, 128GB SSD, Intel UHD 630, 16GB DDR4 | Storage: OWC Mercury Elite Pro Quad (6TB WD Blue HDD, 12TB Seagate Barracuda, 1TB Crucial SSD, 2TB Seagate Barracuda HDD)
Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Forbidden Wafer said:

Good and optimal are different things.

Evolutionary strategies are often used as heuristics to speed up the search for optimal results. That's exactly where I think AI will be used.

Yeah, but all of the ML branch of AI ends up being just a heuristic in the end, and that's good enough for us, even if it gets stuck in some local minima.

If optimal, deterministic algorithms were available/not so costly (most of those are NP-hard as you said yourself), we would be using then instead.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

Nothing created by an AI should be patentable. 

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to comment
Share on other sites

Link to post
Share on other sites

I am reminded that “calculator” used to be a profession before it was a thing that fit in your pocket before it was an app that fit on a thing you put in your pocket.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, DrMacintosh said:

Reject humans. Embrace automation. 
 

On a more serious note, I’m surprised technology like this hasn’t already been in use. I didn’t think people would be designing chips by hand in 2021. 

The design phase is done on systems, as you have to design your units. However, the construction of them and then layout are automated processes. Mostly. (Obviously, each design team works a little different.)  Something about wasting 10 million USD because of a mistake you made rather than letting the automated systems do their job, comes to mind.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Taf the Ghost said:

This feels a lot more like "easy headline" and "we couldn't hire good enough engineers to do the work" issue than some great advancement. A massive amount of chip design is already automated, and layout isn't just about the design itself. The node and objectives of the silicon will matter quite a lot. I could see this being useful after a design team (note, it's going to be an entire team) has figured out all of the requirements. That would allow the program to find extra efficiencies. 

 

This sounds a lot more like a precursor to "Google TPU delayed again". lol

 

@WolframaticAlphaAI isn't going to change as much as everyone is currently projecting. Frankly, it'll only really work well in fully digitized information sets. Further, all it's really good at is optimizing within the framework it has been given to operate and upon the information that is available. It's change some things a lot, but it'll actually put even more demand on "Systems" Engineers.

 

It's already like 3D Printing. It's really important, but it's got its place in all things, rather than taking over everything.

the problem with 3d printing is the materials it can print, but slowly metal printers are becoming a thing (a cheaper thing), then many more use cases appear, though if you really want to you can already use 3d printers to cast metal parts, all you need is some sand.

5 hours ago, DrMacintosh said:

Reject humans. Embrace automation. 
 

On a more serious note, I’m surprised technology like this hasn’t already been in use. I didn’t think people would be designing chips by hand in 2021. 

people make the blocks, but most of the layout is probably already done with auto-routing  software, i remember hearing some time ago that was one of intel's advantages as they did more of the layout manually so they ended up with better optimized layouts, not sure how true that is of course.

Link to comment
Share on other sites

Link to post
Share on other sites

Very cool. 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Step one of judgement day, AI that can create its own CPUs....

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Ai can create a lot of things, doesn't mean they're useful or quality.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Heliian said:

Ai can create a lot of things, doesn't mean they're useful or quality.

 

They're supposedly faster and more efficient than their 2017 TPUs. That in my books is GOOD.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, poochyena said:

is it actually as good or better than human designs though?

We've had computer generated chips and they were pretty crap. Besides, chips aren't just layout.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, poochyena said:

is it actually as good or better than human designs though?

This is the question lol

CPU: Core i9 12900K || CPU COOLER : Corsair H100i Pro XT || MOBO : ASUS Prime Z690 PLUS D4 || GPU: PowerColor RX 6800XT Red Dragon || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB (Boot), Crucial P2 1TB, Crucial MX500 1TB (x2), Samsung 850 EVO 1TB || PSU: Corsair RM850 || CASE: Fractal Design Meshify C Mini || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: Bose QC35 II || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

 

LAPTOP: Dell XPS 15 7590

TABLET: iPad Pro

PHONE: Galaxy S9

She/they 

Link to comment
Share on other sites

Link to post
Share on other sites

Rough idea as to what the design of the chip looked like:

 

https://www.techpowerup.com/img/t6dw81UctSMnDoaw.jpg

 

Quote

the study explains that the system was trained on over 10,000 microchip designs, and that when faced with a selection of macro blocks to arrange in the floorplanning stage of microchip design, novelty iterations of components were found to outperform those designed by teams of human engineers.

Sauce: https://www.techpowerup.com/283235/ai-designed-microchips-now-outperform-human-designed-ones

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, DildorTheDecent said:

Rough idea as to what the design of the chip looked like:

 

https://www.techpowerup.com/img/t6dw81UctSMnDoaw.jpg

 

Sauce: https://www.techpowerup.com/283235/ai-designed-microchips-now-outperform-human-designed-ones

This reminds me of chess and go or the whole machine compile thing.  It is possible to compile stuff by hand and there were people that claimed they could do it better. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

"AI" = Algorithm.

 

AI doesn't exist yet, its very annoying seeing the media misuse the term.

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×