Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Google AI designs* chip in 6 hours

Summary

Google has designed an artificial intelligence program that can design a chip 'floorplan' in 6 hours, something that takes months if it is done by human engineers. The floorplan of a chipset is the basic design of a silicon system-on-chip that determines the optimal positioning of the CPU, GPU, and memory units on a chipset. This method is being used to design Google's much awaited next gen TPUs.

 

Quotes

Quote

 

This usually takes months for a chip-design engineer to come up with, while Google’s new AI can design it in six hours’ time. The developers fed over 10,000 chip floorplans to help the AI system learn the task and come up with an optimal floorplan of a chipset that can deliver improved performance by using less power

 

My thoughts

This looks cool. My guess is that AI will lead to me getting fired in 2040, as I know batshit about AI.

 

Sources

A graph placement methodology for fast chip design | Nature

Google's AI Software Can Design a Chipset in Just Six Hours | Beebom

Google is using AI to design its next generation of AI chips more quickly than humans can - The Verge

Now Google is using AI to design chips, far faster than human engineers can do the job | ZDNet

Link to post
Share on other sites

Not surprising. There's a pletorha of jobs that can and will be replaced by AI. Other jobs will pop up that we can't even imagine. It's going to be a tough transition for some. 

 

AI is also superb at designing material objects to be strong yet incredibly light in ways human engineers simply can't. At least not in any economical capacity. 

Current PC:

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to post
Share on other sites

I know AI is awesome, but the hype makes me hate it.

Engineers spend that much time because the layout is optimized by a multitude of goals, and there are known algorithms used to help.

Beware of people saying something is "optimal", especially when dealing with AI. It usually is fundamentally wrong. It may be great/good/reasonable, but not optimal.
In this specific case, that is "easily" verifiable by comparing with deterministic optimization algorithms that do produce optimal results (not really, as multi-objective optimization = NP-Hard to verify).

Didn't look the article nor links, but I believe it were the crackhead journalists claiming something to be optimal, once again.

Link to post
Share on other sites

We suppose to use technology to work less since the beginning of time. And yeah be jobs will appear, like Youtubers and cinema actors, sport players were not a thing a century ago.

Anyway that seems like AI creates AI and that's kinda scary 😄

Link to post
Share on other sites

This feels a lot more like "easy headline" and "we couldn't hire good enough engineers to do the work" issue than some great advancement. A massive amount of chip design is already automated, and layout isn't just about the design itself. The node and objectives of the silicon will matter quite a lot. I could see this being useful after a design team (note, it's going to be an entire team) has figured out all of the requirements. That would allow the program to find extra efficiencies. 

 

This sounds a lot more like a precursor to "Google TPU delayed again". lol

 

@WolframaticAlphaAI isn't going to change as much as everyone is currently projecting. Frankly, it'll only really work well in fully digitized information sets. Further, all it's really good at is optimizing within the framework it has been given to operate and upon the information that is available. It's change some things a lot, but it'll actually put even more demand on "Systems" Engineers.

 

It's already like 3D Printing. It's really important, but it's got its place in all things, rather than taking over everything.

Link to post
Share on other sites

@Forbidden WaferAI will be good for optimization runs on many designs and engineering tasks, but AI is always limited by the framework into which it is set. So it'll be good at Chess but terrible at judging Sports outcomes.

Link to post
Share on other sites
1 minute ago, Taf the Ghost said:

@Forbidden WaferAI will be good for optimization runs on many designs and engineering tasks [...]

I believe it will end up being used as an heuristic, to speed up deterministic optimization or Monte Carlo simulations.

Link to post
Share on other sites

Can someone give me a tldr on what's new? The AI chip design chip by google isn't new: https://www.nextplatform.com/2020/02/20/google-teaches-ai-to-play-the-game-of-chip-design/

I wonder what's actually the news here.

 

37 minutes ago, Forbidden Wafer said:

I know AI is awesome, but the hype makes me hate it.

Engineers spend that much time because the layout is optimized by a multitude of goals, and there are known algorithms used to help.

Beware of people saying something is "optimal", especially when dealing with AI. It usually is fundamentally wrong. It may be great/good/reasonable, but not optimal.
In this specific case, that is "easily" verifiable by comparing with deterministic optimization algorithms that do produce optimal results (not really, as multi-objective optimization = NP-Hard to verify).

Didn't look the article nor links, but I believe it were the crackhead journalists claiming something to be optimal, once again.

It may not be optimal, but should be good enough to fit all of the needed requirements, that's how GAs/evolutionary strategies work (and that may be the "optimal" case for a specific scenario).

 

 

 

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to post
Share on other sites
13 minutes ago, igormp said:

It may not be optimal, but should be good enough to fit all of the needed requirements, that's how GAs/evolutionary strategies work (and that may be the "optimal" case for a specific scenario).

Good and optimal are different things.

Evolutionary strategies are often used as heuristics to speed up the search for optimal results. That's exactly where I think AI will be used.

Link to post
Share on other sites

Reject humans. Embrace automation. 
 

On a more serious note, I’m surprised technology like this hasn’t already been in use. I didn’t think people would be designing chips by hand in 2021. 

Laptop: 2020 13" MacBook Pro i5, 512GB, G7 Graphics, 16GB LPDDR4X | Phone: iPhone 8 Plus 64GB | Wearables: Apple Watch Sport Series 2 | CPU: R5 2600 | Mobo: ASRock B450M Pro4 | RAM: 16GB 2666 | GPU: ASRock RX 5700 8GB | Case: Apple PowerMac G5 | OS: Win 10 | Storage: 480GB PNY SSD & 2TB WD Green HDD | PSU: Corsair CX600M | Display: Dell 27 Gaming Monitor S2719DGF 1440p @155Hz, Dell UZ2215H 21.5" 1080p, ViewSonic VX2450wm-LED 23.6" 1080p | Cooling: Wraith Prism | Keyboard: G610 Orion Cherry MX Brown | Mouse: G303 | Audio: Audio Technica ATH-M50X & Blue Snowball
Link to post
Share on other sites
39 minutes ago, Forbidden Wafer said:

Good and optimal are different things.

Evolutionary strategies are often used as heuristics to speed up the search for optimal results. That's exactly where I think AI will be used.

Yeah, but all of the ML branch of AI ends up being just a heuristic in the end, and that's good enough for us, even if it gets stuck in some local minima.

If optimal, deterministic algorithms were available/not so costly (most of those are NP-hard as you said yourself), we would be using then instead.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to post
Share on other sites

Nothing created by an AI should be patentable. 

Cor Caeruleus Reborn v6

Spoiler

CPU: Intel - Core i7-8700K

CPU Cooler: be quiet! - PURE ROCK 
Thermal Compound: Arctic Silver - 5 High-Density Polysynthetic Silver 3.5g Thermal Paste 
Motherboard: ASRock Z370 Extreme4
Memory: G.Skill TridentZ RGB 2x8GB 3200/14
Storage: Samsung - 850 EVO-Series 500GB 2.5" Solid State Drive 
Storage: Samsung - 960 EVO 500GB M.2-2280 Solid State Drive
Storage: Western Digital - Blue 2TB 3.5" 5400RPM Internal Hard Drive
Storage: Western Digital - BLACK SERIES 3TB 3.5" 7200RPM Internal Hard Drive
Video Card: EVGA - 970 SSC ACX (1080 is in RMA)
Case: Fractal Design - Define R5 w/Window (Black) ATX Mid Tower Case
Power Supply: EVGA - SuperNOVA P2 750W with CableMod blue/black Pro Series
Optical Drive: LG - WH16NS40 Blu-Ray/DVD/CD Writer 
Operating System: Microsoft - Windows 10 Pro OEM 64-bit and Linux Mint Serena
Keyboard: Logitech - G910 Orion Spectrum RGB Wired Gaming Keyboard
Mouse: Logitech - G502 Wired Optical Mouse
Headphones: Logitech - G430 7.1 Channel  Headset
Speakers: Logitech - Z506 155W 5.1ch Speakers

 

Link to post
Share on other sites

I am reminded that “calculator” used to be a profession before it was a thing that fit in your pocket before it was an app that fit on a thing you put in your pocket.

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
5 hours ago, DrMacintosh said:

Reject humans. Embrace automation. 
 

On a more serious note, I’m surprised technology like this hasn’t already been in use. I didn’t think people would be designing chips by hand in 2021. 

The design phase is done on systems, as you have to design your units. However, the construction of them and then layout are automated processes. Mostly. (Obviously, each design team works a little different.)  Something about wasting 10 million USD because of a mistake you made rather than letting the automated systems do their job, comes to mind.

Link to post
Share on other sites
6 hours ago, Taf the Ghost said:

This feels a lot more like "easy headline" and "we couldn't hire good enough engineers to do the work" issue than some great advancement. A massive amount of chip design is already automated, and layout isn't just about the design itself. The node and objectives of the silicon will matter quite a lot. I could see this being useful after a design team (note, it's going to be an entire team) has figured out all of the requirements. That would allow the program to find extra efficiencies. 

 

This sounds a lot more like a precursor to "Google TPU delayed again". lol

 

@WolframaticAlphaAI isn't going to change as much as everyone is currently projecting. Frankly, it'll only really work well in fully digitized information sets. Further, all it's really good at is optimizing within the framework it has been given to operate and upon the information that is available. It's change some things a lot, but it'll actually put even more demand on "Systems" Engineers.

 

It's already like 3D Printing. It's really important, but it's got its place in all things, rather than taking over everything.

the problem with 3d printing is the materials it can print, but slowly metal printers are becoming a thing (a cheaper thing), then many more use cases appear, though if you really want to you can already use 3d printers to cast metal parts, all you need is some sand.

5 hours ago, DrMacintosh said:

Reject humans. Embrace automation. 
 

On a more serious note, I’m surprised technology like this hasn’t already been in use. I didn’t think people would be designing chips by hand in 2021. 

people make the blocks, but most of the layout is probably already done with auto-routing  software, i remember hearing some time ago that was one of intel's advantages as they did more of the layout manually so they ended up with better optimized layouts, not sure how true that is of course.

Link to post
Share on other sites

Very cool. 

Ryzen 7 3800X | X570 Aorus Elite | G.Skill 16GB 3200MHz C16 | Radeon RX 5700 XT | Samsung 850 PRO 256GB | Mouse: Zowie S1 | OS: Windows 10

Link to post
Share on other sites

Step one of judgement day, AI that can create its own CPUs....

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to post
Share on other sites

Ai can create a lot of things, doesn't mean they're useful or quality.

 

Link to post
Share on other sites
3 hours ago, Heliian said:

Ai can create a lot of things, doesn't mean they're useful or quality.

 

They're supposedly faster and more efficient than their 2017 TPUs. That in my books is GOOD.

Link to post
Share on other sites
14 hours ago, poochyena said:

is it actually as good or better than human designs though?

We've had computer generated chips and they were pretty crap. Besides, chips aren't just layout.

AMD Ryzen 7 5800X | ASUS Strix X570-E | G.Skill 32GB 3733MHz CL16 | PALIT RTX 3080 10GB GamingPro | Samsung 850 Pro 2TB | Seagate Barracuda 8TB | Sound Blaster AE-9 MUSES Edition | Altec Lansing MX5021 Nichicon/MUSES Edition

Link to post
Share on other sites
14 hours ago, poochyena said:

is it actually as good or better than human designs though?

This is the question lol

My PCs:

VALENTINIAN : CPU: Ryzen 7 2700X || CPU COOLER : Corsair H115i Pro || MOBO : MSi B450 Tomahawk Max || GPU: ASUS GTX 1080 Ti Strix OC || RAM: 4x8GB Corsair Vengeance (3200) || SSDs: Samsung 970 Evo 250GB, Samsung 850 Evo 1TB x2 || PSU: EVGA G2 850W w/ Cablemod Black & White Cables || CASE: NZXT H510 White || MONITOR: Acer Predator X34A (1440p 100hz), HP 27yh (1080p 60hz) || KEYBOARD: GameSir GK300 || MOUSE: Logitech G502 Hero || AUDIO: HyperX Cloud Alpha, Logitech C920 || CASE FANS : 2x Corsair ML140, 1x BeQuiet SilentWings 3 120 ||

DIOCLETIAN III (HTPC) : CPU: Ryzen 5 1600 || CPU COOLER : Cooler Master Hyper 212 Black Edition || MOBO : MSi X370 Gaming Pro Carbon || GPU: ASUS GTX 1080 Strix OC || RAM: 2x8GB G.SKILL Ripjaws V (3200) || SSDs: Crucial P1 500GB, Crucial MX500 1TB || HDD: Seagate Barracuda 2TB || PSU: Seasonic 650W w/ Black & Red Extensions || CASE: Phanteks P300 || Monitor: Samsung Q60 65" QLED (4K 60hz) || KEYBOARD: Logitech G613 || Mouse: Logitech G305 || CONTROLLER: Xbox One Controller x2 || AUDIO: Samsung Q60R Soundbar || Case Fans : 2x Cooler Master Masterfan Pro 120, Noctua NF-F12 iPPC-2000 ||

JUSTINIAN - Dell XPS 15": CPU: Core i7-9750H || GPU: GTX 1650 || RAM: 2*8GB 2666MhZ DDR4 SODIMM || SSD: 1TB M.2 PCIe || CASE: 15.6" Laptop with dBrand skin || MONITOR: 15" 1920 * 1080 IPS || KEYBOARD: Dell Keyboard || MOUSE: Logitech G305 White || AUDIO: HyperX Cloud II ||

OTHER : Dell Latitude (i7-6600U, 16GB RAM, 500GB SSD) ||

MOBILE : Galaxy S9 (64GB + 64GB uSD) || Galaxy S7 (32GB) || FitBit Blaze || iPad 7th Generation ||

CONSOLE : Nintendo Switch (Pro Controller x3, Joy-Con Grip x1, PowerA Enhanced Controller x1) ||

Link to post
Share on other sites

Rough idea as to what the design of the chip looked like:

 

https://www.techpowerup.com/img/t6dw81UctSMnDoaw.jpg

 

Quote

the study explains that the system was trained on over 10,000 microchip designs, and that when faced with a selection of macro blocks to arrange in the floorplanning stage of microchip design, novelty iterations of components were found to outperform those designed by teams of human engineers.

Sauce: https://www.techpowerup.com/283235/ai-designed-microchips-now-outperform-human-designed-ones

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to post
Share on other sites
1 minute ago, DildorTheDecent said:

Rough idea as to what the design of the chip looked like:

 

https://www.techpowerup.com/img/t6dw81UctSMnDoaw.jpg

 

Sauce: https://www.techpowerup.com/283235/ai-designed-microchips-now-outperform-human-designed-ones

This reminds me of chess and go or the whole machine compile thing.  It is possible to compile stuff by hand and there were people that claimed they could do it better. 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites

"AI" = Algorithm.

 

AI doesn't exist yet, its very annoying seeing the media misuse the term.

System Specs:

CPU: Ryzen 9 3900X Stock

GPU: Gigabyte RTX 3080 Gaming OC

RAM: 16GB 3000MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: ThermalTake 750W Smart BX1 - RGB Disabled

Case: BeQuiet! Silent Base 801 Black

Cooler: DeepCool Castle 360EX

 

 

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×