Jump to content

Sam Altman seeking 5-7 TRILLION in backing for Open AI CPU Creation

tkitch

Did Altman provide details of why he needs $5 trillion?

 

How will be held accountable considering the timespan for a $5 trillion endeavor to yield results?

 

Chris Roberts is that you?

Link to comment
Share on other sites

Link to post
Share on other sites

Quote
  • Altman has said AI chip limitations hinder OpenAI's growth, and as this project would increase chip-building capacity globally, he is in talks with investors, including the United Arab Emirates government, per the Journal.

To me this sounds very much like a MASSIVE increase in production capacity... To do that he needs to worker create a ASML competitor or invest heavily into ASML to get them to increase production.

 

Then you have silicon production, as in the actual making of the blank wafers, interposers, fabbing, packaging, testing, researching nodes or licencing TSMC or Samsung tech to get started, getting the people, like there is a shortage of people who knows how to actually do most of this outside of the likes of Asia, and even there they are in high demand... 

 

Yeah if he wants to actually increase the GLOBAL production capacity by any significant amount of going to cost a lot of money...

 

Quote

On Wednesday, Altman posted on X that OpenAI believes "the world needs more ai infrastructure--fab capacity, energy, datacenters, etc--than people are currently planning to build."

Yeah... Fabs, energy, datacenters, "etc" so confirms he's talking about the WHOLE supply chain

 

Spoiler

He added that "building massive-scale AI infrastructure, and a resilient supply chain, is crucial to economic competitiveness" and that OpenAI would try to help.

This pretty much confirms the ides he's talking the whole shebang 

 

Source: https://www.cnbc.com/2024/02/09/openai-ceo-sam-altman-reportedly-seeking-trillions-of-dollars-for-ai-chip-project.html

My Folding Stats - Join the fight against COVID-19 with FOLDING! - If someone has helped you out on the forum don't forget to give them a reaction to say thank you!

 

The only true wisdom is in knowing you know nothing. - Socrates
 

Please put as much effort into your question as you expect me to put into answering it. 

 

  • CPU
    Ryzen 9 5950X
  • Motherboard
    Gigabyte Aorus GA-AX370-GAMING 5
  • RAM
    32GB DDR4 3200
  • GPU
    Inno3D 4070 Ti
  • Case
    Cooler Master - MasterCase H500P
  • Storage
    Western Digital Black 250GB, Seagate BarraCuda 1TB x2
  • PSU
    EVGA Supernova 1000w 
  • Display(s)
    Lenovo L29w-30 29 Inch UltraWide Full HD, BenQ - XL2430(portrait), Dell P2311Hb(portrait)
  • Cooling
    MasterLiquid Lite 240
Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, ToboRobot said:

If you compare the expense to GDP it's comparable to the Manhattan project for the people who think it's unreasonable.

I'm too lazy to do the math. sorry.

i'll do the maths then.

 

the quoted number is pretty much 1000 years of nvidia R&D budget, at their current rate.

 

basicly, what can be concluded, is that this quote to make this thing happen, is equal to the work nvidia could do over the course of 1000 years. y'know.. nvidia, who are building some of the most powerful compute chips available, and make a whole new generation that's even more powerful every few years.

 

also - nvidia spends 232 dollars per second on R&D, just to pull this back to numbers that are easier to grasp.

 

then, according to google, the manhattan project took 26 billion, adjusted for inflation.

manhattan project took 4 years, which reduces to 6.5 billion per year, or only about 206 bucks per second.

 

let's assume this 7 trillion quote is a "200 year plan" to design and build a new platform from the ground up, integrate it into a compute product, and turn it into a corporation that can maintain itself over the next 200 years...

this is still over 1000 dollars per second at the 7T mark, or 790-ish dollars per second at the 5T mark.

 

and that assumes that over the next 200 years this business has no foreseeable form of income past the investment budgets...

 

i'll toss in a very different comparison. let's say one theoretical employee costs a business $10k per month (wage cost, office space, benefits, taxes, etc.), 7T is enough to support the full career (starting at 18, and retiring at 75) of over 1000 people.

 

essentially, the quoted budget is so far into the unreasonable that there's endless comparisons to show how ridiculous it is. to the point this *has* to be a mistake...

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, manikyath said:

i'll do the maths then.

 

the quoted number is pretty much 1000 years of nvidia R&D budget, at their current rate.

 

basicly, what can be concluded, is that this quote to make this thing happen, is equal to the work nvidia could do over the course of 1000 years. y'know.. nvidia, who are building some of the most powerful compute chips available, and make a whole new generation that's even more powerful every few years.

 

also - nvidia spends 232 dollars per second on R&D, just to pull this back to numbers that are easier to grasp.

 

then, according to google, the manhattan project took 26 billion, adjusted for inflation.

manhattan project took 4 years, which reduces to 6.5 billion per year, or only about 206 bucks per second.

 

let's assume this 7 trillion quote is a "200 year plan" to design and build a new platform from the ground up, integrate it into a compute product, and turn it into a corporation that can maintain itself over the next 200 years...

this is still over 1000 dollars per second at the 7T mark, or 790-ish dollars per second at the 5T mark.

 

and that assumes that over the next 200 years this business has no foreseeable form of income past the investment budgets...

 

i'll toss in a very different comparison. let's say one theoretical employee costs a business $10k per month (wage cost, office space, benefits, taxes, etc.), 7T is enough to support the full career (starting at 18, and retiring at 75) of over 1000 people.

 

essentially, the quoted budget is so far into the unreasonable that there's endless comparisons to show how ridiculous it is. to the point this *has* to be a mistake...

I'm sorry but you missed the target of the math being a comparison of the ratio of the cost of the project to the GDP, which I don't see any calculations for.   Your calculations on R&D spending per second are not useful unfortunately. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ToboRobot said:

I'm sorry but you missed the target of the math being a comparison of the ratio of the cost of the project to the GDP, which I don't see any calculations for.   Your calculations on R&D spending per second are not useful unfortunately. 

but the USA's GDP is irrelevant to the matter.

 

as for the calculations per second not being useful.. it's to convert this very theoretical number into something average joe can understand as a unit of money. yes, that number means nothing, but it's an example of just how far into ridiculous terretory this goes.

 

as for to please your one example.. let's put this in context to the USA GDP...

 

manhattan project was 0.36% of GDP for 4 years, or 1.4% if the whole project cost was compacted into one year, in 1942.

 

5T in 2023 is 18.27% of USA's GDP.

7T in 2023 is 25.58% of USA's GDP.

 

or in other words, we'd have to spread out this budget over a timespan of 50-70 years during which this setup has no other form of income what so ever in order to make this on equal cost per USA GDP to the manhattan project.

 

or - like i said:

15 minutes ago, manikyath said:

essentially, the quoted budget is so far into the unreasonable that there's endless comparisons to show how ridiculous it is. to the point this *has* to be a mistake...

 

as for a sidenote - the R&D budget of apple, nvidia, and TSMC combined only add up to about 36 billion per year.

 

we could also consider a douzen fabs, TSMC seems to quote about $20 billion per fab, let's double this to be on the safe side.. that only brings us to 480 billion, for a DOUZEN fabs...

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, manikyath said:

the quoted number is pretty much 1000 years of nvidia R&D budget, at their current rate.

Nvidia's R&D budget doesn't account for hyper expensive fab infrastructure because they simply don't have it. If this were only about chip design and paying someone to make it, Nvidia, then looking at Nvidia's R&D budget would be more useful but this simply is not about that so it's really not useful at all.

 

Just in owned physical land, building and equipment TSMC has $3 trillion worth of that so if OpenAI wants to even attempt to come close to TSMC wafer production there is your absolute minimum cost and that's not even close to everything actually required.

 

The money being sought is not outlandish, wanting to be another TSMC+Nvidia out the gate is.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, tkitch said:

how do you figure?
TSMC has a total market valuation of like 1/10th of that?  

Or would he be trying to build something on a scale to dwarf TSMC and Intel combined?

 

I mentioned if he wanted to make an entirely new Fab industry. That'd require building a completely parallel system that, even with licensing, would need to invest R&D to the tune of a trillion USD. Duplicating the most advanced supply chain of the most advanced production lines in Human History is really, really expensive. A big build out of new Fabs over a decade would cover most of that first 1 Trillion USD.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Nvidia's R&D budget doesn't account for hyper expensive fab infrastructure because they simply don't have it. If this were only about chip design and paying someone to make it, Nvidia, then looking at Nvidia's R&D budget would be more useful but this simply is not about that so it's really not useful at all.

 

Just in owned physical land, building and equipment TSMC has $3 trillion worth of that so if OpenAI wants to even attempt to come close to TSMC wafer production there is your absolute minimum cost and that's not even close to everything actually required.

 

The money being sought is not outlandish, wanting to be another TSMC+Nvidia out the gate is.

i was looking at nvidia's R&D budget because aside from apple they're the highest in the space. nvidia actually spends more on R&D than TSMC does, besides that TSMC's quotes on how much they invest every time they build a new fab are relatively low when speaking in this price bracket (to the tune of $20B per fab).

 

even for this outlandishly ridiculous goal, it is difficult to gather what 7 trillion dollars would get into, before such a business would start generating revenue to be self sufficient in it's growth. - and that's really my point here... if he's looking for 7T in investment, that would imply he needs that much money to get this going before it starts becoming profitable in any way at all, and this figure is so far beyond what even the giant players in the space are investing into their own growth, that it's difficult to imagine any sensible human being would think this big.

 

the point is not that the money "cant" be well spent, the point is that it's genuinely very difficult for "a single entity" to spend that much money within the timeframe of "an investment".

 

15 minutes ago, Taf the Ghost said:

A big build out of new Fabs over a decade would cover most of that first 1 Trillion USD.

but be honest with yourself.. it makes no sense to "build all of it, at once, in one investment" - because that's the only way you're realisticly getting to this figure. i'd assume you proof of concept this at a sensible scale, and expand from there... but maybe common sense and AI dont mix well.

 

it's sort of if framework would be looking for 200x the investment money they were looking for initially, so they can bring every shape and size of laptop to market all at once. that's not how a sensible business works, that's not how a sensible investment works.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, manikyath said:

the point is not that the money "cant" be well spent, the point is that it's genuinely very difficult for "a single entity" to spend that much money within the timeframe of "an investment".

And spending it that fast will just lead to many things you wish you didn't do or find that could have been done better. No such thing as perfect plans and perfect implementation. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, manikyath said:

but maybe common sense and AI dont mix well

They never did

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Salted Spinach said:

Did Altman provide details of why he needs $5 trillion?

 

How will be held accountable considering the timespan for a $5 trillion endeavor to yield results?

 

Chris Roberts is that you?

Indeed.

 

Even if the expansion was warranted to the extent claimend, Sam Altman shouldn't be trusted with the key to this manufacturing.

 

We need ML model to be more open, not closed, and even if Sam Altman was able to lead the expansion, he could make accelerators that ONLY work with Open AI models via homomorphic encryption to lock all his competitors out of the race.

 

Sam Altman is the last person that should be entrusted with strategic chip making leadership. Already I don't believe he can be trusted with ML making since as soon as what he discovered proved to be valuable, he jettisoned to Open part of OpenAI.

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, 05032-Mendicant-Bias said:

Indeed.

 

Even if the expansion was warranted to the extent claimend, Sam Altman shouldn't be trusted with the key to this manufacturing.

 

We need ML model to be more open, not closed, and even if Sam Altman was able to lead the expansion, he could make accelerators that ONLY work with Open AI models via homomorphic encryption to lock all his competitors out of the race.

 

Sam Altman is the last person that should be entrusted with strategic chip making leadership. Already I don't believe he can be trusted with ML making since as soon as what he discovered proved to be valuable, he jettisoned to Open part of OpenAI.

Feels more like just a play to get in to the fabrication market, AI/ML etc won't be the hot thing forever and will vastly change over time while fabricators do not change and can make whatever chips of the moment.

 

You get in to AI/ML for the today, you get in to fabrication for the tomorrow/forever.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Feels more like just a play to get in to the fabrication market, AI/ML etc won't be the hot thing forever and will vastly change over time while fabricators do not change and can make whatever chips of the moment.

 

You get in to AI/ML for the today, you get in to fabrication for the tomorrow/forever.

Right up till they develop a process that doesn't use current lithography tech.  Then you have a few trillion dollars worth of scrap no one wants.  People are still trying to develop biological matter for information storage and who knows where quantum will take us.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, mr moose said:

Right up till they develop a process that doesn't use current lithography tech.  Then you have a few trillion dollars worth of scrap no one wants.  People are still trying to develop biological matter for information storage and who knows where quantum will take us.

That's the same as everyone else and older processes aren't useless, not even after a decade. That's really the point, anyone can design chips, few can actually make them.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, leadeater said:

That's the same as everyone else and older processes aren't useless, not even after a decade. That's really the point, anyone can design chips, few can actually make them.

But what if a new technology obsoletes ALL silicon based chips?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

But what if a new technology obsoletes ALL silicon based chips?

Doesn't matter, like I said it's not any different for TSMC etc. Nobody is going to be any worse or better off. In fact if/when that does happen so long as it's not ASML again selling to everyone then they/we will be better off.

 

But until then building the clean facilities comes first, then you buy equipment off ASML, whatever that may be at the time.

 

We are a long way off non-silicon fabrication at mass scale with end chip performance better than silicon based. Not a huge concern for anyone really.

Link to comment
Share on other sites

Link to post
Share on other sites

Sam....who?

 

is he like the Phil Spencer of nobodies? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

But what if a new technology obsoletes ALL silicon based chips?

well they're trying since decades... with "quantum computing"... it doesn't seem to be going anywhere, a calculator from 1960 is more powerful lol.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Mark Kaine said:

Sam....who?

 

is he like the Phil Spencer of nobodies? 

The guy runs Open AI, the group that made ChatGBT

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Mark Kaine said:

well they're trying since decades... with "quantum computing"... it doesn't seem to be going anywhere, a calculator from 1960 is more powerful lol.

huh?
While I get the sentiment that its always been the "next big thing"
Quantum computing the last few years has grown in leaps and bounds. We only just started building semi usable ones in 2019. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, starsmine said:

The guy runs Open AI, the group that made ChatGBT

ah, mb... i honestly thought its some game dev, not the next Elon Musk... 

 

2 minutes ago, starsmine said:

huh?
While I get the sentiment that its always been the "next big thing"
Quantum computing the last few years has grown in leaps and bounds.

its going nowhere though... the theory is sound,  apparently they can't find the right materials though. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, leadeater said:

We are a long way off non-silicon fabrication at mass scale with end chip performance better than silicon based. Not a huge concern for anyone really.

Yup, steps to obsolete silicon:

Find something "better", find a way to make it at scale, then actually scale it. Even if they found something today that would clearly beat silicon it could be decades before it scales up.

 

1 minute ago, starsmine said:

Quantum computing the last few years has grown in leaps and bounds.

This. They are getting bigger and more complex all the time.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Mark Kaine said:

well they're trying since decades... with "quantum computing"... it doesn't seem to be going anywhere, a calculator from 1960 is more powerful lol.

Quantum computing is not meant to be a replacement for traditional computing, it's an application specific technology that is multiple orders of magnitude better than traditional computing at those specific tasks.

2 hours ago, mr moose said:

But what if a new technology obsoletes ALL silicon based chips?

Currently there's no reason to believe this will happen in the near future, nor has Altman provided any.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/9/2024 at 10:19 PM, Skipple said:

I... what? This has to be a mistake. $7 trillion is more money than the entire US Federal operating budget. It's a quarter of the US GDP. The concept doesn't make sense sense. Like, $7 trillion on a future evaluation I can maybe see, but what could possibly be the scope of a project could that costs $7 trillion?

In the world, we have about $30 trillions... It wouldn't be easy to find $7 trillions.

 

https://www.sunnyavenue.co.uk/insight/how-much-money-is-in-the-world

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, X-System said:

In the world, we have about $30 trillions... It wouldn't be easy to find $7 trillions.

 

https://www.sunnyavenue.co.uk/insight/how-much-money-is-in-the-world

you're getting it wrong in how it works:

The US has a GDP of over 25 Trillion a year.

 

Sam doesn't need to have 7 Trillion sitting in his account at once.  He needs it over a period of time.

 

Money that's spent gets circulated around and spend multiple times, each time it's used registers in GDP.  

So the money Sam spends could end up circulating through OpenAI at least twice 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×