Jump to content

Sam Altman seeking 5-7 TRILLION in backing for Open AI CPU Creation

tkitch
12 hours ago, tkitch said:

Sam doesn't need to have 7 Trillion sitting in his account at once.  He needs it over a period of time.

What he needs to do is put down the crackpipe.

 

7 Trillion. See, this is why drugs are bad!

Un-believable! 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, X-System said:

In the world, we have about $30 trillions... It wouldn't be easy to find $7 trillions.

 

https://www.sunnyavenue.co.uk/insight/how-much-money-is-in-the-world

That's... not how money works.

ask me about my homelab

on a personal quest convincing the general public to return to the glory that is 12" laptops.

cheap and easy cable management is my fetish.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/12/2024 at 7:49 PM, leadeater said:

Doesn't matter, like I said it's not any different for TSMC etc. Nobody is going to be any worse or better off. In fact if/when that does happen so long as it's not ASML again selling to everyone then they/we will be better off.

 

But until then building the clean facilities comes first, then you buy equipment off ASML, whatever that may be at the time.

 

We are a long way off non-silicon fabrication at mass scale with end chip performance better than silicon based. Not a huge concern for anyone really.

 

On 2/12/2024 at 8:08 PM, Mark Kaine said:

well they're trying since decades... with "quantum computing"... it doesn't seem to be going anywhere, a calculator from 1960 is more powerful lol.

 

On 2/12/2024 at 8:28 PM, Sauron said:

Quantum computing is not meant to be a replacement for traditional computing, it's an application specific technology that is multiple orders of magnitude better than traditional computing at those specific tasks.

Currently there's no reason to believe this will happen in the near future, nor has Altman provided any.

 

 

I wasn't arguing it was going to happen, just that silicon may not be a "forever" investment,  especially at the cost of 4T.  don't forget that there are only two countries with GDP worth more than 5T.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

I wasn't arguing it was going to happen, just that silicon may not be a "forever" investment,  especially at the cost of 4T.  don't forget that there are only two countries with GDP worth more than 5T.

I wasn't saying silicon fabrication though, once you are a chip fabricator then you can do whatever the industry moves towards. Simply being one is wildly expensive which is why there are so few.

 

The counter point was your comment applies exactly the same to TSMC, Intel, GloFo, Samsung, Micron, UMC etc so I don't really see how it matters a whole lot. All of these are essentially forever companies while just designing AI chips for someone else to make them definitely is not and I'm pretty sure OpenAI knows that.

 

Edit:

Also why is everyone so fixated on GDP, totally irrelevant metric to be looking at. Again TSMC company value alone is nearly 17T, that is one fabrication company. Notice how completely irrelevant GDP is. It's not like actual real 5T-7T money is going to be spent and certainly not in one year or 5 years.

Link to comment
Share on other sites

Link to post
Share on other sites

i do believe we need something else than silicone is the future,  also i think the whole qbits thing is a good thing, but i don't think we can go smaller than 1 atom...? i also think a lot could be done with optimization and better programming languages,  like come on, games have reached several hundred GBs now and barely look above PS3 levels... no one uses their own engines anymore,  its all UE molasses......

 

on the other hand, maybe we have reached peak and no one really needs more computational power?  🤔

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, leadeater said:

All of these are essentially forever companies

ya, until someone comes along and makes something better and they become obsolete overnight?  wouldn't be the first time (*scrambles to find historical examples*)

 

like, i dunno,  kodak, nokia... 

 

ps: Boeing seems on their way out too! i never thought that'd be possible...  

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Does money "work" at all? 🙃

It does when you got lots of it all in one place 😉 

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SimplyChunk said:

I does when you got lots of it all in one place 😉 

counter point: you don't need money if you use paypal "pay later"!  😉

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Mark Kaine said:

ps: Boeing seems on their way out too! i never thought that'd be possible...  

Nah, bad press and a few problems isn't going to make them go anywhere.

 

https://en.wikipedia.org/wiki/Boeing_Defense,_Space_%26_Security

 

They will be more the fineeeeeee

 

13 minutes ago, Mark Kaine said:

ya, until someone comes along and makes something better and they become obsolete overnight?  wouldn't be the first time (*scrambles to find historical examples*)

 

like, i dunno,  kodak, nokia... 

Those aren't very good examples. Those are end product companies with products and technologies easily replaced. Fabricators are like Banks, Pharmaceutical and Food/Agriculture companies, essential foundational companies that are on the list of the oldest. While fabricators are "new" I don't see computer driven society going anywhere so as long as we need to "compute stuff" then TSMC etc will continue to exist.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mark Kaine said:

counter point: you don't need money if you use paypal "pay later"!  😉

That reminds me of an old Two Stupid Dogs episode...

 With all the Trolls, Try Hards, Noobs and Weirdos around here you'd think i'd find SOMEWHERE to fit in!

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Mark Kaine said:

i do believe we need something else than silicone in the future,  also i think the whole qbits thing is a good thing, but i don't think we can go smaller than 1 atom...? i also think a lot could be done with optimization and better programming languages,  like come on, games have reached several hundred GBs now and barely look above PS3 levels... no one uses their own engines anymore,  its all UE molasses......

 

on the other hand, maybe we have reached peak and no one really needs more computational power?  🤔

This feels like a piss take. UE5 games all look leaps and bounds better then UE3 games with the tools they provide devs to do their creative visions. It has some asset loading stutter issues under the hood, but as a whole its some hella optimized code.

Im not sure what you mean by going smaller then an atom. no one is trying to and that's not on the horizon. Node names have not made sense since before finfet, and we are moving to gaafet. We then have high NA after that, And there are research on a new shape to change to after that. Yes there will come a time when silicon transistors can no longer continue. That's when a multi trillion investment into a new material to completely replace silicon will happen for classical computing. Its not like there are not a dozen candidates for that, but the money hasn't been put there because we are already so far up the silicon maximum it would cost... well trillions. Replacing silicon wont happen for another 40 years, and pivoting from silicon for a fab with all that knowledge is a lateral move. 

Or we just make a cheap version of P type GaN and now its all just GaN and 10+ ghz.
https://pubs.aip.org/aip/jap/article/128/9/090901/157804/Progress-on-and-challenges-of-p-type-formation-for

37 minutes ago, Mark Kaine said:

ya, until someone comes along and makes something better and they become obsolete overnight?  wouldn't be the first time (*scrambles to find historical examples*)

 

like, i dunno,  kodak, nokia... 

 

ps: Boeing seems on their way out too! i never thought that'd be possible...  

Kodak is a chemical company and doing just fine, they are solidly and consistently profitable and still a market leading in many niche areas. No they are no longer a juggernaut trying to outcompete the old GE anymore, but they don't try to be. Yes they missed the boat on digital imaging because of not wanting to compete with itself and that caused problems, but the last decade has been very good for them.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mark Kaine said:

ya, until someone comes along and makes something better and they become obsolete overnight?  wouldn't be the first time (*scrambles to find historical examples*)

 

like, i dunno,  kodak, nokia... 

 

ps: Boeing seems on their way out! i never thought that'd be possible!  

 

 

Big companies don't disappear overnight. They all have time to adjust to the new market, but some choose not to. It takes inflexible management to kill a big company, not a changing market. 

 

Boeing will be fine, eventually. They are still profitable and the US government wouldn't let them collapse for various reasons. Also, their main competitor, Airbus, can't ramp up production fast enough to take all of Boeing's business. They have a pretty captive market as one half of a near duopoly.

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Monkey Dust said:

Big companies don't disappear overnight.

well, no, overnight wasn't to be taken literally, but it can happen...!

 

So theoretically:  someone makes a new quantum computing device,  or something similar not based on silicone... sure the big companies will try to steal it, buy it, kill it, anything that keeps them on top of the chain, so this isn't very likely, but it's possible! 

people always say this can't happen,  until it does.

 

(cars, planes, etc...)

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, starsmine said:

Im not sure what you mean by going smaller then an atom

quantum computing works with gates only a few atoms wide, so what i was saying is even that has its limits.

 

ps: basically that's what I said above,  i cant imagine gates smaller than an atom, it may not be possible after all, but at some point,  someone will come up with something new, that's probably simpler yet superior.  (time frame: unknown) 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Mark Kaine said:

well, no, overnight wasn't to be taken literally, but it can happen...!

WeWork lol

 

Quote

In April 2023, WeWork faced delisting on the New York Stock Exchange as its stock price had fallen below a $1.00 threshold. The company was valued at $360.9 million, down from its $47 billion valuation in 2019.[8]

RIP

Link to comment
Share on other sites

Link to post
Share on other sites

Neuromorphic computing

 

The world's first supercomputer capable of simulating networks at the scale of the human brain has been announced by researchers from the International Centre for Neuromorphic Systems (ICNS) at Western Sydney University.

 

DeepSouth uses a neuromorphic system which mimics biological processes, using hardware to efficiently emulate large networks of spiking neurons at 228 trillion synaptic operations per second - rivalling the estimated rate of operations in the human brain.
 

Key Benefits of DeepSouth:

  • Super-fast, large scale parallel processing using far less power:  Our brains are able to process the equivalent of an exaflop — a billion-billion (1 followed by 18 zeros) mathematical operations per second — with just 20 watts of power. Using neuromorphic engineering that simulates the way our brain works, DeepSouth can process massive amounts of data quickly, using much less power, while being much smaller than other supercomputers.
  • Scalability: The system is also scalable, allowing for the addition of more hardware to create a larger system or scaling down for smaller portable or more cost-effective applications.
  • Reconfigurable: Leveraging Field Programmable Gate Arrays (FPGA) facilitates hardware reprogramming, enabling the addition of new neuron models, connectivity schemes, and learning rules—overcoming limitations seen in other neuromorphic computing systems with custom-designed hardware. DeepSouth will be remotely accessible with a front end that allows description of the neural models and design of the neural networks in the popular programming language Python. The development of this front-end enables researchers to use the platform without needing detailed knowledge of the hardware configuration.
  • Commercial Availability: Leveraging commercially available hardware ensures continual improvements of the hardware, independent of the team designing the supercomputer, overcoming limitations seen in other neuromorphic computing systems with custom designed hardware. Custom chips take a large amount of time to design and manufacture and cost tens of millions of dollars each. Using commercial off-the-shelf configurable hardware means that the protype would be easy to replicate at data centres around the world.
  • Artificial Intelligence: By mimicking the brain, we will be able to create more efficient ways of undertaking AI processes than our current models.

 

Now this is interesting research to watch out for. Brute-force AI is probably where it's not.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, StDragon said:

Neuromorphic computing

[snip]

 

Now this is interesting research to watch out for. Brute-force AI is probably where it's not.

I really wish I better understood the whole AI space. Have heard of this but not looked closer before. "AI" is still a growing field and leading techniques in use today are different from those even a year or two ago. Maybe this is the one beyond that? But for now, it is still implemented in silicon.

 

I also found myself wondering what "brute-force AI" even means. Many of the AI applications now rely in showing it enough training data so it knows what to do. Even if there are alternate implementations, they'll still need training.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

ps: Boeing seems on their way out too! i never thought that'd be possible...  

I'm not sure that's true and even if it were, which "new technology" would have been responsible for that?

20 minutes ago, StDragon said:

Artificial Intelligence: By mimicking the brain, we will be able to create more efficient ways of undertaking AI processes than our current models.

BIG [citation needed] on that one. Comparing neural networks to the human brain has always been a stretch and CNNs have lost even the superficial appearance of human neurons.

2 minutes ago, porina said:

"AI" is still a growing field and leading techniques in use today are different from those even a year or two ago.

Are they though? Recent "breakthroughs" have mostly been the result of iteration on ideas introduced several years ago. The difference in performance between GPT generations are in large part due to simply adding more data in the training sets. Similarly, afaik hardware has gotten faster mainly by adding more stuff rather than radically changing the way processing is done, at least in the last few years.

2 hours ago, Mark Kaine said:

i also think a lot could be done with optimization and better programming languages,  like come on, games have reached several hundred GBs

This has absolutely nothing to do with programming or optimization. The vast majority of data in a modern videogame is assets like textures and character models. The more detailed, the "larger" - it's just the inescapable reality of things. Better compression could help a little, but don't expect an orders of magnitude difference since (almost) everything is already compressed.

 

Poor code optimization can lead to bad performance, not to large size on disk.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Sauron said:

Poor code optimization can lead to bad performance, not to large size on disk

why not both.gif

 

 

seriously idk *why* games are so bloated nowadays, i just know its not texture quality etc... most games rely heavily on upscaling tech ala dlss, and don't look that much better for it. not to mention physics, destructible environments etc take a huge backseat.

 

i made comparisons between ps3 and ps4 games where ps3 clearly looked better/ sharper and was accused of "photoshopping" ... that's how big the cope really is.

 

i say it again games have no reason to be this big, most of it is due to lack of optimization and bloated engines ala unreal.  also the idiotic need for "real time" lighting doesn't help when prebaked would free up so many resources and look just as good if not better! 

 

There's literally nothing impressive about "reflections", ps2 games could already do that 🙄 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, porina said:

I really wish I better understood the whole AI space. Have heard of this but not looked closer before. "AI" is still a growing field and leading techniques in use today are different from those even a year or two ago. Maybe this is the one beyond that? But for now, it is still implemented in silicon.

 

I also found myself wondering what "brute-force AI" even means. Many of the AI applications now rely in showing it enough training data so it knows what to do. Even if there are alternate implementations, they'll still need training.

 

Most of the stuff you see called AI atm isn't AI at all. It's an Expert System. A program designed to do a specific task and highly flexible within that, but incapable of stepping outside it.  And some things being called that don't even qualify as Expert Systems. They're a step down from that.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, CarlBar said:

 

Most of the stuff you see called AI atm isn't AI at all. It's an Expert System. A program designed to do a specific task and highly flexible within that, but incapable of stepping outside it.  And some things being called that don't even qualify as Expert Systems. They're a step down from that.

these systems are totally ok, a logical evolution,  its just absolutely disgraceful to market them as "AI", they're dumb as bread lol.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, CarlBar said:

Most of the stuff you see called AI atm isn't AI at all.

I'll leave it at if the industry in general calls something AI, it's AI. Personally not interested in getting into definitions of what AI is or should be.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

seriously idk *why* games are so bloated nowadays, i just know its not texture quality etc...

Yes it is. Well, textures, models, audio etc.

2 hours ago, Mark Kaine said:

most games rely heavily on upscaling tech ala dlss

Not always the case, and dlss upscales the whole output image, not individual textures and models. It's possible that sometimes assets are needlessly high resolution or not well compressed and end up being mostly wasted space, I think there was controversy a while back because a COD game did not bother compressing the audio and ended up with 50gb of FLAC files... but that's not a given. And again it has nothing to do with code optimization.

2 hours ago, Mark Kaine said:

not to mention physics, destructible environments etc take a huge backseat.

Pretty irrelevant for size.

2 hours ago, Mark Kaine said:

i say it again games have no reason to be this big, most of it is due to lack of optimization and bloated engines ala unreal.  also the idiotic need for "real time" lighting doesn't help when prebaked would free up so many resources and look just as good if not better! 

Prebaked lighting would occupy more space on disk, not less. It would make the game run faster but definitely not less "bloated".

1 hour ago, CarlBar said:

Most of the stuff you see called AI atm isn't AI at all.

I agree with the sentiment but it kind of depends on what "AI" is even understood to mean. Just a few years back you would use "AI" to refer to simple game NPC behavior.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Sauron said:

Yes it is. Well, textures, models, audio etc.

 

 

I agree with Mark Kaine. It's not that simple. There are many examples of games out there that seem to have equal fidelity, if not more than others, and can have vastly smaller file sizes. Games have released patches that drastically reduce install size after the fact, etc etc. It can depend on a lot of factors. There's no good reason that Warzone needs to be as big as it is, for example, as far as I'm concerned.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×