Jump to content

OpenAI unveils "Sora." A prompt-based short video generator with amazing results

JustLovett0
On 2/20/2024 at 11:34 AM, Kisai said:

Really, the problem here is that an AI that consumes that much power is only going to be justified if it's constantly in use (hence the cloud applications.)

I shun Crypto for it's power consumption (as much as 1% of the world energy use), it's only right that I also criticize the IT infrastructure for using as much as 1.5% of the world energy.

 

Now, with Crypto, it uses 1% of the world energy to defraud marks, so it's a humongous liability. With Datacenters, you do get utility out of it, the internet is useful, and I even count entertainment as utility, so I feel the energy use is more justified.

 

Still you are right, training ML models is making the climate change problem worse. I think ML will overall improve our odds of reversing the problem, but it's wishful thinking on my part. So far its just more cat pictures basically.

 

23 hours ago, leadeater said:

Nobody has to do anything, kind of beside the point really. My pessimistic view is that the majority of deployed computational power in the world is consumed by non-productive tasks that has or delivers low social value

That's likely the case. E.g. all bandwidth used to stream of Tik Tok videos.

 

16 hours ago, Kisai said:

True, it doesn't need to be a LLM, but many of these chatbots are simply "here's our webpage on (thing)". Good grief, if you've ever tried to get "help" from google you've probably experienced this kind of "it seems you are asking about (thing), here's a completely irrelevant page that contains only one of the words you said", and google is getting worse at it.

Google has gotten pretty bad at being a search engine. Like Amazon has gotten pretty bad at buying relevant stuffs.

 

I advocate for Local Generative AI. Rather than centralized servers, models should run on end user machines, it might fix a lot of the problems with centralized loss leaders and realign incentives.

 

Microsoft just cannot keep subsidizing Bing Chat. My prediction, is that within five years, all new laptop CPUs will be able to run sparse models, and windows update will let you download an 80GB local multimodal generative AI model that replaces cortana, copilot and windows search, giving you a virtual assistent that is local, and no privacy problems and is aligned to provide a good user experience.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Kisai said:

True, it doesn't need to be a LLM, but many of these chatbots are simply "here's our webpage on (thing)". Good grief, if you've ever tried to get "help" from google you've probably experienced this kind of "it seems you are asking about (thing), here's a completely irrelevant page that contains only one of the words you said", and google is getting worse at it. Sending completely unhelpful emails about services I do and do not pay for, and unable to determine which client the email is for.

That's just Google though and it's a chat bot for something extremely broad. Ours is actually quite helpful since it's very narrowly focused to helping you enroll in courses so it's actually quite a lot easier to give actually helpful information.

 

My insurance company also has chat bot on their website for support and it's so much faster and better than trying to call them and the wait times aren't even bad. I refused to use it for so long because "they are always bad" but good ones aren't, but they can only be good, quick and easy to use when very, very narrowly focused.

 

Google support has to answer questions about literally anything including questions that are well.. you know. So it's difficultly curve is so vastly higher, even humans struggle to do Google support effectively if you can ever get to them.

 

16 hours ago, Kisai said:

I feel the reality is that every company feels compelled to jump on the generative AI bandwagon even if that means that means it erodes the customer trust.

But if we don't do it we might be left behind, lost opportunity is really important 🙃 

 

16 hours ago, Kisai said:

Banks are pretty damn stupid about this too, my bank sends me a new "terms and conditions"  by post, every time they change something, but don't ever say what they changed from the last time (hint, interest rates), so it's like "why waste printing and mailing this at all?" Cause legally they are covering the ass by doing so, yet nearly everyone just throws the thing in the trash.

If it's like here they are required to by law to notify you and they don't tell you what the changes are in the letter for information accuracy reasons, by the time you get and read the letter changes could have been made. Directing you to the website ensures you only see the current up to date information.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, 05032-Mendicant-Bias said:

That's likely the case. E.g. all bandwidth used to stream of Tik Tok videos.

Or playing video games.

 

 

23 minutes ago, 05032-Mendicant-Bias said:

Microsoft just cannot keep subsidizing Bing Chat. My prediction, is that within five years, all new laptop CPUs will be able to run sparse models, and windows update will let you download an 80GB local multimodal generative AI model that replaces cortana, copilot and windows search, giving you a virtual assistent that is local, and no privacy problems and is aligned to provide a good user experience.

I am not so sure about that. Microsoft puts a lot of focus on Azure these days and is also investing a lot of money into building out their AI capabilities in Azure. I think Microsoft is more interested in building a cloud-based solution that can give them money in some way (subscription or whatnot) rather than giving it away for free and having it run locally in a private manner. It just doesn't sound like Microsoft to me.

They probably want to offload some things to the clients because it lowers their costs, but I doubt it will be used for more than some specific functions. I think the model we see today, the cloud- and subscription-based, are here to stay.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

They probably want to offload some things to the clients because it lowers their costs, but I doubt it will be used for more than some specific functions. I think the model we see today, the cloud- and subscription-based, are here to stay.

Microsoft 365 and Copilot is how it is now and will be for anything else. The demos of Copilot and Excel etc looks really good but of course it costs extra to use and per month, I'd expect nothing else at this point from anyone.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

My insurance company also has chat bot on their website for support and it's so much faster and better than trying to call them and the wait times aren't even bad. I refused to use it for so long because "they are always bad" but good ones aren't, but they can only be good, quick and easy to use when very, very narrowly focused.

I'm not trying to pry, but...really?! Chat bots are at best lazy parsers to regurgitate FAQ info.

Specifically for Insurance (automotive or medical), it's not an issue navigating a website. If I need their help, it's always going to be some odd corner-case situation that's as unique as the individuals involved in them. Specifically disputes, that will always require a human being to assess the needs as to how that's best handled on the back-end with internal staff (bill coding, errors in entries..etc) 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, StDragon said:

I'm not trying to pry, but...really?! Chat bots are at best lazy parsers to regurgitate FAQ info.

They are good when you ask questions that aren't basic but contained in policy documents that I'm too lazy to read through to find the bit I need to know about. Or if I want to cancel a policy, but I can do that now with account login and linked policy numbers to my account.

 

I.e. If I hit a sheep on a country road and only have 3rd Party with Fire and Theft is this covered. The answer is yes, under the condition that you can prove negligence in maintaining fencing and keeping gates closed. You won't find that on the website in the FAQ and you can get that answer in less than a minute which you can't calling them, or reading the policy document. I'd know, I asked that question 😉

 

The whole point is to be a lazy parser, that actually is useful. I am more lazy than a computer.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

They are good when you ask questions that aren't basic but contained in policy documents that I'm too lazy to read through to find the bit I need to know about. Or if I want to cancel a policy, but I can do that now with account login and linked policy numbers to my account.

 

This basically.  LLM's can answer basic questions for which the answer has been figured out by someone somewhere ... AND they can maybe provide the "dots" for a person to connect.  With so much crap in internet search results and so much information to search they can make it all inteligible. 

 

The fear is when these LLM's and AI's get so good that people want to replace most human workers with them.  Ever economic model and philosophy in existence is based on the idea that there are two basic elements for an economy.  Resources and Labor.  IF AI's and robotics (eventually) can do the labor then what is there for humans to do?  How do we then decide who gets what resources?  

A concrete technical issue that even the near future sci-fi of energy generation/ collection won't solve is this.https://environmentalsystemsresearch.springeropen.com/articles/10.1186/s40068-020-00169-2#change-history  A scientific paper about the waste heat that data centers and other technologies generate.  All the heat generated by using the technology can itself be an issue.   Even if we generated the electricity from solar power collected in space from asteroid mined resources.  Energy beamed down to the surface... it may be that running computers and EV's on even power like that could be a problem.  

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

The whole point is to be a lazy parser, that actually is useful. I am more lazy than a computer.

But not being satisfied, I would require an AI obey my command with a single action. *SNAP*

image.png.d0e4926f706f51dbc5f9897200dd9821.png

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×