Jump to content

OpenAI unveils "Sora." A prompt-based short video generator with amazing results

JustLovett0
1 hour ago, Stahlmann said:

Is it really that bad if AI takes away jobs if it's better at that job than a human would be?

It's bad for the people replaced, and possibly bad for society as a whole because our current system isn't designed to have a lot of unemployed people.

 

 

1 hour ago, Stahlmann said:

There will always be jobs where manual labor or human interference will be needed. Even if AI takes over some jobs, it also creates new jobs. Currently every AI needs a "copilot" for example. And while an AI could probably be trained to plan houses, they still can't build them.

I am not so sure manual labor will always be needed. It depends on how far into the future we look.

The problem is also that while some new jobs might be created, it seems like those jobs are right now going to be far fewer than the jobs replaced. 

Again, our current system isn't designed to have a lot of unemployed people around. Even if we are being selfish and thinking "fuck the people who will become unemployed. They chose the wrong career", the risk is that those people will end up doing, let's say less good things, which will have a negative impact on the people who did "choose the right career". 

 

I recently had my storage broken into, and the thieves who did that stole a lot of consumables. They ignored the stuff that was worth money, like legos, a lot of tools and so on, and instead, they took things like shampoo, plastic bags, toothpaste and brushes, and so on. Chances are it was a homeless person who just needed things to live. If that person had a job they might not have broken into my storage locker. The misery of other people can often indirectly cause bad things to happen to other people. That's important to keep in mind.

 

 

 

 

 

1 hour ago, Stahlmann said:

While this sounds like a noble goal, this is never going to work. History has repeated itself so many times. That should be proof enough that such a utopia is nothing but pure fantasy.

When in history did this happen before?

 

 

6 minutes ago, Needfuldoer said:

Problem is, these "AI" models are all trying to automate the kind of creativity you'd expect people to want to pursue, instead of the tedious, menial work that has to be done for society to function but nobody actually wants to do.

 

"TormentNexusBot can compose an opera and paint a masterpiece simultaneously. You go back to stocking shelves and doing your taxes."

I don't think that's the case at all.

Someone good at painting didn't just wake up good one day and then start churning out amazing pieces of art. They got to where they were because they enjoyed the creative process itself. They liked drawing even when their drawings weren't that good. 

 

 

To me, saying that nobody will want to take on creative tasks because an AI can do that better is as absurd as saying "easy and cheap access to music has made it so that nobody wants to play an instrument anymore".

A camera can replicate landscapes with extreme detail, yet we still have plenty of people who enjoy painting landscapes with oil and canvas.

 

Freeing up people to pursue creative processes without burdening them with demands for monetary gain will, I hope, make people more willing to try different things and find something they enjoy doing, without worrying about the financial aspects. Because right now it seems like the artists worried about AI taking over have only one thing on their mind, how it will economically impact them once people no longer have to pay them to get art done. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Needfuldoer said:

Problem is, these "AI" models are all trying to automate the kind of creativity you'd expect people to want to pursue, instead of the tedious, menial work that has to be done for society to function but nobody actually wants to do.

 

"TormentNexusBot can compose an opera and paint a masterpiece simultaneously. You go back to stocking shelves and doing your taxes."

That's because a lot of science and tech people are bad at creative aspects, I know for example I wanted to try my hand at game development but that died hard at the needing "art" and "assets"  so I gave that up.

 

People want to create tools to do things they cannot, that's a fundamental basic, also along with making things they can do easier.

 

What I find amusing is "artists" saying code generated art is not art while it's just another form of expression getting shunned like so many forms and styles of art through history. Blind gatekeeping exists everywhere.

 

But yes we shouldn't be trying to push away human art in front of robots flipping burgers for example.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

I find it interesting that the person pretending to have a crystal ball is talking about "fantasy vs reality".

Since you are so sure about what will happen, can you please enlighten us by sharing your wisdom? 

Politics isn't allowed. Thought you would understand my meaning between the lines. I can't elaborate further.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, LAwLz said:

I am not so sure manual labor will always be needed.

Grey-collar, nursing, and civil service occupations will dominate over white-collar by volume as AI replaces white-collar work. AI will augment those of other aforementioned occupations.

 

Problem is what to do with the unemployed people with an IQ below 100? That's a lot of people under that category...(50% of the population by definition).

 

So, I won't elaborate further, but positing the issue for others to ponder.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

It's bad for the people replaced, and possibly bad for society as a whole because our current system isn't designed to have a lot of unemployed people.

Why is someone entitled to a job if an AI can do it quicker and better? That would actually mean they're employed for no good reason other than for the sake of working, not because they're contributing to a society.

 

1 hour ago, LAwLz said:

I am not so sure manual labor will always be needed. It depends on how far into the future we look.

The problem is also that while some new jobs might be created, it seems like those jobs are right now going to be far fewer than the jobs replaced. 

Again, our current system isn't designed to have a lot of unemployed people around. Even if we are being selfish and thinking "fuck the people who will become unemployed. They chose the wrong career", the risk is that those people will end up doing, let's say less good things, which will have a negative impact on the people who did "choose the right career".

This issue has "plagued" humanity for thousands of years. All the time jobs become obsolete because of technological advancement and these people need to go into another direction. How many people nowadays are blacksmiths compared to mediveal times for example? How many people used to put together cars on an assembly line just 10-20 years ago? Are all of them just without work today? Of course not, most of them found another job. This isn't a new problem just because of AI. It's a problem humanity has solved time and time again, and it will work itself out again.

 

1 hour ago, LAwLz said:

I recently had my storage broken into, and the thieves who did that stole a lot of consumables. They ignored the stuff that was worth money, like legos, a lot of tools and so on, and instead, they took things like shampoo, plastic bags, toothpaste and brushes, and so on. Chances are it was a homeless person who just needed things to live. If that person had a job they might not have broken into my storage locker. The misery of other people can often indirectly cause bad things to happen to other people. That's important to keep in mind.

Sure, but there are also a bunch of people who break into houses, maybe kill their inhabitants just because of a few gold necklaces. You make it sound like society made them break into your storage by not providing them with a job, but they made the decision to break in by themself. Don't take away a person's autonomy or personal accountability.

 

1 hour ago, LAwLz said:

When in history did this happen before?

When in history did it not happen is the better question. Never was there ever a time where some universal system could be applied to everyone. There is always a gap in culture, needs and fundamentally what people want from their life.

 

Was there ever a point where people could peacefully live in a society without contributing to said society? That's what you're doing while working. You're contributing to society to get a currency in return so you can pay someone else with said currency to provide you with something you don't have.

 

 

 

Everyone piling up drama about AI taking jobs is just screaming into the void imo. Why not bark at construction companies for replacing 10 people tediously digging a hole with shovels and pickaxes with one man driving a digger? You're not gonna stop AI from getting in the hands of people who learn to work with it, just as people couldn't stop robots from taking over assembly lines or literally any other tool that has been introduced since humanity began using them. If someone is easily replaced by an algorithm, what reason is there to keep that person in that job?

 

There are still plenty of jobs that aren't affected by this, like for example tradesmen like plumbers, woodworkers, etc. which there aren't enough of anyway. The problem isn't that there are no jobs. The problem is that some people would rather not work at all than having to work on a construction site for example. But that's a personal decision, not one made by society.

 

If all this is still just about the morality of "taking away jobs is bad", then practically any tool ever introduced falls under the same sentiment. And the only solution basically is to stop innovating. On the other hand people have without a doubt a more comfortable and better life now than they would've had 100 years ago. We wouldn't have got here if humanity just decided to stop innovating.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

Why is someone entitled to a job if an AI can do it quicker and better? That would actually mean they're employed for no good reason other than for the sake of working, not because they're contributing to a society.

I never said they were entitled to a job, nor did I say we should keep them employed.

Me saying that society is not designed to accommodate a lot of unemployed people does not mean I am saying we should keep people employed with "make-believe" jobs.

Again, if we go back to my original post you will see that I advocate for a future where nobody has a job.

 

 

1 hour ago, Stahlmann said:

This issue has "plagued" humanity for thousands of years. All the time jobs become obsolete because of technological advancement and these people need to go into another direction. How many people nowadays are blacksmiths compared to mediveal times for example? How many people used to put together cars on an assembly line just 10-20 years ago? Are all of them just without work today? Of course not, most of them found another job. This isn't a new problem just because of AI. It's a problem humanity has solved time and time again, and it will work itself out again.

Maybe you're right that things will work themselves out, but the way I see it, the jobs being created are far fewer than the potential jobs lost, and the systems that are threatening to replace a lot of jobs in the future might be able to adapt to the potential future jobs that get created.

 

 

1 hour ago, Stahlmann said:

Sure, but there are also a bunch of people who break into houses, maybe kill their inhabitants just because of a few gold necklaces. You make it sound like society made them break into your storage by not providing them with a job, but they made the decision to break in by themself. Don't take away a person's autonomy or personal accountability.

I never said society took away a person's personal responsibilities. What I said was that they would probably have been less likely to commit those crimes if they were in a better place in life. I am in no way trying to remove responsibility for those who broke in and stole my stuff. I am in no way trying to say it was "society's fault" either.

 

What I am trying to get across is that having a bunch of unhappy and miserable people will negatively affect the lives of those who do not belong to that group. This is not me advocating for some specific political system either. This is just me pointing out that it is in the best interest of everyone to keep everyone feeling as happy and good as possible because it only takes a handful of people to cause issues for others. Ensuring that individuals are in a "better place in life" can potentially reduce the risk of them turning to crime or doing other things that would negatively impact you and I.

 

Since you seem to make a lot of assumptions about what I write I just want to clarify, this is not me making a proposal or saying we should implement some specific policy or whatnot. This is just me pointing out that if we suddenly get a bunch of unemployed people it will negatively affect those who have jobs too.

 

 

1 hour ago, Stahlmann said:

When in history did it not happen is the better question. Never was there ever a time where some universal system could be applied to everyone. There is always a gap in culture, needs and fundamentally what people want from their life.

I asked for examples. When in history has a new technology threatened to replace potentially the majority of workers, in a way that could probably be adapted to replace future jobs as well?

If we look back at history we have seen manual labor jobs get replaced with jobs that require more cognitive abilities. This time we see the jobs that require cognitive abilities get replaced. 

 

 

 

  

1 hour ago, Stahlmann said:

Everyone piling up drama about AI taking jobs is just screaming into the void imo. Why not bark at construction companies for replacing 10 people tediously digging a hole with shovels and pickaxes with one man driving a digger? You're not gonna stop AI from getting in the hands of people who learn to work with it, just as people couldn't stop robots from taking over assembly lines or literally any other tool that has been introduced since humanity began using them. If someone is easily replaced by an algorithm, what reason is there to keep that person in that job?

I feel like you have categorized me into some box and are not fully taking in what I am writing. You're assuming I think a certain way and as a result, you interpret all my posts as written by someone who disagrees with you on some important aspects, even though I have not once said I disagree with you on those points. I find it very difficult to have a conversation with you because it feels like you are putting words in my mouth than you think I am saying or thinking, when I am saying something completely different.

 

 

 

I am not saying that "taking jobs = bad". The point of my posts isn't to resist change or be against it. It's about preparing for the consequences of the changes. If anything, what I am advocating is for more change. What changes need to be put in place is something I can't answer, but we can't keep doing things the way we are right now because society isn't build to handle the coming changes.

Link to comment
Share on other sites

Link to post
Share on other sites

Well looks like any media content creator should start polishing their resume. Hollywood will literally be able to crank out entire season's of shows in days not months now.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, LAwLz said:

I never said they were entitled to a job, nor did I say we should keep them employed.

Me saying that society is not designed to accommodate a lot of unemployed people does not mean I am saying we should keep people employed with "make-believe" jobs.

Again, if we go back to my original post you will see that I advocate for a future where nobody has a job.

In that case I failed to understand your argument with the break-in.

 

33 minutes ago, LAwLz said:

 

Maybe you're right that things will work themselves out, but the way I see it, the jobs being created are far fewer than the potential jobs lost, and the systems that are threatening to replace a lot of jobs in the future might be able to adapt to the potential future jobs that get created.

That is true. But I also mentioned that there are a lot of job opportunities outside of the places impacted by AI. There is a huge shortage of craftsmen. Taking away a lot of these AI-replaceable jobs will probably motivate people losing their job AND future trainees to prefer these manual labor jobs to fill that gap.

 

33 minutes ago, LAwLz said:

I never said society took away a person's personal responsibilities. What I said was that they would probably have been less likely to commit those crimes if they were in a better place in life. I am in no way trying to remove responsibility for those who broke in and stole my stuff. I am in no way trying to say it was "society's fault" either.

That was just how I interpreted your argument. Maybe I just misunderstood.

 

33 minutes ago, LAwLz said:

What I am trying to get across is that having a bunch of unhappy and miserable people will negatively affect the lives of those who do not belong to that group. This is not me advocating for some specific political system either. This is just me pointing out that it is in the best interest of everyone to keep everyone feeling as happy and good as possible because it only takes a handful of people to cause issues for others. Ensuring that individuals are in a "better place in life" can potentially reduce the risk of them turning to crime or doing other things that would negatively impact you and I.

 

Since you seem to make a lot of assumptions about what I write I just want to clarify, this is not me making a proposal or saying we should implement some specific policy or whatnot. This is just me pointing out that if we suddenly get a bunch of unemployed people it will negatively affect those who have jobs too.

My whole point is that your proposal for a future where nobody has to work is not compatible with how humanity and society works. It's not just "society has to change". It's more like society as we know it cannot exist in that concept, which is the argument I'm trying to make here.

 

33 minutes ago, LAwLz said:

I asked for examples. When in history has a new technology threatened to replace potentially the majority of workers, in a way that could probably be adapted to replace future jobs as well?

One of the examples I already mentioned and imo one of the best. Assembly robots. They already took many of the bog standard assembly line jobs. And as they get more and more advanced they will continue to completely take over the industry. Yet, in the long run they didn't leave millions of long-term unemployed people. Most of them moved on and got into a different job, be it similar or completely different.

 

33 minutes ago, LAwLz said:

If we look back at history we have seen manual labor jobs get replaced with jobs that require more cognitive abilities. This time we see the jobs that require cognitive abilities get replaced. 

So is it really that surprising that it takes a turn and goes back the other way, since again, manual labor is currently in extreme demand? Is it a bad thing if humans go back to labor jobs and leave the number crunching to a more efficient AI?

 

33 minutes ago, LAwLz said:

  I feel like you have categorized me into some box and are not fully taking in what I am writing. You're assuming I think a certain way and as a result, you interpret all my posts as written by someone who disagrees with you on some important aspects, even though I have not once said I disagree with you on those points. I find it very difficult to have a conversation with you because it feels like you are putting words in my mouth than you think I am saying or thinking, when I am saying something completely different.

 

The point of my posts isn't to resist change or be against it. It's about preparing for the consequences of the changes. If anything, what I am advocating is for more change. What changes need to be put in place is something I can't answer, but we can't keep doing things the way we are right now because society isn't build to handle the coming changes. 

The last part of my post wasn't only directed at you, which is why I used the term "everyone". That's why I lashed out a bit farther. It wasn't my intention to make it seem like you said it.

 

But in my opinion many people are (maybe unintentionally) trying to overthink the latest boom in AI technology and make it more intimidating than it really is. I'm of the opinion that this is a process humanity has gone through many times after big technological breakthroughs (other examples would be electricity, combustion engines, etc.). Every time such a breakthrough happens, the economics will change to accomodate them. And the people that don't, go under. People will lose jobs, people will be hired, some people might need to relearn something to be an attractive worker again. But every single time this happened, it worked itself out. But not just because society isn't prepared for so many people without jobs, it's also because people aren't able to sustain their life/family without a job.

 

TL;DR: I think this discussion is moot because of the countless times people have gone through changes of that magnitude.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

For all the slimeness of its CEO, OpenAI is onto something with Sora

 

It's not just Temporal Choerence that is improved, but the latent space of the model might have learned a simulacrum of cause and effect and temporal correlation, not just semantic correlation.

 

In several clips, there are sequences that seem time reversed, like the minecraft pig that is moving forward in space but backward in time, and the opening by hand in the "scifi trailer" clip which hints that the latent space has inferred temporal correlation.

 

Videos are even more rich in data than images, we have trillions of hours of videos to sift trough about everything. Training a large model on that might result in something really interesting!

 

I still maintain that we need at least two big model revolutions before we can make a proper AGI:

  1. Cause and Effect/Reasoning
  2. Long term memory

Sora seems to be biting at the first problem on the latent space representation.

  

3 hours ago, Stahlmann said:

My whole point is that your proposal for a future where nobody has to work is not compatible with how humanity and society works. It's not just "society has to change". It's more like society as we know it cannot exist in that concept, which is the argument I'm trying to make here.

My preferred solution is the "Automated Luxury Communism" as my friends call it. AI automate everything, and I mean everything. We get into a post scarcity high trust society where humans are left with hobbies and just enjoy life, or help the AI push AI forward if they want.

 

We get there by creating a moderate UBI financed by a tax on automation, the more automation takes over, the bigger the UBI, until nobody has to work to live and AIs fix all our problems for us.

 

 

The opposite scenario is the "Cyberpunk Dystopia", automation still wins, but the corporations that owns the automation have control over humanity and use it to maximize profits and misery.

 

We get there by doing nothing and letting people like Bezof and Sam Altman getting into regulatory capture and owning the automation (including military automation).

 

 

I call the third scenario "Second Dark Age", the governments bans automation, maybe after a close call, and inquisition make sure no automation can be developed. Climate change and other problem we set for ourselves catch up with us.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, 05032-Mendicant-Bias said:

I still maintain that we need at least two big model revolutions before we can make a proper AGI:

  1. Cause and Effect/Reasoning
  2. Long term memory

Sora seems to be biting at the first problem on the latent space representation.


So, nature figured this out long ago via evolution. Those that could, did survive and their ancestors carry on those advantages to this day. Generative Adversarial Network (GAN) modeling is basically that for logic.

If intelligence and reasoning is an emergent phenomenon, you're going to need a virtual jungle and let it act as the filter by which these emergent attributes surface.

Problem with that is that there's no reason to assume this wouldn't cut loose and kill off the dominate apex predator occupying Earth (humanity). Unless of course it rationalizes that mutual cooperation is the best path forward via symbiosis for both human and AI. It depends on how forward thinking such an AI would be. Also worth noting, that there's a theory that the human brain actually shrank due to domestication; presumably it's more efficient in terms of resources consumed and survivability of the race as a whole. Would AI learn that domestication is also an advantage? Or because it never starves for energy, will it keep expanding in power and strength via brutal means? Remember, it only knows from past experience; it may never get a chance to learn that until it's too late.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, Stahlmann said:

 

Everyone piling up drama about AI taking jobs is just screaming into the void imo. Why not bark at construction companies for replacing 10 people tediously digging a hole with shovels and pickaxes with one man driving a digger? 

 

Not even remotely the same.

 

Here's a better analogy, molds. Why not make a mold of something instead of sculpting it? Because molds wear out. There is more than one way to "make" something, and physical products are about reducing waste. When you have a mold, you have zero waste, but you have a lot of defects. When you sculpt things, you have a lot of waste, but the end result is stronger and more accurate, it's an art form and no two sculpted items are identical. This is in fact one of the ways you identify counterfeit luxury products, because fake products are made with molds, and legitimate products are sculpted (eg hand made, and thus unique.)

 

Ultimately there is a functional difference between how it's made, but the end user may not actually give a care. This is why people buy counterfeit garbage. They want to be seen wearing luxury products, but lack the knowledge and experience to know they look like idiots for wearing counterfeit garbage.

 

And that's what we see with generative AI. It's the lack of care and lack of experience of the people using generative AI to mass produce garbage that any artist can identify as garbage, but the people cherry picking this stuff and posting it on art sites seem to think they are some kind of genius, when in fact, they look like clowns.

 

Clowns are just going to accept clowning on each other as art. Everyone else is like "I'm not enjoying the clown show", and aren't willing to go to the AI circus on principle then.

 

An even simpler analogy is cookies. Home made cookies taste better, because you've accustomed yourself to eating those. But if all you've ever had are those garbage dried out cookies you find in grocery stores, then you might not realize that good cookies exist and what they are missing. A mass produced cookie is "Good enough" and highly profitable, but they're not better than the recipe on the back of the nestle tollhouse chocolate chip recipe you can make yourself.

 

A lot of this generative AI stuff is cherry picked cookies out of piles and piles of cookies made of toenail clippings and sawdust. Sure, it looks like a cookie, but it isn't edible.

Link to comment
Share on other sites

Link to post
Share on other sites

Main issue is that Ai does approximation of what we want which you can instruct it via text input as description. And while it looks great in isolation, you can't make something specific looking. It's like instructing someone to draw something instead of doing it yourself exactly what you want. For example, you can say "a girl with big glasses" and Ai will just do whatever following that description. But even if you go with "asian girl with big red glasses that are round and have very thin frame", the Ai will still just render whatever that falls into that description. Artist can do the same, but you can tweak the things mid way, adjust them, reshape them and redesign them. I'm sure they'll add tweaks to Ai too, but it'll still be very "whatever that fits". It'll look good to the external observer who has no idea what the concept was internally, but designers and artists will know, because it won't output exactly what you need, just approximation of what you need.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

Not even remotely the same.

 

Here's a better analogy, molds. Why not make a mold of something instead of sculpting it? Because molds wear out. There is more than one way to "make" something, and physical products are about reducing waste. When you have a mold, you have zero waste, but you have a lot of defects. When you sculpt things, you have a lot of waste, but the end result is stronger and more accurate, it's an art form and no to sculpted items are identical. This is in fact one of the ways you identify counterfeit luxury products, because fake products are made with molds, and legitimate products are sculpted (eg hand made, and thus unique.)

 

Ultimately there is a functional difference between how it's made, but the end user may not actually give a care. This is why people buy counterfeit garbage. They want to be seen wearing luxury products, but lack the knowledge and experience to know they look like idiots for wearing counterfeit garbage.

 

And that's what we see with generative AI. It's the lack of care and lack of experience of the people using generative AI to mass produce garbage that any artist can identify as garbage, but the people cherry picking this stuff and posting it on art sites seem to think they are some kind of genius, when in fact, they look like clowns.

 

Clowns are just going to accept clowning on each other as art. Everyone else is like "I'm not enjoying the clown show", and aren't willing to go to the AI circus on principle then.

 

An even simpler analogy is cookies. Home made cookies taste better, because you've accustomed yourself to eating those. But if all you've ever had are those garbage dried out cookies you find in grocery stores, then you might not realize that good cookies exist and what they are missing. A mass produced cookie is "Good enough" and highly profitable, but they're not better than the recipe on the back of the nestle tollhouse chocolate chip recipe you can make yourself.

 

A lot of this generative AI stuff is cherry picked cookies out of piles and piles of cookies made of toenail clippings and sawdust. Sure, it looks like a cookie, but it isn't edible.

I don't think it's fair to throw all AI generated art under that bus. I've seen some genuinely good art coming from AI. Of course that's highly subjectve, same as non-AI generated art. I also don't think everything hand-crafted is automatically better than everything mass produced. You still have to have the right recipe and technique to get home made cookies to taste better than a mass produced one. Only a small faction of "real" human artists produce art that speaks to my taste, so let's say 90% of what they produce is stuff i don't like. So if only a fraction of AI art is something i like, it's not that different from human output imo. But in terms of Art I think this discussion is moot because art and it's value is so deeply subjective. If anything it made it more accessible. You either had skill, learnt skill or commissioned skill if you wanted a specific picture before. Now you have the option to just enter a promt and see what it spits out. It's just one more tool. It doesn't have to be the only one. I think there is enough space for both AI and human artists to coexist.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, RejZoR said:

Main issue is that Ai does approximation of what we want which you can instruct it via text input as description. And while it looks great in isolation, you can't make something specific looking. It's like instructing someone to draw something instead of doing it yourself exactly what you want. For example, you can say "a girl with big glasses" and Ai will just do whatever following that description. But even if you go with "asian girl with big red glasses that are round and have very thin frame", the Ai will still just render whatever that falls into that description. Artist can do the same, but you can tweak the things mid way, adjust them, reshape them and redesign them. I'm sure they'll add tweaks to Ai too, but it'll still be very "whatever that fits". It'll look good to the external observer who has no idea what the concept was internally, but designers and artists will know, because it won't output exactly what you need, just approximation of what you need.

I'd argue it's still better than what the majority of people could come up with using their own skills. People that can't draw can use tools like this while the people that can draw do it exactly how they want. It's not a perfect replacement for not being able to draw yourself, but it's close enough for a tool without a significant learning curve.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

A lot of this generative AI stuff is cherry picked cookies out of piles and piles of cookies made of toenail clippings and sawdust. Sure, it looks like a cookie, but it isn't edible.

1 hour ago, Stahlmann said:

I don't think it's fair to throw all AI generated art under that bus. I've seen some genuinely good art coming from AI. Of course that's highly subjectve, same as non-AI generated art. I also don't think everything hand-crafted is automatically better than everything mass produced.

Generative AI is like a new tool, with countless knobs to turn and tune.

 

You can do like Crypto people do, and spit out 100 variation of a low effort garbage for a quick fraud. Or you can use it as part of a workflow to make handcrafted images.

 

Applications are endless! Imagine a solo game developer, that now instead of generic texture, can use Generative AI to get something unique, like his own grass asset that tiles.

 

1 hour ago, Stahlmann said:

I'd argue it's still better than what the majority of people could come up with using their own skills. People that can't draw can use tools like this while the people that can draw do it exactly how they want. It's not a perfect replacement for not being able to draw yourself, but it's close enough for a tool without a significant learning curve.

E.g. This is a ship I made for my D&D campaign, it took me about one day to make (I also made the lower decks), I included damage my PC incurred in the first encounter. It's a pretty laborious process involving inpaint/ image edit, took a lot of research on models that understands how to draw something seen from above. I'm pretty sure a pro artists would scoff at this image, still, as an amateur this is incomparable to what I can make with legacy tools. I can only make something like this using Generative AI, I don't need skills with a brush, I only need skills in prompting, some elementary image editing and the creativity to imagine what the end result should look like.

image.thumb.jpeg.d4bcee8910ded506b3ab4271a5e797cd.jpeg

 

So much of our progress as a civilization is to be able to make something we could make before, but at a larger scale and cheaper with less labour. Everyone benefits on the long run, even the people that were toiling to make that something the hard way before.

 

My point is that Generative AI is a tool that decouples creativity from the manual ability to use brushes. It's an ideal target for automation to come in, and lower the barrier of entry. What Sora might be able to do, is that instead of taking a stock video from Shutterstock for a B roll, you can prompt something, and have your own B roll. It's a new tool in the toolbox, and it's unambiguously good that we have it and they are getting better.

 

Artists won't be put out of work by Generative AI, no more than they were put out of job by photography automating portraits, or by photoshop, or by digital tablets, etc... etc... Your boss will never prompt Midjourney for a company logo. It will always hire an artist with the creativity to come up with a unique and relevant company logo, and your boss couldn't care less what tools the artist use to make that logo.

  

1 hour ago, RejZoR said:

Main issue is that Ai does approximation of what we want which you can instruct it via text input as description. And while it looks great in isolation, you can't make something specific looking.

With something like Midjourney, yes, you have to make it all in one go using text, which is really hard.

 

Stable Diffusion  has an inpaint tool, you select a little piece of the image, and change just that. it's a laborious process, but it allows you to have very high control on the composition of the image. Used like this, it's like using the smartest brush you can imagine, it' won't just draw brown pixels, but it will draw "face like" "apple like", etc...

 

E.g. on the left I made about the best I can manually draw as a mask (I'm that bad at drawing), to give a very strong hint about what I want to get, on the right is the img2img output, with another handrawn mask to hint at what I want to do next. Doing like this you have very high control on the output.

image.thumb.jpeg.e2f84204e920f3d336a45bc11fdb5a11.jpeg

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, StDragon said:

Problem with that is that there's no reason to assume this wouldn't cut loose and kill off the dominate apex predator occupying Earth (humanity). Unless of course it rationalizes that mutual cooperation is the best path forward via symbiosis for both human and AI.

Veritasium made a video about the prisoner's dilemma, the conclusion is very hopeful. That in our world, cooperation is rewarded, you see it all the time emerging in the real world across species.

Personally I'm hopeful that a properly implemented AGI would see war and conflict as a waste of resource and a negative sums game, and would instead be biased toward positive sums games, diplomacy and cooperation to tackle common problems.

 

Our civilization needs all the help it can get to get out of the existential problems that are catching up with us. Climate change, fusion energy, antibiotic resistence, water scarcity ,raising inequality of economic policies, etc...

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, 05032-Mendicant-Bias said:

Our civilization needs all the help it can get to get out of the existential problems that are catching up with us. Climate change, fusion energy, antibiotic resistence, water scarcity ,raising inequality of economic policies, etc...

We are well, well past the climate change tipping point. We were warned in the 70's, and the 80's and the 90's, and then we hit the tipping point in the 2000's. Now we're in the "coping and hoping for a miracle for the terminal patient" stage. I distinctly remember this stuff around the time of the CFC ban.

 

Humanity will survive, but we're likely going to be mostly wiped out by disease or war first. Like I'm not seeing an endgame of either waterworld or max max, but more like one where we're not allowed to go outside because we will be cooked in minutes, and towers stop having glass windows and instead have permanently closed shutters, with more towers start being built into the ground to reduce solar exposure. 

 

But that's just what I see being the logical path we're heading towards. There is no judgement day, there's no aliens, there's no savior. We're just here for the ride.

 

The thing is that a lot of AI uses just waste energy and are a net contributor towards the destruction of the planet, just like the cryptocoin and NFT trash before it. I'm sure there will be good uses of AI, but we're not going to see AGI in the next century, not without some breakthru in sub-atomic computer lithography that removes waste heat from the equation. It is bad enough that a high end PC puts out as much heat as a space heater, no multiply that across every PC in people's homes and in data centers. 

 

So given how much energy is required just to build a stand-alone chatbot that understands visuals and audio, and can talk, animate and generate video back to you, you're looking at least 2 million dollars in 1.5 HGX systems with 5 independent AI's (12 x 700w H100's) running between them (8400w, as much as a small electric furnace). Really, the problem here is that an AI that consumes that much power is only going to be justified if it's constantly in use (hence the cloud applications.) That's bad for the environment if these cloud systems are just used to generate endless garbage for the internet.

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Kisai said:

That's bad for the environment if these cloud systems are just used to generate endless garbage for the internet.

Well I mean... they already largely are, probably the majority.

 

38 minutes ago, Kisai said:

you're looking at least 2 million dollars in 1.5 HGX systems

They're actually not that expensive. Two of them are around $600k, we have older 8x A100 based system which really wasn't that expensive at the time and the newer H100 while more is far lower than you are thinking.

 

Still expensive servers though.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

Well I mean... they already largely are, probably the majority.

Yeah, and? There's money to be made, and there's no regulations against "wasting energy on speculative AI/Cryptocurrency/Research"

12 minutes ago, leadeater said:

They're actually not that expensive. Two of them are around $600k, we have older 8x A100 based system which really wasn't that expensive at the time and the newer H100 while more is far lower than you are thinking.

 

Still expensive servers though.

Still "that's as much as a house" money. Like if I had that much money and could build a waifu-chatbot-vtuber with it, I don't think I would ever make that money back in a decade, if ever. It's not disposable money.

 

Now a corporation though, branded mascots that actually have some semblance of AGI despite putting out as much thermal energy as a furnace but can replace the entire customer-facing experience globally and speak everyones language. That can replace piles of customer support, marketing and PR (and even C-suite) jobs in industries that are currently run by incompetent numbskulls.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Kisai said:

Yeah, and? There's money to be made, and there's no regulations against "wasting energy on speculative AI/Cryptocurrency/Research"

There is no and. Just pointing out it's not really so much a situation of if. This is not even talking about AI or anything like that, just look at how much it takes to run YouTube, Twitch, Facebook, Tik Tok etc, that's a lot of energy to move around and distribute a lot of garbage. Gems may exist within that garbage but there is still tons of "not gems".

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

There is no and. Just pointing out it's not really so much a situation of if. This is not even talking about AI or anything like that, just look at how much it takes to run YouTube, Twitch, Facebook, Tik Tok etc, that's a lot of energy to move around and distribute a lot of garbage. Gems may exist within that garbage but there is still tons of "not gems".

Youtube doesn't have to store garbage, neither does anyone else, they just choose to. Though they may be forced to do some kind of base-level quality control since they already decided to stop caching the web, they might start stomping on videos that have little or no viewership in 1 year.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Kisai said:

Youtube doesn't have to store garbage, neither does anyone else, they just choose to. Though they may be forced to do some kind of base-level quality control since they already decided to stop caching the web, they might start stomping on videos that have little or no viewership in 1 year.

Nobody has to do anything, kind of beside the point really. My pessimistic view is that the majority of deployed computational power in the world is consumed by non-productive tasks that has or delivers low social value.

 

Lots of CPU and GPU compute resources are used to sift through huge data lakes of "information" by insurance and health companies, finance and banks, large retail like Amazon and Walmart all while I don't see much actual benefit from that, in their own company perspective. Just seems like a racket for people interest in the technology to get money and resources to do "stuff"  and then use that to justify why that was a good spend of money and resources by proving what they did was worthwhile with their own data they analyzed and presented. Lets just say there is a high incentive to make it look more useful than it is.

 

It actually doesn't take a lot of resources to create a company specific chatbot for customer service, we have one on our website to assist students with enrolling etc and it's not LLM at all, it's not necessary. Even so if it were the creation of it today wouldn't be a lot and the actual running of it doesn't really require a whole lot.

 

What uses huge amounts of computation is people like OpenAI constantly "training" their technology on giant amounts of data over and over, all the time. Generating video like this story is just an order of magnitude higher in requirements.

 

OpenAI is still the minority of datacenter GPU and CPU usage though, along with other LLM's people are exploring. "He who shouts the loudest gets heard" <-- AI/LLM/OpenAI.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

... just look at how much it takes to run YouTube, Twitch, Facebook, Tik Tok etc, that's a lot of energy to move around and distribute a lot of garbage. Gems may exist within that garbage but there is still tons of "not gems".

Just looking for quality content is like...
 

image.png.3dc2bcd7954bd920f06e8e23b855c3d7.png

 

You have to crawl through a lot of 💩 to get to it.

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/16/2024 at 12:25 AM, Lightwreather said:

It appears Larry the Cucumber was right

even non-ai-generated internet humor sounds like it's completely random

Quote
Quote
Quote

By reading this, you're entering a contract that says you have to visit my profile.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Nobody has to do anything, kind of beside the point really. My pessimistic view is that the majority of deployed computational power in the world is consumed by non-productive tasks that has or delivers low social value.

Then why stop "caching the web?" Cause it doesn't make them money. Same reason they stopped doing google reader, there's no way to monetize it, so just lie and go "nobody uses this" when the reality is that it's still used by people who have been using it for years.

 

 

6 hours ago, leadeater said:

Lots of CPU and GPU compute resources are used to sift through huge data lakes of "information" by insurance and health companies, finance and banks, large retail like Amazon and Walmart all while I don't see much actual benefit from that, in their own company perspective. Just seems like a racket for people interest in the technology to get money and resources to do "stuff"  and then use that to justify why that was a good spend of money and resources by proving what they did was worthwhile with their own data they analyzed and presented. Lets just say there is a high incentive to make it look more useful than it is.

 

Mmm,  yeah, I'd say a lot of sitting on data is useful to the company who has acquired it, but only if they're doing something with it, and just not perpetually sitting on it. One of my website clients is literately sitting on 10 years of usage data of their site and maybe it gets used once a year to decide what stuff to not renew. Aside from that it costs more to keep it than it does to use it.

 

6 hours ago, leadeater said:

It actually doesn't take a lot of resources to create a company specific chatbot for customer service, we have one on our website to assist students with enrolling etc and it's not LLM at all, it's not necessary. Even so if it were the creation of it today wouldn't be a lot and the actual running of it doesn't really require a whole lot.

True, it doesn't need to be a LLM, but many of these chatbots are simply "here's our webpage on (thing)". Good grief, if you've ever tried to get "help" from google you've probably experienced this kind of "it seems you are asking about (thing), here's a completely irrelevant page that contains only one of the words you said", and google is getting worse at it. Sending completely unhelpful emails about services I do and do not pay for, and unable to determine which client the email is for.

image.png.97b527ba6421eed12f1807ce75f585a1.png

No indication what account this is for, and clicking the links in it just bring you to generic stuff that's even less helpful. Like, quit pushing the onus on the customer to understand these changes, a lot of people just want "set it and forget it" and don't want to have to bloody micro-manage every damn service.

 

Banks are pretty damn stupid about this too, my bank sends me a new "terms and conditions"  by post, every time they change something, but don't ever say what they changed from the last time (hint, interest rates), so it's like "why waste printing and mailing this at all?" Cause legally they are covering the ass by doing so, yet nearly everyone just throws the thing in the trash.

 

6 hours ago, leadeater said:

What uses huge amounts of computation is people like OpenAI constantly "training" their technology on giant amounts of data over and over, all the time. Generating video like this story is just an order of magnitude higher in requirements.

 

OpenAI is still the minority of datacenter GPU and CPU usage though, along with other LLM's people are exploring. "He who shouts the loudest gets heard" <-- AI/LLM/OpenAI.

Meh, something like that. I feel the reality is that every company feels compelled to jump on the generative AI bandwagon even if that means that means it erodes the customer trust. It's like two years ago with the NFT garbage, you have companies like Ubisoft and Square Enix still investing in this garbage when it's been completely rejected, anti-consumer. Heck any time I see someone with ETH or BTC in their profile I just assume they're a FOMO grifter now.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×