Jump to content

Bing has no Feelings, Bing isn't Sentient, so why is it "depressed"?

betalars

In the most recent WAN-Show, Luke talked about his experiences with Bing and how it entirely went off the rails and seemed to act like a teenager.

This is one of the worst takes I've seen on WAN-Show in a while. Looking for a Teenager in Bing is the same as looking for a Ghost in a creepy warehouse. Your brain will find it, and it's a fun activity you can do with friends. But when you do it as a Journalist, it's [misleading framing at best, misinformation at worst].

 

Because Bing Chat isn't a Teenager? It is a Text Predictor. It has been developed to generate the response a human would most likely write up on a given prompt. Okay, that's Science Jargon, what's a real world reference?

 

Do you know this game, where you start writing a text message like "My most valuable memory is when I ..." and then let autocomplete fill in the rest?

That is in a nutshell how Bing Chat works. Bing is just more powerful, and has access to the internet.

 

That is why [I'm so irritated, when Linus and Luke seem to treat it like a person.]

 

When you ask it a weird personal Question, it does not consider how it feels about it to then tell you an engaging story.

It takes in your Jibberish, finds similar Jibberish on the internet, and does some advanced statistical analasis on what Jibberish will most likely come next to then spit out the average the average response an internet would have to your Jibberish. At no point did it understand what you asked it, in a sense that a human would.

But when you treat it as a human, you will see a Ghost. Because that is what our brains evolved to do. And they get spooked, when something that isn't actually human seems like it is, but not quite.

 

And it mimics gaslighting, depression, insecurity, threats and so on - not because it has feelings, but because those things are on the internet.

 

So how should you think of Bing Chat?

It's a hive-mind, that can see a lot of the internet, but cannot understand a single word. But it's brilliant at statistics. And it will guess what noise the internet will most likely come up with given a signal you feed into it.

 

For more Reference, look up the Chinese Room problem.

Edited by betalars
Wasnt happy with some of the Wording, because many of Linus and Lukes statements were statements of opinion, and I kind of phrased it like they were statemtns of fact.
Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, betalars said:

In the most recent WAN-Show, Luke talked about his experiences with Bing and how it entirely went off the rails and seemed to act like a teenager.

This is one of the worst takes I've seen on WAN-Show in a while. Looking for a Teenager in Bing is the same as looking for a Ghost in a creepy warehouse. Your brain will find it, and it's a fun activity you can do with friends. But when you do it as a Journalist, it's misinformation.

 

Because Bing Chat isn't a Teenager? It is a Text Predictor. It has been developed to generate the response a human would most likely write up on a given prompt. Okay, that's Science Jargon, what's a real world reference?

 

Do you know this game, where you start writing a text message like "My most valuable memory is when I ..." and then let autocomplete fill in the rest?

That is in a nutshell how Bing Chat works. Bing is just more powerful, and has access to the internet.

 

That is why Linus and Luke are so wrong when they think of Bing as a Person.

 

When you ask it a weird personal Question, it does not consider how it feels about it to then tell you an engaging story.

It takes in your Jibberish, finds similar Jibberish on the internet, and does some advanced statistical analasis on what Jibberish will most likely come next to then spit out the average the average response an internet would have to your Jibberish. At no point did it understand what you asked it, in a sense that a human would.

But when you treat it as a human, you will see a Ghost. Because that is what our brains evolved to do. And they get spooked, when something that isn't actually human seems like it is, but not quite.

 

And it mimics gaslighting, depression, insecurity, threats and so on - not because it has feelings, but because those things are on the internet.

 

So how should you think of Bing Chat?

It's a hive-mind, that can see a lot of the internet, but cannot understand a single word. But it's brilliant at statistics. And it will guess what noise the internet will most likely come up with given a signal you feed into it.

 

For more Reference, look up the Chinese Room problem.

This basically.

Just about every 'youtuber' has ADHD, anxiety, depression, some form of LGBTQ identity and so on, why? because in most cases thier management team has told them that is how to get engagement from a potential audience, so they make a video titled 'my battle with.... insert trendy issue here'.

These so called AIs are just 'responding' based on current trends washing over the internet like dirty dish water.

A true AI would be able to come up with an original thought of it's own without any human input, pre-written algorithms, training or rules & comparisons.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, DigitalGoat said:

Just about every 'youtuber' has ADHD, anxiety, depression, some form of LGBTQ identity and so on, why?

WOAH, that is such an awful stretch.

 

YouTuvers are actually humans. And they would not be popular, if they weren't odd and/or relatable.

The most average Joe is just not as entertaining as ADHD Linus. And Influencers tend to be a lot more personal with their Audience compared to the Average Actor for instance. That's why you are more likely to hear about their anxieties.

 

This has nothing to do with "AI-Ghosts." And I think it's pretty messed up how you throw these things in the same category.

 

Its literally dehumanizing.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, betalars said:

WOAH, that is such an awful stretch.

 

YouTuvers are actually humans. And they would not be popular, if they weren't odd and/or relatable.

The most average Joe is just not as entertaining as ADHD Linus. And Influencers tend to be a lot more personal with their Audience compared to the Average Actor for instance. That's why you are more likely to hear about their anxieties.

 

This has nothing to do with "AI-Ghosts." And I think it's pretty messed up how you throw these things in the same category.

 

Its literally dehumanizing.

If you don't beleive that a lot of youtubers are playing the algorithm game to become more appealing to an audience then, IMHO, you are being naive, youtubers are for the most part actors, the part you see as being more personal with thier audience could just as easily be coached for them as an actor portraying any emotion, personality, trait on the big screen.

But my point was that the chatbot's responses are fueled by the sheer number of those sort of videos, vlogs & blogs available for it to 'train' on since they trend so highly.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, betalars said:

That is why Linus and Luke are so wrong when they think of Bing as a Person.

What makes you think they do? I don't really get that impression from them from the bits I've seen and I don't see anything wrong with describing an AI acting like something.

 

I think sometimes this subject gets stuck too much on "it can't actually willingly do that". Irrespective of whether it is acting concciously or not its behaviour matches what those words were specifically created for to describe. How else would you describe Bing acting here? If Bing tells me I deserve to be dead that's acting hostile no matter what its intentions or actual capabilities/inner workings were.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, tikker said:

If Bing tells me I deserve to be dead that's acting hostile no matter what its intentions or actual capabilities/inner workings were.

It makes a difference. Even when real humans do harm, there's a difference between an accident, negligence and malice.

 

Sure, people can be hurt or feel threatened by what Bing Chat is saying, no matter if it's doing it out of malice or probability.

 

But that's not what Luke and Linus are talking about. They are suggesting to use a "teenager" as a mental model to grasp why the AI is doing weird stuff, and that is a horrible take. It's like explaining strange noises in an old warehouse with ghosts.

 

Sure, those noises are weird, and can be frightening. But we got to remember: it's not ghosts.

Link to comment
Share on other sites

Link to post
Share on other sites

Plus: when we think of the wild responses Bing Search tends to give as statistical noise, that the bot itself cannot understand, it becomes a lot less threatening.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DigitalGoat said:

 youtubers are for the most part actors, the part you see as being more personal with thier audience could just as easily be coached for them as an actor portraying any emotion, personality, trait on the big screen.

How many hours of relevant experience do you have in live performance?

I'll be charitable and say anything counts: improv, classical play, Pen and Paper, Podcasting, maybe even presenting/teaching.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, betalars said:

It makes a difference. Even when real humans do harm, there's a difference between an accident, negligence and malice.

There is a difference, but they don't truly rectify things. If you told a customer to fuck off while you were having a bad day, that may be understandable, but does not make it acceptable behaviour towards customers. We can understand that the bot itself doesn't understand what it is saying, but that does not mean a search engine wishing you dead, whether it understands the meaning of those words or not, isn't a problem.

1 hour ago, betalars said:

But that's not what Luke and Linus are talking about. They are suggesting to use a "teenager" as a mental model to grasp why the AI is doing weird stuff, and that is a horrible take. It's like explaining strange noises in an old warehouse with ghosts.

Unless I missed it, that is not what I heard Luke say in the linked bit. He just gave examples of how off the rails it went? In your warehouse example it would stil be appropriate to say it feels haunted or that it gives off that vibe.

1 hour ago, betalars said:

Plus: when we think of the wild responses Bing Search tends to give as statistical noise, that the bot itself cannot understand, it becomes a lot less threatening.

I don't think it really matters that the bot doesn't understand what it's saying. Even more so since this is a "customer service" type situation and that it "understanding" the conversation is sort of the point of these things. Now I agree with Luke's point about how easy it is to get it to do this. If this is behaviour you really need to try and drag out then fine, but if it simply goes ham on you after talking too long or asking the wrong thing then it needs to be addressed like how you would tell a real person why something is inappropriate and "train" them to try and be more concious about it, for exampe.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, tikker said:

Unless I missed it, that is not what I heard Luke say in the linked bit. He just gave examples of how off the rails it went?

This is the bit, that kind of made me want to start this thread, and while (having just looked at it again) they mostly made statements of opinion (it feels like this and that), they seemed to be coming from a frame of mind, that thinks of this chatbot like a person.

 

And again: I get why it feels like this. I get why people believe in ghosts, too. But I think it's pretty bad to think of it this way.

 

I'm not defending bing chat btw. It is seriously malfunctioning and there's obviously so much work that needs to go into it not making such wild statements.

 

But I think we need to stress: this is a text predictor, and we should if all feel alienated by it. But we shouldn't say it is "depressed" or "obsessed" and these are phrases Luke used to describe it.

Will rephrase some of my statements.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, betalars said:

But I think we need to stress: this is a text predictor, and we should if all feel alienated by it. But we shouldn't say it is "depressed" or "obsessed" and these are phrases Luke used to describe it.

Will rephrase some of my statements.

Consider what people write the most about.  IF this is trained on text from the internet if Bing sounds erratic, depressed, etc etc it's because we are.  We humans have created in our minds a world with so many extra reasons to worry and stress out than exist in nature.   Then we write about them on the net.  So much of it is drama.  So much of it is emotional damage.  We are evolved to focus on that which is bad since what is bad is what can harm us.  

Hence a text predictor based on that input will out put like the HAL 9000 (2001 a space Odyssey) or the W.O.P.R. Computer (WarGames) and so many more.   For that matter the idea of Rampant AI in the Halo series.  

While it is not alive in a way we are all its parents and the child reflects the parents.   

 

Someone needs to have a version of this that speaks the text and one voice option has to be HAL9000.  Especially if it's going to be all crazy acting. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, betalars said:

This is the bit, that kind of made me want to start this thread, and while (having just looked at it again) they mostly made statements of opinion (it feels like this and that), they seemed to be coming from a frame of mind, that thinks of this chatbot like a person.

 

And again: I get why it feels like this. I get why people believe in ghosts, too. But I think it's pretty bad to think of it this way.

 

I'm not defending bing chat btw. It is seriously malfunctioning and there's obviously so much work that needs to go into it not making such wild statements.

 

But I think we need to stress: this is a text predictor, and we should if all feel alienated by it. But we shouldn't say it is "depressed" or "obsessed" and these are phrases Luke used to describe it.

Will rephrase some of my statements.

I think there are two intertwined aspects here: how does it (appear to) act and can it fundamentally act so willingly. I think it is fine to describe it as acting depressed, hostile, like a teenager etc. because those are words that we have specifically created to reflect that kind of behaviour. To me Luke sounded like he was just describing what Bing feels like to him in 'real world' analogues. I would say taking these kind of statements too literally misses some of the context, nuance and complexity of a statement that we currently say AIs fail to grasp.

 

You say we should feel alienated by it, but I feel that the entire point of these kind of AIs is to do the opposite. We want it to not sound like a cold hearted robot, but more like an interaction. That incidentally makes me think of Detroid Become Human.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Caroline said:

I've met people with actual anxiety and depression, they're locked up in an asylum not making youtube videos for profit.

 

If I wouldn't waste my time on you, I'd right now be literally creating content (not for profit) discussing mental health conditions, that I myself and many of my friends have.

 

I have 3 diagnoses, none of witch are for a "serious" "disease". I do not require to be locked up in an asylum. I am still legally disabled.

 

So shut up when you can't take anyone seriously when they have some capacity left to function outside of an asylum. You're not helping people with conditions that are so bad they need to be treated in a clinic. And you're creating a dead and cold culture, where mental health issues escalate, as either people deny themselves support because they "don't have it bad enough" or they are afraid to be locked up in asylums.

 

I'd take 100 Influencers and Songwriters that make money on making people feel a little better, even when they do not know what they talk about over one toxic troll like you, that has the privilege of being able to easily empathize with others and chooses not to.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, betalars said:

WOAH, that is such an awful stretch.

 

YouTuvers are actually humans. And they would not be popular, if they weren't odd and/or relatable.

The most average Joe is just not as entertaining as ADHD Linus. And Influencers tend to be a lot more personal with their Audience compared to the Average Actor for instance. That's why you are more likely to hear about their anxieties.

 

This has nothing to do with "AI-Ghosts." And I think it's pretty messed up how you throw these things in the same category.

 

Its literally dehumanizing.

If it's not beyond tech you tubers to misinform/mislead and play the algorithms why isn't it beyond any other youtuber who survives off clicks?  These guys lie all the time.   May I present:

 

"check out what this creep did at the gym"

 

and:

 

"you won't believe how this boss treated his gay employee"

 

also:

 

"why I lost my job at..."

 

and the list goes on.   They all do it, they don't care about facts or reality, just your click.

 

 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

That incidentally makes me think of Detroid Become Human.

*sigh* I think I have to disagree to some point.

I mean yes, it reminds me of the game, because I think these AIs are probably falling in a sort of uncanny valley right now. Where they are not really human, but almost, and it all seems so much worse than if they were acting more like bots.

 

But again: this game is projecting the struggle of actual humans, that suffered from imperialism and slavery onto robots. It's really jucky for me. And using this as a reference frame to think about Bing Chat? Idk.

 

As I said I get that Bing chat feels like a teenager. And I understand, why people would think it is depressed. But when reporting to it, I think it's careless to not describe the mechanisms behind it.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, mr moose said:

They all do it, they don't care about facts or reality, just your click.

A statement like this is utterly ridiculus. I can easily falsify it by finding a single one that genuinely does want to educate and does not care about just clicks.

 

I mean sure: the attention economy is bad. It gives power to bad actors. And the fact that outrageous titles are rewarded with so many clicks is kind of an issue, and I do not have a good answer for it.

 

But "they are all evil" style comments, are just as pointless and arguably also engineered to generate the maximum outrage.

 

so when you don't like the problem, don't be a part of it.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, betalars said:

A statement like this is utterly ridiculus. I can easily falsify it by finding a single one that genuinely does want to educate and does not care about just clicks.

Absolutism does not undo the argument.

 

9 minutes ago, betalars said:

I mean sure: the attention economy is bad. It gives power to bad actors. And the fact that outrageous titles are rewarded with so many clicks is kind of an issue, and I do not have a good answer for it.

 

But "they are all evil" style comments, are just as pointless and arguably also engineered to generate the maximum outrage.

Only if you get hung up on being meticulously accurate to the nth degree.  The meaning of the message in context is pretty plain and simply without playing semantic games around the wording.

 

9 minutes ago, betalars said:

so when you don't like the problem, don't be a part of it.

I don't see how calling out the shite state of youtube, the media or humanity is being a part of the problem.

 

If anything trying to shut down the discussion because you can find a 1%er example that isn't a problem is effectively you defending the other 99% who are a problem.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, betalars said:

*sigh* I think I have to disagree to some point.

I mean yes, it reminds me of the game, because I think these AIs are probably falling in a sort of uncanny valley right now. Where they are not really human, but almost, and it all seems so much worse than if they were acting more like bots.

 

But again: this game is projecting the struggle of actual humans, that suffered from imperialism and slavery onto robots. It's really jucky for me. And using this as a reference frame to think about Bing Chat? Idk.

I didn't use it as a frame to think about Bing chat. Your statement about not having to care about it since it was just a machine/program was what reminded me of it, because it had similar elements of whether you should care about how an AI acts or how you act towards it. And since that is a huge topic on its own I also then said that that is a separate discussion on its own.

2 hours ago, betalars said:

As I said I get that Bing chat feels like a teenager. And I understand, why people would think it is depressed. But when reporting to it, I think it's careless to not describe the mechanisms behind it.

The thing is, to me this sounds like you are warning that people shouldn't think of Bing actually being depressed while at the same time you assume that because someone describes it as acting depressed it means they think it's actually a personality that is depressed. We should avoid going as far as claiming sentience or anything, we've seen that go wrong, but I don't see anything fundamentally wrong about a person describing Bing, or any other AI, as acting like something we are familiar with. That does not mean they think it actually is what they are using to describe the behaviour. To me it sounds similar to saying an adult is acting like a child and then saying that it is wrong, because they are physically incapable of actually being a child.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, DigitalGoat said:

If you don't beleive that a lot of youtubers are playing the algorithm game to become more appealing to an audience then, IMHO, you are being naive, youtubers are for the most part actors, the part you see as being more personal with thier audience could just as easily be coached for them as an actor portraying any emotion, personality, trait on the big screen.

But my point was that the chatbot's responses are fueled by the sheer number of those sort of videos, vlogs & blogs available for it to 'train' on since they trend so highly.

Don't hate the player, hate the game (or trillion dollar company, in this case).

"Don't fall down the hole!" ~James, 2022

 

"If you have a monitor, look at that monitor with your eyeballs." ~ Jake, 2022

Link to comment
Share on other sites

Link to post
Share on other sites

On 2/19/2023 at 1:12 PM, betalars said:

So how should you think of Bing Chat

same as nfts twitter meta vr crypto fb, its fake trash, dont use it, thats what you should think about it

 

~ yours, the thought police .

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, mr moose said:

I don't see how calling out the shite state of youtube, the media or humanity is being a part of the problem.

99% of YouTubers do this! Look at these 3 outrageous titles🥵, they only care about clicks!!!😱😱😱

 

I'ts not that you are critiquing it, it's how you're critiquing. Dealing in absolutes, focusing on the outrage. It's part of the patterns, that ruin the attention economy.

 

And what kind of confuses me about your argument:

Pretty much the only thing I regularly watch with a lot of drama and clickbait is the WAN-show. So for me it's like 1% of my feed. And even in this case, I don't believe Linus is faking his ADHD.

 

So idk ... when you perceive it as being 99% of all YouTubers, then what are you watching mate?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, betalars said:

99% of YouTubers do this! Look at these 3 outrageous titles🥵, they only care about clicks!!!😱😱😱

 

I'ts not that you are critiquing it, it's how you're critiquing. Dealing in absolutes, focusing on the outrage. It's part of the patterns, that ruin the attention economy.

 

And what kind of confuses me about your argument:

Pretty much the only thing I regularly watch with a lot of drama and clickbait is the WAN-show. So for me it's like 1% of my feed. And even in this case, I don't believe Linus is faking his ADHD.

 

So idk ... when you perceive it as being 99% of all YouTubers, then what are you watching mate?

So your argument is motivated because you can't process information unless it is absolutely 100% semantically accurate.

 

We call it throwing the baby out withe bathwater,   Instead of actually debating my argument you are stuck trying to deal with absolutes.  I should not have to sit down and walk you through a perfectly understandable argument because you got tripped up when I said "they all do it" when most people know full well that that reads as " it is so common that you may as well consider that they all do it".  Because believe it or not, very small things like that in informal discussion are meant and read as an indicator of magnitude, not an exact number.

 

 

So now you know I am not giving you an EXACT number and I am indicating a very large portion that is for all intents and purposes large enough to effect majority of youtube users,  there is a problem with click bait titles and misinformation for the sake of staying popular.  Just because you don't see it just means you either don't care or are sucked in by it.    A lot of us here see it in the LTT videos because we are generally tech enthusiast and you can't hide much form someone who is informed.  Just like when I see a lot of videos that include something to do with mental health issues (I am well informed) and I see the vagaries and the generalities that would be lucky to help a few at best and the dangerous misinformation at worst. It's information that if you are mentally unwell should be getting from a doctor or psych.   

 

So yes, I will call it out and I will not sit here and be told that it's ok for random uninformed people to release videos on something as complicated as mental health like it is roses and sunshine.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×