Jump to content

AI developed by Facebook will scan your posts for suicidal thoughts

Source: Facebook Newsroom via CNET

 

Quote

support-for-concerned-friend-screenshot.thumb.png.b41b3434ab577d4d814c7df591bffc7a.png

If you're going through a difficult time, Facebook wants to help.

 

Its artificial intelligence software can now use pattern recognition to scan Facebook posts and live videos for suicidal thoughts.

Once the technology has identified a problem post through what the company calls proactive detection, it alerts a team of human moderators who specialize in dealing with suicide and self harm. These specialists can send mental health resources to the user at risk or to that person's friends and alert the authorities if necessary. The program could flag problem posts before users report them, accelerating the alert process. Help could be sent to people in real time.

Facebook has been testing the technology for nine months in the US. It will extend the AI's reach to the rest of the world, except for the European Union, because of General Data Protection Regulation privacy laws, which restrict all forms of profiling. Exact timing for when the AI will go worldwide is unclear, but we've reached out to Facebook to ask.

I think many people including myself tend to rail against Facebook given their history of privacy invasive practices and just a few weeks ago, I find their plans in combating "revenge porn" as something problematic.

 

But as someone who experienced depression like myself, I'm actually in favor of this and with that, thank you Facebook :x. I remember last year someone in this forum made a thread on how they're going to commit suicide and thankfully so many people jumped in and told him no. I don't know what happened to that kid and I think that thread has been deleted by the mods. I know exactly how it feels to be isolated, feeling everyone hates you and your mind is full of self loathing and you feel helpless and many people especially young kids and teenagers go to social media to express their feelings. Adults tend to be more secretive and if they do post something that will suggest suicide, it is very subtle.

 

There's an off topic thread about people's experience with depression and I actually shared mine there. 

Quote

Today, we are sharing additional work we’re doing to help people who are expressing thoughts of suicide, including:

  • Using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster
  • Improving how we identify appropriate first responders
  • Dedicating more reviewers from our Community Operations team to review reports of suicide or self harm

Over the last month, we’ve worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts. This is in addition to reports we received from people in the Facebook community. We also use pattern recognition to help accelerate the most concerning reports. We’ve found these accelerated reports— that we have signaled require immediate attention—are escalated to local authorities twice as quickly as other reports. We are committed to continuing to invest in pattern recognition technology to better serve our community.

Expanding our use of proactive detection

  • We are starting to roll out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. This will eventually be available worldwide, except the EU.
  • This approach uses pattern recognition technology to help identify posts and live streams as likely to be expressing thoughts of suicide. We continue to work on this technology to increase accuracy and avoid false positives before our team reviews.
  • We use signals like the text used in the post and comments (for example, comments like “Are you ok?” and “Can I help?” can be strong indicators). In some instances, we have found that the technology has identified videos that may have gone unreported.

Improving how we identify first responders and dedicating more reviewers

  • Our Community Operations team includes thousands of people around the world who review reports about content on Facebook. The team includes a dedicated group of specialists who have specific training in suicide and self harm.
  • We are also using artificial intelligence to prioritize the order in which our team reviews reported posts, videos and live streams. This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.
  • Context is critical for our review teams, so we have developed ways to enhance our tools to get people help as quickly as possible. For example, our reviewers can quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.
  • In addition to those tools, we’re using automation so the team can more quickly access the appropriate first responders’ contact information.

support-for-broadcaster-screenshot.png

My concern then would be, if someone posts a potentially suicidal Facebook post and the AI picked it up, will it automatically trigger a call from National Suicide Prevention Center or it will be just a friend sending a message as if the friend is tapping the shoulders of the depressed person? I'm guessing the AI's algorithms would range from posts about lyrics of sad songs to pictures of knives or guns pointed at the temple or anything that might suggest self harm. At this moment, it pisses me off when someone tells depression is not real or it's just in the head and you'll get over it but those narrow minded people will never know depression until it hits them. I wish Facebook and other social media services like Twitter, Instagram, etc have something like this to prevent and guide people having dark moments in their lives and show they're not alone and to put an end to such atrocious and disturbing practice of inviting other people to commit suicide.

 

Edited by hey_yo_

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Am I the only one who now wants to go emo on FB to screw up FB's AI?

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Sauron said:

I don't use facebook, problem solved :P 

Is this the new/old "I'm vegan?" /s

 

Even if you were using facebook, since they already know everything about you, this shouldn't make a difference for you personally. I for one support this. I know people with depression, and it's a bitch. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll eat my hat if this will actually be used for ONLY this purpose. Facebook is not your friend. They are doing this for some other purpose under the guise of looking all helpful and innocent.

Link to comment
Share on other sites

Link to post
Share on other sites

This is what  would make Facebook's data collection useful; preventing bad things from happening to people, not trying to sell me product XYZ. Given Google's and Facebook's location collection methods (tracking even if you disable it) could be rather useful in crime prevention/resolution. For example if someone repeatedly is near a location which they never previously went to and a theft or break in happened a while later this could be used to narrow down the suspect list (or fill it). This would not be a perfect system as thieves could simply turn of their phones, however given how many people never leave their phone behind in this age it would be somewhat reliable.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bouzoo said:

Is this the new/old "I'm vegan?" /s

 

Even if you were using facebook, since they already know everything about you, this shouldn't make a difference for you personally. I for one support this. I know people with depression, and it's a bitch. 

Honestly if I were depressed I wouldn't post about it on social media and I wouldn't want people finding it out through analysis of what I post. Doctors take an oath that includes secrecy on their patient's condition, Facebook never did. Besides, imagine getting a random visit from first response teams and finding out that facebook sent them because it misinterpreted your posts as indicative of suicidal thoughts... it's a strange world if Facebook is more effective at anticipating your suicide than your family or friends.

 

But hey, if it actually saves lives I guess why not... I just wouldn't want this "service" for myself.

16 minutes ago, HarryNyquist said:

I'll eat my hat if this will actually be used for ONLY this purpose. Facebook is not your friend. They are doing this for some other purpose under the guise of looking all helpful and innocent.

Maybe, maybe not - after all the data they use for this is already at their disposal. My guess is this would be relatively easy to implement and would grant them a PR boost, so they figured it would be worth the investment. It certainly isn't as creepy as when they asked for your nudes.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

I know that there are definitely people in trouble and they really need help but in my experience there are a lot of people who are joking when they say that they are going to kill themselves. 

 

What I see a lot of people saying is stuff like "If I don't get a good grade on my test then I'm going to kill myself". Now of course this is an example and I know they are joking but I still hear it a lot and see it a lot. How is Facebook going to tell the difference between people who are joking and people who are seriously in trouble?

 

Edit: I do think that this is a good idea though

Edited by Bigbootyjudy

The only reason I'm here is that I have homework that I don't want to do

 

PC  Specs   CPU: Intel Celeron N3060 |GPU: Intel HD graphics 400 |RAM2 gigs  |STORAGE16 gigs

 

 

It took me half an hour to find where to change my signature :(

Link to comment
Share on other sites

Link to post
Share on other sites

As someone who has actually tried to kill themselves even I dont know how I feel about this. I never posted how depressed I was on any social media because I dont want people to feel bad for me. But when I had a friend reach out to me because he noticed I wasnt the same it really helped a lot. 

 

I like the idea, I just dont want this to have a potential negative impact on people who are suicidal. 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, but they'll never take me alive.

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mynameisjuan said:

 

 

I like the idea, I just dont want this to have a potential negative impact on people who are suicidal. 

 

Or what happens when there is a yet another fuckin data leak and insurance companies get a hold of this information then deny you coverage based on it.

Link to comment
Share on other sites

Link to post
Share on other sites

So.. not only will FB help you track down your nudes.. but it will find out if your suicidal, and then attempt to sell you the rope to hang yourself on... Effin brilliant!

Link to comment
Share on other sites

Link to post
Share on other sites

Scary. We are heading towards the future where machines knows better than yourself what you want and how you feel. This suicide AI is just the beginning. Would be surprised if they are not already testing all kinds of AIs from our data.  

Laptop: Acer V3-772G  CPU: i5 4200M GPU: GT 750M SSD: Crucial MX100 256GB
DesktopCPU: R7 1700x GPU: RTX 2080 SSDSamsung 860 Evo 1TB 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, JuztBe said:

Scary. We are heading towards the future where machines knows better than yourself what you want and how you feel. This suicide AI is just the beginning. Would be surprised if they are not already testing all kinds of AIs from our data.  

Maybe this post was from an AI posing as a person to distract us from the truth?!

 

**Puts on tin foil hat**

 

But seriously, wouldn't it be crazy if we actually let robots control or lives like that? I mean, we already let them into it daily lives as it is (I'm looking at you Google, Siri, Cortana, and Alexa)...

 

Why do people continue to use a platform like this? I guess I don't understand the social media addiction.

Link to comment
Share on other sites

Link to post
Share on other sites

Post this one and you get flaged: ;)

On toipc: Doing things like this good, but it's a shame it's necessary. Also it feels like an excuse to have an AI looking at all your data (and profiles you for advertisement).

Mineral oil and 40 kg aluminium heat sinks are a perfect combination: 73 cores and a Titan X, Twenty Thousand Leagues Under the Oil

Link to comment
Share on other sites

Link to post
Share on other sites

Forcing help onto people like this doesn't really work. If someone wants die they should be allowed to die. Ideally while they're rational and not intoxicated but that should be a choice they would be allowed to make. 

Link to comment
Share on other sites

Link to post
Share on other sites

The only reason I could see for Facebook doing this is to squeeze more money out of people.

 

They are a company after all and it's not in their best interest to treat us with respect and dignity.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Sauron said:

it's a strange world if Facebook is more effective at anticipating your suicide than your family or friends.

Nowadays a lot of people (that I know) tend to share more stuff online than with families. What a time to live in.

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

facebook vs bluewhale

who would win 

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, JoseGuya said:

If Twitter implements this, the servers would crash from so many flagged post. 

Twitter nowadays is a cesspool of toxic political rants and whining politicians. If Twitter added a similar AI, it might flag certain politicians for doing diplomacy talks or rant about something as a cry for help. 

2 hours ago, suicidalfranco said:

facebook vs bluewhale

who would win 

I hope Facebook. It’s the lesser of the two evils.

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

If it wasn't bad enough that half the internet thinks it can diagnose and fix medical conditions, now we have an early development AI system working inside an advertising fueled social media platform doing it too.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×