Facebook Is Using Artificial Intelligence To Reduce Livestreams Of Suicide

[Shutterstock - Roman Demkiv]

Daily Caller News Foundation logo
Eric Lieberman Managing Editor
Font Size:

Facebook announced Monday that it is using artificial intelligence (AI) to reduce the amount of suicides broadcasted live on the platform.

By using automatic “pattern recognition” through advanced technology, the social media company can better detect posts or live videos where a person is expressing thoughts of suicide, which can subsequently lead to a faster response for help.

Broadly, AI is the concept that machines can display a level of knowledge similar to humans through learning and understanding of the surrounding environment. An artificially intelligent system, for example, can perform almost-cognitive functions like problem solving, which often requires adapting to unique circumstances in realtime.

Facebook is also working on improving the processes of deciphering the most appropriate first responders through automation, as well as increasing the amount of human content reviewers on its staff.

“We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports,” Guy Rosen, vice president of product management, writes in an official blog post. “We provide people with a number of support options, such as the option to reach out to a friend and even offer suggested text templates … Our approach was developed in collaboration with mental health organizations such as, National Suicide Prevention Lifeline, and Forefront Suicide Prevention and with input from people who have had personal experience thinking about or attempting suicide.”

The purposeful publicization of suicides has seemingly grown in recent months and years.

An Alabama man reportedly took his own life in April while using Facebook Live, and local authorities said they personally haven’t seen anything like it.

“This was a first for us,” Anthony Lowery, assistant chief deputy with the Baldwin County Sheriff’s Office, told USA Today. “I hope this isn’t a trend starting. It’s one thing to commit suicide. It’s another thing to victimize other people.” (RELATED: Nine People Shot In Philly While Reportedly Dancing On Facebook Live)

Two different young girls, both under 15, took their own lives while using a live streaming service. One did so in her family’s front yard.

A young aspiring actor committed suicide in Los Angeles earlier this year while using Facebook Live. The man was arrested on suspicion of sexual assault right before his self-imposed death.

Facebook announced in early March that it planned on introducing several new features to help prevent suicides, including embedding “live chat support from crisis support organizations” on its own Messenger app.

With the latest additional tool of AI, Facebook’s mechanisms for suicide prevention have been vamped up. (RELATED: Many People Are Thankful For Artificial Intelligence This Thanksgiving, Says Poll)

“Using artificial intelligence to prioritize the order in which our team reviews reported posts, videos and live streams,” Rosen explains, “… ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.”

Follow Eric on Twitter

Send tips to

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact