Facebook announced Wednesday it is testing the ability for AI to identify potential “suicide or self injury” posts in the United States, but not yet in Canada.
Facebook is seeking balance between enabling live, shared experiences and providing a safe environment for online users. This morning the social media brand announced
While the vast majority of Facebook Live users engage with the tool to broadcast relatively innocent moments, like weddings and news broadcasts, the social media conglomerate has had to adjust to hard-hitting live news moments, like when Diamond Reynolds live-streamed video on Facebook Live after her boyfriend, Philando Castile, was shot and killed by police.
The company announced Wednesday it is testing the ability for AI to identify potential “suicide or self injury” posts based on pattern recognition from posts that have been previously flagged on the site in the past. Its community operations team will then review the posts to decide if Facebook should surface crisis resources to the user.
Editor’s Note: In 2016, I flagged a friend’s progressively ominous status updates three times with Facebook, and Facebook deemed them “not in conflict with their community standards”. I tried one more time before he did post a “goodbye letter” on Facebook, and attempted suicide, but not before two friends read his status update and prevented his death — with moments to spare. He’s going to recovery and therapy today, and will be fine. This announcement by Facebook today is a positive step in the right direction but I was disappointed at the time that his CLEARLY dangerous status updates were deemed non-threatening when, in fact, they were.
Unfortunately, this test is currently only being tested in the United States and not yet in Canada. Users can now chat directly, using Facebook Messenger, with several support organizations through their pages: National Eating Disorder Association, National Suicide Prevention Lifeline and Crisis Text Line. Anyone has the option to message the organizations by going to their Facebook pages. While it hadn’t been publicly announced, the integration between Crisis Text Line and Facebook Messenger has been quietly in place for a while.
A few months ago, Facebook reacted too slowly to the terrifying implications of suicide on Facebook Live when a 12-year old girl committed suicide on a live broadcast, and Facebook took two weeks to remove the video.
Facebook’s CEO Mark Zuckerberg in January acknowledged those stories, and the important role of early detection.
“There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner,” he wrote. “Going forward, there are even more cases where our community should be able to identify risks related to mental health, disease or crime.”