Facebook rolled out new online tools Wednesday to help those who may be struggling with suicidal thoughts.
The new toolkit lets users flag content that they find concerning. This means that if someone on Facebook sees a direct threat of suicide, they can click on an arrow on the post to report it, the Los Angeles Times reported.
The social media giant partnered with mental health organizations Forefront, Now Matters Now, the National Suicide Prevention Lifeline, Save.org and others on these updates, in addition to consulting with people who had lived experience with self-injury or suicide.
"One of the first things these organizations discussed with us was how much connecting with people who care can help those in distress," Rob Boyle, Facebook product manager & Nicole Staubli, Facebook community operations safety specialist, wrote in a post.
Facebook will have teams working around the world, 24/7, reviewing any report that comes in. They prioritize the most serious reports, like self-injury, and send help and resources to those in distress.
They also will contact the person reporting the posts, "providing them with options to call or message the potentially suicidal friend, or to also seek the advice of a trained professional," Time reported.
The new approach is an update to a clunkier initiative the social media giant launched in late 2011, "that required users to upload links and screenshots to the official Facebook suicide prevention page," Time reported.
The updates will roll out to everyone who uses Facebook in the United States over the next couple of months. The social network is also working to improve our tools for those outside the United States.