Facebook Updates Its Efforts to Prevent Suicide and Self-Harm – Adweek


Facebook used World Suicide Prevention Day Tuesday to provide an update on its efforts to prevent suicide and self-harm by users of its platforms.

Global head of safety Antigone Davis wrote in a Newsroom post, “Earlier this year, we began hosting regular consultations with experts from around the world to discuss some of the more difficult topics associated with suicide and self-injury. These include how we deal with suicide notes, the risks of sad content online and newsworthy depictions of suicide. Further details of these meetings are available on Facebook’s new Suicide Prevention page in our Safety Center.”

She also detailed some steps the company took in recent months.

In February, Facebook said it would continue to allow people to share admissions of self-harm and suicidal thoughts, but content that promotes such actions would be removed.

And Instagram stressed that it never allowed posts that promote or encourage suicide or self-harm, and it would not allow any graphic images of self-harm, even in cases where those images would previously have been allowed as admission.

The Facebook-owned photo- and video-sharing network also said at the time that non-graphic content related to self-harm, such as healed scars, would not be shown in search, hashtags or the Explore tab, and it would not be recommended.

In July, Facebook updated its standards on content related to eating disorders and began sending resources to people who posted content promoting eating disorders or self-harm.

Davis went on to discuss further changes that are in the works.

Facebook plans to hire a health and wellbeing expert to join its safety policy team and focus exclusively on the impact of the social network’s applications and policies.

The company is also providing academic researchers with access to social media monitoring tool CrowdTangle so that they can analyze how people discuss suicide on its platform.

And Orygen’s #chatsafe guidelines were added to Facebook’s Safety Center and to resources on Instagram when someone searches for suicide or self-injury content.

Davis concluded, “We’ll continue to invest in people, technology and resources so that we can do more to protect people on our apps. Visit our Suicide Prevention Resource page to learn more about what’s available.”



Source link

Related Articles