Facebook Content Policy Allows Self-harm Videos to be Livestreamed, Leaked Documents Show

Facebook self-harm livestreaming videos

According to a leaked repository of about 11 policy documents from Facebook, the company’s policy on self-harm clearly shows that the company will allow self-harm content to be livestreamed until “there’s no longer an opportunity to help the person.”

Images that are posted, or videos that are livestreamed, with such content, say the documents, can be removed unless the incident has “news value.”

The livestream video feature on the social platform, called Facebook Live, allows any user to stream a live video capture to its nearly 2 billion users around the world. Facebook Live is now a big part of the Facebook experience, but the Wall Street Journal reports that there have been at least 50 acts of violence being broadcast since the feature went live.

More than violence targeted at others, self-harm seems to be the trending type of content. The Guardian reviewed one of the leaked documents, which shows that there were 4,531 reports of self-harm during a two-week period last year; the same period for this year puts that figure at 5,431.

In an internal policy update to its content moderators, Facebook said this:

“We’re now seeing more video content — including suicides — shared on Facebook. We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.
“However, because of the contagion risk , what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”

This is undoubtedly a difficult matter for any company to handle, and because Facebook’s content is all user-generated, it becomes even harder to tackle the issue. Facebook already has a team of over 4,500 content moderators that review reported content every week, and earlier this month, Facebook CEO Mark Zuckerberg said that the company would hire an additional 3,000 moderators to review reported content.

To be clear, there’s no way to review the millions of posts, photos, images, videos and status updates that are generated on a daily basis. The current system that Facebook uses is to review only content that is reported by other users.

As such, the company has confirmed that it is working on making the review process simpler and clearer, as well as investing in better tools for the platform. But it did not confirm the authenticity of the leaked documents. In a statement, Facebook’s Monika Bickert, head of global policy management, said:

“Keeping people on Facebook safe is the most important thing we do. In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”

Thanks for visiting! Would you do us a favor? If you think it’s worth a few seconds, please like our Facebook page and follow us on TwitterIt would mean a lot to us. Thank you.