Leaked Facebook documents reveal the company walks a fine line between free speech and violent or hateful content.

The Guardian newspaper says it obtained the “more than 100 internal training manuals, spreadsheets and flowcharts” outlining how the social media giant decides what content can stay and what gets taken down.

According to the documents, Facebook does allows certain posts that contain violent language. For example, it’s OK to post “let’s beat up fat kids,” but prohibited to post “someone shoot Trump.”

“People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways,” reads one of the documents.

Images showing non-sexual physical abuse or bullying of children as long as there is not a “sadistic or celebratory element.” Live streams of people harming themselves is also allowed, the documents say because Facebook doesn’t want to “censor or punish people in distress.”

A Facebook representative said the company’s top priority is keeping users safe.

“We work hard to make Facebook as safe as possible while enabling free speech,” said Monica Bickert, Facebook’s head of global policy management. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”

Facebook has been under increased pressure to prevent violent content from appearing, as a stream of violent videos have been allowed to stay on the site for hours before being deleted.

One particularly gruesome video showed the brutal murder of Cleveland grandfather Robert Godwin in a crime posted on Facebook Live.

The company recently hired 3,000 more humans to help curb objectionable material, and The Guardian documents reveal the moderators are overwhelmed with requests to review material.

“These reviewers will also help us get better at removing things we don’t allow on Facebook, like hate speech and child exploitation, “ Facebook founder Mark Zuckerberg wrote in a post about the hiring. “And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it – either because they’re about to harm themselves, or because they’re in danger from someone else.”

The company also employs algorithms to mark objectionable content.

Facebook also faces criticism when it does take down material deemed offensive.

Last fall, the company removed an iconic photo showing a naked Vietnamese girl running from a napalm attack during the Vietnam War. Facebook later allowed the image to be posted.

leave a reply: