X
    Categories: Life

Facebook’s Manual Might be Problematic, but We’re Happy They’re Looking at Content Seriously

By Manasi Nene

Image courtesy Facebook

Did you know that Facebook’s policies and rules include wacky guidelines for match-fixing and cannibalism, amongst others such as violence, hate speech, pornography and terrorism? These rules were revealed through a recent Guardian investigation.

Facebook has constantly been in the news for policies that sometimes miss the mark – from taking down videos of breastfeeding and deleting a post that says men are trash, to not taking down the livestream of a boy committing suicide.

Maybe now we can be a little more aware of its stance on things – some of these rules are definitely a step forward. What constitutes as revenge porn has been codified, threats of violence have a classification system – credible and generic/non-credible – and it investigates up to 6.5 million accounts a week to determine whether they are fake, malicious accounts.

But some guidelines still have us scratching our heads. The line between credible and non-credible threats of violence is very oddly defined – “I’m going to kill you” and “f— off and die” are seen as non-credible threats, because they’re somewhat abstract. Yes, because a straight-up threat can always seem obscure. Videos of animal abuse and non-sexual child abuse will stay on the site with a warning before the video, and videos of violent deaths will not necessarily be taken down. It will allow videos of abortion (as long as there is no nudity) and self-harm, because doing otherwise goes against Facebook’s line on free speech. There are also other strange clauses in the rulebook that we can’t figure out. For example, works of art that depict nudity and sexual activity are allowed if they’re “handmade”, but not if they’re digital.

It’s a bit depressing how difficult it is to tease out the rationale behind these guidelines: “I’m going to kill you” is not a credible threat because it’s abstract, but they very specific “unless you stop bitching I’ll have to cut your tongue out” still works. What gives?

You get a pretty strong feeling that these guidelines themselves come from a masculine understanding of violence and harassment. Clearly, the guidelines view security as the absence of one solid actionable threat, which could indeed be the case for men online, but what they fail to encompass is how individual statements of violence and violent intent contribute to cultures of harassment and bullying, creating frameworks that condone and perpetuate different forms of violence against women. The absence of a physical threat may provide security to men, but to women living in a misogynist world, these guidelines feel both misguided and a bit frustratingly inadequate.

Still, all is not lost. Even if this manual is problematic, it’s heartening that Facebook is starting to take these matters so seriously. The free/hate speech argument isn’t likely to die down anytime soon, and the policies governing the two are going to become more and more important. According to Facebook, videos of violent deaths and animal abuse are allowed to stay up because they can start conversations among people – but when conversation descends into downright savagery, it’s time to start questioning our policies again.

ladiesfinger :