Can Facebook ever be kept safe without hurting staff?

Facebook
(Image credit: Shutterstock)

In 2017, Facebook ever so slightly adjusted its mission statement. Out went a pledge to “make the world more open and connected”, and in its place came an intention to “give people the power to build community and bring the world closer together”.

You could view this as an admission that 'open' had failed. 'Open' means open to hate speech, child abuse, violence, sex and the kind of illegal acts Facebook would rather have nothing to do with. And yet the company now finds itself having to clean up such messes every hour of every day.

Or rather, it employs outsiders to do said dirty work. In The Cleaners, a documentary by Hans Block and Moritz Riesewieck, contractors from the Philippines candidly discuss the steady stream of sex, violence and hate speech they have to sift through every day.

Mozfest 2019

Former Facebook moderator Chris Gray and filmmaker Moritz Riesewieck at Mozfest 2019 (Image credit: Connor Ballard-Bateman)

They have to make each decision in eight to 10 seconds, they say, and “don’t overthink” is a direct quote from the training materials, such as they are. “Don't doubt too much whether your decision is right or wrong, because otherwise you will overthink it, and then you won't be able to take a decision,” Riesewieck summarises to TechRadar at Mozilla’s Mozfest, where he and his co-director have just been on a panel discussing internet moderation.

If ever there were a company to stress test the idea that any problem can be solved with enough money, it’s Facebook. And yet, so far, the problem just continues to grow. In 2009, Facebook had just 12 (yes, that’s twelve) content moderators looking out for the welfare of 120 million users. There are now over two billion people on the platform and around 15,000 moderators. While that means the ratio of moderator to user has gone up from paltry to feeble, it’s worth reflecting that Facebook in 2019 is very different to what it was a decade ago, when the Like button was the latest innovation, and Facebook Live was still years away.

"The worst of the worst of the internet's trash"

“Estimates say that there are about 100,000 professionals that work in this field,” says Clara Tsao, a Mozilla fellow and expert in countering online disinformation. They “deal with the worst of the worst of the internet’s trash,” she adds, noting that on 4chan they’re literally called 'janitors'.

Unlike real-world janitors, though, the internet’s cleaners aren’t always given the right equipment for the enormous task at hand. Facebook’s Filipino contingent would occasionally encounter exchanges in languages they didn’t speak, using Google Translate to follow meaning. That inevitably takes a sledgehammer to nuance, before you even get onto the cultural differences inevitable between countries separated by an eight-hour time zone gap.

Social media images

Facebook moderators have to monitor vast amounts of content from around the world, and may be required to assess conversations in a language they don't speak (Image credit: Shutterstock)

Facebook moderators aren't only located in the Philippines. There are offices around the world, and it was in Dublin where Chris Gray found himself after a spell teaching in Asia. Now he’s the lead plaintiff representing moderators in High Court proceedings against Facebook. Over a nine-month spell at the company (in Ireland, most workers are on 11-month contracts, he says but most leave early), Gray was dealing with 500-600 bits of content a night, usually in the 6pm to 2am slot. It was only a year after he left that he was officially diagnosed with PTSD.

“It took me a year before I realised that this job had knocked me on my arse,” he says as part of the panel discussion. This delayed reaction, Riesewieck tells us, isn’t wholly uncommon. “In some cases they told us it's mostly their friends telling them that they changed,” he explains.

It took me a year before I realised that this job had knocked me on my arse

Chris Gray

In any case, many of Gray’s former colleagues are privately pleased at him breaking NDA and leading the charge to legal action – even if they’re not prepared to say so publicly just yet. “People are just coming out of the woodwork and saying, ‘Oh, thank God, somebody has spoken out and said this,’” he tells TechRadar later.

To be clear, despite having personally been affected by the work, Gray feels that it’s misleading to assume it’s non-stop gore, child exploitation and sex. “To be honest, most of the work is tedious,” he says. “It’s just people reporting each other because they're having an argument and they want to use some process to get back at the other person.”

Tedious, but high pressure. In the Irish office, Gray had 30 seconds to pass verdict on content whether it was a one-line insult or a 30-minute video. “If your auditor clicked in [on a video] two seconds later than you and he saw something different – he heard a different slur, or he saw something higher up the priority ladder – then bang, you've made a wrong decision.” Wrong decisions affect quality score, and quality score affects your employment. Despite this, the target for the office was a nigh-on impossible 98% accuracy.

Superheroes

Finding people to talk about their moderation experience is tough, as Block and Riesewieck found when looking for subjects. NDAs are universal, and the work comes under a codename – at the time of filming it was 'project honey badger'.

Despite this, Facebook – or the subcontractors that deal with moderation – hire quite openly, even if they’re often grossly misleading about what the job actually entails. “They use superheroes in costumes, ‘come be a superhero, clean up the internet’,” explains Gabi Ivens, another Mozilla fellow on the panel. “One advert in Germany for content moderators asked questions like ‘do you love social media and wants to be up to date with what's happening in the world?’”

But despite the general tediousness of the day-to-day, there’s a surprising element to Block and Riesewieck’s documentary: many of their interview subjects took real pride in the role, seeing it as less of a job and more of a duty.

Facebook

Filipino Facebook moderators told filmmakers Hans Block and Moritz Riesewieck they felt it was their ethical duty to clean up the internet (Image credit: Shutterstock)

“They told us they feel like superheroes of the internet – like policemen guarding the internet,” says Block. The directors credit this in part to the Philippines’ 90% Christian populous. “They told us they feel like Jesus freeing the world from it,” Block adds. This, in turn, might make people feel reluctant to walk away, seeing it as their ethical duty rather than just a job.

But there are limits to this, especially as moderators aren’t making the final calls themselves. Here, the sacred text is Facebook’s labyrinthine set of rules and instructions: thousands of words accumulated over many years. In some cases, people are having to protect speech they think should be banned, or ban speech they think should be protected, something that Ivens sees as an obvious problem for wellbeing. “Keeping content online that you don’t think should be online is extremely damaging, even before you think about what people are seeing.”

The irony to treating the rules as sacred is that Facebook’s rules have never been an infallible set text: they’re the results of years of iterative changes, gradually responding to crises as they emerge, and trying to make the subjective more objective.

Keeping content online that you don’t think should be online is extremely damaging, even before you think about what people are seeing

Gabi Ivens

Remember the 'free the nipple' campaign? In short, Facebook guidelines originally said that any photograph with breasts in should be banned as pornographic, which meant the internet was deprived of proud mothers breastfeeding on the platform. Facebook gradually shifted its rules and accepted that context matters. In the same way, it’s had to accept that although there’s nothing illegal with people eating Tide Pods or spreading anti-vaccination conspiracy theories, if something becomes a public health epidemic, then it has a duty to step up.

"Some platforms say certain content might not be unlawful, but is unacceptable,” explains Tsao. But “other people feel like the internet should have broader freedoms to say whatever you want.” For Facebook, this dichotomy produces absurd levels of granularity: “Now we've got some guidance on if you threaten to push somebody off a roof,” Gray says. “Pushing is not a violent action. The fact that you're on a roof is important, but then how high is the roof?” So much for that “don’t overthink” guidance.

This kind of inertia in moderation guidelines lets internet trolls thrive. You don’t have to look very hard to come up with examples of internet rabble rousers who step right up to the line without ever quite overstepping it. Instead, they leave that to their followers – and sometimes, catastrophically, that spills over into the real world.

Morality doesn’t cross borders

Facebook’s global status makes the problem even more complex because morality isn’t shared across borders. “It's complicated because it surpasses the local policies of countries and borders right into a wild west,” Tsao says.

Gray gives the example of people’s sexuality: gay pride is very much a thing in most of the west, but less so elsewhere in the world. You might tag a friend as gay in a post, and they’re comfortable enough with their sexuality to share it. So in that instance, it feels reasonable not to take the post down, even if a curmudgeonly homophobe complains about it.

Facebook

Morality isn't a global concept. which makes moderating international content a huge challenge (Image credit: Shutterstock)

“But then if you're in Nigeria you could get you beaten or killed because somebody sees that post,” he explains. “That mistake could cost somebody their life. I mean, this is the reality of it: you are sometimes looking at life and death situations.”

Objective acts of violence should be more clear cut, but they aren’t. Video of a child getting shot might seem like an obvious candidate for deletion, but what if it’s citizen journalism uncovering unreported war crimes? If Facebook takes that down, then isn’t it just the unwitting propaganda wing of the world’s worst despots?

This is the reality of it: you are sometimes looking at life and death situations

Chris Gray

This is complex, easily muddled and doesn’t help the workers who are being judged for their objective responses to subjective posts. “People are protesting and it’s appearing on my desk,” Gray says during the panel. “And I’ve got to make the call: is that baby dead? And then I’ve got to press the right button, and if I press the wrong button because my auditor thinks the baby’s not dead, then I’ve made a mistake and it goes towards my quality score and I get fired.

“So I’m lying awake in bed at night seeing that image again and trying to formulate an argument to keep my job.”

Can it be fixed?

It should be pretty obvious at this point that this isn’t entirely Facebook’s fault, even if the company hasn’t exactly helped itself along the way. But what can it do? It’s pretty clear throwing people at the problem won’t work, nor is AI moderation ready for show time. (And there are legitimate doubts that it ever will be – for starters, you need humans to train the AI, which just moves trauma one step backwards. “I think it'll be really hard to completely remove humans from the loop,” says Tsao.)

“Facebook don't have a clear strategy for this,” says Gray. “It's all reactive. Something happens, so they make a new rule and hire more people.” He believes a lack of leadership is the root of the problem. “You need to know where you’re going with this and what your strategy is, and they don’t. Everything stems from that.”

Stressed worker

Psychology professor Roderick Orner says it's crucially important that nobody does this type of work alone so responsibility doesn't lie entirely with an individual (Image credit: Shutterstock)

That, Tsao believes, is in part because the decision makers haven’t had to do it themselves. “I've interviewed a bunch of heads of trust and safety at companies, and one of them has always said: ‘if you're going to be in a management role in this professional field, you have to understand what it's like on the bottom’,” she says. “You have to understand the trauma, you have to understand what kind of support system is needed.”

Roderick Orner, a psychology professor from the University of Lincoln, has his own perspectives when we reach out to him. “There is a duty of care. This doesn’t in any way guarantee that there aren’t going to be casualties amongst people who view this kind of material, but the company must be seen to have done everything reasonable to reduce risks to staff.

Nobody should be doing this kind of work alone. And if it's been done by a group then the thing that’s really important is strong group cohesion

Roderick Orner

“First of all, nobody should be doing this kind of work alone. And if it's been done by a group then the thing that’s really important is strong group cohesion. It’s very important to arrange this in such a way that responsibility is not seen to be with the individual.”

Any company hiring for such “dangerous work”, Orner says, should have training so that employees can recognise the warning signs: “a general sense of unease, not being able to relax after work, maybe being unduly preoccupied with certain images. And to be particularly watchful of whether sleep is adversely affected: with an accumulation of poor sleep, everything else feels much worse.”

"What's on your mind?"

Whether Facebook is interested in such insights is another matter. “We don’t claim that all the fault is on the side of the companies – that’s not true” says Block. “The fault, we consider at least, is they don’t make it transparent, they don’t open the discussion and they don’t accept that they alone can’t decide about all that.” Block and Riesewieck know that some Facebook employees have seen their film at a screening in San Francisco, and there was even talk of showing it at Facebook’s offices, only for follow-up emails to end up mysteriously unanswered.

Certainly the NDA treatment isn’t helping, and the sheer quantity of ex- and current employees bound to them means the effect will inevitably lessen as there’s a certain safety in numbers. Gray hasn’t had any word from Facebook over breaking his – at least not directly.

“I had a call a couple of weeks ago from a former colleague… and they said ‘Hey, I hear you’re being sued by Facebook’. No. Who told you I was being sued? ‘My team leader.’ Your team leader is trying to manipulate you into silence.”

I don't know if anything was ever done. You know, it just goes off into the void, it seems

Chris Gray

In other words, the carrot to stick balance feels as comically off as Facebook’s moderator to user ratio. Given people want to make the internet a better place, perhaps Facebook could tap into that sense of meaning?

Even Gray remembers feeling positive, recalling a text message he sent in the early days. “I said ‘I have personally escalated 13 cases of child abuse to the rapid response team, and I feel really good’.” But it didn’t last. “I never heard back from those people. I don't know if anything was ever done. You know, it just goes off into the void, it seems.”

Could acknowledgement of making a difference boost morale? Maybe, but only if they really are having an impact, Gray fairly interjects. “Some kid in Afghanistan is tied to the bed naked and he's being beaten. I escalate that because it's it's child sexual abuse, but what can anybody do?

“I’m just following the policy, the person next level up is just deleting it.”

Alan Martin

Alan Martin is a freelance writer in London. He have bylines in Wired, CNET, Gizmodo UK (RIP), ShortList, TechRadar, The Evening Standard, City Metric, Macworld, Pocket Gamer, Expert Reviews, Coach, The Inquirer (RIP), Rock Paper Shotgun, Tom's Guide, T3, PC Pro, IT Pro, Stuff, Wareable and Trusted Reviews amongst others. He is no stranger to commercial work and have created content for brands such as Microsoft, OnePlus, Currys, Tesco, Merrell, Red Bull, ESET, LG and Timberland.