Content in moderation: is digital censorship messing with the arts?

Despite your feelings on the mechanics or outcome, the larger reasoning behind the app makes a certain amount of sense. Swearing is not allowed on radio and live TV before a certain time of night, clean versions of songs and albums are produced for radio and streaming play and films are restricted by age-ratings based on profanity as well as content.

The publication of 50 Shades of Grey, which started life as Twilight fan fiction, prompted some to call for the age rating of books but while some pop-lit books, like the gross-out Wetlands, come with stickers that say "Not suitable for younger readers" there's no law against a teenager going to a bookshop and buying them. You can see why a parent might want an alternative.

But neither film nor music use machine algorithms to decide what words should and shouldn't be removed or dialled down. Removing swear words from songs is still done manually - the mix engineer works to a list of words that radio stations will not play, then goes through each track and manually edits them out. Sometimes swear words are simply cut leaving just the backing music, sometimes they are replaced with funny sounds and very occasionally they are replaced with an alternative word provided by the artist.

More often than not, swear words are replaced with the reversed audio. That's why you get a lot of "ish" sounds in hip-hop; what you're hearing is 'shit' played backwards. But there is always a human removing the words and at least some contextual thought goes into how to replace them. What's more, the artist is always at least aware that the process is taking place.

CleanReader

In the age of music streaming, record labels provide streaming services with clean versions of many albums just as they provide radio stations with radio edits of songs. On streaming services like Spotify, these clean edits are sometimes set as the default version of an album, which can be jarring when particularly profane songs become a hotpotch of articles, determiners and quantifiers. Spotify is keen to point out that the explicit version is never removed - but you might have to go looking for it.

There's no automated process in film classification either, at least in the UK. The BBFC has stopped relying on a list of swear words to decide what was and wasn't suitable for each age rating. It now takes the context of the language into account - a lighthearted "fuck" is rated lower than an aggressive or insulting "fuck", for example. You can even have a "cunt" in a 15-rated film these days, as long as it is "justified by the context".

CleanReader

If you can't handle profanity how can CleanReaders handle the sex and death in A Game of Thrones?

Every film is watched by at least one human being, with the most controversial movies watched by several. The director and president take the ultimate responsibility for the rating of every film. It's not for nothing that there are real people's signatures on the classification certificate you see before every movie screened in the UK.

The BBFC is by no means a perfect organisation with a perfect system and has had many complaints levelled at it in its time, but imagine if its process was taken over by machines working to algorithms. To cut and censor a film without a full and deep appreciation for its contextual meaning or nuances seems anti-democratic; almost dictatorial in its approach.

Of course, the BBFC and Clean Reader are very different things. Clean Reader only wants to offer you the chance to read a book without profanity - it's not taking out rape scenes, cutting violence or denying suicide as Thomas Bowdler and his sister did in The Family Shakespeare which was published in 1807, very certainly without the Bard's posthumous permission. It's also not providing a standard against which all books are rated nor saying "this book is now suitable for a 12 year old to read". But in the digital age of instant access, there are parallels.

CleanReader

Which begs the question: can you really apply rigorous machine thinking to something with as many shades of grey as film or literature? By programming a machine to replace swear words, we're saying if x then replace with y - if 'fuck' then replace with 'darn' - there's a right and a wrong, a catch-all solution. But that doesn't allow for the hundreds of tiny decisions that go into every word of a novel.

"Each word in my books is carefully chosen," Melissa Harrison, author of Clay and At Hawthorn Time, told us. "Swear words perhaps more than most, because they are powerful. When Denny calls Jozef a 'cunt' in my first novel, Clay, it's supposed to be shocking. Moreover, the insult Denny chooses is very precisely determined by his class, age and background; there is no other word that a man like him would use in that circumstance. The word is exactly right."

Similarly, Joe Dunthorne, author of Submarine and Wild Abandon, feels protective over his words. "I take so much time and pride over my choice of profanity that it would feel like a betrayal [to have them changed]. It serves a narrative purpose as well. Specifically in Submarine, the choice of sex words - there's a very long sex scene and the whole idea was that it runs in real time so as you read the sex scene happens at the speed at which the actual sex is occurring. The idea is that as he progresses towards orgasm his choice of sex words become ever more florid and out there, so you can track the progress of his emotional state through the choice of profanity. And that is obviously a deliberate effect that would be lost if any of those words were tagged out in favour of something else."

That's not to say that a human touch is necessarily the answer. The authors we spoke to weren't really any more open to having their books 'cleaned' by a human or a machine. "By definition changing the word is making the wrong choice, so what kind of wrong choice doesn't really matter," Dunthorne said. "I don't think a human's going to have a much more nuanced take on it."

CleanReader

Somewhat misguidedly, Clean Reader's people responded to upset authors by likening literature to a salad. In this analogy, the salad is the book, the cheese is profanity. "Is the chef offended when I don't eat the blue cheese? Perhaps. Do I care? Nope. I payed [sic] good money for the food and if I want to consume only part of it then I have that right. Everyone else at the table can consume their food however they want. Me removing the blue cheese from my salad doesn't impact anyone else at the table."

This blog post prompted the following reaction from Joanne Harris: "NOTE: A BOOK IS NOT A FUCKING SALAD"

If the question of profanity in literature is such a big deal to readers, Harrison is happy for them to forgo her books. Dunthorne has an alternative solution: let authors who are willing provide an alternative themselves.

The algorithmic approach comes with too little care, and a human moderator ctrl+f-ing a book isn't going to do much better. "It's interesting to think about whether as a writer you might be given the option to provide alternative words," he suggested. "Ie, if you could basically present an alternative age rating version of the text where you as the writer get to choose how you have it softened."

News Editor (UK)

Former UK News Editor for TechRadar, it was a perpetual challenge among the TechRadar staff to send Kate (Twitter, Google+) a link to something interesting on the internet that she hasn't already seen. As TechRadar's News Editor (UK), she was constantly on the hunt for top news and intriguing stories to feed your gadget lust. Kate now enjoys life as a renowned music critic – her words can be found in the i Paper, Guardian, GQ, Metro, Evening Standard and Time Out, and she's also the author of 'Amy Winehouse', a biography of the soul star.