Why the UK doesn't moderate UGC
An in depth look at the legal problems with user generated content
Although their findings may be hotly debated – much of the Select Committee findings on the way in which children are kept away from harmful content on the internet is sensible and well-measured.
Using the common-sense findings of Tanya Byron as its starting point the recommendations on a whole host of subjects make a lot of sense, but perhaps the most important declaration is that sites should not be penalised for actively moderating their content.
To understand why and how any of this is relevant to the UK internet industry you have to consider the confusion that has reigned in the past over the way that companies approach UGC content like forums and uploaded pictures/videos.
The way it is
The primary reason that many of the UK's major internet companies – Microsoft's MSN and our own Future Publishing's websites for instance – have adopted their current policy on UGC is because of the EC-E-Commerce Directive.
This directive deals with the responsibility of companies for content published on their website of which they do not have 'actual knowledge.' As the Committee report explains:
"Under regulation 17 of the Electronic Commerce (EC Directive) Regulations 2002 (which transpose the Directive into UK law), companies that transmit Internet content on behalf of others (such as a user's profile page on a social networking site) cannot be held liable for anything illegal about the content if they did not initiate the transmission, select the receiver, or select or modify the information contained in the transmission.
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
"Nor is a service which hosts Internet content liable for damages or for any criminal sanction as a result of that storage if they do not have "actual knowledge" of unlawful activity or information and if, on becoming aware of such activity, they act "expeditiously" to remove or to disable access to the information."
In other words – if you don't know about it then you can't be held responsible. Which has led to many companies taking the stance that if they choose to actively moderate their UGC then they could feasibly be considered to therefore have 'actual knowledge' of all content posted which would make them legally responsible not just for the posting of unsuitable material, but libellous comments.
Which means that the majority of major companies have taken a passive 'report and take-down' approach to ensue that they can use regulation 17 as a defence.
Tanya Byron's response to this was to suggest the approach: "is a bit like saying that it is unfair to ask companies to survey their premises for asbestos in case they find some but fail to remove it safely", adding that "on this issue, companies should not hide behind the law."
Which is a fair comment, but does not salve the fears of the companies who take the passive stance. Inside the current system those that actively moderate could well be found guilty of having prior knowledge through moderation, so you can't blame those who choose to take the 'head in the sand' approach. At least until the law is clarified.
And the committee appears to appreciate this as well – sensibly going so far as to suggest that the government should seek to 'seek amendment to the Directive if it is preventing ISPs and websites from exercising more rigorous controls over content."
Public interest
The report says: "We do not believe that it is in the public interest for Internet service providers or networking sites to neglect screening content because of a fear that they will become liable under the terms of the EC E-Commerce Directive for material which is illegal but which is not identified.
"It would be perverse if the law were to make such sites more vulnerable for trying to offer protection to consumers. We recommend that Ofcom or the Government should set out their interpretation of when the E-Commerce Directive will place upon Internet service providers liability for content which they host or to which they enable access."
Patrick Goss is the ex-Editor in Chief of TechRadar. Patrick was a passionate and experienced journalist, and he has been lucky enough to work on some of the finest online properties on the planet, building audiences everywhere and establishing himself at the forefront of digital content. After a long stint as the boss at TechRadar, Patrick has now moved on to a role with Apple, where he is the Managing Editor for the App Store in the UK.