MPs have urged that sites actively moderate all user generated uploads in a move that could threaten everything from forums through to video providers like YouTube and photo upload sites.
A Culture, Media and Sport select committee believes a new governmental body, which will be called the UK Council for Child Internet Safety, should be set up to monitor the internet and attempt to protect children from harmful content.
But it was their comments on user-uploaded content, such as the videos on YouTube, posts on all forums and the photographs on Flickr that will create the most discussion; with 'expectation' that these uploads are monitored proactively.
This would mean that every one of the hundreds of thousands of videos, pictures and even comments and forum posts put up on UK sites on a daily basis would have to be checked before publication, rather than the current passive system where only videos that attract complaints are monitored.
The cost of such a system would run into million of pounds, with the sheer volume of videos, pictures and text being uploaded to websites by users meaning that many companies would opt to disable uploads in the UK rather than pay for the moderation.
The current stance for many of the UK's major sites is that, until the legal system has some kind of ruling in place for the control of user generated content (UGC) on the internet, it will adopt a passive 'report and take-down' approach that allows other users to easily indicate illegal or upsetting posts in order for a moderator to take a look.
"Lack of consistency"
However, within this self-regulation by sites "appears to be a lack of consistency and transparency of practice, and the public needs the assurance that certain basic standards will be met."
The committee stopped short of recommending making internet policing mandatory, but the new body (which is expected to be called the child internet safety council) will be set up by the end of the year and the MPs involved add: "We would expect providers of all Internet services based upon user participation to move towards these standards without delay.
"Rather than leap to statutory regulation, we propose a tighter form of self-regulation, under which the industry would speedily establish a self-regulatory body to draw up agreed minimum standards based upon the recommendations of the UK Council for Child Internet Safety, monitor their effectiveness, publish performance statistics, and adjudicate on complaints," concludes the report.
Article continues below