It's a commendable stance, albeit one that needs to be backed up in actual law in order to convince companies that they should switch to an active moderation system.

Sheer volume

However, as you have probably noted – there is another problem to active moderation. It is a costly and time consuming process; and when your UGC content is of the volume of a site like YouTube or Flickr's then the financial burden is potentially massive.

Indeed Google – the owners of YouTube – have expressed their doubt that it would even be feasible to pro-actively moderate every post.

"We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly," a spokesman told the BBC.

"Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly."

Not an excuse

The committee does not accept that volume is an excuse saying: "We found the arguments put forward by Google/You Tube against their staff undertaking any kind of proactive screening to be unconvincing.

"To plead that the volume of traffic prevents screening of content is clearly not correct: indeed, major providers such as MySpace have not been deterred from reviewing material posted on their sites.

"Even if review of every bit of content is not practical, that is not an argument to undertake none at all."

So, essentially, the Committee are saying that companies should be seen to try to moderate their content even if they can't do it with 100 per cent effectiveness – which makes a lot of sense.

The WORLD wide web

Of course, regulating UK sites is one thing, but the laws don't apply to much of the rest of the internet - a truly global product.

However, Committee chair John Whittingdale MP told TechRadar that this shouldn't lead to a laissez faire attitude.

"Just because people can get around the rules doesn't mean that there should be no rules," said Whittingdale.

"We want the industry to self-regulate and produce its own list of standards that people comply to.

"The sites that are prepared to comply should want to advertise this to their users. It should be something that companies are proud of saying: 'we will keep your kids safe from harmful content'.