Google's YouTube engineers have developed a new system for detecting and tracking videos of illegal child abuse on the internet.

The company's executive chairman Eric Schmidt has detailed how the search engine is working to stop paedophiles from sharing images and videos of abuse online in a many-pronged attack.

While the company argues that it isn't directly responsible for the abusive content that lives on the web, it has come under fire for being the conduit through which people come across the images, as well as its products being used to host them.

Google already uses Microsoft's PhotoDNA picture detection tech to give it a unique digital fingerprint so it knows if and when the image reoccurs on the web. An actual person is still required, however, to review each image because "computers can't reliably distinguish between innocent pictures of kids at bathtime and genuine abuse."

Because video is as big a problem, Google's YouTube engineers are working on a similar motion-picture technology called VideoID.

The tech is currently in testing and will be shared across other online companies over the course of next year. It will also be used to locate and track videos that infringe on copyrights.

No algorithm is perfect

Aside from these tactics, Google will clean up search using a "finely-tuned" algorithm to detect and remove links to child sexual abuse material in Google search results.

Schmidt says that while "no algorigthm is perfect", the tweaks have "cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids".

The next element is what Schmidt describes as deterrence: Google will display warnings when you search for one of over 13,000 queries to "make clear that child sexual abuse is illegal and offer advice on where to get help." This is something that Microsoft's Bing already does.

The good news is that many companies are working together on this: as well as sharing their tech smarts, Google and Microsoft are working together on an industry-wide database of these detected images allowing a quick and easy way to pass information over to the authorities.

It's not a perfect plan. While it's unlikely that many paedophiles are using Google literally to search for what they want, the measures should stop innocent web users from stumbling across these images - and locating images and videos when they occur will lead to take-downs and, potentially, arrests.