Google has updated its YouTube Kids service, putting measures in place that should make it a safer place for children to browse.
In a blog post, it outlines the steps it's taking to safeguard children from potentially harmful content, by adding collections by trusted partners and 'parent approved' content - something that was rumored earlier this year.
It has also put up some bumpers on search, adding the ability to turn search off so all that's available are videos that have been pre-approved by parents.
“When we launched the YouTube Kids app three years ago, our goal was to give kids around the world a place to access videos that were enriching, engaging and allowed them to explore their endless interests,” states the blog (opens in new tab).
“Since then, our team has continued to work to improve the app experience for kids and families around the world. One area of focus has been building new features that give parents even more control around the content available in the YouTube Kids app so they can make the right choice for their unique family and for each child within their family.”
YouTube Kids’ premise should have been simple: offer a safe viewing experience for children. But a mixture of nefarious uploaders looking to game the system with dodgy, copyright-infringing content and a lapse in the algorithm meant that some videos available to view through the service should never been seen by children.
When basics like this breakdown then there is a massive problem. YouTube is hoping its latest improvements will win back users burned by the service and salvage its integrity.
YouTube in its post doesn't actually address any of the controversy over the app, but does acknowledge that it isn't foolproof when it comes to safeguarding children from the wrong type of video.
“While no system is perfect,” the post notes, “we continue to fine-tune, rigorously test and improve our filters for this more open version of our app.”
It also puts the onus back on the parents encouraging them to “block and flag videos for review that they don't think should be in the YouTube Kids app". Given 8 million videos were taken down in just three months on the full-fat service, YouTube needs all the help it can get to stem the flow of bad content.
The changes outlined by YouTube will start rolling out through the year.