YouTube has been at the center of a great deal of controversy recently, with videos aimed at children said to include messages encouraging self harm, and pedophile rings operating in the comments sections of supposedly ‘kid-friendly’ content.
Unsurprisingly, this has caused a great deal of public backlash, along with many high-profile companies having pulled their advertisements from the platform, but today YouTube has promised a slew of changes that aim to remedy to the problem.
In a post (opens in new tab) on YouTube’s Creator Blog, the video streaming service announced that it has begun removing the ability to comment on videos containing “young minors”, with “tens of millions” of videos already affected.
“We will continue to identify videos at risk over the next few months,” YouTube stated. “We will be broadening this action to suspend comments on [...] videos featuring older minors that could be at risk of attracting predatory behavior”.
While some select creators will still be able to retain comments on their videos, they will have to monitor their comments rigorously for nefarious comments, and “demonstrate a low risk of predatory behavior”.
Prevention is the best cure
On top of the manual work YouTube has done in removing “hundreds of millions of comments”, it has also developed a new classifier with a broader scope that helps to automatically detect and remove twice as many comments that violate policy as was previously possible.
To cap off the post, YouTube has declared that it has “terminated certain channels that attempt to endanger children in any way” and will continue to do so as they are discovered, in part with the help of users flagging (opens in new tab) posts.