YouTube's quest to stamp out the more undesirable aspects of its platform has taken another major step forward today, with the Google-owned video site announcing an update to its harassment policy (opens in new tab) that applies stricter rules against online bullying and other toxic behavior.
- YouTube is disabling comments on videos featuring children
- YouTube bans 'harmful or dangerous' prank videos
- We've rounded up the best free video editors
"We will no longer allow content that maliciously insults someone based on protected attributes such as their race, gender expression, or sexual orientation," said YouTube's vice president Matt Halprin in a statement on the matter. "This applies to everyone, from private individuals, to YouTube creators, to public officials."
While YouTube has always been quick to remove personal attacks and threats of violence from the site's comments section, the platform has now pledged to take a stronger stance towards posts that could be considered borderline, such as "veiled or Implied threats."
"This includes content simulating violence toward an individual or language suggesting physical violence may occur," said Halprin, further stating that "no individual should be subject to harassment that suggests violence."
The toxicity avenger
On top of this, YouTube will be dealing out harsh consequences for users who display a "pattern of repeated behavior across multiple videos or comments, even if any individual video doesn’t cross our policy line."
Starting today, "Channels that repeatedly brush up against our harassment policy will be suspended from YPP (YouTube Partner Program), eliminating their ability to make money on YouTube," explained Halprin. "We may also remove content from channels if they repeatedly harass someone."
The update to YouTube's harassment policy follows another policy update from earlier this year, in which the video platform announced bans for videos that promote superiority and discrimination.