Call of Duty's AI-powered anti-toxicity voice recognition has already detected 2 million accounts
Call of Duty has seen an 8% reduction in repeat offenders since the system was added
Call of Duty's anti-toxicity voice chat moderation system has detected more than two million accounts that are being investigated.
Last year, Activision announced that it would be implementing a new real-time voice moderation tool to its more recent Call of Duty games that would "enforce against toxic speech" by detecting things like "hate speech, discriminatory language, harassment, and more" from players.
A beta version of the AI-powered moderation system was added to Modern Warfare 2 and Warzone in August 2023 in North America in English only before later implementing it into Call of Duty: Modern Warfare 3, expanding it globally (minus Asia) and adding Spanish and Portuguese support to its moderation.
Now, according to the company in a recent blog post, its anti-toxicity system program has managed to detect more than two million accounts that have seen in-game enforcement for "disruptive voice chat, based on the Call of Duty Code of Conduct."
However, Activision says it found an "unfortunate trend" that saw only one in five players report toxic behavior and speech. Still, in cases that go unreported, its voice moderation system would take action against players.
"Active reporting is still critical so that players can raise any negative situation they encounter," the company said.
"To encourage more reporting, we’ve rolled out messages that thank players for reporting, and in the future, we're looking to provide additional feedback to players when we act on their reports."
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Activision explained that after examining the month-over-month data, Call of Duty had seen an 8% reduction in repeat offenders since the voice-chat moderation system was added and saw a 50% reduction in players "exposed to severe instances of disruptive voice" chat since Modern Warfare III was released.
For violating the rules of conduct, players can expect to see consequences such as being globally muted from voice and text chat and/or restricting other social features.
"Our tools will continue to evolve and expand over time, including the addition of new languages to our voice moderation system in future updates," Activision added.
"Call of Duty is dedicated to combating toxicity within our games and will empower our teams to deploy and evolve our moderation technology to fight disruptive behavior, whether it be via voice or text chat. We understand this is ongoing work, but we are committed to working with our community to make sure Call of Duty is fair and fun for all.
For more, here's our list of the best FPS games, as well as the best multiplayer PC games available to play in 2024.