Facebook is stepping up efforts to curb violent videos
Plans to hire 3,000 more people to review flagged videos and other content
Facebook has had a number of extremely violent videos posted on its service in recent weeks, and today CEO Mark Zuckerberg outlined changes to its review process aimed at making reporting, removing and responding to such videos faster and more effective.
"If we're going to build a safe community, we need to respond quickly," Zuckerberg wrote in a post. "We're working to make these videos easier to report so we can take the right action sooner – whether that's responding quickly when someone needs help or taking a post down."
The biggest step Facebook is taking is hiring 3,000 more workers for its worldwide operations team. They will join the 4,500 people already employed "to review the millions of reports we get every week, and improve the process for doing it quickly."
The reviewers will scrutinize flagged videos, but also content that violates Facebook's terms, such as hate speech and child exploitation. Part of the team's job is also to interface with local community groups and law enforcement to respond when someone may be about to harm themselves or is in danger because of someone else.
Technology, too
Technology will also play a key role in Facebook's efforts to remove graphic or offensive content and intervene when appropriate.
"In addition to investing in more people, we're also building better tools to keep our community safe," Zuckerberg wrote.
"We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help," he continued. "As these become available they should help make our community safer."
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
One tool Facebook already utilizes is artificial intelligence, which prevents videos from being shared in their entirety. This lets users share videos to spread awareness or speak out, but prevents the sharing of graphic or sensitive content.
Facebook faced backlash last month after a man in Cleveland posted a video of himself shooting and killing another man that was viewable on the service for more than two hours. The company said it did not receive a report about the video until an hour and 45 minutes after it was posted.
Following the incident, Facebook said "we know we need to do better" in reviewing and removing violent content.
Monitoring Live video has been especially challenging for Facebook as events are unfolding in real-time. While the company wants to encourage users to go Live and share experiences with their friends and followers, some users have also posted videos of themselves committing crimes and harming themselves and others.
How effective Facebook's new efforts will be remains to be seen. While the company is taking steps to curb violent videos, with nearly two billion users and millions of hours of video posted every day, a 7,500-person team likely won't be able to review and remove flagged content at the speed Facebook aspires to.
Still, Facebook is willing to try, and is ready to make more changes as appropriate. In response to Zuckerberg's post, Facebook COO Sheryl Sandberg commented: "Keeping people safe is our top priority. We won't stop until we get it right."
Michelle was previously a news editor at TechRadar, leading consumer tech news and reviews. Michelle is now a Content Strategist at Facebook. A versatile, highly effective content writer and skilled editor with a keen eye for detail, Michelle is a collaborative problem solver and covered everything from smartwatches and microprocessors to VR and self-driving cars.