Facebook is nearly doubling the number of workers it employs to monitor Facebook Live video feeds, in an attempt to catch violent streams before they spread across the network.
The social network has had to grapple with several graphic videos on its network being shared widely in the past several months – including a spate of live-streamed suicides, rapes and the real-time confessions of Steve Stephens, who also posted a video of himself killing a man on the network.
Facebook CEO Mark Zuckerberg said in a Facebook post Wednesday that the social network is hiring 3,000 additional workers to its “community operations” team, which is in charge of fielding reports from users that flag inappropriate material on the site. The company currently has 4,500 workers on the team.
The new reviewers “will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” Zuckerberg said. He added Facebook will keep working with “local community groups” – such as suicide prevention groups – and law enforcement to offer assistance to those who may need help from the videos.
The network hopes to cut down on the response time between when someone reports a violent or inappropriate video and when Facebook can take the video down.
Zuckerberg has previously said little about these violent Facebook incidents; at the company’s annual conference, he expressed his sympathy for those affected by the crimes live-streamed on Facebook’s platform.
Critics have leveled heavy criticism at Facebook for not having sufficient measures in place to vet and react to users who stream inappropriate content on the social network.
(c) 2017, The Washington Post · Hayley Tsukayama