Facebook announced it will hire 3,000 people to monitor live videos and extremely inappropriate content, such as murder and suicide videos, in response to a number of high profile incidents posted on the social media platform.
“Over the last few weeks, we've seen people hurting themselves and others on Facebook – either live or in video posted later,” CEO Mark Zuckerberg said in a Facebook post.
“It's heartbreaking, and I've been reflecting on how we can do better for our community,” said Zuckerberg.
Facebook has been under the spotlight, with criticism of the company for failing to deal with a spike in heinous crimes, such as murder and rape, either streamed live or uploaded to the social media network.
Zuckerberg said over the next year the company will add 3,000 new workers to the 4,500 people who currently review reported posts. The new staff will help Facebook “get better” at removing content like “hate speech and child exploitation.”
The company will also work with law enforcement and local community groups to help people who may harm themselves, or those who are in danger.
Zuckerberg pointed to a case last week where a person broadcasting on Facebook Live was considering suicide. “We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate,” said the Facebook boss.
“We're also building better tools to keep our community safe,” Zuckerberg continued.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”
Facebook has been criticized for censoring videos at the behest of law enforcement, such as in the case of Korryn Gaines who was shot by Baltimore police along with her five-year-old son, and in the case of Philando Castile, whose girlfriend live streamed his death after he was shot by police.