Science

How Will Facebook Prevent Murders and Suicides? 3,000 More Jobs

After several murders, suicide attempts, and broadcasted crimes, the social network is cracking down. 

Getty Images / Justin Sullivan

Since its launch in January 2016, Facebook Live has hosted millions of videos. Some of them have included death, murder, and crime. On Tuesday, nearly a year and a half since the feature’s launch, and in the wake of another murder live-streamed on Facebook, CEO Mark Zuckerberg announced that the site is hiring 3,000 more people to review reports of inappropriate or dangerous Live broadcasts and media.

Zuckerberg says that Facebook already has 4,500 community operations employees, but the high-profile live-streamed murders of Robert Goodwin, Sr. and an infant in Thailand has apparently forced the company to realize that its moderation capabilities were insufficient. The additional moderators will allow the company to process reports faster, which Zuckerberg says could help it keep the site clean from the everyday presence of “hate speech and child exploitation.”

In the post, Zuckerberg references a recent event when Facebook moderators reported a user who was allegedly considering suicide to the local authorities, and the user was able to get help.

“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need,” Zuckerberg writes.

Zuckerberg addressed Goodwin’s death at the Facebook F8 Developers conference last week.

“We have a lot more to do here,” he said.

In addition to hiring 3,000 more employees, the service was also investing in “building better tools to keep our community safe,” that would make the reporting system faster and more efficient in cases where law enforcement or other authorities need to get involved.

It seems likely that the job could be similar to a “Community Operations Safety Analyst” position, which Facebook has listed on its contractor jobs employment page. The job asks prospective employees to do the following:

Learn and apply user content policies around sensitive site issues. Monitor reports to mitigate risks on the site. Analysis to provide feedback on user safety initiatives.

The position is looking for:

Fast learner and strong communicator; ability to proactively identify opportunities to improve efficiency; ability to learn complicated and nuanced policies, while applying them consistently over time. Language expertise in Arabic, Portuguese or Spanish helpful, but not necessary.

A bachelor’s degree is “preferred but not required” for the job. We reached out to Facebook for more information on the hiring process for the 3,000 new employees and will update when we know more what that job looks like.