The statement follows Zuckerberg’s announcement on Wednesday that Facebook will hire 3,000 more people to monitor the content of Facebook live videos. The new hires join a staff of about 4,500 people, and are in response to a slew of violent videos posted to Facebook, Zuckerberg said. And while this will almost double the community operations team, using A.I. to help flag posts that need to be reviewed is one of the things Facebook is working on.
“No matter how many people we have on the team we’re never going to be able to look at everything,” Zuckerberg said during the call. And with video and live video becoming more popular, monitoring all the posts becomes a huge challenge. “A.I. tools, over time, will be able to do a better job of flagging things for the set of people who are in the community ops team so that we can prioritize what we look at,” Zuckerberg said.
Based on his statements, the most likely scenario has an A.I. scrawling through posts in real time in order to flag dangerous activity to the human monitors. In the event of a problem, like the person on Facebook Live video contemplating suicide last week that Zuckerberg highlighted in the call, it allows the community operations team to contact the local police to help individuals in trouble.
However, this is a move that will take place over a span of years, Zuckerberg said. “Right now there are certain things that A.I. can do in terms of understanding text, understanding what’s in a photo, what’s in a video. That will get better over time. That will take a period of years though to really reach the quality level that we want.” In the meantime, increasing the size of the community operations team doubles Facebook’s capacity to respond to posts or videos that have been reported, and get help for users in real time.