Facebook Chairman and CEO Mark Zuckerberg testifies at a House Financial Services Committee hearing in Washington, October 23, 2019.
Erin Scott | Reuters
Facebook CEO Mark Zuckerberg on Tuesday said the company’s ability to moderate content on its social networks has been impacted by Covid-19 limiting its use of human moderators.
As the coronavirus began to spread in the U.S. in mid March, Facebook and its partners sent content moderator contractors home to keep them safe, Zuckerberg said. That decision reduced Facebook’s number of human content moderators down to just its full-time moderators, he said.
“Our effectiveness has certainly been impacted by having less human review during Covid-19, and we do unfortunately expect to make more mistakes until were able to ramp everything back up,” he said.
As a result of this limitation, Facebook made the decision to prioritize the use of human moderators to do initial reviews of the most severe content violations reported by its users. As a result, Facebook has relied less on human moderators to look at appeals involving other types of content. Zuckerberg said he expects the number of appealed content to be much lower in the company’s August report.
Already, that drop on content appeal reviews can be seen in the Tuesday report. Content appeal reviews for January through March came in at 2.3 million pieces of content, down nearly 18% from content appeal reviews between October and December 2019 and down nearly 26% from January through March 2019.
Despite this dip, Zuckerberg said Facebook will continue to issues its content moderation transparency reports.
“We’re gonna keep sharing our report even if our numbers dip in some places because I believe transparency in how we’re handling the safety of our community is as important as the reports that we make on our quarterly earnings,” he said.
Facebook is now in the process of bringing its content moderator contractors back online to help with the review of content, and the majority of those reviewers can now work from home, said Guy Rosen, Facebook vice president of integrity.
“There’s obviously difference in what that works is like, so we’re working hard to make sure that we’re prioritizing things the right way,” Rosen said.