Facebook has been talking up its third-party fact-checking partners as the company continues its war against fake news and misinformation campaigns.
But at least one partner isn’t actually reviewing much content.
FactCheck.org, one of several third-party reviews contracted by Facebook, told the Wall Street Journal it reviews less than one Facebook post per day, on average.
The company has been ramping up detection efforts in light of foreign interference in the 2016 presidential election, and ahead of the midterm elections in November. CEO Mark Zuckerberg testified before Congress that the company’s AI algorithms were getting better, but that human fact-checkers were essential to the “arms race” of content moderation.
This week, Facebook showed off its recently formed “War Room” dedicated to election security.
FactCheck assigned two full-time staffers of its eight-person team to work specifically with Facebook, the company told the Journal. But its staffers are seeing little work. Other third-party organizations reported similar workloads, the Journal reported.
Facebook first announced the fact-checking partners in 2016 and at the time identified FactCheck.org, Snopes, Politifact and ABC News. The company now includes the Associated Press and The Weekly Standard Fact Check among its U.S.-based reviewers.
Facebook has promised 20,000 human fact-checkers would be working on content moderation by the end of the year, but says there are still limitations.
“Even where fact-checking organizations do exist, there aren’t enough to review all potentially false claims online. It can take hours or even days to review a single claim,” Tessa Lyons, a product manager in charge of News Feed, said in a June blog post.
Representatives for Facebook and FactCheck.org were not immediately available to comment.