Connect with us

Technology

Facebook to crowdsource fact checking to users with ‘diverse viewpoints’

Published

on

As Facebook continues to ramp up its fact-checking operation, the company is turning to a new group to help it spot fake news before it goes viral: its users.

The social network has been testing a new fact checking program that uses part-time contractors, who don’t have professional fact-checking experience but who represent “diverse viewpoints,” to aid in its larger fact-checking effort. 

The plan, which is for now a “pilot program,” is notably different from Facebook’s current fact-checking operation, which relies on outside organizations like the Associated Press or factcheck.org. Instead of established news organizations, this pilot program will “leverage the Facebook community” to research potential fake news before it’s routed to professional fact checkers.

The goal, according to Facebook, is to speed up the fact-checking process. 

“For example, if there is a post claiming that a celebrity has died and community reviewers don’t find any other sources reporting that news — or see a report that the same celebrity is performing later that day — they can flag that the claim isn’t corroborated,” the company explains. “Fact-checkers will then see this information as they review and rate the post.”

On the surface, the new fact-checking program makes sense. Facebook’s fact checkers have long called the company’s process too cumbersome, complaining that it moves too slowly. But the idea of crowdsourcing fact-checking also belies what some say is Mark Zuckerberg’s fundamental misunderstanding of journalism. 

Facebook’s CEO, who has famously said he doesn’t want his company to become the “arbiters of truth,” has long proposed some kind of crowdsourced approach to fact-checking. 

“It’s not about saying here’s one view; here’s the other side,” Zuckerberg told a group of editors and media executives last year, in remarks reported by The Atlantic. “You should decide where you want to be.”

He echoed a similar sentiment earlier this year, in a conversation with Harvard Law Professor Jonathan Zittrain. 

I think the real thing that we want to try to get to over time is more of a crowd sourced model where people, it’s not that people are trusting some sort, some basic set of experts who are accredited but are in some kid of lofty institution somewhere else. It’s like do you trust, yeah, like, if you get enough data points from within the community of people reasonably looking at  something and assessing it over time, then the question is can you compound that together into something that is a strong enough signal that we can then use that?

When Zittrain pointed out that such an approach could lead to intentional manipulation, Zuckerberg acknowledged that “there are a lot of questions here, which is why I’m not sitting here and announcing a new program.”

Now, Facebook says it’s working with outside partners to vet all its reviewers in order to ensure “the pool of community reviewers represents the diversity of people on Facebook.” A contracting company called Appen handled the actual hiring, while polling company YouGov has provided data that backs up that the hires are “representative of the Facebook community in the US and reflects the diverse viewpoints — including political ideology — of Facebook users.” 

Importantly, Facebook doesn’t comment on its new contractors’ ability to determine what is and isn’t accurate in its early testing of the program. The company says YouGov’s study of the program  “found the judgments of corroborating claims by community reviewers were consistent with what most people using Facebook would conclude.”

Continue Reading
Advertisement Find your dream job

Trending