Connect with us

Technology

Facebook’s fight against fake news is actually working. Sort of.

Published

on

Facebook’s promise to fight fake news is finally starting to work. Well, sort of. It depends on where you look. 

Almost two years after the company vowed to start taking its fake news problem seriously, some of those efforts are beginning to pay off, even if things aren’t moving nearly fast enough for some.

The social network has introduced a new series called “The Hunt for False News,” which includes specific examples of widely shared fake news on the platform. 

It’s partly a status update on the company’s efforts to fight misinformation and partly an effort at instilling a bit more media literacy in users (assuming they think to check Facebook’s official blog posts in the first place). The initial post provides three examples of fake news stories that have made the rounds on Facebook over the last several months:

  • A story titled “NASA will pay you $100,000 to stay in bed for 60 days!” (Spoiler: they won’t.) 

  • A video captioned “Man from Saudi spits in the face of the poor receptionist at a Hospital in London then attacks other staff.” (The video was old and originated in Kuwait.)

  • A photo that falsely identified a man as the attacker who stabbed a candidate in Brazil’s upcoming presidential election.

All of the stories were eventually debunked by Facebook’s third-party fact checkers and demoted in News Feed. But not before these items were shared. In the case of the fake story about NASA, the story still “racked up millions of views on Facebook,” before it was debunked.

“We’re getting better at detecting and enforcing against false news, even as perpetrators’ tactics continue to evolve. And while we caught and reduced the distribution of many pieces of misinformation on Facebook this summer, there are still some we miss,” writes Facebook product manager Antonia Woodford.

“We’re getting better at detecting and enforcing against false news, even as perpetrators’ tactics continue to evolve.”

On the whole, Woodford says that Facebook is getting better and better at stopping the spread of fake news. Elsewhere, academic studies have also suggested the company’s efforts have been paying off. A September study found that websites peddling fake news have seen significant drops in Facebook engagement since 2016 — results  Facebook has also touted as proof its fake news initiatives are working. 

But while progress may be being made, experts have pointed out that there are still serious issues with Facebook’s approach: There simply aren’t enough third-party fact checkers to keep up with the constant flood of misinformation, for one.

Consider this, from a story this week in The Wall Street Journal, which detailed the experiences of some of Facebook’s fact-check partners, including Factcheck.org (emphasis added):

Out of Factcheck’s full-time staff of eight people, two focus specifically on Facebook. On average, they debunk less than one Facebook post a day. Some of the other third-party groups reported similar volumes. None of the organizations said they had received special instructions from Facebook ahead of the midterms, or perceived a sense of heightened urgency.

Reading this, it’s not difficult to understand why it’s so hard for fact checkers to address false information before it’s widely distributed in Facebook’s News Feed. It’s always going to be faster to share something that’s inflammatory and wrong than it is to professionally debunk it. Which brings up another issue: How many people who see or share a fake news story also see its debunking, which can come days or even weeks later? 

Facebook has said that it notifies users and page administrators when a story they had previously shared is debunked by a fact checker, but that hardly guarantees they’ll actually see the message (particularly in an era when there’s an overwhelming amount of spammy Facebook notifications to begin with). It also does nothing to address those who may have seen the original post somewhere on Facebook but didn’t turn around and share it themselves. 

These issues are even more amplified in countries where false information is especially prevalent and Facebook is particularly influential. Earlier this month, The New York Times reported on the impossible task facing Facebook’s fact checkers in the Philippines.

There, fact checkers not only can’t keep up with the pace of false information, but also, they regularly deal with death threats and other harassment, according to the report. 

The same is true in Brazil, where fact checkers are using WhatsApp to try to counter rampant fake news ahead of the country’s elections. (These efforts aren’t going nearly far enough, according to many experts.)

Facebook, for its part, is aware that it has to keep doing more, even if it can’t wipe fake news out entirely. 

“Because it’s evolving, we’ll never be able to catch every instance of false news — though we can learn from the things we do miss. As a company, one of our biggest priorities is understanding the total volume of misinformation on Facebook and seeing that number trend downward,” product manager Tessa Lyons writes.

So while there is reason to be optimistic about Facebook’s efforts to get ahead of fake news, the problem is still far from solved.

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f86784%2f7aac4b15 04d1 4d7f a838 5f203ea4bbd7

Continue Reading
Advertisement Find your dream job

Trending