Connect with us

Technology

After losing users’ trust, Facebook gives them ‘trustworthiness’ score

Published

on

TRUST.
TRUST.

Image: Chip Somodevilla/getty

Facebook knows it has a trust problem. 

In an effort to better manage how its users flag content they object to, the social media giant is planning to rate the trustworthiness of its users on an individual basis. However, the initiative comes at a time when Facebook has lost the confidence of the American people on numerous fronts ranging from its ability to protect users’ private data to the veracity of the information promoted by the platform on News Feed, surveys show. 

The Washington Post reports the company “has begun to assign its users a reputation score, predicting their trustworthiness on a scale from zero to 1.” The score, which was reportedly developed this year, is used for internal calculations by Facebook and does not appear to be displayed in any public capacity. 

Ostensibly, the scores are used to determine which users provide correct feedback about the accuracy of articles shared on the site. Which, if you’re a half-trillion dollar company looking to outsource vital labor to its users, sounds like an OK idea.

However, a lot about just what the so-called trustworthiness score is remains unknown. The Post writes that in addition to a lack of clarity on how they are calculated, we also don’t know all the specific ways in which the scores are used. We don’t even know if everyone on Facebook has a trustworthiness rating, or whether that potentially ignominious ranking is reserved for a lucky few. 

There are some things we do know, however. In March of this year, the Pew Research Center published survey data showing that “[while] a substantial share of Americans get news from Facebook and other social media sites, very few people express much trust in information on these sites.”

Specifically, continues Pew, “[only] 5% of online Americans say they have ‘a lot’ of trust in the information they get from social media sites.”

But the fun doesn’t stop there. The Business Insider Intelligence’s 2018 Digital Trust survey found that 81 percent of those surveyed had “little to no confidence that Facebook will protect their data and privacy.”

Essentially, people neither trust Facebook as company nor as a service. With Facebook now looking to surreptitiously quantify whether or not the people who spend their lives on its site are liars, it seems like the distrust goes both ways.

Facebook, which has credibly been accused of misleading regulators, elected officials, and the press before, disputes the Washington Post’s reporting. When reached for comment, a Facebook spokesperson claimed this was all much ado about nothing. 

“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading,” wrote the spokesperson. “What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system. The reason we do this is to make sure that our fight against misinformation is as effective as possible.”

Which, if accurate, sounds fine. If only there was some sort of way to determine whether or not Facebook is trustworthy. 

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f85967%2f5ac3511e 3588 47d0 817e 3d170500d683

!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1453039084979896’);
if (window.mashKit) {
mashKit.gdpr.trackerFactory(function() {
fbq(‘track’, “PageView”);
}).render();
}

Continue Reading
Advertisement Find your dream job

Trending