Politics
Facebook tests News Feed controls that let people see less from groups and pages
Facebook announced Thursday that it’s running a test to give users a sliver more control over what they see on the platform.
The test will go live on Facebook’s app for English-speaking users. It adds three sub-menus into Facebook’s menu for managing what shows up in the News Feed: friends and family, groups and pages and public figures. Users in the test can choose to keep the ratio of those posts in their feed at “normal” or change it to more or less, depending on their preferences.
Anyone in the test can do the same for topics, designating things they are interested in or stuff they’d rather not see. In a blog post, Facebook says the test will affect “a small percentage of people” around the world before the test expands gradually in the next few weeks.
Facebook will also be expanding a tool that allows advertisers to exclude their content from certain topic domains, letting brands opt out of appearing next to “news and politics,” “social issues” and “crime and tragedy. “When an advertiser selects one or more topics, their ad will not be delivered to people recently engaging with those topics in their News Feed,” the company wrote in a blog post.
Facebook’s algorithms are notorious for promoting inflammatory content and dangerous misinformation. Given that, Facebook— and its newly-named parent company Meta — are under mounting regulatory pressure to clean up the platform and make its practices more transparent. As Congress mulls solutions that could give users more control over what they see and tear down some of the opacity around algorithmic content, Facebook is likely holding out hope that there’s still time left to self-regulate.
Last month before Congress, Facebook whistleblower Frances Haugen called attention to the ways that Facebook’s opaque algorithms can prove dangerous, particularly in countries beyond the company’s most scrutinized markets.
Even within the U.S. and Europe, the company’s decision to prioritize engagement in its News Feed ranking systems enabled divisive content and politically inflammatory posts to soar.
“One of the consequences of how Facebook is picking out that content today is that it’s optimizing for content that gets engagement, or reaction,” Haugen said on “60 Minutes” last month. “But its own research is showing that content that is hateful, that is divisive, that is polarizing — it’s easier to inspire people to anger than it is to other emotions.”
-
Business7 days ago
Why Apple’s ‘Crush’ ad is so misguided
-
Business6 days ago
Women in AI: Rep. Dar’shun Kendrick wants to pass more AI legislation
-
Business5 days ago
Healthy growth helps B2B food e-commerce startup Pepper nab $30 million led by ICONIQ Growth
-
Entertainment3 days ago
‘Furiosa: A Mad Max Saga’ review: George Miller’s blazing action folktale might just have outdone ‘Fury Road’
-
Entertainment4 days ago
‘House of the Dragon’ Season 2 trailer breakdown: Dragons, Rook’s Rest, and more
-
Entertainment5 days ago
Apple iPad Pro 2024 (13-inch) review: The battery life is bonkers
-
Business7 days ago
U.K. agency releases tools to test AI model safety
-
Entertainment3 days ago
Sex education is under threat in the UK. What’s going on?