Connect with us

Social Media

UK health minister leans on social media platforms to delete anti-vax content

Published

on

Social media-fuelled anti-vaxxer propaganda is the latest online harm the U.K. government is targeting.

Speaking on BBC Radio 4’s Today program this morning health secretary Matt Hancock said he will meet with representatives from social media platforms on Monday to pressure them into doing more to prevent false information about the safety of vaccinations from being amplified by their platforms.

“I’m seeing them on Monday to require that they do more to take down wrong — well lies essentially — that are promoted on social media about the impact of vaccination,” he said, when asked about a warning by a U.K. public health body about the risk of a public health emergency being caused by an increase in the number of British children who have not received the measles vaccination.

“Vaccination is safe; it’s very, very important for the public health, for everybody’s health and we’re going to tackle it.”

The head of NHS England also warned last month about anti-vaccination messages gaining traction on social media.

“We need to tackle this risk in people not vaccinating,” Hancock added. “One of the things I’m particularly worried about is the spread of anti-vaccination messages online. I’ve called in the social media companies like we had to for self-harming imagery a couple of months ago.”

Hancock, who between 2016 and 2018 served as the U.K.’s digital minister, prior to taking over the health brief, held a similar meeting with the boss of Instagram earlier this year.

That followed a public outcry over suicide content spreading on Instagram after a British schoolgirl was reported to have been encouraged to killed herself by viewing graphic content on the Facebook -owned platform.

Instagram subsequently announced a policy change saying it would remove graphic images of self harm and remove non-graphic self-harm images so they don’t show up in searches, relevant hashtags or the explore tab.

But it remains to be seen whether platforms will be as immediately responsive to amped up political pressure to scrub anti-vaccination content entirely given the level of support anti-vaxxer messages can attract among social media users.

Earlier this year Facebook said it would downrank such content in the News Feed and hide it on Instagram in an effort to minimize the spread of vaccination misinformation.

It also said it would point users toward “authoritative” vaccine-related information — i.e. information that’s been corroborated by the health and scientific establishment.

But deleting such content entirely was not part of Facebook’s announced strategy.

We’ve reached out to Facebook for any response to Hancock’s comments.

In the longer term social media platforms operating in the U.K. could face laws that require them to remove content deemed to pose a risk to public health if ordered to by a dedicated regulator, as a result of a wide-ranging government plan to tackle a range of online harms.

Earlier this month the U.K. government set out a broad policy plan for regulating online harms.

The Online Harms Whitepaper proposes to put a mandatory duty of care on platforms to take reasonable steps to protect users from a range of harms — including those linked to the spread of disinformation.

It also proposes a dedicated, overarching regulator to oversee internet companies to ensure they meet their responsibilities.

The government is currently running a public consultation on the proposals, which ends July 1, after which it says it will set out any next actions as it works on developing draft legislation.

Continue Reading
Advertisement Find your dream job

Trending