Connect with us


Social media giants told to protect children from ‘disturbing content’ | UK News



Social media platforms have been urged to take more responsibility for the “horrific amount of disturbing content” children are able to access online.

The Children’s Commissioner for England has called on YouTube, Snapchat, Pinterest and Facebook – the owners of Instagram – to back a statutory duty of care and a digital ombudsman to act as an independent arbiter between the platforms and their users.

In an open letter, Anne Longfield said the death of 14-year-old Molly Russell highlighted the amount of harmful material children are able to access online.

The teenager’s family later found she had viewed content on social media linked to anxiety, self-harm, depression and suicide before taking her own life in November 2017.

Molly Russell
Molly Russell died in November 2017

Ms Longfield questioned whether the tech firms still had control over the content appearing online, considering the rate at which the platforms had grown.

She said: “If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.

“The recent tragic cases of young people who had accessed and drawn from sites that post deeply troubling content around suicide and self-harm, and who in the end took their own lives, should be a moment of reflection. I would appeal to you to accept there are problems and to commit to tackling them – or admit publicly that you are unable to.”

Ms Longfield demanded the social media giants provide information on the amount of self-harm related material available on their platforms as well as data on the amount of under-18s and under-13s using their platforms.

Anne Longfield warned the numbers of pupils in PRUs made it easier for gangs to target children
Anne Longfield reminded platforms that with ‘great power comes great responsibility’

The Children’s Commissioner also asked for details on what support was offered to those who searched for images of self-harm and what criteria was used to decide on whether posts or accounts were removed.

Ms Longfield added: “With great power comes great responsibility, and it is your responsibility to support measures that give children the information and tools they need growing up in this digital world – or to admit that you cannot control what anyone sees on your platforms.”

Snapchat said it worked hard to “keep Snapchat a safe and supportive place for everyone”.

“From the outset we have sought to connect our community with content that is authoritative and credible and safeguard against harmful content and disinformation,” a spokesman added.

A spokesman for Instagram and Facebook said: “We have a huge responsibility to make sure young people are safe on our platforms and working together with the government, the Children’s Commissioner and other companies is the only way to make sure we get this right.”

He said the platforms were “taking measures aimed at preventing people from finding self-harm related content through search and hashtags”.

:: If you feel emotionally distressed or suicidal please call Samaritans for help on 116 123 or email [email protected] in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.

Advertisement Find your dream job