Connect with us

Entertainment

Following suicides and lawsuits, Snapchat restricts apps building on its platform with new policies

Published

on

After a bullied teen died by suicide, a grieving mother last year sued the platform where the abuse had taken place — Snapchat — for not doing enough to protect its younger users. Another lawsuit, related to another suicide, followed last month. In response to the former, Snap banned the anonymous messaging apps that had facilitated online bullying and vowed to revamp its policies to address what sort of Snapchat-connected experiences could be built using its developer tools. Today, the company announced the results of its policy review and the changes it’s making.

Effective immediately for new developers building on its Snap Kit platform, Snap is banning anonymous messaging apps and will require anyone building friend-finding apps to limit those apps to users 18 and up. Existing developers are being given 30 days to come into compliance with the new policies.

These changes are limited to third-party apps integrated with Snapchat and are not intended to address other child safety issues on Snap’s platform.

Snap says the policy update will impact a small subset of their community of over 1,500 developers. Only around 2% of developers will be impacted by the prohibition on anonymous messaging apps, while another 3% will be impacted by the new requirement to age-gate their apps. The company also noted that developers who remove anonymous messaging from their apps can have their apps re-reviewed and remain a Snap Kit partner.

One app that greatly benefited from the earlier ban on anonymous messaging apps YOLO and LMK, Sendit, is among those that will need to make changes in order to continue to work with Snapchat. In a matter of months following the bans, Sendit had gained millions more downloads from teens who still wanted a way to post anonymous Q&As.

The draw of anonymous social apps is unquestionable, especially for young people. But over the years, time and again, it’s been proven that such apps cannot be used responsibly –but can result in devastating consequences. From the early MySpace days to the teen suicides linked to Ask.fm to the unfortunately well-funded anonymous apps like Secret and Yik Yak (neither of which lasted), anonymity in the hands of young people has been tested and consistently failed. Considering this history, it was arguably irresponsible to permit this sort of activity on Snapchat in the first place, given its core demographic of teens and young adults.

In addition to the anonymous messaging ban, Snap will also now limit friend-finding apps to adult users ages 18 and up.

Friend-finding apps are designed to connect users with strangers on Snapchat, can encourage people to share their personal information, and are a common avenue for child predators to reach younger, vulnerable Snapchat users. Often, the apps are used for dating purposes or sexting, not “friend-finding,” and can be filled with porn bots. For years, law enforcement officials and child safety experts have warned about child predators on Snapchat and dubbed friend-finding apps as “Tinder for teens.”

Issues with these apps continue today. For example, an investigation published last month by The Times detailed the rampant sexual abuse and racism taking place on one of these apps, Yubo.

The anonymous messaging ban and restrictions on friend-finding apps are the only two major changes being made to Snap’s policies today, but the company notes that developers’ apps will still have to go through a review process where they have to answer questions about their use cases and demo their proposed integrations. Snap also said it will conduct periodic reviews every six months to ensure the functionality of the apps hasn’t changed in a way that would violate its policies. Any developer who intentionally seeks to deceive Snap will be removed from Snap Kit and the developer platform altogether, it added.

“As a platform that works with a wide range of developers, we want to foster an ecosystem that helps apps protect user safety, privacy and well-being while unlocking product innovation for developers and helping them grow their businesses,” a Snap spokesperson said in reference to the policy updates. “We believe we can do both, and will continue to regularly evaluate our policies, monitor app compliance, and work with developers to better protect the well-being of our community.”

Snap’s platform safety still needs work

While the changes impact third-party apps integrating with Snapchat, the company has yet to address child safety issues on its platform through something like an age-gated experience for minors, similar to TikTok, or through the launch its promised parental controls, which Instagram and TikTok now have.

However, the company, whose app is rated 13+, has restricted the visibility and findability of minors’ profiles, provides tools and reminders to maintain your friend list, requires mutual friending before messaging (if under 18), and provides links to safety resources, like mental health lines.

Despite those efforts, today’s changes come ahead of what’s still a lot more work to be done in terms of child safety.

Platform safety is already top of mind for social media companies industry-wide as regulatory pressure heats up. In its case, Snap was hauled before Congress last fall to answer lawmakers’ questions over various safety issues impacting minors and young adults using its app, including the prevalence of eating disorder content and adult-oriented fare that’s inappropriate for Snapchat’s younger teenage users but not blocked by an age gate.

Snap was also sued this January alongside Meta by another family that lost their child to suicide after she succumbed to pressure to send sexually explicit photos that were later leaked among her classmates. The complaint states that Snapchat’s lack of verification of the child’s age and its use of disappearing messages contributed to her death. In addition, the suit mentions how anonymous messaging played a role, though it doesn’t directly reference the use of third-party anonymous apps.

In the same month, Snap addressed other issues with its friend recommendation feature to make it harder for drug dealers to connect with teens on the app. The problem had been the subject of an NBC News investigation that connected Snapchat with the sale of fentanyl-laced pills that had killed teens and young adults in over a dozen states.

Prior to that, the company faced lawsuits for its “speed filter” that let users take photos that showed how fast they were going. The filter contributed to numerous car accidentsinjuries, and even deaths over the years. It was later disabled at driving speed initially, then taken down in 2021. (Snap declined to comment on this matter because litigation is pending.)

Now that lawmakers are finally looking to rein in the Wild West days of Big Tech, where growth and engagement were consistently prioritized over user safety, Snap has been preparing to make changes. It hired its first-ever head of platform safety, Jacqueline Beauchere, in September.

Snap CEO Evan Spiegel in October also said the company was developing parental control tools. These tools — which would follow the launch of parental controls on TikTok and, just this week, Instagram — will allow parents to see who their teens are talking to on the app.

Snap hasn’t said if the tools will address parents’ other concerns — including a way for parents to disable the child’s access to sending or receiving disappearing messages, restrict friend requests or require approvals, block the child from sharing photos and other media, or hide the adult-oriented (and often clickbait-y) content that features prominently in the app’s Discover section.

“We want to help provide ways for parents and teens to partner together to ensure their safety and well-being online — similar to the ways parents help prepare their kids in real life,” a Snap spokesperson said of the parental controls. “We hope that these new tools will serve as a conversation starter between parents and their teens about how to be safe online.”

The company said its initial suite of parental controls is on track for a launch this year. The developer policy changes are live now.

If you or someone you know is struggling with depression or has had thoughts of harming themselves or taking their own life, The National Suicide Prevention Lifeline (1-800-273-8255) provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations.

Correction, 3/17/22 2:45 PM ET: Snapchat’s speed filter was initially disabled at car speeds, but not fully removed from the platform until 2021. 

Advertisement Find your dream job

Trending