Connect with us

Entertainment

Instagram debuts new Family Center with additional safety tools for parents

Published

on

Instagram launched its latest attempt to quell the concerns of parents and guardians Wednesday: its brand new Family Center — a one-stop shop with teen safety tools, parental monitoring, and educational resources for those worried about teen safety on the app.

The Family Center connects all of Instagram’s online safety resources in one spot. The site includes an education hub, which provides Instagram-specific safety explanations; conversation guides for guardians to discuss digital safety and wellness; and external resources from partner organizations like The Trevor Project and the American Foundation for Suicide Prevention.

Most interestingly, the Family Center allows parents to closely monitor accounts using a single dashboard. The dashboard provides access to insights and usage on profiles that have provided you access — you can visit Instagram’s explanation of supervised profiles to learn more about these settings. Guardians can see how long supervised accounts are active on the app, will be able to set time limits, can monitor who follows and frequently interacts with the account holder, and get alerts from users when (and why) they report another account or post that appears in their feeds.

The new tools were originally announced in December, and are hosted by Instagram’s parent company, Meta. For now, teens have to approve supervision within the settings on their own account. Instagram will introduce a way for parents to set up supervision outside an account’s settings in the future. Supervision is automatically removed from an account when the owner turns 18.

In a blog post about the newly added features, head of Instagram Adam Mosseri wrote that the announcement was “the first step in a longer-term journey to develop intuitive supervision tools, informed by experts, teens and parents.” The initiative was co-created by a safety advisory board, which includes representatives from online safety organizations around the world, as well as a collaborative group made up of teens, parents, and other youth safety advisors.

Mosseri said in a video posted to his Twitter account that Instagram’s Family Center would continue to grow and change as it’s used, and that the company receives feedback from parents and teens. “We know parents are busy, and there’s a lot to do in day-to-day life, so we want to make sure these tools are as easy to use as possible,” Mosseri said.

The intent is to create collaborative, supportive relationships between parents and app users. “Encouraging informed parental engagement in their children’s digital presence is an important way to support young people’s wellness online,” wrote Dr. Michael Rich, director and founder at Boston Children’s Hospital’s Digital Wellness Lab, in a statement from Instagram. “Parents can support and monitor their children’s gradual increase in independence as they demonstrate responsible and safe use, with respect for others and for themselves.”

While that’s a noble goal, it’s also a limitation for the new features. They put a lot of power and responsibility in the hands of parents, who have to be prudent enough to have continuous, active conversations about digital safety with their kids. The tools are also inherently preventative instead of fixing the harm already caused by the app’s use. And what about teens and kids who don’t have any guardian supervision, but are still at high risk for abuse and psychological harm online?

Last year, Instagram and Facebook came under fire for a lack of action in protecting young users from abuse and inappropriate content, even after discovering that the app’s usage led to negative mental health outcomes for teens. Concerns grew alongside a similar reckoning with the ever-growing app TikTok, which continues to churn up concerns for user safety. In December 2021, Instagram representatives, including Mosseri, had to testify in Congress, speaking to the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security about teen safety and industry regulations.

There’s also a question of consent. Will these tools be as effective if teens don’t feel comfortable sharing their experiences on the app with their parents, or if parents take over as account monitors without the consent of their children? Where do you draw the line between fostering independence and trust, versus keeping teens away from danger on an app like Instagram?

Instagram hopes that the safety tools found in the Family Center, especially educational resources about creating healthy digital boundaries and habits, can start users on that path toward safer, healthier usage. The company’s future plans include allowing parents and guardians to apply the Family Center tools across all Meta accounts, the addition of even more safety monitoring tools, and a rollout of the same safety features to Quest VR in the coming months.

It’ll be quite the test of trust between the app and its parent company, worried guardians, and teen users themselves.

If you want to talk to someone or are experiencing suicidal thoughts, Crisis Text Line provides free, confidential support 24/7. Text CRISIS to 741741 to be connected to a crisis counselor. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected] You can also call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources.

Continue Reading
Advertisement Find your dream job

Trending