Snapchat is introducing its first set of parental controls today, after announcing in October last year that it was developing tools that would help parents better understand how their teens were using the social networking app. The update follows the launch of similar parental controls in other apps favored by teens, including Instagram, TikTok and YouTube.
To use the new feature, known as Family Center, parents or guardians must install the Snapchat app on their own device to link their account to their teens through an opt-in invite process.
Once configured, parents can see which accounts the teen has had conversations with in the app in the past seven days, without being able to view the content of those messages. They can also review the teen’s friends list and report possible abuse to Snap’s Trust & Safety team for review. These are essentially the same features that londonbusinessblog.com reported and were in development earlier this year.
Parents can access to the new controls either through the app’s profile settings or by searching for “family center” or related terms from the app’s search function.
Snap notes that the feature is only available to parents and teens ages 13 to 18, as the app is not intended for use by younger users. The launch follows increased pressure on social networks to better protect their underage users from harm, both in the US and abroad. This has led major tech companies to introduce parental controls and other safety features to comply with EU laws and expected US regulations.
Other social networks have introduced more extensive parental controls compared to what was available at the launch of Snapchat’s Family Center. For example, TikTok allows parents to set screen time, enable a more “restricted mode” for younger users, disable search, set accounts to private, restrict posts, and also who can see the teen’s likes and who can click on their posts. respond, among other things. Instagram also offers support for time limits set by parents in addition to parental controls.
However, Snap points out that it doesn’t require as much parental controls because of how the app is designed in the first place.
Teens must be mutual friends by default in order to communicate – so there’s a reduced risk of them receiving unwanted messages from potential predators. Friend lists are private and teens are not allowed to have public profiles. In addition, teen users only appear as “Suggested Friends” or in search results when they have mutual friends with the user in the app, which also limits their exposure.
That said, parental concerns about Snapchat aren’t limited to the fear of unwanted contact between teens and potentially dangerous adults.
At its core, Snapchat’s disappearing messages feature makes it easier for teens to engage in bullying, abuse, and other inappropriate behavior, such as sexting. As a result, Snap has been the subject of: multiple lawsuits of grieving parents whose teens committed suicide. They claim Snap’s platform helped facilitate online bullying, leading the company to review its policies and restrict access to its developer tools. It also shut down friends-finding apps that had encouraged users to share their personal information with strangers — a common way to child predators to get youngervulnerable Snapchat users.
Sexting is also a matter of multiple lawsuits. Most recently, a teenage girl started a class-action lawsuit against Snapchat who claims his designers have done nothing to protect against sexual exploitation of girls who use his service.
With Snapchat’s new Family Center, the company is giving parents some insight into teen use of the app, but not enough to completely prevent abuse or exploitation, as it promotes teen privacy.
For parents, being able to view a teen’s friends list doesn’t necessarily help them understand whether those contacts are safe. And parents don’t always know the names of all their teenagers’ classmates and acquaintances, only their closest friends. Snap also doesn’t allow parents to stop their teens from sending photos privately to friends, nor has it implemented a feature similar to Apple’s iMessage technology that automatically steps in to alert parents when sexually explicit images are sent in chats. (Although it does connect now) CSAI Matching Technology to remove known abuse material.)
The Family Center also offers no control over whether or how their teen can interact with the app’s Spotlight feature, a TikTok clone of short videos. Also, parents can’t control whether or not their teen’s live location can be shared on the in-app Snap Map. And parents can’t control who can add their teens as friends.
The company’s Discover section is also ignored by parental controls.
At a congressional hearing last year, Snap was asked to defend why some of the content in the Discover section was clearly adult-oriented — such as invitations to sexualized video games, articles about going to bars or articles about porn, and other items that don’t sync with the age rating of the app from 13+. The new Family Center does not provide control over this part of the app, which contains a significant amount of clickbait content.
We’ve found that this section consistently contains intentionally shocking photos and medical images, similar to the cheap clickbait articles and ads you’ll see all over the web.
At the time of writing, a quick scroll through Discover revealed several articles intended to intimidate or alarm — at least three articles contained photos of giant spiders. Another was about a parent who killed her children. One story was about Japan’s suicide forest and the other story was about people dying in theme parks. There was also a story of a teacher who was caught “cheating with” (his words) a 12-year-old student – a really disgusting way to title a child sexual abuse story. And there were multiple pictures of rare medical conditions that should probably be left to a doctor, not younger teens.
Snap says a future update will introduce “content controls” for parents and the ability for teens to notify their parents when they report an account or piece of content to Snap’s security team.
“While we closely monitor and control both our content and entertainment platforms, not allowing unmonitored content to reach large audiences on Snapchat, we know that every family has different views on what content is appropriate for their teens and we want give them the option to make those personal decisions,” said a Snap spokesperson about the upcoming parental control features.
The company added that it would continue to add other controls after it received more feedback from parents and teens.