The latest safety product updates address existing risks faced by youth when connecting with strangers, regulated access to content, and offering educational resources for parents.
Snap Inc. has unveiled a suite of new safeguards to further protect 13-to-17-year-old Snapchatters against potential online harms. These safeguards, which will begin to roll out in the coming weeks, are designed to protect youth from being contacted by people they may not know in real life, provide a more age-appropriate viewing experience on Snapchat’s content platforms, and enable Snap to more effectively remove accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies. These new measures will put greater enforcement and regulations in place to guide youth towards safer experiences while limiting unwanted contact from suspicious accounts.
Real relationships that are safer to navigate
Moving forward, when youth become friends with someone on Snapchat – such as a close friend, family member, or other trusted person – they must be existing Snapchat friends or phone book contacts with another user before they can begin communicating. It is now also harder for a user to show up as a suggested friend to another user outside their friend network.
As part of the rollout, Snapchatters can expect to see the following upgrades to their user experience:
- In-app warnings: Youth will now receive pop-up warnings if they are contacted by someone they don’t share mutual friends with or the person isn’t in their contacts. This message will urge users to carefully consider if they want to be in contact with this user and not to connect with them if it isn’t someone they trust.
- Stronger friending protections: Snapchat already requires a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in Search results. They are now raising this bar to require more friends in common based on the number of friends a Snapchatter has. This aim is to further reduce youth and strangers’ ability to connect.
New strike system to crack down on accounts promoting age-inappropriate content
Across Snapchat’s two main content platforms – Stories and Spotlight – users can find public Stories published by vetted media organisations, verified creators, and Snapchatters. Snapchat applies additional content moderation on these public content platforms to prevent violating content from reaching a large audience. Snapchat recently launched a new Strike System to help remove accounts that market and promote age-inappropriate content. Under this system, any inappropriate content detected proactively or that gets reported will be immediately removed. If an account tries to circumvent rules repeatedly, it will be banned. Learn more about the Strike System here.
“From the start, Snapchat was designed to be different, built as an antidote to traditional social platforms – prioritising our community’s safety, privacy and wellbeing, especially our younger audiences. A huge share of our audience comes from the GCC region, and we continue to work on creating a better online ecosystem that also offers safe avenues for young Snapchatters. In the coming months, we will build on these upgrades by launching new and improved resources for families to help encourage safer habits and practices for using Snapchat,” said Georg Wolfart, Head of Public Policy at Snap Inc.
Beyond these developments, Snap has released new in-app educational content that will be featured on Snapchat’s Stories platform and available for Snapchatters using relevant Search terms or keywords. Snap will also release an updated safety guide for parents at parents.snapchat.com and a new YouTube explainer series that walks through suggested protections for youth.