Jaipur, September 2023.
Creating a safe and positive experience for Snapchatters is a top priority and the platform is always trying to do more to help keep the growing community safe. With that goal in mind, today Snap Inc. is announcing new features to further protect 13-17 year olds from potential online risks. These features, which will begin to roll out in the coming weeks, are designed to 1) protect teens from being contacted by people they may not know in real life; 2) provide a more age-appropriate viewing experience on the content platform; and 3) enable more effective removal of accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies.
As a messaging platform for real friends, the goal is to help Snapchatters communicate with people that matter to them, and to ensure that the content they view on the app is informative, fun and age-appropriate.
In addition, Snap is releasing new resources for families, including an updated parents guide at parents.snapchat.com that covers our protections for teens, our tools for parents, and a new YouTube explainer series.
Safer Contact
Most teens use Snapchat to chat with close friends using pictures and text, like text messaging or phone calls. When a teen becomes friends with someone on Snapchat, the platform wants to be confident it is someone they know in real life – such as a friend, family member or other trusted person. To help encourage this, it is already required for teens to be Snapchat friends or phone book contacts with another user before they can begin communicating.
Building on this, Snap is rolling out additional protections for 13-17 year olds, designed to further limit unwanted interactions or potentially risky contact, including:
- In-App Warnings: launching a new feature that sends a pop-up warning to a teen if someone tries to add them as a friend when they don’t share mutual contacts or the person isn’t in their contacts. This message will urge the teen to carefully consider if they want to be in contact with this person and not to connect with them if it isn’t someone they trust.
- Stronger Friending Protections: It is already required for a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in Search results. This bar is getting raised to require a greater number of friends in common based on the number of friends a Snapchatter has – with the goal of further reducing the ability for teens to connect with people they may not already be friends with.
Across Snapchat, illegal and harmful content such as sexual exploitation, pornography, violence, self-harm, misinformation, and much more is prohibited. To enforce policies and take quick action to help protect the community, Snap has long used a zero-tolerance approach for users who try to commit severe harms. If accounts are found engaging in this activity, they are immediately banned with measures to prevent the account holder from getting back on Snapchat, and may be reported that account holder to law enforcement.
New Strike System for Accounts Promoting Age-Inappropriate Content
While Snapchat is most commonly used for communicating with friends, Snapchat offers two content platforms – Stories and Spotlight – where one can find public Stories published by vetted media organizations, verified creators, and other Snapchatters. On these public content platforms, there applies additional content moderation to prevent violating content from reaching a large audience.
To help remove accounts that market and promote age inappropriate content we recently launched a new Strike System. Under this system, we immediately remove inappropriate content that we proactively detect or that gets reported. If an account is repeatedly trying to circumvent rules, they will be banned. One can learn more about the Strike System here [LINK].
Education About Common Online Risks
As Snapchat continues to bolster defenses against risks, the platform also wants to use its reach with young people to help them spot likely signs of this type of activity – and what to do if they encounter it. Today onwards the community will have access to new in-app content developed in partnership with Young Leaders for Active Citizenship (YLAC) as important local resources. This collaboration aims to create a more positive, responsible, and secure online environment for the youth by addressing critical topics like mental health, responsible sharing, and online safety. This content will be featured on the Stories platform, and surfaced to Snapchatters by relevant Search terms or keywords.
Uthara Ganesh, Head Public Policy-South Asia, Snap Inc. added,“Snapchat is designed to have fun and communicate openly with your closest friends. At Snap, nothing is more important than the safety of our users and we believe that design plays a powerful role in ensuring this. Our latest features are thoughtful in app features that are designed to empower teens to make smarter choices, and talk openly about staying safe online. We’re committed to making sure Snapchat is a place where you can be creative and stay safe and above all, the safety and well-being of our community in India, which includes over 200 million users, is our top priority.”
Shefali Shah, Movie & Theatre Artist and Celebrity Mom, Shefali Shah, “As a mother and an actor, I understand the power and impact of our digital choices. It’s crucial that we guide our teens to make thoughtful decisions while navigating the online world. Snapchat’s commitment to safety by design is commendable, and its new safeguards empower us as parents to strike that balance between trust and responsible oversight. Having open conversations with our teens about tools available to help keep them safe is critical.”
Snapchat’s commitment to making the platform safer is always on, with a continuous effort to build on these improvements in the coming months with additional protections for teen Snapchatters and support for parents.