Instagram and Facebook Safety Measures for Teens: Expanding Protection Against Unwanted Contact
3 min readIn the digital age, social media platforms have become an integral part of our daily lives. They offer a space for communication, entertainment, and connection. However, with the increasing use of social media by younger generations, concerns regarding their safety and well-being have grown. In response to these concerns, Meta, the parent company of Facebook and Instagram, has announced new safety measures aimed at protecting teens from potentially unwanted contact.
In 2021, Meta introduced a safety feature that restricted adults from messaging under-18 users on Instagram who did not follow them. This year, the company is expanding this rule to further safeguard younger teens. Users under 16, or 18 depending on their country, will no longer be able to receive direct messages (DMs) from anyone they do not follow by default, even if they are fellow teens.
This new safety measure applies to both Instagram and Messenger. For Messenger, young users will only be able to receive messages from their Facebook friends or people in their phone contacts. Users under parental supervision will need approval from their guardians to make any changes to this setting. The effectiveness of this setting will depend on a user’s declared age and Meta’s technology designed to predict people’s ages.
Meta’s primary objective is to ensure that teens have safe, age-appropriate experiences on its apps. Earlier this month, the company announced that it will start hiding content related to self-harm, graphic violence, eating disorders, and other harmful topics from teens on Instagram and Facebook. Users under 16 will not see such content in their feeds and stories, even if they are shared by accounts they follow. Meta also recently introduced a mindfulness feature that sends “nighttime nudges” to teens under 18, encouraging them to close the app and go to bed if they have been scrolling for more than 10 minutes.
These changes come in the wake of lawsuits and complaints against Meta for its handling of younger users and their data. An unsealed lawsuit filed by 33 states accuses the company of actively targeting children under 13 to use its apps and websites and of continuing to harvest their data even after becoming aware of their ages. A Wall Street Journal report also revealed that Instagram served “risqué footage of children as well as overtly sexual adult videos” to accounts that follow teenage influencers. In December 2023, the state of New Mexico sued Meta, alleging that Facebook and Instagram algorithms recommended sexual content to minors. The latest report from The Wall Street Journal revealed that 100,000 child users were harassed daily on Facebook and Instagram based on employees’ estimates, highlighting the need for stricter measures on its platforms.
Meta’s new safety measures are a step in the right direction towards ensuring the safety and well-being of its younger userbase. However, it is essential to remember that no system is foolproof, and users must remain vigilant and report any inappropriate content or behavior they encounter. Parents and guardians also play a crucial role in monitoring their children’s online activities and educating them about safe online practices. By working together, we can create a safer and more enjoyable social media experience for all users, especially the younger generation.
In conclusion, Meta’s new safety measures for teens on Instagram and Facebook are a significant step towards protecting them from potentially unwanted contact and harmful content. These measures, along with continued vigilance and education, can help ensure that teens have safe, age-appropriate experiences on these platforms. It is crucial that we continue to prioritize the safety and well-being of our younger users in the digital age.