Facebook and Instagram are making it harder for strangers to DM teens
Meta continues to steadily roll out updates for younger users in an attempt to bolster their safety and privacy. On Thursday, the tech company announced some of its most restrictive measures yet—in theory. Teens users, by default, will no longer receive direct messages on Instagram and Facebook from anyone that isn’t a follower or connection. “Connections,” according to Meta, are those people that users have “communicated with” in some way, such as sending text messages, voice or video calls, or accepting message requests. A similar update is also going into effect on Facebook, with messages only allowed from friends and “people they’re connected to through phone contacts, for example.”
Instagram previously restricted anyone over 18 years old from messaging younger accounts that did not already follow them back. The expanded rules will automatically apply to global users under the age of either 16 or 18, depending on their country’s laws, who now also cannot message other teens they are not connected to. Similarly, group chats with teens can only include their friends or connections, and the same messaging restrictions apply for teens who don’t follow each other.
To disable the setting, teens will need to receive permission from their parents through the social media platforms’ parental supervision tools. Until now, parents and guardians would receive notifications if teens changed their settings, but couldn’t do anything about it. According to Meta, affected users will receive a notification on their apps regarding the new changes.
“As with all our parental supervision tools, this new feature is intended to help facilitate offline conversations between parents and their teens, as they navigate their online lives together and decide what’s best for them and their family,” Meta wrote in today’s newsroom post.
It’s worth bearing in mind here that the updates assume that the parental supervision option is enabled, users have accurately entered their “declared age” on either Instagram or Facebook, and Meta’s age-predicting technology is working as planned.
The direct message changes arrive following multiple recent changes tailored for Facebook, Instagram, and Messenger’s under-18 crowds. Last week, Meta announced a new “nighttime nudge” feature that will begin politely reminding teens at regular intervals after 10pm to drop their phones and turn in for the night. Earlier this month, the company also revealed plans to roll out automatically restrictive content settings focused on curtailing young people’s exposure to potentially harmful subject matter, particularly posts and messages related to self-harm, eating disorders, and graphic violence. Unlike today’s new features, however, those content censors are mandatory, and unable to be circumvented for any accounts under the age of 18.
Meta’s flurry of social media reforms come as the company continues to deal with ongoing and mounting pressure regarding its yearslong approach (or lack thereof) for protecting minors. Next week, CEO Mark Zuckerberg will be grilled—alongside the heads of X, Snap, Discord, and TikTok—at a Senate hearing on online child safety. Meanwhile, Meta faces a number of major lawsuits alleging the company ignored safety issues in favor of profiteering from young users’ data.
Knowing this, Meta isn’t done with its policy changes. In today’s update, the company also announced impending plans to implement restrictions targeting “unwanted and potentially inappropriate” images and messages from young users’ connections and friends. More information pertaining to this policy shift will purportedly arrive “later this year.”