Instagram won’t let random adults message teens anymore

The popular app is finally taking the safety of its youngest users seriously.

Mother filming teenager daughters dancing at home
FG Trade/E+/Getty Images

Instagram is implementing new policies to keep its younger users safe, including making it more difficult for adults to message teenagers who don’t follow them. And if an adult account is sending out large swaths of message requests to people under 18, Instagram will present those people with a “safety notice” with options to block, report, or restrict the adult.

Beyond messaging, Instagram announced that it’s also working to improve its understanding of its users’ real ages using AI and machine learning. The company officially requires that someone be over the age of 13 to sign up for an Instagram account, but, until now, there hasn’t really been a method by which to know if a child is just lying about their birth date.

Today’s policy update is by far the most comprehensive take on teen safety we’ve seen from Instagram in a very long time — perhaps ever. We can probably thank TikTok for that.

All about protecting the youths — Today’s policy update has the potential to genuinely impact user safety across Instagram.

The new restrictions on Direct Messages between adults and teens will be the most noticeable change from a user perspective. Adults attempting to contact teens who don’t follow them will be presented with a notification that DM’ing them isn’t an option. Teens will also start seeing safety notifications to encourage them to be cautious in conversations with adults they do follow — especially if that adult has been sending DMs to lots of teens.


Instagram will also now suggest that new users under the age of 18 make their accounts private by default. If a teen does choose to make their account public, they’ll later receive a notification highlighting “the benefits of a private account.”

It’s not mentioned much in today’s blog post, but Instagram says it’s also in the early stages of investing in new artificial intelligence and machine learning technology to better understand a user’s real age. Such tech, if successful, could be a game-changer for social media safety.

Thank you, TikTok — Kids love Instagram, and Instagram loves them, too. When it comes to protecting its youngest users, though, Instagram has until now mostly just shrugged its metaphysical shoulders. The app’s most recent updates — making its Shopping tab more prominent, adding short-form video — has been profit-focused.

Other social media apps, especially teen-favorite TikTok, have recently positioned themselves as leaders in privacy and safety, especially for younger users. Earlier this year, TikTok almost completely cut off its youngest users from the public, in the biggest child-focused privacy clampdown we’ve seen yet. TikTok’s proactive misinformation efforts have been hugely successful, and last month the app even added body-inclusive resources and warnings.

The uptick in concern for teen safety across social media is long overdue. Let’s hope the trend continues.