In a fresh move to enhance digital safety, Meta has introduced new Instagram teen safety features aimed at protecting users aged 13 to 17.
These latest updates focus primarily on direct messages (DMs), making it easier for teens to spot suspicious accounts and stay safe online.
Announced through an official blog post, Meta said the new features are designed to provide underage users with better control over their inboxes.
Now, Instagram will display the year a message sender joined the platform, giving young users more context before engaging in conversations.
The company has also introduced clearer block and report options within DMs.

These new tools allow teens to instantly block and report potentially harmful users, which Meta says will help prevent fraud and unwanted contact.
Meta clarified that these DM safety tools are currently exclusive to Instagram but hinted that similar protections may soon be added to Facebook Messenger. However, another feature – Safety and Location Notices – is being rolled out across both Instagram and Facebook.
Instagram categorizes accounts of users between 13 and 17 years of age as teen accounts.
These profiles already have several privacy restrictions, such as limited visibility to unknown users and tighter default privacy settings.
By adding layers of visibility and moderation tools, Meta aims to create a safer online space for teens.
The platform has faced criticism in recent years for not doing enough to protect young users. With these features, the company signals a stronger commitment to digital youth safety.