Instagram Becomes Safer for Indian Teens as Meta Rolls Out New DM and Block Features
Share this article:
Meta is stepping up its efforts to make Instagram a safer space for teenagers, especially in key markets like India. With young users spending more time online, the company has introduced a series of thoughtful updates aimed at protecting them from scams, unwanted messages, and inappropriate content-starting with a stronger focus on Direct Messages (DMs).
In a major change, Instagram will now show safety prompts before a teen begins a chat-even if both users follow each other. These pop-ups will remind the teen to carefully check the account profile of the person they’re messaging and to avoid sharing personal details if something feels off. The goal is to create a moment of pause and encourage thoughtful interaction, helping teens recognize potential red flags early on.
To help users identify fake or suspicious accounts more easily, Instagram will now display the month and year an account was created at the top of the chat. This small but powerful addition allows teens to spot freshly made or possibly fraudulent profiles-often used by scammers to target vulnerable users.
Blocking and reporting bad behavior just got simpler. Instagram has combined both actions into a single “Block and Report” button within the chat. Earlier, users had to take these steps separately, which may have discouraged action. Now, if a teen experiences anything uncomfortable or threatening, they can immediately cut off contact and flag the account in one go.
Meta is also extending protections to adult-run accounts for children under 13, such as those managed by parents or talent agents. These accounts will now come with Instagram’s highest safety settings enabled by default, including:
Meta clarified that while adults can operate these accounts on behalf of children, if a child is found managing the account themselves, it will be promptly deleted.
India is one of Instagram’s largest and fastest-growing user bases, especially among teens and young creators. With these updates, Meta aims to create a more age-appropriate and secure space for young users. These changes not only protect teens from online risks but also give parents more confidence in allowing their children to use the platform.
By embedding safety into the core experience, Meta is signaling that online protection for minors is no longer optional-it’s essential.
Safety Tips Before Chatting
In a major change, Instagram will now show safety prompts before a teen begins a chat-even if both users follow each other. These pop-ups will remind the teen to carefully check the account profile of the person they’re messaging and to avoid sharing personal details if something feels off. The goal is to create a moment of pause and encourage thoughtful interaction, helping teens recognize potential red flags early on.
See When an Account Was Created
To help users identify fake or suspicious accounts more easily, Instagram will now display the month and year an account was created at the top of the chat. This small but powerful addition allows teens to spot freshly made or possibly fraudulent profiles-often used by scammers to target vulnerable users.
One-Tap Block and Report Option
Blocking and reporting bad behavior just got simpler. Instagram has combined both actions into a single “Block and Report” button within the chat. Earlier, users had to take these steps separately, which may have discouraged action. Now, if a teen experiences anything uncomfortable or threatening, they can immediately cut off contact and flag the account in one go.
Stronger Controls for Adult-Managed Child Accounts
Meta is also extending protections to adult-run accounts for children under 13, such as those managed by parents or talent agents. These accounts will now come with Instagram’s highest safety settings enabled by default, including:
- Stricter message controls that limit who can contact the account
- Hidden Words filters that block offensive or harmful comments
- Safety alerts displayed prominently at the top of the feed
Meta clarified that while adults can operate these accounts on behalf of children, if a child is found managing the account themselves, it will be promptly deleted.
A Big Step Toward Safer Social Media in India
India is one of Instagram’s largest and fastest-growing user bases, especially among teens and young creators. With these updates, Meta aims to create a more age-appropriate and secure space for young users. These changes not only protect teens from online risks but also give parents more confidence in allowing their children to use the platform.
By embedding safety into the core experience, Meta is signaling that online protection for minors is no longer optional-it’s essential.
Next Story