Meta Tightens Teen Safety Measures, Removes Over 600,000 Inappropriate Accounts
Meta has announced a new wave of teen safety initiatives on its platforms, alongside a major crackdown on accounts linked to inappropriate interactions with minors. The tech giant, which owns Instagram and Facebook, revealed that it has taken down more than 635,000 accounts involved in sexualised or predatory behaviour toward children.
Of the accounts removed, 135,000 had left explicit comments, while an additional 500,000 were flagged for inappropriate interactions with profiles believed to be run by or mimicking children under 13. Meta shared these details in a blog post on Wednesday, underscoring its renewed focus on teen safety amid growing public and legal pressure.
The company also unveiled new protective tools designed to give teenagers more control over their interactions online. These include safety notices when a teen is messaged, and a simplified process to block and report users in a single tap. According to Meta, these measures have already led to teens blocking over one million accounts and reporting another million after receiving safety alerts.
The announcement comes as social media platforms face mounting scrutiny over their role in safeguarding young users. Lawmakers and child safety advocates have raised alarms about the mental health impacts of social media and the rising trend of online predators who coerce or manipulate teens into sharing explicit images – a practice that can spiral into blackmail or public exposure.
In a further step, Meta is expanding its use of artificial intelligence to help detect users who misrepresent their age. Instagram is officially restricted to users over 13, but the company has been trialling AI tools to flag and reclassify underage users. Once flagged, these accounts are automatically placed under stricter controls – including private-by-default settings and limitations on who can message them.
This push for tighter safeguards comes as Meta battles multiple lawsuits from over 40 U.S. states. The suits accuse the company of knowingly designing addictive features that contribute to youth mental health struggles, and failing to adequately protect children on its platforms.
Meta says it remains committed to evolving its tools to keep teens safer online, particularly as its platforms continue to attract millions of young users worldwide.