Roblox website on a laptop computer

Roblox to Ban Messaging for Under-13s in New Child Safety Measures

Roblox, the popular online gaming platform, has announced significant updates to improve child safety, including a ban on under-13s sending direct messages without parental approval.

The new policy, set to roll out from Monday and fully implemented by March 2025, will block younger users from sending private messages by default. Parents or guardians can override this restriction by verifying their identity through a government-issued ID or credit card.

Additional parental controls will allow caregivers to monitor their child’s online friends, set playtime limits, and manage accounts directly. Public conversations within games will remain accessible to all users, enabling communication visible to other players.

Roblox, widely used by children aged eight to 12 in the UK according to Ofcom research, has faced calls to enhance safety measures on its platform. The company’s chief safety officer, Matt Kaufman, stated that safety remains a top priority, with thousands of employees dedicated to safeguarding features.

“As our platform has grown in scale, we have always recognised that our approach to safety must evolve with it,” Kaufman said, urging parents to ensure accurate age settings during account creation.

The National Society for the Prevention of Cruelty to Children (NSPCC) welcomed the changes, describing them as “a positive step in the right direction.” However, the charity highlighted the need for robust age verification systems to ensure effective protection.

New Maturity Guidelines Introduced

Roblox is also revising how it categorises game content. It will replace age-based recommendations with “content labels,” enabling parents to decide what their child can access based on maturity rather than age alone.

Content labels will range from “minimal,” which includes mild violence or fear, to “restricted,” which features more mature themes like strong violence or realistic depictions of blood.

By default, users under nine will only access “minimal” or “mild” content, while users aged 17 and older will require age verification to play “restricted” games.

These changes follow Roblox’s recent ban on under-13s from “social hangouts” and new requirements for developers to categorise their games by suitability for younger audiences.

Compliance with Online Safety Act

The updates come as platforms popular with children prepare to meet the UK’s forthcoming Online Safety Act regulations. Ofcom, the law’s enforcement body, has warned companies of penalties for failing to protect young users.

Roblox’s proactive measures signal a broader shift in the gaming industry toward prioritising child safety in an increasingly digital world.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We don’t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *