Meta Accused of Hiding Evidence of VR Risks to Children
Two former Meta researchers have accused the tech giant of covering up evidence of potential harm to children from its virtual reality (VR) platforms, sparking intense scrutiny during a U.S. Senate hearing.
Jason Sattizahn and Cayce Savage, who previously worked on youth safety research for Meta, alleged that the company pressured its teams to delete findings that highlighted risks such as sexual exploitation on its VR platforms. They also claimed that Meta discouraged internal studies that could expose how its products might endanger young users.
“Meta has chosen to ignore the problems they created and bury evidence of users’ negative experiences,” Sattizahn testified. He worked at the company from 2018 until earlier this year.
The allegations were first reported by The Washington Post, which claimed Meta’s legal team had influenced internal research to downplay potential risks. Meta, the parent company of Facebook, Instagram, and WhatsApp, has strongly denied the claims, calling them “nonsense” and arguing that the whistleblowers are relying on “selectively leaked internal documents” designed to create a “false narrative.”
According to Meta, nearly 180 research studies have been approved in recent years by its Reality Labs division, covering issues like youth well-being and safety.
However, Sattizahn dismissed that defense, calling it a “lie by avoidance” and insisting that internal research was “pruned and manipulated” to protect the company’s image.
During the hearing, Savage recounted uncovering disturbing activity on Roblox, a popular online game for children, while conducting her research. She alleged that organized groups were using the platform to pay children to engage in inappropriate activities using Robux, the game’s in-app currency, which can be converted into real money.
Savage said she warned Meta that Roblox should not be hosted on its VR headset due to safety concerns. Despite her warning, Roblox remains available in Meta’s VR app store.
Roblox has rejected the allegations, stating they were based on “ill-informed and outdated information.” The company emphasized that safety is a top priority and said it has a 24/7 moderation system to swiftly remove harmful content and report offenders to law enforcement.
Meta currently offers parental control tools for its VR devices, including its Horizon Worlds platform. However, Florida Senator Ashley Moody highlighted how difficult these tools are to use, sharing that even she struggled to find the controls and had to ask her child for help.
This is not the first time Meta has faced accusations about child safety. In 2021, former employee Frances Haugen revealed that Instagram’s internal research showed the platform negatively affected teenagers’ mental health, describing it as “toxic” for many young users. Meta CEO Mark Zuckerberg denied prioritizing profits over safety but has repeatedly been questioned by lawmakers over harmful content targeting young audiences.
At a Senate hearing last year, Zuckerberg publicly apologized to families who believe their loved ones were harmed by Meta’s products, saying, “I’m sorry for everything you have all been through.”
The latest allegations come as regulators and lawmakers intensify calls for stricter oversight of tech companies, particularly when it comes to safeguarding children online.