Meta Covered Up Potential Child Harms, Whistleblowers Claim
Two former Meta safety researchers testified before a US Senate committee on Tuesday, alleging that the social media giant covered up potential harms to children stemming from its virtual reality (VR) products.
Jason Sattizahn and Cayce Savage, who once led research on the youth user experience for Meta's VR platforms, told senators that the company demanded researchers erase evidence of sexual abuse risk on those products. They also alleged that Meta instructed in-house researchers to avoid work that could produce evidence of harm from its VR products to children.
"This is a classic case of corporate cover-up," Sattizahn said during the hearing. "Meta has chosen to ignore the problems they created and bury evidence of users' negative experiences."
The allegations come after The Washington Post reported on Monday that Meta lawyers intervened to shape internal research that could have flagged risks. Meta, the parent company of Facebook, Instagram, and WhatsApp, denies the allegations, calling them "nonsense" in a statement.
Background and Context
Meta's VR products, such as Oculus, have been criticized for their potential impact on children's mental health and safety. The company has faced numerous lawsuits and regulatory scrutiny over its handling of user data and content moderation.
The whistleblowers' testimony highlights concerns about the intersection of technology and child safety. "As we increasingly rely on AI-powered technologies to shape our online experiences, it's essential that companies prioritize transparency and accountability," said Dr. Kathryn Montgomery, a leading expert on children's media policy.
Additional Perspectives
Dr. Jean Twenge, a psychologist who has studied the impact of social media on children, noted that Meta's actions are particularly concerning given the company's influence over online discourse. "Meta's VR products have the potential to expose children to explicit content and predators," she said. "It's unacceptable that the company would cover up these risks."
Current Status and Next Developments
The Senate committee is expected to continue its investigation into Meta's handling of child safety concerns. The company faces growing pressure from lawmakers, regulators, and advocacy groups to prioritize transparency and accountability.
As the debate over AI-powered technologies continues, experts emphasize the need for greater scrutiny and regulation. "We must ensure that companies like Meta are held accountable for their actions and prioritize the well-being of children," said Dr. Montgomery.
Meta's Response
In a statement, Meta spokesperson Andy Stone said: "The claims at the heart of this hearing are nonsense. We take child safety seriously and have implemented numerous measures to protect our users."
However, the whistleblowers' testimony raises questions about Meta's commitment to transparency and accountability. As the investigation continues, one thing is clear: the intersection of technology and child safety demands greater scrutiny and regulation.
Sources
Jason Sattizahn, former Meta safety researcher
Cayce Savage, former Meta safety researcher
The Washington Post
Dr. Kathryn Montgomery, leading expert on children's media policy
Dr. Jean Twenge, psychologist and expert on social media's impact on children
*Reporting by Bbc.*