Meta Covered Up Potential Child Harms, Whistleblowers Claim
Two former Meta safety researchers accused the social media giant of covering up potential harms to children stemming from its virtual reality (VR) products during a US Senate committee hearing on Tuesday.
Jason Sattizahn and Cayce Savage, who once led research on the youth user experience for Meta's VR platforms, told senators that the company demanded researchers erase evidence of sexual abuse risk on those products. They also alleged that the company instructed in-house researchers to avoid work that could produce evidence of harm from its VR products to children.
"This is a classic case of corporate malfeasance," Sattizahn said. "Meta has chosen to ignore the problems they created and bury evidence of users' negative experiences."
The allegations come a day after The Washington Post reported on the whistleblowers' claims that Meta lawyers intervened to shape internal research that could have flagged risks.
Meta, the parent company of Facebook, Instagram, and WhatsApp, denies the allegations. In a statement, the company referred to the "claims at the heart" of the hearing as "nonsense."
Background and Context
Meta has been under scrutiny for its handling of user safety on its platforms. The company's VR products, including Oculus, have raised concerns about potential harms to children.
In 2020, a report by the Center for Humane Technology found that Meta's VR products were vulnerable to exploitation by predators. The report highlighted the need for greater regulation and oversight of VR technology.
Additional Perspectives
Dr. Rachel Kim, a leading expert on child online safety, said that the allegations against Meta are "deeply troubling." "The company has a responsibility to protect its users, particularly children," she added.
Current Status and Next Developments
The Senate committee hearing is ongoing, with lawmakers expected to grill Meta executives about the allegations. The company's response to the whistleblowers' claims will likely be closely watched by regulators and lawmakers.
In a statement, Meta said that it takes user safety seriously and has implemented measures to protect children on its platforms. However, the company's denials have been met with skepticism by critics who point to the company's history of prioritizing profits over safety concerns.
The implications of the allegations against Meta are far-reaching. If true, they could lead to greater regulation of VR technology and increased scrutiny of social media companies' handling of user safety.
As the hearing continues, one thing is clear: the debate about Meta's handling of child safety on its platforms will not be going away anytime soon.
*Reporting by Bbc.*