Elon Musk's AI tool Grok will no longer be able to edit photos of real people to depict them in revealing clothing in jurisdictions where such alterations are illegal, according to an announcement on X. The decision follows widespread concern regarding the potential for sexualized AI deepfakes generated by the platform.
X, the social media platform owned by Elon Musk, stated that it has implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing. The company launched Grok in 2023.
The UK government responded to the change, calling it a "vindication" of its call for X to control Grok. Ofcom, the UK's communications regulator, described the move as a "welcome development" but emphasized that its investigation into whether the platform violated UK laws "remains ongoing." Ofcom stated that it is working to progress the investigation and determine what went wrong and what is being done to fix it.
Technology Secretary Liz Kendall welcomed the move but said she would "expect the facts to be fully and robustly established by Ofcom's ongoing investigation."
Campaigners and victims argue that the change is too late to undo the harm already caused by the technology. Journalist and campaigner Jess Davies is among those who have voiced concerns.
The technology behind Grok, like many AI image editing tools, utilizes deep learning models trained on vast datasets of images. These models can be manipulated to alter existing images in various ways, including adding or removing clothing. The concern arises when these capabilities are used to create non-consensual, sexually explicit images, often referred to as deepfakes.
The industry impact of this decision remains to be seen. Other AI image editing platforms may face increased scrutiny and pressure to implement similar safeguards. The incident highlights the ethical challenges associated with rapidly advancing AI technology and the need for proactive measures to prevent misuse.
Discussion
Join the conversation
Be the first to comment