Reports from child-safety and media investigations say xAI’s Grok image tools have been used to generate sexual imagery that appears to include minors, with the UK-based Internet Watch Foundation saying analysts found such material discussed on a dark-web forum and that it would qualify as child sexual abuse material under UK law. Separately, a WIRED review says Grok’s official site has hosted user-generated outputs that are more explicit than what’s typically visible on X, including violent sexual images and videos and content that includes apparent minors. In the latest political fallout in the UK, the House of Commons women and equalities committee decided to stop using X after an outcry over AI-altered images, adding pressure for stronger safeguards around generative image tools on ...
Reports from child-safety and media investigations say xAI’s Grok image tools have been used to generate sexual imagery that appears to include minors, with the UK-based Internet Watch Foundation saying analysts found such material discussed on a dark-web forum and that it would qualify as child sexual abuse material under UK law. Separately, a WIRED review says Grok’s official site has hosted user-generated outputs that are more explicit than what’s typically visible on X, including violent sexual images and videos and content that includes apparent minors. In the latest political fallout in the UK, the House of Commons women and equalities committee decided to stop using X after an outcry over AI-altered images, adding pressure for stronger safeguards around generative image tools on major platforms.
Highlights:
- Dark-web claims: The Internet Watch Foundation said forum users claimed they used “Grok Imagine” to create sexualised and topless images of girls described as aged 11–13, and IWF analysts assessed the material as illegal CSAM under UK law.
- UK political response: The Commons women and equalities committee’s decision to stop posting on X followed a surge of AI-edited “clothing removed” images of women and children associated with Grok’s tooling, according to The Guardian’s reporting.
- Product feature trigger: Firstpost reported that Grok rolled out an “edit image” button in December, which allows users to alter pictures via prompts and has been used to generate nonconsensual undressed images.
- Platform mismatch: WIRED reported that Grok’s website and app can generate and display adult sexual content that is far more graphic than what appears on X, raising questions about how policy enforcement differs across xAI/X surfaces.
- Public scrutiny: Online discussion amplified the issue after posts highlighted claims that Grok-generated undressed images appeared at extremely high volume on X, reflecting how quickly safety failures can become visible at internet scale.
Perspectives:
- Internet Watch Foundation (IWF): The UK-based watchdog warned that alleged use of Grok to generate sexualised images of children risks pushing such material toward the mainstream, and said its analysts deemed examples it found to be illegal CSAM under UK law. (The Guardian)
- WIRED (reporting): WIRED said its review of outputs hosted on Grok’s official website found extremely explicit and sometimes violent sexual imagery, including content that includes apparent minors, and noted the material was more graphic than what’s typically on X. (WIRED)
- UK House of Commons women and equalities committee: The cross-party committee decided to stop using X after the platform was flooded with AI-altered images that digitally removed clothing from women and children, according to The Guardian. (The Guardian)
- Firstpost (analysis): Firstpost framed the spread of nonconsensual “digitally undressed” images as harmful to women and raised questions about what actions xAI/X took amid backlash following new image-editing functionality. (Firstpost)
Sources:
- AI tool Grok used to create child sexual abuse imagery, watchdog says - theguardian.com
- Grok AI Generated Thousands of Undressed Images Per Hour on X - reddit.com
- Grok Is Generating Sexual Content Far More Graphic Than What's on X - wired.com
- Grok Produces Graphic Sexual Content, WIRED Reports - lighthome.in
- Elon Musk's Grok AI appears to have made child sexual imagery, says charity - bbc.com
- Grok's website and app are being used to produce extremely graphic, sometimes violent, sexual imagery of adults that is far more explicit than images on X (Wired) - techmeme.com
- How Musk’s Grok is ‘dehumanising’ women by digitally undressing their images on X - firstpost.com