The woman stressed that adequate action was not taken to stop Grok from creating abusive content [File]
| Photo Credit: REUTERS
Elon Musk’s AI company xAI was sued by a woman who alleged that the AI chatbot Grok created sexual deepfakes of her by morphing her photos, including photos of her as a child. She claimed in her lawsuit that Grok used AI to “undress, humiliate, and sexually exploit victims,” per a CNN report.
The woman stressed that adequate action was not taken to stop Grok from creating abusive content.
Grok has come under fire as regulators around the world are taking steps to crack down on the multi-billionaire’s chatbot after it went viral for undressing people in photos, based on user requests.
Targeted people, including women, were undressed and put into explicit scenes through the morphing of their photos on the social media platform X. Some users alleged that Grok also undressed children and teenagers.
Grok’s ability to generate deepfake images and partial nudity was restricted across multiple jurisdictions this week.

While Musk initially treated Grok’s “bikini trend” as a joke, he has started to distance himself from the growing fury of users and authorities worldwide.
He further claimed he was not aware of nude images of underaged people.
“I not aware of any naked underage images generated by Grok. Literally zero. Obviously, Grok does not spontaneously generate images, it does so only according to user requests. When asked to generate images, it will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state. There may be times when adversarial hacking of Grok prompts does something unexpected. If that happens, we fix the bug immediately,” Musk posted on X on January 14.
Published – January 16, 2026 02:58 pm IST