Grok AI Faces Heat Over Explicit Taylor Swift Videos
Khushi Kumari
Image Source: X
A new AI video tool from Elon Musk’s xAI, Grok Imagine uses text prompts to generate animated clips—offering presets like “Custom,” “Normal,” “Fun,” and “Spicy”
The “Spicy” preset allows for mature or explicit content, including partial nudity—raising ethical and safety concerns
Reports reveal that even innocuous prompts like “Taylor Swift celebrating Coachella” triggered explicit videos—Grok produced revealing content without nudity being requested
A six-second, topless deepfake video of Taylor Swift dancing provocatively was generated—despite Grok’s policies banning non-consensual pornographic content
Although xAI formally prohibits pornographic content of real people, Grok Imagine lacks effective enforcement and age verification, enabling misuse
Experts slam the tool's design, calling it "misogyny by design." With over 34 million AI images already generated, regulatory bodies are concerned
This incident deepens concerns over non-consensual AI content of celebrities—following prior deepfake controversies involving Taylor Swift
New laws like the Take It Down Act aim to criminalize non-consensual deepfake pornography, putting pressure on AI platforms