Shocking pretty much no one, a new report claims the explicit AI-generated images of Taylor Swift emerged from the internet’s most infamous cesspool: 4chan.
In findings shared with Rolling Stone, the research firm Graphika said it sourced the images to a particular message board community on 4chan that basically made a game out of coming up with prompts for AI image generators that would skirt safeguards and create graphic images of famous people. (The tech publication 404 similarly traced the images back to 4chan in a story published last month.)
And though the fake images of Swift wound up garnering the most attention, researchers said she was not the only target, nor the most prominent one of the message board denizens.
“While viral pornographic pictures of Taylor Swift have brought mainstream attention to the issue of AI-generated non-consensual intimate images, she is far from the only victim,” Graphika senior analyst Cristina López G. said in a statement. “In the 4chan community where these images originated, she isn’t even the most frequently targeted public figure. This shows that anyone can be targeted in this way, from global celebrities to school children.”
On the message board, per The New York Times, users sought and shared “tips and tricks to find new ways to bypass filters.” Particularly realistic images were given votes of approval, while “misses” — like a prompt that led to an AI-generated image of a celebrity in a bathing suit, rather than nude — were bemoaned. (Similar “games” have been used for other purposes on 4chan, as well as platforms like Telegram and WhatsApp, including far-flung attempts to find alleged voter fraud.)
López G. added: “These images originated from a community of people motivated by the ‘challenge’ of circumventing the safeguards of generative AI products, and new restrictions are seen as just another obstacle to ‘defeat.’ It’s important to understand the gamified nature of this malicious activity in order to prevent further abuse at the source.”
The fake, explicit images of Swift first popped up on 4chan back on Jan. 6, but it wasn’t until over a week later that they started to go viral on more mainstream platforms like Telegram and X (formerly known as Twitter). The proliferation of the images was so bad that X had to block searches for “Taylor Swift” for several days while it removed the images. Swift’s incensed fans, who helped get many of the fake images removed through mass reporting, also demanded more protections, though, as Rolling Stone previously reported, those won’t come easily.
In the aftermath, SAG-AFTRA condemned the images as “upsetting, harmful, and deeply concerning” and said the “development and dissemination of fake images… without someone’s consent must be made illegal.” White House press secretary Karine Jean-Pierre even addressed the issue, calling the images “alarming,” demanding social media companies take steps to more strongly enforce content moderation policies, and calling on Congress to pass protective legislation.