Pornographic AI images featuring pop sensation Taylor Swift trended on X Thursday, outraging many of her fans.
The images feature Ms. Swift in various sexualized poses in the colors and gear of the Kansas City Chiefs. Many of the images featured fictional characters.
The pictures were originally uploaded to Celeb Jihad, which features leaked celebrity nude photos and deepfake pornography, before they were posted on X.
Swifties raged online, with many implying the images were illegal.
“How is this not considered sexual assault? I cannot be the only one who is finding this weird and uncomfortable,” one user wrote on X. “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable. How are there no regulations or laws preventing this?”
Generating deepfake pornography without the subject’s consent is illegal in Georgia, Hawaii, Minnesota, New York and Texas. However, without a clear author, prosecuting a case against the user or users who generated the images of Ms. Swift could prove difficult.
The images posted on X pose a significant problem for owner Elon Musk, who is trying to convince advertisers and content creators that the site is safe.
X’s policy on deepfake, which replaces a person’s likeness with someone else’s, remains unclear. The platform does have a “synthetic and manipulated media” policy that could protect the artificial intelligence images from being removed since they could arguably fall under the “animations, illustrations and cartoons” category.
• Vaughn Cockayne can be reached at vcockayne@washingtontimes.com.
Please read our comment policy before commenting.