Entertainment /

Taylor Swift Reportedly Considering Legal Action Over Viral Pornographic AI Deepfakes

'These fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge.'


Taylor Swift Reportedly Considering Legal Action Over Viral Pornographic AI Deepfakes

Taylor Swift is reportedly "furious" and considering legal action over pornographic AI deepfakes of her that have gone viral this week.


The highly realistic-looking fake images feature the pop star in Kansas City Chief memorabilia and body paint.

According to a report from The Daily Mail, the images originated from Celeb Jihad, a website that hosts leaked and faked images of nude celebrities. However, they soon spread across Facebook, X, Reddit, and other social media platforms.

After being alerted by the Daily Mail to what was happening, Reddit began removing some of the offending accounts.

A Meta spokesperson also told the paper that the images violate its terms of service and are being removed.

"This content violates our policies and we’re removing it from our platforms and taking action against accounts that posted it," the spokesperson said. "We’re continuing to monitor and if we identify any additional violating content we’ll remove it and take appropriate action."

An unnamed source "close to Swift" spoke to the outlet on Thursday, saying, "Whether or not legal action will be taken is being decided, but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge."

"The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with," the source continued. "These images must be removed from everywhere they exist and should not be promoted by anyone."

The source continued, "These images must be removed from everywhere they exist and should not be promoted by anyone. They have the right to be, and every woman should be. The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted."

Swift has not publicly commented about the images.

The Daily Mail noted, "Nonconsensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginia, Hawaii, and Georgia. In Illinois and California, victims can sue the creators of the pornography in court for defamation."

The American Civil Liberties Union, the Electronic Frontier Foundation and The Media Coalition have argued that laws limiting deepfakes may violate the First Amendment.

*For corrections please email [email protected]*