Graphic AI images of Taylor Swift are sweeping the internet, showing the singer in a series of explicit acts themed around the Kansas City Chiefs, in the latest example of the disturbing rise in deepfake porn.
DailyMail.com has seen the images in question but will not be publishing them.
They are hosted on Celeb Jihad, one of the many deepfake porn websites in existence that continue to outrun cybercrime investigators.
Swift has been a regular at Chiefs games over the last six months to support her boyfriend, star player Travis Kelce.
The images are the latest to be hosted by Celeb Jihad, which was previously wrapped up in a string of indecent scandals; in 2017, the website was sued by celebrities for posting explicit images that had been hacked from their phones and iCloud accounts.
The abhorrent sites hide in plain sight, seemingly cloaked by proxy IP addresses.
According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press in December, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.
There are mounting calls for the website to be taken down and the owners criminally investigated.
Swift pictured leaving Nobu restaurant after dining with Brittany Mahomes, wife of Kansas City Chiefs quarterback Patrick Mahomes
Brittany Mahomes, Jason Kelce, and Taylor Swift react during the second half of the AFC Divisional Playoff game between the Kansas City Chiefs and the Buffalo Bills at Highmark Stadium
The images have prompted outrage from Taylor Swift fans across the world
On Thursday morning, X started suspending accounts that had reshared some – but others quickly emerged in their place. There are also reposts of the images on Instagram, Reddit and 4Chan.
Swift is yet to comment on the site or the spread of the images but her loyal and distressed fans have waged war.
‘How is this not considered 𝑠e𝑥ual assault? I cannot be the only one who is finding this weird and uncomfortable?
‘We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable. How are there no regulations or laws preventing this?,’ one fan tweeted.
Nonconsensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginial, Hawaii and Georgia. In Illinois and California, victims can sue the creators of the pornography in court for defamation.
‘I’m gonna need the entirety of the adult Swiftie community to log into Twitter, search the term ‘Taylor Swift AI,’ click the media tab, and report every single AI generated pornographic photo of Taylor that they can see because I’m f***ing done with this BS. Get it together Elon,’ one enraged Swift fan wrote.
The obscene images are themed around Swift’s fandom of the Kansas City Chiefs, which began after she started dating star player Travis Kelce
‘Man, this is so inappropriate,’ another wrote. While another said: ‘Whoever is making those Taylor Swift AI pictures is going to hell.’
‘Whoever is making this garbage needs to be arrested. What I saw is just absolutely repulsive, and this kind of s**t should be illegal… we NEED to protect women from stuff like this,’ another person added.
Explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate.
Desperate for solutions, affected families are pushing lawmakers to implement robust safeguards for victims whose images are manipulated using new AI models, or the plethora of apps and websites that openly advertise their services.
Advocates and some legal experts are also calling for federal regulation that can provide uniform protections across the country and send a strong message to current and would-be perpetrators.
The problem with deepfakes isn’t new, but experts say it’s getting worse as the technology to produce it becomes more available and easier to use.
Biden speaks before he signed an executive order to regulate artificial intelligence (A.I.) in October 2023
Researchers have been sounding the alarm this year on the explosion of AI-generated child 𝑠e𝑥ual abuse material using depictions of real victims or virtual characters.
In June 2023, the FBI warned it was continuing to receive reports from victims, both minors and adults, whose photos or videos were used to create explicit content that was shared online.
In addition to the states with laws already on the books, other states are considering their own legislation, including New Jersey, where a bill is currently in the works to ban deepfake porn and impose penalties — either jail time, a fine or both — on those who spread it.
President Joe Biden signed an executive order in October that, among other things, called for barring the use of generative AI to produce child 𝑠e𝑥ual abuse material or non-consensual ‘intimate imagery of real individuals.’
The order also directs the federal government to issue guidance to label and watermark AI-generated content to help differentiate between authentic and material made by software.
Some argue for caution — including the American Civil Liberties Union, the Electronic Frontier Foundation and The Media Coalition, an organization that works for trade groups representing publishers, movie studios and others — saying that careful consideration is needed to avoid proposals that may run afoul of the First Amendment.
‘Some concerns about abusive deepfakes can be addressed under existing cyber harassment’ laws, said Joe Johnson, an attorney for ACLU of New Jersey.
‘Whether federal or state, there must be substantial conversation and stakeholder input to ensure any bill is not overbroad and addresses the stated problem.’
Mani said her daughter has created a website and set up a charity aiming to help AI victims. The two have also been in talks with state lawmakers pushing the New Jersey bill and are planning a trip to Washington to advocate for more protections.
‘Not every child, boy or girl, will have the support system to deal with this issue,’ Mani said. ‘And they might not see the light at the end of the tunnel.’