The Violation of Taylor Swift

The pornographic depictions of Taylor Swift in artificial intelligence (AI) photos that circulated on social media this week highlight not only a burgeoning technology issue, but also the centuries-old practice of simple misogyny.

At a time when Swift's high-flying career has seen her dominate the pop culture landscape, the singer has become a talking point over a set of humiliating, sexually explicit images that she's featured in—even though she posed for none of them.

The images were created with the use of by AI, pictures generated through artificial intelligence software using text prompts. This can be done without a person's consent. The software to create such images is widely available.

Among the AI images of Swift that circulated were some that depicted her posing inappropriately while at a Kansas City Chiefs game, an unsavory nod to her romance with the NFL team's tight end, Travis Kelce.

AI Deepfakes of Taylor Swift
Pornographic AI images of pop star Taylor Swift have been circulated on social media this week. Photo-illustration by Newsweek/Getty

The offensive images originated on the AI celebrity porn website Celeb Jihad on January 15. They were subsequently shared on X, formerly Twitter, this week, clocking up millions of views and sparking spirited debate before the associated accounts were suspended.

Soon after the images were removed from X, it was reported that Swift is mulling legal action. Newsweek has reached out to a representative of Swift via email for comment.

While the legal system has not caught up with this surfacing threat of such images flooding publicly accessible spaces, that could eventually change. On Tuesday, two lawmakers reintroduced a bill that would make the non-consensual sharing of digitally altered pornographic images a federal crime.

Representative Joseph Morelle, a Democrat from New York, first authored the "Preventing Deepfakes of Intimate Images Act" in May 2023, which he said at the time was created "to protect the right to privacy online amid a rise of artificial intelligence and digitally-manipulated content." He has since added Representative Tom Kean, a Republican from New Jersey, as a co-sponsor.

Sarah Klein, an attorney at California-based firm Manly, Stewart & Finaldi, is among those who back a lawsuit from Swift, telling Newsweek that what happened to the musician "should never happen to any woman. It is abuse, plain and simple. Taylor should definitely take legal action."

And while Swift may emerge victorious from a courthouse months from now, she almost certainly faces a decades-long battle to be seen as more than a sex object to her detractors. After all, countless high-profile women before her have been forced to endure such treatment, even if not via AI.

Humiliating High-Profile Women

Jeffrey R. Dudas, Ph.D., professor of political science at the University of Connecticut, told Newsweek that "this sort of sexualization is unfortunately common as a means of humiliating or otherwise attempting to discipline high-profile women. Consider, for example, how often [New York Representative] Alexandra Ocasio-Cortez has been a target of this exact thing—including by Republican members of Congress."

Swift, Dudas said, has fallen foul of people identifying with realms far removed from her musical world. And a faction of these people, including women, have expressed their apparent distaste through portraying the star as less-than.

"I think it not a coincidence that this particular episode emerges at the exact time in which Taylor has been increasingly the target of right-wing ire," Dudas said, "both for her [quite mild] political advocacy and for her informal affiliation with the NFL, which really seems to trigger the rage of a particular type of person who envisions football as some sort of bastion of old-fashioned masculinity.

"In all cases, this sort of humiliation/disciplining is a response to a [perceived] sense of threat—threats that have less to do with the individual women themselves and more to do with what the women stand for, with female autonomy and agency foremost amongst those things."

Echoing the sentiment, Equality Now Digital Law & Rights Advisor Amanda Manyame told Newsweek: "Deepfake image-based sexual abuse mirrors the historic patterns of sexual exploitation and abuse experienced by women. Sexual exploitation and abuse in the physical and digital realms operate together in a continuum. They are a part of the same structure of violence rooted in gender-based inequality and systemic misogyny that perpetuates women's subordination in society."

In 2014, Jennifer Lawrence, Kate Upton and Cara Delevingne were among a host of female stars whose real nude images were taken from their hacked Apple iClouds and leaked online for public consumption. Naysayers argued that the photos could not have been leaked if they hadn't been taken. Others pushed back, along with the stars themselves, insisting that they should be afforded privacy like anyone else.

When Kamala Harris was vying for her role as vice president in 2020, her mere participation in a debate against then-incumbent Mike Pence saw a spike in people searching Google for her name alongside such terms as "nude," "bathing suit" and "bikini."

One study on 2008 vice presidential candidate Sarah Palin published in the Journal of Experimental Social Psychology found that objectifying the former Alaska governor by focusing on her looks made her appear less competent and less "fully human" in the eyes of 133 participants. It also made volunteers less likely to vote for the John McCain-Palin ticket in the 2008 presidential election.

Swift, unfortunately, follows a long line.

Time for Change

Evan Nierman, CEO of global PR firm Red Banyan, told Newsweek that this degrading episode could become an opportunity for Swift to set a precedent and turn the proverbial ship around.

The AI images, Nierman said, "may ironically prove to be a godsend, since they will probably be the catalyst to faster and more far-reaching policies to prevent and act against this type of content. Unfortunately, this vexing problem is going to exponentially expand in the years to come.

"As one of the most powerful people on the planet, Swift is uniquely positioned to bring this major AI-generated challenge to the top of public consciousness, exposing the dangerous and disgusting consequences of fake imagery being generated by machines.

"An entire new generation of problems and threats will be sparked by AI, and the fact that Swift has become embroiled in this controversy so early on will likely mean that the topic becomes an urgent focus of tech companies, policymakers, the media and the general public."

Equity Now's Manyame sees the current controversy as a dress rehearsal for what's to come, given a perfect storm of factors that include this year's presidential election. With misinformation already rife across the social media landscape, the misuse of a famous person's image is a practice that is projected to increase.

"Taylor's is not the first one and will definitely not be the last one," Manyame said.

Siwei Lyu, a computer science and engineering professor at the University of Buffalo, described the problem as AI technology "gone rogue."

He attributed the rise in sexualized AI images to three factors: the development of newer AI models that do not require huge training data or a long time to train; social media platforms providing this data and allowing images to spread quickly; and the emergence of AI tools that are easy to use for someone without much knowledge of programming.

"The problem is certainly of significant concern due to the defamation effect," Lyu told Newsweek.

"If these are used targeting individuals [known as revenge pornography] they can also cause tremendous psychological trauma to the victims [almost all victims of revenge pornography are female]. I think the solution lies in (1) technology for detecting and attributing such deepfake images; (2) laws that protect individuals from unauthorized use of their imagery or voice to make deepfake pornography."

Profound Effect

Emma Pickering, head of technology-facilitated abuse and economic empowerment at U.K. domestic abuse nonprofit Refuge, pointed out that there's also a human factor involved in such behavior that is often overlooked.

"Intimate image deepfakes have a profound and long-lasting impact on survivors," Pickering told Newsweek. "These types of fake images can harm a person's health and wellbeing by causing psychological trauma and feelings of humiliation, fear, embarrassment, and shame. The sharing of these intimate deepfakes can also damage a person's reputation, impact their relationships, and affect their employability.

"Survivors of non-consensual deepfakes, or other intimate image abuse, can fear their personal safety, as they experience an increase in on and offline harassment, stalking, and assaults."

Given the chorus of ominous warnings regarding the grip that AI is set to possess in the near future, Swift's situation has raised expectations for prompt legislative reform.

Senator Mark Warner, a Democrat from Virginia and chairman of the Senate Intelligence Committee, wrote in an X post on Thursday that "current law may insulate platforms and websites from exactly this sort of accountability. I want to pass Section 230 reform so we can hold tech firms accountable for allowing this disgusting content to proliferate."

Passed in 1996, Section 230 of the Communications Decency Act offers a degree of immunity to websites for any content uploaded by third parties—be it in the form of a social media post, classified ad or user review.

It gives sites acting in "good faith" the protection to remove any objectionable material, regardless of whether it is "constitutionally protected." The law is broad, saying that the rules apply to any "provider or user of an interactive computer service."

Section 230 does not offer total blanket protection, however, and exemptions are in place, meaning lawsuits are possible for criminal and intellectual property cases.

Backing calls for action, Manyame said there is "an urgent need for legal frameworks to confront and combat deepfake sexual abuse. Legal measures also need to recognize and address the continuum of harm experienced by women.

"Digital content can spread across multiple platforms and countries making it difficult to remove or track and protective laws in one country would not be enough to protect all victims across the world," she said. "National responses and mechanisms have to be supported by strong interconnected, international responses. The technology sector also needs to play a role by not promoting this content on their platforms and removing it, especially when it has been reported as offending content."

However, Nierman, author of The Cancel Culture Curse and Crisis Averted, warns that AI already has its feet firmly under the content creation table.

"AI-generated images will upend life as we know it, killing the age-old notion that 'seeing is believing,' he said. "Expect an entirely new cottage area of law to come into existence as the world races to grapple with the implications of a new set of serious problems caused by artificial intelligence. There is no way to put the AI genie back in the bottle, so we must race to enact laws and technology-based policing tactics to catch up to this new reality."

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Ryan Smith is a Newsweek Senior Pop Culture and Entertainment Reporter based in London, U.K. His focus is reporting on ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go