The apparent AI-generated images of Taylor Swift flooding the Internet highlight a major problem with creative AI

Taylor Swift has battled record labels for music ownership. She headed the first billion-dollar tour and won Grammys.

However, deepfake pornography broke her authority overnight in the crudest, most miserable way for a woman.

Can't Stop Making Taylor Swift AI Art : r/TaylorSwift

The crude photos, which Fast Company won’t link to, reveal AI-generated sexual actions. The photographs were seen almost 45 million times on one account before it was blocked after nearly a day. Swift reps declined Fast Company’s request for comment.

Degrading women by making them sexual objects and producing explicit images of them is nothing new. The New York Evening Graphic sexualized women over 100 years ago by utilizing composite photos of heads on torsos. But the combination of a worldwide megastar like Swift with the quick, unconstrained expansion of generative AI tools makes the images that went viral on social media overnight a low point and a warning about the future of online sexual abuse if regulators and firms do nothing.

Taylor Swift - AI Generated Artwork - NightCafe Creator

Many of Swift’s fans flooded social media search results with harmless images to mitigate the damage, but these AI-created images are now in the wild, which is depressing to victims of revenge porn and deepfaked images.

Carolina Are, a platform governance researcher at Northumbria University’s Centre for Digital Citizens, believes the debate raises several challenges. One is how society views women’s sexuality, especially the sharing of nonconsensual photographs, whether stolen or manufactured by AI. “It just causes powerlessness,” she says. The inability to control your body in digital spaces. I fear this will become the norm.”

X, where the photographs were first shared, has suspended accounts that submitted them, but people have reposted them on Reddit, Facebook, and Instagram. X did not react to a request for comment on the photos’ resharing.

“What I’ve noticed in all these worrying, AI-generated content is that when it’s nonconsensual, when it’s not people who want their content created or are willing to share their body, it thrives,” Are adds. “But then consensual content is heavily moderated.”

Swift is not the first to be affected by AI-generated photos. (Several media publications have reported on other platforms that offer deepfake pornography; Fast Company is not connecting to them.) A Spanish village was torn apart in October after a scandal involving deepfake photographs of over 20 women and youngsters, the youngest of whom was 11. Deepfake pornography has also been used to extort victims.

Swift is likely the most prominent and most likely to have resources to combat tech platforms, which may compel certain generative AI tools to change course. 

Oh no on X: "THERES MORE TAYLOR SWIFT AI?! WHOEVER MAKING THIS NEEDS TO BE  PUT BEHIND BARS https://t.co/EW8AdeA0D9" / X

Last year, a nonacademic investigation of over 100,000 deepfaked movies online showed that 98% were pornographic and 99% included women. 94% of the videos’ subjects worked in entertainment. A connected poll of over 1,500 American men indicated that three in four AI-generated deepfake porn consumers felt no shame. A third argued it wasn’t detrimental because it wasn’t the person and couldn’t affect anyone if it was for their own benefit.

However, posting deepfaked nonconsensual sexual photographs is prohibited in New York and the U.K.’s Online Safety Act, and individuals should feel bad.“It’s clear that an ongoing and systemic disregard for bodily sovereignty is fuelling the harm posed by deepfakes of this kind, which are absolutely sexual abuse and sexual harassment,” says Glitch founder and CEO Seyi Akiwowo.TAYLOR SWIFT (AI) on X: "rt and like if you'd join Taylor Swift AI at a  festival? https://t.co/5bgU24QGWy" / X

Akiwowo thinks digital companies must act to protect women and underprivileged populations from deepfakes.

TAYLOR SWIFT (AI) on X: "I think I like this one better ️  https://t.co/cKNq4Xb0XY" / X