It’s no big deal, until it happens to Taylor Swift.
Now that we have your attention: Houston, we have an AI-generated porn problem. An oozing societal sore many folks were seemingly happy to ignore until it ran afoul of America’s Sweetheart.
This week, deepfake images of the wholesome singer — digitally-altered “photos” portraying her in a compromising sexual position — circulated on X, to Swifties’ disgust.
Like Voltron, fans immediately activated, feeding X with praise for the singer to elevate the good and bury the sham smut into the virtual basement.
Meanwhile, Swifties also wondered how there are no regulations against someone creating fake porn using the likeness of real people and distributing it for the whole world to behold.
It’s an excellent question. And one that government needs to address soon.
Coincidentally, the Preventing Deepfakes of Intimate Images Act was reintroduced to the House Judiciary Committee last week by US Reps. Joseph Morelle (D-NY) and Tom Kean (R-NJ).
The bill would make the nonconsensual sharing of digitally altered pornographic images a federal crime, with penalties like jail time, a fine or both. It would also allow victims to sue perpetrators in court.
On Thursday, Morelle tweeted: “The spread of AI-generated explicit images of Taylor Swift is appalling—and sadly, it’s happening to women everywhere, every day. It’s sexual exploitation, and I’m fighting to make it a federal crime with my legislation … “
As a culture, we’ve given so much oxygen to the narrative that ChatGPT will take your job — and not enough to the sickening fact that AI is coming for your image and reputation. Your sanity and mental health, too. There’s no limit to the chaos and utter destruction these pernicious, reality-bending Artifical Intelligence tools can produce. And we’re only seeing the tip of the iceberg.
As if there wasn’t enough real porn to go around.
According to the Daily Mail, the mysteriously run site Celeb Jihad — which Swift issued a warning in 2011 for posting faux topless photos of her — was reportedly “the origin of” the latest bogus batch. Along with realistic deepfakes, the site has also gleefully published tons of leaked explicit photos of other celebrities.
Who runs it? No one seems to know. The site’s About page seemingly amounts to a spoof bio of someone called Durka Durka Mohammed. Maybe it’s a Dr. Evil-like figure ensconced in a hyper-secure compound. Or it’s a pot-bellied, nose-picking loser ruining lives from a wood-paneled basement.
But the nefarious site seems to dodge meaningful scrutiny with a disclaimer — a cop-out as lame as its practices.
“CelebJihad.com is a satirical website containing published rumors, speculation, assumptions, opinions, fiction as well as factual information,” it claims.
This 21st century nightmare, however, isn’t just aimed at billionaire pop stars, Oscar winners and other famous names with a legal arsenal at their disposal.
It’s happening to vulnerable teens — girls and probably boys, too.
In October, tony Westfield High School in New Jersey was rocked by an AI-generated pornography scandal, with fake nudes of female students passed around by their male classmates. It was a modern-day, nuclear version of having a rumor about you scrolled in Sharpie on the bathroom wall.
Parents were rightfully outraged and police investigated. Dorata Mani, whose 14-year-old daughter was victimized, said she was “terrified by how this is going to surface and when. My daughter has a bright future and no one can guarantee this won’t impact her professionally, academically or socially.”
It made news for a few days. Then the headlines were quickly banished to our memories’ dust bin.
But forget at your own peril. It’s not stopping at Swift or Westfield High School. Or even at porn.
This dangerous force will only gain strength and ease.
X has started suspending accounts sharing the Swift photos. But when you plug one hole, more inevitably spring up.
We’re playing a demented game of Whac-A-Mole.
In October, Joe Biden signed an executive order regulating AI development. And while a few states have initiated laws preventing the practice, there’s currently no federal law on the books.
Earlier this month, Francesca Mani, a teen victim from the Westfield debacle, spoke out about the proposed Preventing Deepfakes of Intimate Images Act.
“No kid, teen or woman should ever have to experience what I went through. I felt sad and helpless,” she said at a press conference.
Mani is right.
If Swift — whose singular capture of the culture has brought this front and center — isn’t expected to shake it off, then neither should our daughters, sisters and friends.