Why are searches for the world’s most famous pop star blocked on X, and how is the issue creating uproar straight to the White House?
Social media platform X has temporarily blocked Taylor Swift’s name from being searched after AI-generated deepfake pornography images of the singer went viral last week.
One fake picture of Swift posted on the platform was viewed 47 million times before the account was suspended.
The material was shared tens of thousands of times before X’s security team responded: “We have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”
Now, searching for Swift’s name and “AI Taylor Swift” on X will lead users to a page that reads: “Something went wrong. Try reloading.”
X’s head of business operations Joe Benarroch said in a statement per Variety: “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.”
Meanwhile, Meta said in a statement that it strongly condemns “the content that has appeared across different internet services” and has worked to remove it.
“We continue to monitor our platforms for this violating content and will take appropriate action as needed,” the company said.
Another day on the internet, but this is Swift we’re talking about, so things escalated quickly…
The singer’s “Swifties” mobilized by launching a counteroffensive on the platform formerly known as Twitter and a #ProtectTaylorSwift hashtag to flood it with more positive images of the pop star.
And then the ante was upped…
The White House got involved, calling the incident “alarming”.
White House press secretary Karine Jean-Pierre said via a statement: “We know that lax enforcement disproportionately impacts women and they also impact girls, sadly, who are the overwhelming targets.”
Jean-Pierre even suggested that there should be a legislation that handles the misuse of AI on social media.
If only this sort of fast reaction was applied to non-billionaires with similar complaints of deepfakes, abusive fake imagery, revenge porn and other forms of harmful online harassment…
Still, take progress where you can get it, and the incident seems to have shaken things up somewhat. As a reminder, there are currently no federal laws in the US against the sharing or creation of deepfake images – despite some moves at state level to tackle the issue.
The ills of generative AI
Federal lawmakers who’ve introduced bills to put more restrictions or criminalize deepfake porn have indicated that the incident shows why the US needs to implement better protections.
“For years, women have been victims of non-consensual deepfakes, so what happened to Taylor Swift is more common than most people realize,” said US Rep. Yvette D. Clarke, a Democrat from New York who’s introduced legislation would require creators to digitally watermark deepfake content.
“Generative-AI is helping create better deepfakes at a fraction of the cost,” Clarke said.
US Rep. Joe Morelle, another New York Democrat pushing a bill that would criminalize sharing deepfake porn online, said what happened to Swift was disturbing and has become more and more pervasive across the internet.
“The images may be fake, but their impacts are very real,” Morelle said in a statement. “Deepfakes are happening every day to women everywhere in our increasingly digital world, and it’s time to put a stop to them.”
Deepfakes use artificial intelligence to make a video of someone by manipulating their face or body.
Researchers have said the number of explicit deepfakes have grown in the past few years, as the technology used to produce such images has become more accessible and easier to use. In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmingly weaponized against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.
Brittany Spanos, a senior writer at Rolling Stone who teaches a course on Swift at New York University, said of the recent Swift incident: “This could be a huge deal if she really does pursue it to court.”
Spanos says the deep fake pornography issue aligns with others Swift has had in the past, pointing to her 2017 lawsuit against a radio station DJ who allegedly groped her; jurors awarded Swift $1 in damages, a sum her attorney, Douglas Baldridge, called “a single symbolic dollar, the value of which is immeasurable to all women in this situation” in the midst of the MeToo movement.
Additional sources • AP, Variety, Rolling Stone