Search for ‘Taylor Swift’ on Saturday draws blank space on Twitter/X as social network is no longer visible Any Pornographic AI nudes spiked earlier this week, with search results for the singer flooding the service.
The service also doesn’t show results for “Taylor Swift Nude” or “Taylor Swift AI,” search terms widely used to circulate illegal images. But innocuous-sounding searches such as “Taylor Swift singer” still show results.
The move comes after the White House on Friday urged Congress to consider the controversy and “take legislative action.” Spokeswoman Karine Jean-Pierre told ABC News: “We are concerned and alarmed by the reports you have just laid out regarding the circulation of images, or more precisely, the circulation of false images… While social media companies are establishing their own independence, we are making sure that they are making decisions about content moderation to prevent the spread of misinformation and intimate images of real people without their consent. We believe we have an important role to play in enforcing our own rules.”
Meanwhile, New York state congressman Joe Morrell is citing Swift’s nudity to push for legislation that would make the nonconsensual sharing of digitally explicit images a federal crime.
X and other social media platforms attempted to remove a viral image of Swift from their platforms on Wednesday, but a new fake Swift-inspired image that differed from the original began circulating in its place. This may have made enforcement even more difficult.
The original images showed Swift, wearing red body paint, having sex with a Kansas City Chiefs fan and mocking her romance with Chiefs tight end Travis Kelce. On Sunday, the Chiefs will face the Baltimore Ravens in the all-important AFC Championship Game, which will determine which teams will advance to the Super Bowl.
So far, Twitter/X chief Elon Musk has remained uncharacteristically silent on the issue, instead announcing things like the closure of a San Francisco toy store on Saturday due to the city’s crime problem. commented on the issue. Swift has not commented, but unconfirmed reports suggest she is considering legal action.
On Friday, SAG-AFTRA also released a statement about the images, writing that they are “upsetting, harmful, and deeply disturbing.”
“The development and distribution of false images, especially those of an obscene nature, without someone’s consent must be made illegal,” the union continued. “As a society, we have the power to control these technologies, but we must act now before it is too late. SAG-AFTRA is committed to ensuring that abuses of this nature do not occur again. We continue to support Congressman Joe Morrell’s Intimate Image Deepfake Prevention Act to help Taylor and other women everywhere who are victims of this type of theft of their privacy and autonomy. We support you.”
The issue of deepfakes is also a concern heading into a contentious presidential election year. Robocalls spoofing Biden’s voice were recently deployed in an attempt to influence Tuesday’s New Hampshire primary.