X, formerly known as Twitter, was one of the first social platforms to have clear rules against AI-generated fakes. saying In 2020, it acknowledged the threat of misleading “synthetic media” and said it was “committed to getting this right.”
But under owner Elon Musk, X has become one of the most powerful and prominent distribution channels for non-consensual deepfake porn. The platform not only helps fake photos and videos spread in a discreet environment, but could also reward deepfake spreaders who make money by exploiting manipulated pornography. .
“Twitter is 4chan 2,” said Genevieve Oh, an analyst who studies deepfakes, and is known for posting not only deepfake porn but also anti-Semitic memes and glorification of mass shootings. He mentioned harmful bulletin boards without rules. “This emboldens future malicious actors who may use composite videos and images to collaborate in the disparagement of more popular women,” she said.
There is no federal law regulating deepfakes, but some states, including Georgia and Virginia, have banned non-consensual AI-generated pornography.
X prohibit But enforcement has been limited because the company, at Mr. Musk’s direction, has laid off thousands of employees and dismantled its “trust and safety” team, which traditionally removes such images.
Musk laughs off the need for content moderation. The day before Althoff’s video went viral, he shared: message Grok, X’s chatbot, calls content moderation a “digital chastity belt” and a “steaming pile of horse manure” enforced only by “digital tyrants.”
“Let’s give content moderation a big middle finger and embrace the chaos of the internet!” the post read.
X did not respond to requests for comment.
Last month, X’s failure to stop deepfakes was highlighted after an AI-generated sex image of pop star Taylor Swift went viral on the platform, racking up tens of millions of views. Because it didn’t have enough moderators, the company took the unusual step of blocking searches for Swift’s name.
But Mr. Althoff’s case shows that the company is struggling with this issue. One of his most popular posts directing viewers to the video remained online more than 30 hours later.
Another post, which promised to “send the full Bobby Althoff leak to everyone who liked and commented,” had been online for 20 hours, but the Washington Post commented on the misinformation. After asking for it, X deleted the post. The video post was viewed more than 5 million times before it was deleted.
Althoff, a content creator first known for her light-hearted TikTok videos about parenting and pregnancy, gained millions of followers on social media last year with her podcast where she awkwardly interviews celebrities like Drake and Shaquille O’Neal. did.
Representatives for Mr. Althoff did not respond to requests for comment. On Wednesday, she posted on Instagram and shared a screenshot of her own name on X’s trending list with the comment, “100% not me, definitely AI generated.” .
“I was like, ‘What is this?'” she said in the video. “I felt like it was some kind of mistake. … I didn’t know that people actually believed it was me.”
Her name was found on more than 17,000 posts, according to a screenshot of X’s trending data. These topics were once filtered by a “curation team” to remove offensive tendencies. Under Musk, X also fired them.
X is the only mainstream social platform that allows pornography, increasing the challenge for the remaining moderators who are responsible for deciding between genuine explicit content and non-consensual fake content.
But the company also promotes virality by offering to pay out a portion of ad revenue to accounts with high viewership. Many of the accounts sharing Althoff clips had blue check marks indicating they were eligible for payments.
Many of the X posters who shared Althoff’s video referred to it as a real “leaked” sex scene, or offered to send the video to everyone who shared or interacted with their tweet. I tried to increase engagement.
Deepfakes are created by using artificial intelligence to digitally superimpose someone’s face onto another body. These have been used for years to harass, embarrass, and degrade women and girls, including Hollywood actresses, online creators, members of Congress, and high school students who have been photographed and artificially “undressed” on social media. It has been used.
Platforms like deepfake forums and Telegram have become common places where photos and videos are manufactured. Some users ask for money to add certain faces to explicit scenes.
The creator of one fake Althoff video offered to sell a 20-minute version of it on a deepfake forum for $10 via PayPal, according to a listing seen by The Post. (Preview videos for this listing have been viewed 60,000 times in the past four months.)
To gain attention beyond message boards, some deepfake creators are moving their content to X, where they hope to sell more clips or reach a more mainstream audience. Some of the Swift and Althoff fakes were also posted on platforms such as Instagram and Reddit, but they only gained a fraction of the audience there and were quickly removed.
Musk has often cited “community notes” as an alternative to moderating X. The note allows volunteers to suggest comments that will appear on a particular tweet if it receives enough approval votes. However, many of the posts posting Althoff’s fake video did not include such notes, and some of the notes did not appear until hours after the video went viral. Notes also don’t prevent you from viewing, sharing, or saving clips.
One post by Wednesday afternoon included a community note saying the videos were generated by AI and being spread “with the intent of engaging as bait and generating revenue for Twitter.” The author of the original post suggested that followers could find the “leaked” video in the tweet’s “hidden” replies, later commenting, “If Bobby Althoff sees this, he apologizes.” wrote.
However, the account did not delete the original post, and many X users shared it with their own followers without hindrance. After 24 hours, his original post had been viewed over 20 million times and liked 29,000 times.
Will Oremus contributed to this report.