White House says AI-faked nude photos need independent legal regulation. The U.S. government has taken a stand after AI-faked nude photos of Taylor Swift appeared on X.
In a statement, White House Press Secretary Karine Jean-Pierre called the incidents “alarming” and reiterated the need for legal action.
She emphasized the role of social media in enforcing its own rules, noting the overwhelming impact on women and girls, who are the primary targets of online harassment and abuse.
The spokeswoman emphasized that such fake images are among the AI issues that the Biden administration has prioritized, and that they require appropriate legislation, without going into detail about what that might look like.
X criticized for spreading fake AI nude pictures of Taylor Swift
Jean-Pierre’s comments were prompted by AI-generated explicit images of singer Taylor Swift on the social media platform X (formerly Twitter).
One post received more than 45 million views, 24,000 reposts, and hundreds of thousands of likes and bookmarks, according to The Verge, before the account of the verified user who shared the images was suspended for violating the platform’s policies.
The images continued to spread and were re-posted on other accounts. According to 404 Media’s research, the images may have originally been created in a group on Telegram. Many are still online.
Swift’s fanbase heavily criticized X for allowing many of the posts to remain visible for so long. Fans flooded the hashtags used to spread the images with real clips of Swift’s performances to cover up the fake images.
X has responded by saying that the distribution of nude images without consent is strictly prohibited on the platform. The X safety team’s statement does not directly address the Taylor Swift incident.
The distribution of deepfake pornography and AI-generated nude images of real people is a major challenge in general, not just for celebrities: In the Spanish town of Almendralejo, more than 20 girls reportedly received AI-generated nude images of themselves. The youngest victim was just 11 years old.
The naked bodies in the fake AI images are AI-generated images, not the actual bodies of the people involved. Nevertheless, the images can give the impression that they are real. Such fake photos can also cause psychological harm and possible reputational damage.