Trolls have flooded X with graphic Taylor Swift AI fakes

X is full of crude Taylor Swift AI fakes made by trolls.

Over the last day, sexually explicit pictures of Taylor Swift made by AI have been going around on X (formerly Twitter). This is the latest example of how easy it is for fake pornography made by AI to spread and how hard it is to stop it.

More than 45 million people saw one of the most famous examples on X, along with 24,000 reposts, hundreds of thousands of likes, and bookmarks. The verified user who shared the pictures later had their account suspended for breaking platform rules. The post was up on the site for about 17 hours before it was taken down.

Taylor Swift AI fakes made by trolls :

Users started to talk about the post, though, and the pictures started to get shared on other accounts. A lot of them are still up, and a lot of new graphic fakes have come out since then. The word “Taylor Swift AI” became a trending topic in some places, which helped get the pictures seen by more people.

404 Media reported that the pictures might have come from a Telegram group where people share sexy pictures of women made by AI, which are often made with Microsoft Designer. People in the group were said to have made jokes about how the pictures of Swift went popular on X.

X’s policies regarding synthetic and manipulated media and nonconsensual nudity both specifically ban this kind of content from being hosted on the platform. While representatives for X, Swift, and the NFL have not responded to our requests for comment, X did post the following public statement almost a day after the incident began, but without mentioning the Swift pictures specifically.

Swift’s fan base has criticized X for allowing many of the posts to stay live for as long as they have. In response, fans have reacted by flooding hashtags used to circulate the images with messages that instead promote real clips of Swift performing to hide the explicit fakes.

The event speaks to the very real challenge of stopping deepfake porn and AI-generated images of real people. Some AI image generators have restrictions in place that prevent nude, pornographic, and photorealistic images of celebrities from being produced, but many others do not openly offer such a service. The responsibility of stopping fake images from spreading often falls to social platforms — something that can be difficult to do under the best of circumstances and even harder for a company like X that has hollowed out its moderation capabilities.

The company is currently being investigated by the EU regarding claims that it’s being used to “disseminate illegal content and disinformation” and is reportedly being questioned regarding its crisis protocols after misinformation about the Israel-Hamas war was found being promoted across the platform.

Update January 25th, 1:06PM ET: Added results from 404 Media.

Update January 26th, 5:31AM ET: Added Twitter’s general comment on posting non-consensual nudity.

Leave a Comment

Your email address will not be published. Required fields are marked *