Social media platform X has blocked searches for Taylor Swift after a series of X-rated deepfake images of the singer went viral over the weekend.

The sexually explicit AI-generated images of the star began circulating online over the weekend, with one of the accounts being viewed over 47 million times before it was suspended.

If you attempt to search ‘Taylor Swift’ on the platform, it comes up with an error message reading ‘Something went wrong. Try reloading’.

“This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” the head of business operations at X, Joe Benarroch said.

Swifties came through to advocate for the singer, starting the hashtag #ProtectTaylorSwift, urging fans to report the accounts and stop the images from circulating.


The White House Press Secretary said that the images were ‘alarming’ and that social media companies are responsible for preventing the spread of such misinformation.

The situation highlights a scary threat for women in the public eye, with people pushing for new legislation to be introduced to prevent images from spreading online.