Taylor Swift’s name has been temporarily blocked by X, formerly Twitter, in the wake of current AI deepfake controversy.
The 34-year-old singer cannot be entered on the search box on the app; it displays error message: “Something went wrong. Try reloading”, upon searching.
According to Billboard, however, Swift’s official X account is still accessible.
It comes after explicit, A.I. generated photos of the Grammy winner went viral on the social media platform earlier this week.
Though Taylor herself hasn’t spoken out on the photos, a source told the Daily Mail that the pop star has been considering legal action against the perpetrators in order to prevent the same thing happening to other women.
“The door needs to be shut on this,” they insisted.
Internet users also denounced the practice, with SAG-AFTRA and White House releasing respective statements of condemnation shortly after.
The actor’s union described the image as “upsetting,” “harmful,” and “deeply concerning”.
They said in a statement on Friday, Jan. 26, “The development and dissemination of fake images—especially those of a lewd nature—without someone’s consent must be made illegal.”
White House Press Secretary Karine Jean-Pierre weighed in, “We are alarmed by the reports of the… circulation of images that you just laid out—of false images to be more exact, and it is alarming.”