Twitter deletes searches for Taylor Swift due to fake photos

Henry

The social media platform, X (formerly Twitter), has blocked searches for Taylor Swift after explicit photos of the American pop singer, generated with artificial intelligence (AI), began to circulate on the site.

The photos, described by CNN as “pornographic”, have been distributed mainly on X since January 25 and shared millions of times before they were removed.

One post had nearly 47 million views before it was deleted.

Fans of Swift, known as “Swifties”, encouraged the removal of the images on X using the hashtag #protectSwift.

Joe Benarroch, X’s chief business officer, said in a statement to the BBC that blocking searches for Swift was a “temporary action to prioritize safety”.

When searching for Swift on the website, a message appears saying, “Something went wrong. Try reloading.”

The photos also prompted X to issue a statement on Friday saying that posting non-consensual nudity on the platform is “strictly prohibited”.

This AI crisis has even come to the attention of the American White House.

Karine Jean-Pierre, the White House press secretary, called the incident “alarming” and added that the negative impact of AI is something the Biden administration is focusing on.

The Screen Actors’ union (SAG-AFTRA) also issued a statement about the incident, saying that “the development and distribution of false images – especially those of a harmful nature – without someone’s permission, must be made illegal”.

The union pointed out that technology of this nature must be better controlled “before it is too late”.

Swift herself has yet to respond to news about the photos.

Sources: BBC, Billboard, CNN.