Taylor Swift Fans Are In An Uproar As Someone Created “Disgusting” NSFW A.I. Photos Of The Pop Star That Are Going Viral Online /d

Taylor Swift speaking

KANSAS CITY, MISSOURI – SEPTEMBER 24: Taylor Swift is seen during a game between the Chicago Bears and the Kansas City Chiefs at GEHA Field at Arrowhead Stadium on September 24, 2023 in Kansas City, Missouri. (Photo by Jason Hanna/Getty Images)

Taylor Swift fans are rioting online following the viral circulation of some truly disgusting AI-generated images.

While it’s still unclear where the NSFW photos originated from, they have been shared from one account over 100,000 times, while plenty of others have them posted too. X has taken steps, with one of the biggest culprits getting their account suspended, though various images could still been found on the platform at the time of writing.

The deepfake photos show Swift in lewd sexual positions at Chiefs games and have generated massive backlash, with “PROTECT TAYLOR SWIFT” now trending on X as her supporters are attempting to bury the NSFW photos with positive content.

“Y’all see how mean and pathetic these people in making Taylor swift AI ?? PROTECT TAYLOR SWIFT,” one fan wrote.

“When i saw the taylor swift AI pictures, i couldn’t believe my eyes. Those AI pictures are disgusting,” another wrote.

“Taylor Swift AI is as disgusting as hell Please PROTECT TAYLOR SWIFT,” a user said.

“People sharing the ai pics are sick and disgusting. protect taylor swift at all costs,” one added.

“The situation with AI images of Taylor Swift is insane. Its disgusting,” wrote another.

Fans have also wondered why there’s no law protecting people from such acts.

“How is this not considered sexual assault??” a user queried. “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable how are there no regulations laws preventing this.”

U.S. President Joe Biden did sign an executive order to further regulate AI back in October that prohibits “generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals.”

Nonconsensual deepfake NSF imagery is illegal in multiple states, including Texas, Minnesota, New York, Hawaii, and Georgia, but policing the circulation of such photos is something else altogether.

Relative Articles

None found

Related Posts

Stephen A Smith PUTS Angel Reese IN HER PLACE After Latest Stunt! She’s Not Caitlin Clark! (an)

Sports commentator Stephen A. Smith has once again stirred controversy with his bold remarks, this time taking aim at Angel Reese, the standout forward from LSU. Following…

Geno Auriemma FURIOUS Paige Bueckers WON’T BE #1 PICK – She’s No Caitlin Clark! (an)

In a heated reaction that has stirred up the basketball community, Geno Auriemma, the legendary head coach of the UConn women’s basketball team, has expressed his outrage…

Travis Kelce reveals his no s-x deal breaker in unearthed clip amid Taylor Swift romance (an)

Taylor Swift is rumoured to be dating American footballing hunk Travis Kelce, with his very racy dating dealbreakers now unearthed.Back in 2016, the hunk had his own reality-dating show…

REPORT: ESPN has made a tough call on what will happen to Jason Kelce after his phone-smashing altercation with a Penn State fan who used a gay slur (an)

ESPN has made a decision on Jason Kelce following his recent altercation with a young Penn State fan. The former Eagles star had an incident with a Nittany…

Sheila Johnson, owner of the Washington Mystics, said Caitlin Clark should not have been criticized by Time for this selfish nonsense./hi

Caitlin Clark should not have been singled out by Time, says Washington Mystics owner Sheila Johnson Sheila Johnson, billionaire co-owner of WNBA franchise Washington Mystics, has criticized…

Cam Newton Criticizes WNBA’s Bold Prediction and Fans Give Him Extremely Bad Nickname…/hi

Cam Newton slammed for bold WNBA prediction with fans calling it the ‘worst take of 2024’ Does the recent rise in popularity of women’s basketball – stoked…