Taylor Swift is the victim of fake nudes created by Artificial Intelligence

Taylor Swift is the victim of fake nudes created by Artificial Intelligence

[ad_1]

deepnudes

Images or videos manipulated in a realistic way using artificial intelligence prove to be a major challenge as the technology becomes more popular

Fake images generated by artificial intelligence in which the singer Taylor Swift appears nude circulated on social media this Thursday (25), generating outrage among the singer’s fans, who placed the tag “protect Taylor” (protect Taylor, in Portuguese) on X’s trending topics.

Swift was yet another victim of fake nudes, or “deepnudes” — a variation of the term “deepfakes”, used to designate images or videos that manipulate people’s voices, bodies or faces in a realistic way using artificial intelligence —, which have proving to be an increasingly frequent challenge with the popularization of the use of AI.

The practice, almost always applied against women, is usually used to extort or embarrass the victim.

On social media, fans were outraged by the practice. “No matter how famous someone is, we all deserve respect. Taylor Swift is a real person who deserves respect no matter what,” said one user.

“Those AI-generated photos of Taylor are sexual harassment and it’s disgusting that a man can do such things without repercussions. They see women as objects made for their fantasies and I’m fed up with it,”

wrote another.

Cases in Brazil

Recently, actress Ísis Valverde also went through this. In late October, she called in her lawyers after leaked nude photos were attributed to her.

The defense’s main hypothesis is that photos of the actress were taken from her Instagram and manipulated, through image alteration programs, so that she appeared to be naked. According to the defense, an incident was registered at the Computer Crimes Police Station “to notify and hold responsible internet providers who share fraudulent images”.

However, it’s not just actresses or famous people who go through this. Two cases registered in November 2023 show high school students being victims of this type of virtual image manipulation.

Students from Colégio Santo Agostinho, in Barra da Tijuca, Rio de Janeiro, and Colégio Marista São Luís, in Recife were victims of “deepnudes”, made by other students.

In Rio de Janeiro, more than 20 victims were identified, a number that includes teenage students and non-students at the school. The case is being investigated by the Civil Police’s Child and Adolescent Protection Department (DPCA).

In Recife, 18 students were victims of the practice. The case is being investigated by the Civil Police of Pernambuco through the Infractional Acts Police Station.

*With information from CNN Brasil

Read more:

Wilson Lima launches new Fapeam notices and highlights contributions of more than R$634 million in CT&I

AM Law prohibits public money in events with eroticization and sexualization of children

Amazonian Diego Lopes has announced fight and fire challenge at UFC 300

[ad_2]

Source link