Apps and websites that use artificial intelligence to undress women in photos are gaining popularity, according to researchers.
In September alone, 24 million people accessed this type of tool, according to social media analysis company Graphika.
Many of these fake nude services advertise on popular social networks, according to Graphika. Since the beginning of this year, for example, the number of links advertising stripping apps has increased by more than 2,400% on social networks, including X and Reddit, researchers said.
The services use artificial intelligence to recreate an image so that the person is nude. Many of the services only work on women.
These apps are part of a worrying trend of non-consensual pornography being developed and distributed due to advances in artificial intelligence — a type of media known as deepfake pornography.
Its proliferation faces serious legal and ethical obstacles, as images are often taken from social media and distributed without the victims’ consent, control or knowledge.
An image posted on X advertising a fake nude app used language that suggested customers could create nude images and send them to the person whose image was digitally manipulated, inciting harassment.
Meanwhile, one of the apps has paid for sponsored content on Google’s YouTube and appears first when searching with the word “nudify” [nudificar].
A Google spokesperson said the company does not allow ads “that contain sexually explicit content. We have reviewed the ads in question and are removing those that violate our policies.” Neither X nor Reddit responded to requests for comment.
Nonconsensual pornography of public figures has long been an internet problem, but privacy experts are increasingly concerned that advances in AI technology have made deepfake programs more accessible and effective.
“We’re seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “You see it among high school students and people who are in college.”
Many victims never find out about the images, but even those who do may have difficulty getting authorities to investigate or finding resources to pursue legal action, Galperin said.
There is currently no federal law in the United States that prohibits the creation of deepfake pornography, although the government prohibits the creation of these types of images of minors.
In November, a North Carolina child psychiatrist was sentenced to 40 years in prison for using fake nude apps in photos of his patients, the first charge of its kind under the law prohibiting the generation of deepfake child sexual abuse material.
TikTok has blocked the keyword “undress,” a popular search term associated with the services, warning anyone searching for the word that it “may be associated with behavior or content that violates our guidelines,” according to the app.
A TikTok representative declined to provide further details. Meta also said it has started blocking keywords associated with searches for fake nude apps. A spokesperson declined to comment.