Instagram promoted pedophile network, says newspaper – 07/06/2023 – Tech

Instagram promoted pedophile network, says newspaper – 07/06/2023 – Tech

[ad_1]

Instagram has connected and promoted a network of profiles that disseminate and sell pedophile content, according to a report by the American newspaper The Wall Street Journal, in partnership with researchers from Stanford University and the University of Massachusetts Amherst.

According to the newspaper, Meta’s social network hosted and promoted content of the type through its algorithms. Instagram directed users to sellers of child sexual abuse content through similar profile recommendation systems. In practice, the platform would be allowing users to build networks selling illegal content, according to the investigation.

In response, Meta acknowledged problems in its internal law enforcement operations. He said he works to prevent algorithms from recommending potentially pedophile adults to connect or interact with their content.

According to the company, an internal working group was created to deal with the issues raised. “The exploitation of children is a horrible crime. We are constantly investigating ways to actively combat this behavior,” he said.

The newspaper and researchers found that users were able to search for explicit hashtags and be connected to profiles that used the terms to advertise the sale of child sex material. In addition to hashtags, these accounts used suggestive terms and posted “content menus”.

Some “menus” include prices for videos of children hurting themselves and images of minors performing sexual acts with animals, according to researchers at the Stanford Internet Observatory. For a specific price, the profiles said that the children would be available for “in-person meetings”.

Researchers created test profiles to see how algorithms recommended pedophile content. Viewing just a single profile of this type was enough for other accounts to be recommended in a field called “suggested for you”.

The suggested profiles were of alleged sellers and buyers of sex content with minors, as well as accounts linked to content trading sites external to the platform. By following one of the social network’s suggestions, the test account began to be flooded with content that sexualizes children.

According to bigtech, 27 pedophile networks have been taken down in the last two years. The platform told The Wall Street Journal that it has blocked thousands of hashtags that sexualize children and has tightened its systems to avoid recommending terms associated with sexual abuse to users.

In March, Pinterest received a similar accusation. At the time, a report by the American broadcaster NBC News stated that the social network recommended profiles that sexualize minors, collecting photos with this content.

Instagram is one of the most used platforms in the world, with more than 2 billion active users globally, according to Meta. In Brazil, there are more than 115 million users, and in the US the mark exceeds 150 million.

“Instagram is a gateway to places on the internet where there is more explicit child sexual abuse,” Brian Levine, director of the Rescue Lab at the University of Massachusetts, who researches online child victimization and creates forensic tools to combat it, told the newspaper.

Pedophile profiles created on Instagram go to great lengths to cover up illegal activities, according to researchers. Acronyms and emojis that refer to terms like “person attracted to minors” or “child pornography” are used, according to Levine. Many declare themselves “lovers of the little things in life”.

For Alex Stamos, head of the Stanford Internet Observatory and former head of security at Meta until 2018, controlling even obvious abuse would require an ongoing effort. “The fact that a team of three academics with limited access found such a large network should set off alarm bells at Meta,” he told the paper.

Under Meta’s public guidelines, the company removes content that threatens or promotes violence or sexual exploitation of minors. The statement says that “where appropriate”, the company forwards this content to legal authorities.

“To protect victims and survivors, we also remove photographs or videos that depict incidents of sexual violence and images shared in revenge or without permission of the people in the images,” Meta says in its regulations.

[ad_2]

Source link