Home » Instagram alarm: the algorithm favors pedophilia

Instagram alarm: the algorithm favors pedophilia

by admin
Instagram alarm: the algorithm favors pedophilia

Just last February Meta had announced new anti-pedophile moves: the introduction of new features on Instagram to make it more difficult for adults to interact with adolescents, the request for all young people to limit their privacy settings, as well as the foundation – precisely with the support of Zuckerberg’s company – of “Take It Down”a new platform from the National Center for Missing & Exploited Children to help prevent intimate images of young people from being posted online in the future. But all this seems not to be enough.

Today, an investigation conducted by the Wall Street Journal in collaboration with the Stanford and Amherst universities in fact reveals that i Instagram systems for so-called “recommended content” allow pedophiles on the platform to get in touch with each other, encouraging the exchange and sale of illegal content. Result: “Large account networks, which give the impression of being run by minors, openly promote the sale of child pornography content,” said the researchers of the Cyber ​​Policy Center of the prestigious University of Silicon Valley.

Breton summons Zuckerberg: meeting on June 23rd

On 23 June, EU Commissioner Thierry Breton will meet the CEO of Meta Platforms Mark Zuckerberg: on the agenda is the request for immediate action to combat online child pornography.

CYBERSECURITY: the best strategies for the protection and continuity of IT services

Hashtags open to the pedophile network despite moderation

What emerges from the survey is that, despite the moderation, Instagram allows users to search for certain hashtags, useful for getting in touch with other profiles openly dedicated to the sale of content relating to sexual abuse of minors.

A simple search for keywords like #pedwhore or #preteensex it leads to accounts that use these terms to advertise content that shows child sexual abuse. Often, these profiles “claim to be led by the children themselves and use overtly sexual aliases,” the article details. The accounts don’t directly say they’re selling these images, but they do present menus with options, including requesting specific sex acts. Stanford researchers have also spotted offers for videos featuring bestiality and self-harm.

See also  110 representatives of the country are threatened

“For a fee, the children are available for in-person ‘meetings’,” the article continues. The report highlights the role played by algorithms in the popular social network: A test account created by the business newspaper was “flooded with content that sexualises children” after clicking on some of these recommendations.

Established an internal task force in Meta

Meta was contacted by the newspaper, since the promotion and sale of content relating to child sexual abuse is a federal crime, as well as a violation of the platforms’ guidelines, and it acknowledged that there are obstacles to moderating the aforementioned contents. The company of Mark Zuckerberg he informed the newspaper that he had set up an internal task force to resolve this issue. “The sexual exploitation of children is a terrible crime, and we are constantly working to prevent and block such behavior on our platforms,” ​​said a spokesperson for the company, adding that in the last two years Meta has dismantled a total of 27 networks of profiles guilty of pedophilia.

Removed thousands of hashtags used by pedophiles

Furthermore, the company has announced that, after being contacted by the newspaper, have been removed from the platform “thousands of hashtag” used by pedophiles to get in touch with sellers and other sex offenders. The researchers who participated in the survey noticed that, after viewing just one of the profiles belonging to the network, the Instagram algorithm immediately recommended new ones, including accounts that deal with the sale and purchase of illegal content. The observatory of Stanfordusing a variety of hashtags, found a total of 405 accounts dedicated to the sale of sexual content on minors, some of which are said to be as young as 12 years old. It would be hundreds of thousands of profiles, as also confirmed by a spokesperson for Metaaccording to which, in the month of January alone, a total of 490,000 accounts were canceled for violation of the guidelines relating to the safety of minors.

See also  6 vs. 6 brawl with Tier X ships

News for the protection of adolescents

Just last February Meta had proudly announced that he was a founding member of Take It Downa new platform from Ncmec that allows young people to resume checking their intimate images. People can go up TakeItDown.Ncmec.org and follow the prompts to submit a case that will proactively search for their intimate pictures on participating apps. Take It Down assigns a unique hash value – a numeric code – to your image or video privately and directly from your device. Once you send the hash to Ncmeccompanies such as Meta they can use those hashes to find any copies of the image, remove them, and prevent the content from being posted to their apps in the future.

“Meta – a company note also pointed out – does not allow content or behavior that exploits young people, including posting intimate pictures or sextorsion activity. We work to prevent this content as well as inappropriate interactions between young people and suspicious accounts attempting to take advantage of them. For example, we set teenagers to the most private profile of Facebook and Instagram, we work to prevent suspicious adults from connecting with teens on those apps, and we educate young people about the dangers of interacting with adults they don’t know online. We’ve also made it easier for people to report potentially harmful content, particularly if it involves a child.”

New policies on Instagram to prevent pedophilia

“They are Instagram – he added Meta in the note -, we recently introduced new features to make it even more difficult for adults to interact with teenagers. Now, these adults will no longer be able to see teen accounts when scrolling through the list of people who liked a post, or when looking at an account’s followers or following list. If a suspicious adult follows a teen account, we’ll send that teen a notification asking them to review and remove the new follower. We also ask teenagers to review and limit their privacy settings. When someone comments on a teen’s post, tags/mentions them in another post, or includes their content in Reels Remix or Guides, the teen will get a notification to review their privacy settings and have the option to block people from interact with him”.

See also  Why does Xi Jinping insist on clearing?Party media highlights "political considerations" | Shanghai Gang | 20th National Congress of the Communist Party of China | CCP virus

Whatsapp, go to broadcast channels

On the day of the publication of the survey, Meta announces important news on the Whatsapp front: the Channels make their debut to receive important updates from people and organizations. The company says they will be a new one-way broadcast tool where administrators can send messages, images, videos, stickers and polls.

On the occasion of the launch of the Channels, the collaboration with the Singapore Heart Foundation and the Colombian fact-checker Colombia Check was also announced; In fact, Colombia and Singapore are the first countries where the Channels will be available, to fine-tune, improve and optimize the experience. Among the global partners for the launch are NGOs such as the International Rescue Committee and WHO, as well as big names in sport such as FC Barcelona and Manchester City. In the coming months, the Channels will also be available in other countries, giving anyone the opportunity to create one.


You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy