Home » Investigation accuses: “Instagram does not stop misinformation on Covid and vaccines”. The reply: “Old study”

Investigation accuses: “Instagram does not stop misinformation on Covid and vaccines”. The reply: “Old study”

by admin

A FORTY-PAGE REPORT accuses Instagram: its algorithms promote disinformation with respect to the Sars-CoV-2 pandemic and anti-vaccine content. An avalanche of posts potentially addressed to tens of millions of users. He signs it on Center for Countering Digital Hate, a non-profit organization with offices in London and Washington DC, which has opened a series of “investigative” accounts to better understand the mechanism of what the Facebook-controlled video application offers to users, especially those who have just signed up, in two specific sections: the Explore section, the one from which to start discovering what is boiling on the platform, and in the posts suggested on the bulletin board introduced a few months ago.

The result? These tools of the app driven by Adam Mosseri, right arm of Mark Zuckerberg, encourage users to visualize misinformation and throw them into a kind of endless spiral. In the sense that those who make the unfortunate misstep of interacting with certain contents end up being relaunched and bombarded with other contents of the same content, even worse if possible. If, for example, a user follows anti-vaccine accounts, he will also be offered material on QAnon conspiracy theorists and post anti-Semites. If, on the contrary, it starts from the most bizarre conspiracy, it ends up in the clutches of no-wax photos and videos or of electoral disinformation. In short, a bottomless pit and above all with no exit.

Facebook’s war on online hate. “97% of dangerous content stopped”

by Jaime D’Alessandro



The researchers probed the suggestions of the social network, used every month by over a billion users, exploiting 15 different profiles and following different groups of accounts ranging from those of the health authorities to those engaged against vaccines and deniers of the pandemic, organizing for each profile a different mix of “following”. They connected to these study accounts every day, recording all the suggestions “pushed” and received by the algorithm (which they renamed the research available here, “malgoritmi”), their tenor and their content. Investigating above all in the Explore section and placing likes randomly to trigger the functioning of the suggested posts, which otherwise in the brand new profiles does not activate until you interact for a while with photos, videos, reels and the other contents. All this between September 14th and November 16th of last year and saving the screenshots of the suggestions.

Vertical scrolling Instagram stories like on TikTok: the social news that makes experts discuss

by Simone Cosimi



The research explains that Instagram recommended 104 posts containing misinformation to the 15 profiles during that time. More than half obviously dealt with Covid-19, a fifth of vaccines and a tenth of the then upcoming US presidential elections. Users of the CCDH have also received suggestions related to posts in favor of the QAnon conspiracy and in general to photos and videos steeped in anti-Semitism. The only safe path from which no such recommendations were generated was that of accounts that followed official profiles of recognized international health authorities.

The accusations of the non-profit led by Imran Ahmed, which even calls for the suspension of the algorithm but opens an essential issue on the transparency of those IT tools, are very harsh: “Malgorithm, the latest CCDH report, shows how the Instagram algorithm actively promotes disinformation and extremist content to users – the survey reads – they are encouraged to view this type of material and, once hooked, they are fed content that pushes other points of view of that radicalized worldview […] This is a deliberate tactic. The companies that manage social networks constantly try to maximize user engagement. Put simply, the more time users spend on Instagram, the more revenue grows. That’s why, last August, Instagram added unsolicited user content to the timeline to boost engagement. Once a user runs out of the most recent content from the accounts they follow, Instagram’s algorithms present new content as ‘an extension of your feed’. Machine learning algorithms identify potential users ‘interests based on data and habits, then find highly engaging content of the same type and inject it into users’ feeds. Previous research shows that misinformation is shared and gets more engagement of truths on social media. Worse still, a high engagement rate increases the likelihood that even neutral observers will interact with the content. For Instagram and its algorithms, one click is a victory, no matter the content “. And the removal work that the platform also carries out is defined as “insufficient”.

The CCDH also issued an open letter to Zuckerberg, begging him to disable and correct the algorithm. Like? For example by excluding posts dealing with Covid and vaccines from the suggestion system, maintaining a blacklist of well-known accounts to spread disinformation, limiting them to the maximum, and also giving a ceiling to the number of suggested posts addressed to each user. The methods of intervention could in fact be many. The organization also explains that Instagram could better protect its users from disinformation by denying the blue verification check, the one that indicates certified users, to accounts that are actually known to spread news or theories without foundation (among those present in the report also the third child by Bob Kennedy, Robert F. Kennedy Jr, well-known denial leader) introducing alerts on content selected by the algorithm and “vaccinating” users with a new content strategy, for example showing correct posts to those too exposed to extremist posts and always giving more space for advertisements from official organizations to complement those without foundation.

A Facebook spokesperson just downplayed, explaining that the research is five months old and based on “an extremely small sample” of just 104 posts.

.

See also  The effects of Instagram on teenagers, Ethan Zuckerman: "The only way is to have smaller social networks"

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy