Home » “Facebook encourages hate speech for profit”: the latest whistleblower accusing the social media comes out

“Facebook encourages hate speech for profit”: the latest whistleblower accusing the social media comes out

by admin

He has a name and surname the whistleblower who stole a large number of internal Facebook documents, turning them to the Wall Street Journal between the end of 2020 and April 2021 and allowing the newspaper to publish a number of investigations and insights that have cornered the giant, known as ” Facebook Files “. For example, the one on the effects of Instagram on the psychology of teenagers who use it. Is called Frances Haugen, is a 37-year-old computer engineer and gave an interview to the show 60 Minutes of the US channel CBS.

Outrage and anger earn more likes on social media

by Marco Cimminella


History
In the course of the televised interview with Scott Pelley returned to the attack, confirming everything that comes out of those documents transmitted to the newspaper. Above all, an underlying reality: that is, that the group is so committed to optimizing the product, and the way in which it can be rendered, that it has chosen todevelop and distribute algorithms that amplify hate speech, that is, hatred and opposition between users. This is contrary to what, publicly, the top managers of Menlo Park repeat incessantly: that is to have a clear idea of ​​the problem and to do everything to contain its effects. On the contrary, explained Haugen – who entered the social network in 2019, until the end of 2020 in the Civic integrity team then dismissed and finally resigned in April of this year – he does not think that “they are willing to invest what actually needs to be invested for avoid that Facebook continues to be dangerous“.

See also  2023 Nobel Prize in Medicine Awarded to Karikó and Weissman for Their Groundbreaking Discoveries in mRNA Vaccine Development

The woman explained that she had worked for years in Google and Pinterest but found in Facebook “a significantly worse environment” precisely because of the desire of the group of put profits above everything, including the well-being of users. Haugen’s team had the task of monitoring elections all over the world and understanding, by intervening with the appropriate solutions, how governments could use the tools of social media to spread false news or for purposes that disfavored a healthy democratic confrontation in their respective countries. . Too bad that group had received a task too burdensome to carry out in a short time: just three months, as if the company had been launched more out of duty than out of actual conviction.

Not surprisingly, Haugen – who has started to transmit confidential documents but without particular protections, caught for example on the platform for employees Workplace, to Wall Street Journal last December – he also added that at the time in the Facebook organization there were teams with extremely complex tasks composed of very few resources. At least in proportion to the challenges that were at hand, such as identifying and countering the exploitation of people. Why so few resources for such important goals, the former employee seems to be wondering?

Cyberbullying, Instagram blocks offensive messages: harassers away even if they open other profiles

by Simone Cosimi


Good for Facebook but not for others
“There was a conflict between what was good for the audience and what was good for Facebook,” Haugen told Pelley, “and Facebook has chosen over and over to optimize for their own interests, that is, to make more money“. An internal document procured by Haugen, for example, explains that – despite the company cyclically repeating that it has an essential role in containing hate speech -” we estimate that we can only intervene on 3-5% of hate content and on 0.6% of those involving violence and incitement to violence despite our tools being the best in the world in this area “.

See also  Endometriosis, that pain not to be underestimated

According to Haugen, the root of most of Facebook’s problems lies in the new algorithms implemented starting from 2018. In his view designed for increase engagement, that is, the involvement and activities of users, and according to the group what produces the most engagement is what leverages fear and hatred among users. “It’s easier to inspire people with anger than with other emotions,” said Hagen, whose speech was anticipated on TV, on Cnn, none other than Nick Clegg, vice president of global affairs and former British vice premier. On the contrary, at the time Mark Zuckerberg presented the magic formula that would have orchestrated and decided what we see on our message boards, and beyond, in a completely opposite perspective. And it could not have been otherwise, even if internally it seems we know well that we cannot, and want, do much more. The result is that “misinformation, toxicity and violent content are excessively prevalent among the re-shares” as we read in an internal note quoted by the Journal and also shot by Haugen, evaluating the effects of the change in the algorithm. We will see what he will tell in more detail in the Senate, in a hearing in committee on Tuesday 5 October.

The reaction
Meanwhile, the Californian giant has answered point by point to the accusations. For example, on the general accusation, explaining that “the growth of people or advertisers using Facebook it means nothing if our services are not used in ways that bring people togetherThis is why we are investing so much in security that it impacts our bottom line. Protecting our community is more important than maximizing our profits. Saying we turn a blind eye ignores these investments, including the 40 thousand people who work for security on Facebook and our $ 13 billion investment since 2016 ”.

See also  Hong Kong Masters: O'Sullivan beat Fu Jiajun 6-4 to win £100,000 in prize money – yqqlm

On the fact that the algorithm modified in 2018 – then communicated as a return to the intimacy of one’s closest circles of contacts – has worsened the situation, however, the group explained that “the objective of the modification of the Meaningful Social Interactions ranking [l’elemento principale che stabilisce la visibilità dei contenuti sul newsfeed, nda] is expressed in the name: improving people’s experience by prioritizing posts that inspire interactions, especially conversations, between family and friends – which research shows are and interactions better for people’s well-being – and by deprioritizing public content . Other insights also show that polarization has been growing in the United States for decades, long before platforms like Facebook existed, and that it is decreasing in other countries where internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people’s experience more meaningful, but blaming Facebook ignores the root causes of these problems and what the data says”.

.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy