Home » YouTube under accusation: suggests hate videos and fake news that violate its rules

YouTube under accusation: suggests hate videos and fake news that violate its rules

by admin

Hate videos, extremism, disinformation, and conspiracy continue to circulate on YouTube thanks also to its algorithms: to point the finger at the Google video platform are the researchers of the rival Mozilla Foundation in a new investigation which involved over 38 thousand people.

To collect the data necessary for the study, carried out between July 2020 and May 2021, it was used the RegretsReporter browser extension, installed voluntarily by thousands of surfers. With this typical crowdsourcing approach, a tool has been provided that allows participants to monitor and report any content inappropriate content on YouTube.

social and security

Behind the scenes of TikTok, where AI and humans fight against fake news

by Emanuele Capone


Thousands of violent and inappropriate videos
Thanks to the information obtained it was possible to bring out the mass of videos, from 91 different countries, which incite violence, spread hatred, racism, coronavirus alarm and fake news or even show children cartoons that are completely unsuitable for them.

It should be emphasized that it is the non-English language production to be most affected by this phenomenon.

Another important point is that in the vast majority of cases (71%), to provoke the negative reaction of the users were the videos recommended by the same artificial intelligence algorithms used by Big G. Furthermore, most of the clips subject to reporting violate the community rules defined by YouTube to determine which content is allowed.

Beyond 200 videos, among those reported, have been removed, but the cancellation took place quite late: only after they had reached a total of about 160 million views.

See also  Citizens' portal on dangerous spots in Dresden's traffic longer online

Is the algorithm out of control?
In light of the research findings, Brandi Geurkink, author of the document together with Jesse McCrosky, has good game to argue that the Google platform should admit its faults, recognizing the responsibility for poor algorithm design. Whose deleterious consequences are to harm and misinform people.

The YouTube AI systems flop despite best efforts, it is even more evident if we take into account that the algorithm carries over 70% of the contents displayed on the platform.

Video

How YouTube Shorts works, Google’s novelty to beat TikTok

by Emanuele Capone



Mozilla’s tips
It is not the first time that Big G is having to respond to criticism on this issue. The accusations have rained down from various quarters and against the Youtube algorithm are scesi in campo New York Times, Washington Post e Wall Street Journal. Even the Mozilla researchers, to which the company headed by Susan Wojcicki has replied by defending his work, have expressed themselves on several occasions, denouncing the action of Youtube as not very effective if not characterized by inertia and opacity.

In their opinion, to solve the problem they would be necessary greater transparency and more timely and periodic information on algorithms that recommend video content. They would also be useful laws capable of regulating artificial intelligence, to impose stricter rules on platforms and to better protect independent research, a guarantee of objectivity.

.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy