Home » Mixed work groups, free from gender stereotypes

Mixed work groups, free from gender stereotypes

by admin
Mixed work groups, free from gender stereotypes

Listen to the audio version of the article

«A buggy company generates buggy software». Guido Scorza, lawyer expert in new technologies and member of the Authority for the protection of personal data, thus summarizes the risk that artificial intelligence reproduces and amplifies prejudices and stereotypes, starting from gender ones. A danger that he sees as high first of all “in the algorithms applied to personnel selection, because it is likely that prescreening, which is increasingly widespread, will favor male CVs for some roles, discarding female ones a priori”.

“Professional” biases also worry Emanuela Girardi, founder and president of Pop AI (Popular Artificial Intelligence) and head of Adra, the European Association on AI, Data and Robotics which works with the EU Commission on the Horizon Europe 2021 investment plan -2027. «The generative artificial intelligence models, the most popular and most used by the general public – she explains – are trained on the knowledge that exists on the Internet, therefore full of potential discrimination. By re-proposing the content in a mathematical-statistical way, they do nothing but amplify what is present in our society. That’s why if I ask to generate an image of a doctor or a manager, it will be a middle-aged white man. If I ask for a nurse, she will be a woman, probably younger.”

The mechanism is treacherous: the pre-existing collective imagination colonizes the AI ​​imagination, which in turn reinforces it. “We risk finding all the known discrimination based on gender in the contents, including videos and photos,” says Scorza. A paradoxical leap backwards, which could sweep away the achievements of recent decades. But the consequences could be even more serious. Girardi highlights the systems used in the medical field. An example? «We know that heart attack symptoms are very different between men and women. But most AI systems are trained on the male human body, with the result that if a woman presents to a hospital where a similar system is used for triage, her symptoms may not be recognized. All systems that make decisions automatically in sensitive areas could increase inequalities.”

See also  New in .NET 8.0 [12]: Frozen object sets

Where does the short circuit lie? Scorza and Girardi agree: we need to look at the source, at the working groups that produce the algorithms. «In the United States they are predominantly made up of men and this influences the choices in product development», observes the expert. “It’s a little better in Europe, especially in the Nordic countries, but in any case the 20% presence of women is almost never exceeded.” We must increase that share. Promoting STEM degrees among female students, of course, but for Girardi «we also need more role models. In the nineties there were TV series about lawyers, then those about doctors. Today we should crown the women of tech as protagonists.”

New imagery to cure old stereotyped imagery. «The media have a great responsibility – underlines Scorza – which is also to offer a balanced narrative on algorithms, which is not polarized between catastrophists and enthusiasts but correctly informs about advantages and dangers». Necessary in conjunction with upstream work to identify potential biases earlier. «The European AI Act – says the jurist – introduces the impact assessment of every algorithmic solution to guarantee its sustainability in ethical terms». But there is also the path to downstream interventions. “It can be done with synthetic data, balancing the dataset if it is not complete or if it produces bias,” says Girardi. “A system that 90% generates middle-aged white men can be corrected.”

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy