Home » To discover the analysis of emotions, one did not have to wait for AI, a TV series was enough

To discover the analysis of emotions, one did not have to wait for AI, a TV series was enough

by admin
To discover the analysis of emotions, one did not have to wait for AI, a TV series was enough

With a Pavlovian reflex, every time a software manages to mimic human abilities, the alarm triggered by the Frankenstein syndrome starts, the fear that the “creature” will rebel against its “pardone” and the invocation, needless to say, interventions by legislators, guarantors (not least, of course, the one on the protection of personal data) and other more or less secular divinities belonging to Pantheon of various origins.

The latest cry, in the matter of Frankenstein’s Syndrome, is what has been defined as “analysis of emotions” associated with the other great fetish-taboo of our times, facial recognition, all obviously brought about by a threatening “artificial intelligence “.

Illegal content and respect for rights, too many problems to solve

by Andrea Monti


You don’t need to be a neuroscience researcher or a psychologist, just have attended one of the many “effective communication” or “negotiation management” courses to know that there are disciplines, such as “Neuro Linguistic Programming” that are based on the “reading” of signals of the body or, to put it in terms that appeal to “alarmists”, on the analysis of biometric data that reveal character or behavioral traits. Techniques of this kind are also used in some forms of brief psychotherapy, coaching and even sports training.

Likewise, you don’t need to specialize in criminology, just have seen an episode of Lie to me, to know that the analysis of microexpressions, also based on the collection of bio and anthropometric data, is used, albeit with some skepticism, in the police activities of various countries, the United States in the lead. Whether all this is science, parascience or pseudoscience is another matter. In fact, we should ask ourselves if we “moderns” are really so different from those during the Antonine Plague listened to the quackery of Alessandro di Abonutico.

There is nothing to be surprised or scandalized, therefore, if the researchers who deal with these disciplines are taking the inevitable next step which is to automate the collection, analysis and recognition of data of interest – emotions, in this specific case.

See also  The final trailer for the second part of the "Pokémon Vermillion" DLC "Zero's Secret Treasure" has been released, and the mythical beasts and three royal families of the past have appeared | news

Be that as it may, it is beyond question that the analysis of microexpressions, together with facial recognition that also identifies ethnic and somatic traits, are necessary elements for the management of security, the prevention of crimes and the identification, possibly in real time, of suspects. . So much so that even the EU, yes, “that of the GDPR”, in started an experiment called iBorderCtrl whose outcome, however, is not known very strongly oriented to the use of facial recognition technologies for security purposes.

There is little scandalous or “politically incorrect” in all this: those who call the emergency numbers or file a complaint, try to provide the operator with everything that can be useful to identify the threat or the alleged perpetrator, and therefore : clothing, height, weight, eye color, “color”, gender and so on. Try, then, to imagine a call to 112 or a complaint of this tenor: “the suspect is more or less tall, more or less with more or less light hair, more or less long, more / or less Euro / Afro / Asian, dressed in trousers / tunic / skirt ”and then ask yourself what the Volanti or the Gazelles are doing with them that have to intervene, or the prosecutor who has to investigate.

Now, it is clear that these considerations do not mean that mass surveillance should be unchecked, only that controls should be about “how” and “who controls”, not “if”.

The use of more or less “intelligent” automated tools in police investigations and public safety activities is not a future but reality. The analyzes of the enormous amount of data acquired in international investigations such as those of Encrochat and Sky ECC clearly pose such problems, which translate first of all in terms of respect for the right of defense. Who and how acquired the data? With which systems were they tested? What guarantees are there that the results are reliable? Why, even if the Italian Supreme Court has timidly raised some doubts on this point, cannot this information be disclosed to the defenders?

See also  ROG Strix B760-F Gaming WiFi motherboard unboxing test / 16+1 phase 60A power supply, DDR5 frequency increased

It is also quite naïve to complain that personal information is collected and stored in public security information systems. The database of “police records”, the “Permanent Practices” of the Carabinieri which the Republic dealt with in 2000 and then in 2001, essentially with nothing, the Guarantor of personal data, and, before that, the “Central political record”, available on the website of the Ministry of Cultural Heritage, are the demonstration that the information activity needs analysis tools that make it possible to extract “meaning” from information that, on its own, is of no use. If it were possible to effectively interconnect even the databases of the justice system (general civil and criminal register, pending charges, criminal records, police records, SDI and so on) and extract “meaning” from that information, the capacity for prevention, investigation and repression of crimes would increase significantly.

This, however, would be primarily a political and not a technical choice.

The risk inherent in automatically attributing meaning to this data is that operators lazily rely on the results provided by the software or that we do not have the ability to understand what they mean. It is a more than concrete danger when, as mentioned, the amount of information to be put into the system is not manageable by a human being. On the other hand, we must ask ourselves how the right of the suspect would be guaranteed, that is, potentially any of us, in the face of investigations that need costly and, what is not simple, competent experts to be examined. Finally, we would have to question the degree of actual freedom of the judge in pronouncing a sentence which, inevitably, will be conditioned by his ignorance of the “technicalities” on the basis of which the evidence was formed.

See also  Neil Druckmann hints at the possibility of a future trilogy of The Last of Us games

Faced with these problems, neglected or treated with annoyance by the supporters of “predictive justice” at all costs, the alarm about micro-expressions and emotional analysis becomes a frankly marginal issue.

As always, when information technology is involved, you look at your finger and don’t even consider the hypothesis that there may be a moon and which one. In this case, in fact, the automation of emotional analysis necessarily brings the fact that it has been used “in clear” and therefore, even with the above limitations, it is possible to analyze and contest its use. On the contrary, when such techniques are applied directly by the inquisitor, be it a magistrate or a police officer, it is impossible, unless you know them, to realize what is happening. Therefore, even if formally, it is mandatory to acknowledge in the interrogation reports that methods or techniques aimed at influencing the freedom of self-determination or altering the ability to remember and evaluate facts were not used, in practice it would be very complex to prove the opposite when who he asks the questions, hypothetically, he uses these expedients.

But if this is the point, then the real problem is not the use of a program (or the infamous “algorithm”), but the clear attribution of responsibility for a choice that affects people’s freedom. The hypnotic machinistic fascination that pushes to talk about “AI rights” and “legal personhood” for robots is a hypocritical mantra repeated over and over again to absolve those who have to make a decision from the consequences of the exercise of power that is conferred on them. . And that doesn’t just apply to Carol “computer-says-no” Beer, the archetypal Little Britain character.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy