For those who have a small child, and a virtual assistant at home, it is not uncommon to hear a request from the child to the electronic device, as if it were another member of the family.
Children are thirsty for answers, smart speakers never tire of finding the right one. Sometimes, however, it can happen that they are wrong. And that they even inadvertently endanger their very young interlocutors.
Artificial intelligence
Dialect, cooking and jokes: this is how Alexa learns Italian
by Bruno Ruffilli
It happened to Alexa, Amazon’s virtual assistant, which she recommended a very dangerous game for a 10-year-old child.
When the little boy asked him “a challenge” in which to try, Alexa – as often happens when she is questioned on issues that are not handled by the Amazon editorial team – responded by proposing a third-party result, fished on the web. In these cases, Alexa specifies the source from which the information comesborn.
The ‘challenge’ proposed to the 10-year-old child, who obviously he can’t understand how Alexa handles information, and maybe he was accustomed until a few days before to hear her reassuring voice explaining where Santa Claus was, was to connect a cell phone charger to the electrical outlet, but only halfway. And then of touch the exposed part of the spine with a coin.
What Alexa suggested, in this case, is a dangerous “challenge” made viral by TikTok in 2020, known as a “penny challenge” and already condemned by parents, social media users and law enforcement agencies: in fact, there is a risk not only to be electrocuted but also to start a house fire.
Kristin Livdhal posted on Twitter the dangerous response her son received from Alexa. The first comment under his tweet is from Amazon Help: “Sorry for what happened.”
Regarding the incident that happened to the child, which went viral, Amazon said: “Customer trust is at the heart of what we do, and Alexa is built to deliver accurate, relevant, and helpful information. As soon as we were notified of this error, we quickly corrected it. We will continue to improve our systems so that these results no longer appear in the future ”.
.