Home » Amazon’s bad idea of ​​letting the dead talk

Amazon’s bad idea of ​​letting the dead talk

by admin
Amazon’s bad idea of ​​letting the dead talk

Imagine coming home and talking to your voice assistant. You tell him something and that responds with the voice of your parent, partner or child. But they are dead. And yet they speak, or at least, let’s say it better: the artificial intelligence that is inside the voice assistants has taken possession of their voice to make them say things that they have not said, that perhaps they would never have said, and that in any case they will never be able to say. since they are dead.

Can you imagine a more cynical, macabre and stupid use of a technology? Not to Amazon, obviously, because they presented this thing with great fanfare at their annual event on the future, which is taking place in California: it was their scientist, Rohit Prasad, who did it by saying that, for example, since many they have lost relatives to the coronavirus, in this way they will not only be able to hear their voice, but delude yourself that you can still have a conversation with them. Or, she said again, the grandchildren will be able to read the story left halfway by their newly deceased grandmother. On social media among young people there is a word to define things like this: creepy. Horrifying.

The operation of the thing is quite simple, explained the Amazon scientist, and is based on an application of artificial intelligence: that’s enough provide a few-minute recording of someone’s speech (also the voice messages he sent us while he was alive) and Alexa, Amazon’s voice assistant, will take over that voice from then on. What time is it, what time ago, can you give me the couscous recipe, what’s the latest news? The dear departed answers you.

See also  Pnrr bogged down: less than a quarter of the budget spent on healthcare. Spending on inclusion and social cohesion is also at a standstill

This raises a lot of questions: is it legal? Who does the voice of those recordings belong to after one is dead? And assuming that before dying someone gives up the rights to use their voice, this means that one criminal will perfectly simulate another’s voice to commit a crime? More generally: what serious problem does those who believe that in the near future we will want to have voice assistants speak with the voice of our deceased loved ones exactly what serious problem?

They call her Human-like Empathy, or the attempt to build machines with an empathy similar to that of human beings, but here it is humanity that seems to have ended up in a black hole. The only hope is what happened in the room when the scientist announced it: a dead silence.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy