Alexa can mimic the voices of the dead

Alexa can mimic the voices of the dead

Alexa's new skill is perhaps the most disturbing of all, which is the ability to mimic any person's voices after about a minute of listening to a sample audio. To demonstrate this new feature, senior vice president of the team behind Amazon's voice assistant, Rohit Prasad, chose to play a video in which Alexa reads a bedtime story to a child by mimicking grandma's voice. The grandmother who died shortly before, bringing to the fore the age-old ethical question of how far technology can go to try to keep those who are no longer alive in some way.

On the occasion of the Amazon event Re: Mars conference in Las Vegas, Alexa showed a lot of news coming and the one that conquered the front page is undoubtedly the ability to imitate the voices. According to what a company spokesman told Engadget, the training requires about a minute of the person's audio file to be replicated to already obtain more than valid results, when previous experiments required hours and hours of listening. A very fast system, therefore, which calibrates the text-to-voice output by modulating the tones to represent the most disparate timbres. "We live in a golden age of artificial intelligence - commented Prasad - and if the academic year cannot eliminate the pain of a loss, it can certainly ensure that the memories remain" and one can only agree with the manager, even if there are two essential aspects to underline.

The first point concerns ethics: the imitation of the voice, of any voice, is a sort of deepfake at the audio level with the representation of the timbre of a person who may never have consented to this procedure. And this applies to those who are still alive, who could undergo the cloning of the voice by fishing any short movie or audio file, but also those who are no longer there. As in the case of the famous virtual embrace between a mother and a dead daughter, the representation of the deceased could cause serious problems, especially in the process of mourning. Second, it could accelerate the evolution of already existing scams that use voice imitation technology for illicit purposes such as last year's case of a transfer of over 30 million euros from a UAE bank to a tech criminal who he had impersonated an executive.