We support our Publishers and Content Creators. You can view this story on their website by CLICKING HERE.

Just in case we didn’t scar you enough with our recent story about a baby who thinks his mom and Alexa are the same, Amazon has announced a new feature that is just as horrifying.

Miss the soft and sweet sound of your departed grandmother’s caring voice? Just have the audio of her voice transplanted into your virtual slave, Alexa!

.

.

.

I mean this in all seriousness: who the heck asked for this?

This might be the most horrifyingly dystopian idea I have heard of in a while.

And that’s saying a lot for this day and age.

Amazon’s Alexa might soon replicate the voice of family members – even if they’re dead.

The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is in development and would allow the virtual assistant to mimic the voice of a specific person based on a less than a minute of provided recording.

I can see this being fun or useful if you wanted to steal Morgan Freeman or James Earl Jones’s voice for Alexa.

Or it could be used for comedic purposes. Like if it was Gilbert Gottfried or Jason Alexander’s voice.

Or Ben Shapiro’s (that would actually be hilarious… set that sucker to 3x speed and let’s GOOOO).

But the purpose behind this is to make Alexa more personable and to get people to disclose more to the machine that is hooked up to Jeff Bezo’s money-making machine.

Rohit Prasad, senior vice president and head scientist for Alexa, said at the event Wednesday that the desire behind the feature was to build greater trust in the interactions users have with Alexa by putting more “human attributes of empathy and affect.”

“These attributes have become even more important during the ongoing pandemic when so many of us have lost ones that we love,” Prasad said. “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”

In a video played by Amazon at the event, a young child asks “Alexa, can Grandma finish reading me the Wizard of Oz?” Alexa then acknowledges the request, and switches to another voice mimicking the child’s grandmother. The voice assistant then continues to read the book in that same voice.

Alexa can’t ease the pain of loss, but it can make sure you can hear it imitate the voice of your lost loved ones in a creepy robotic way.

To create the feature, Prasad said the company had to learn how to make a “high-quality voice” with a shorter recording, opposed to hours of recording in a studio. Amazon did not provide further details about the feature, which is bound to spark more privacy concerns and ethical questions about consent.

I’m sorry, but this is beyond creepy and I won’t allow my FBI/Amazon listening device to talk to me like a dead relative.

I’ll just talk to her like a normal generic robot.


P.S. Now check out our latest video: BIDEN 2024? 👇


Keep up with our latest videos — Subscribe to our YouTube channel!