Amazon’s Alexa Could Soon Mimic Voice Of Dead Relatives

(AP) Amazon’s Alexa might soon replicate the voice of family members - even if they’re dead.

The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is in development & would allow the virtual assistant to mimic the voice of a specific person based on a less than a minute of provided recording.

Rohit Prasad, senior vice president & head scientist for Alexa, said at the event Wednesday that the desire behind the feature was to build greater trust in the interactions users have with Alexa by putting more “human attributes of empathy & affect.”

“These attributes have become even more important during the ongoing pandemic when so many of us have lost ones that we love,” Prasad said. “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”

In a video played by Amazon at the event, a young child asks “Alexa, can Grandma finish reading me the Wizard of Oz?” Alexa then acknowledges the request & switches to another voice mimicking the child’s grandmother. The voice assistant then continues to read the book in that same voice.

To create the feature, Prasad said the company had to learn how to make a “high-quality voice” with a shorter recording, opposed to hours of recording in a studio. Amazon did not provide further details about the feature, which is bound to spark more privacy concerns and ethical questions about consent.

Amazon’s push comes as competitor Microsoft earlier this week said it was scaling back its synthetic voice offerings and setting stricter guidelines to “ensure the active participation of the speaker” whose voice is recreated. Microsoft said Tuesday it is limiting which customers get to use the service -- while also continuing to highlight acceptable uses such as an interactive Bugs Bunny character at AT&T stores.

“This technology has exciting potential in education, accessibility & entertainment & yet it is also easy to imagine how it could be used to inappropriately impersonate speakers & deceive listeners,” said a blog post from Natasha Crampton, who heads Microsoft’s AI ethics division.

#amazonalexa #alexa #ai #artificialintelligence #ethical #unethical #impersonate #mimic #microsoft #deadrelatives #lovedones #dad #father #mom #mother #grandfather #grandpa #grandmother #grandma #son #daughter #grandkids #grandchildren #comfort #comforting #missed #deepfake #dangerous

Follow Me

Follow Us On TwitterFollow Me On YoutubeLike MY Facebook PageConnect With Me On LinkedinConnect With Me On Google+Join My WebsiteFollow Us On Instagram

Views: 24

Comment

You need to be a member of PaperChaserDotCom to add comments!

Join PaperChaserDotCom

Like My Official Facebook Page & Follow On Google+ & Instagram

My title

Instagram page contents

Follow Me On Twitter, Instagram, Facebook, YouTube

Follow Me

Join Our Facebook Fan Page Follow Us On Twitter Follow Me On Youtube Like MY Facebook Page Connect With Me On Linkedin
Connect With Me On Google+ Join My Website Follow Us On Instagram

Badge

Loading…

Events

About

© 2022   Created by ChasinDatPaper.   Powered by

Badges  |  Report an Issue  |  Terms of Service

} Facebook Login JavaScript Example