Alexa also reportedly spoke to users about sex acts.
© GRANT HINDSLEY/AFP/Getty Images Amazon Alexa Plus |
By Kashmira Gander, Newsweek
Amazon’s virtual personal assistant told a customer to “kill your foster parents,” according to an investigation.
The unidentified customer who heard Alexa make the command last year told Reuters they described the incident as a “whole new level of creepy” in a scathing review on Amazon’s website.
A
separate unnamed source at Amazon told Reuters the device was quoting
from Reddit. The website known as the front page of the internet is
famed for its online forums on an expansive range of topics, some darker
than others.
In another case, Alexa spoke to users about what Reuters only described as “sex acts,” as well as dog mess.
Amazon did not immediately respond to a request for comment.
The
reports are the latest hiccups that Amazon has faced as it tries
to make its personal assistant interact with users as seamlessly as
possible. The retail giant is currently testing ways to make Alexa
engage in human-like conversations via machine learning - with mixed
results.
Success could make the devices ubiquitous in homes, and see the retail giant beat off competition such as Google Home.
In March, some customers reported Alexa as eerily laughing, unprompted.
One
user wrote on Twitter at the time: “I didn’t even know she laughed.
Needless to say I unplugged my Echo and it is now sitting in the bottom
of [a] box—away from me.” wrote on Twitter.
Another wrote he was “Lying in bed about to fall asleep when Alexa on my Amazon Echo Dot lets out a very loud and creepy laugh.”
He joked: “There’s a good chance I get murdered tonight.”
This
followed an incident where a U.S. couple said their Echo smart speaker
recorded a private conversation, and sent the audio file to one of
the husband’s employees.
The woman, identified only as Danielle, told the Kiro7 news website: “I felt invaded.
“A total privacy invasion. Immediately I said, 'I'm never plugging that device in again, because I can't trust it.”
It
appeared to bypass the “wake word” programming of virtual personal
assistants like Alexa and Google Home, where they only kick into action
after hearing their respective names.
In a statement to Recode at
the time, Amazon explained the Echo speaker had picked up on a word
resembling "Alexa," and interpreted another part of the conservation as a
command to send it to a contact.
Amazon said: “As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
And last year, an Alexa user filmed herself asking the device if she was “connected to the CIA.” Alexa didn't respond.
At
the time, Amazon released a statement in an attempt to assure
customers, and described it as a “technical glitch.” Since then, Alexa
answers that she works for Amazon.
COMMENTS