Alexa Recorded Conversation Unbidden, Shared It

An Amazon Alexa home assistant recorded a conversation taking place in the room and then sent it to a third party. There were no orders for it to do so. Danielle, in Portland, says she received a call from one of her husband's employees in Seattle, who told her that their Alexa device had been hacked.

"We unplugged all of them and he proceeded to tell us that he had received audio files of recordings from inside our house," she said. "At first, my husband was, like, 'no you didn't!' And the (recipient of the message) said 'You sat there talking about hardwood floors.' And we said, 'oh gosh, you really did hear us.'"

Danielle listened to the conversation when it was sent back to her, and she couldn't believe someone 176 miles away heard it too.

"I felt invaded," she said. "A total privacy invasion. Immediately I said, 'I'm never plugging that device in again, because I can't trust it.'"

An Amazon technician confirmed that Alexa had done exactly what they suspected, but it hadn't been hacked. The company later responded with an explanation.

In a statement Thursday, Amazon confirmed the woman’s private conversation had been inadvertently recorded and sent. The company said the device interpreted a word in the background conversation as “Alexa” — a command that makes it wake up — and then it interpreted the conversation as a “send message” request.

“At which point, Alexa said out loud ‘To whom?’” the statement said. “At which point, the background conversation was interpreted as a name in the customers contact list.

Amazon called it an “unlikely” string of events, but if a device can activate by and interpret (or misinterpret) background conversation as commands, the sky is the limit as to what it may do. -via Boing Boing

(Image credit: Flickr user methodshop .com)


Newest 5
Newest 5 Comments

Even worse is direct and undetectable commands hidden in audio:
https://www.nytimes.com/2018/05/10/technology/alexa-siri-hidden-command-audio-attacks.html
"A group of students from University of California, Berkeley, and Georgetown University showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text."
Abusive comment hidden. (Show it anyway.)
Login to comment.




Email This Post to a Friend
"Alexa Recorded Conversation Unbidden, Shared It"

Separate multiple emails with a comma. Limit 5.

 

Success! Your email has been sent!

close window
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
 
Learn More