Since the rise of Amazon’s Alexa, tech junkies have speculated the device is secretly listening to private conversations. So-called tech experts often laughed at those claims, but it turns out the artificial intelligence device really does listen to people’s personal exchanges.
A Portland, Ore., family discovered their Amazon Alexa recorded a private conversation and sent it to a Seattle person in the family’s contact list.
“My husband and I would joke and say, ‘I’d bet these devices are listening to what we’re saying,'” said Danielle, whose last name was withheld by KIRO-TV.
Two weeks ago, the family received a disturbing phone call from one of Danielle’s husband’s employees in Seattle.
“The person on the other line said, ‘Unplug your Alexa devices right now,'” Danielle said, according to KIRO-TV. “‘You’re being hacked.'” (RELATED: ACLU: Amazon Is Selling Facial Recognition System To Local Law Enforcement)
“We unplugged all of them, and he proceeded to tell us that he had received audio files of recordings from inside our house,” Danielle said. “At first, my husband was, like, ‘No you didn’t!’ And [her husband’s employee in Seattle] said ‘You sat there talking about hardwood floors.’ And we said, ‘Oh gosh, you really did hear us.'”
The family owns several Amazon devices — all of which use the AI-powered virtual assistant, Alexa. Alexa starts recording conversations when it hears a “wake word,” which is usually “Alexa,” but it can be customized.
“When your device detects the wake word, it streams audio to the Cloud in order to process your request and return a response,” an Amazon spokesman said, according to an April 14 report from The Daily Caller News Foundation.
The information that’s recorded is placed in a digital log, which the device’s owner can access and delete.
“Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future,” Amazon said to KIRO-TV.
The company apologized for the incident, Danielle said.
“I felt invaded,” Danielle told KIRO-TV. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.'”
Amazon refused to refund her for the Alexa, Danielle added.
Despite Daneille’s claims, Amazon seemingly rationalized it all away:
“Echo woke up due to a word in background conversation sounding like ‘Alexa.’ Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right.’ As unlikely as this string of events is, we are evaluating options to make this case even less likely,” according to an Amazon spokesperson’s claim.
This story has been updated to reflect a claim from an Amazon spokesperson at 8:52PM EDT May 24, 2018.
Send tips to email@example.com
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact firstname.lastname@example.org.