Amazon's Alexa recording a couple's conversation without consent is a pretty good reason to freak out
The voice-based assistant is facing the music for violating the personal space.
- Total Shares
The future, it's said, is all about AI-powered voice assistants. Digital beings that, at the drop of a wake word, would help answer queries and make our lives easier. Though we're still quite far from this promised future, a glimpse of what lies ahead can be seen in Google Assistant and Amazon's Alexa.
With recent updates, both these voice-controlled AIs have gained functionalities that could soon make them central to our modern way of life, and in the process, bring to us, in 2018, the promised future. But are we ready for the leap? Probably not.
According to a report by Kiro 7, a Portland-based TV channel, a woman identified as Danielle has alleged that Amazon's AI assistant Alexa recorded a private conversation between her and her husband and then sent it to a contact without their knowledge. And all of this, without it being woken up with the magic word, "Alexa" that Amazon insists is absolutely essential for Alexa to initiate such a task.
Danielle, who previously, had each of the rooms in her house hooked to Alexa, was alerted of the leak by the recipient of the message — one of her husband’s employees — who called her home to alert her that her smart speaker had been “hacked”.
As the report goes on to add, Danielle listened to the conversation when it was sent back to her and her reaction to it was as expected. "I felt invaded," she said. "A total privacy invasion. Immediately I said, 'I'm never plugging that device in again, because I can't trust it.'"
The incident was coming to reality of the prime fear that many of us have with smart speakers and their ever listening voice-based assistants. A fear that these machines are spying on us, or worst, are being used by a third party to keep an eye on us and record us in our most vulnerable and intimate of moments.
How did it happen?
As per the report, upon learning of the recording, she unplugged all Alexa based devices and called Amazon to investigate the matter. Though engineers from Amazon did inform her of how exactly the breach took place or if its a prevailing issue with Alexa based smart speakers, but they did confirm the incident.
"They said our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we're sorry... He told us that the device just guessed what we were saying... He apologised like 15 times in a matter of 30 minutes and he said we really appreciate you bringing this to our attention, this is something we need to fix!"
In an official statement to The Verge, Amazon confirmed the incident but tried to play down the issue.
"Echo woke up due to a word in background conversation sounding like 'Alexa.' Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customer's contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right'. As unlikely as this string of events is, we are evaluating options to make this case even less likely."
Not the first time
As the digital security firm, Symantec notes in a white paper it published earlier this year, the always listening nature of smart speakers of today means that everything you speak in their vicinity is at the threat of being sent over to backend servers of these tech giants.
Though, as per the policy, these smart speakers only listen, record and send to their servers conversations that have been had with them after they have been woken up. However, there have been cases when a glitch or a faulty device has been found to record more than it should.
Case in point: a journalist who was given a Google Home Mini ahead of its general release date discovered that the device was making recordings even when he hadn't summoned the Google Assistant.
Much like today's case involving Amazon Alexa, Google, too, had played down the issue. The tech giant called it a hardware problem which was caused because the "activation button on the device registered phantom touches", activating by itself. The issue was later fixed through a software update, but not before it exposed how great a concern such devices pose to the user's security.