Google Home, Amazon Alexa can be manipulated to run malicious software, researchers find
It wouldn’t be the first time security researchers have detected a vulnerability in smart devices that would let hackers infiltrate home networks. If Google and Amazon don’t fix the newly detected technique of messing with personal assistants, hackers might be in for a treat.
Voice-squatting attacks and voice-masquerading attacks are the most recent hacking techniques a group of Chinese and American researchers from Indiana University, Bloomington, the University of Virginia and the Chinese Academy of Sciences came up with to listen in to smart assistants Google Home and Amazon Echo.
“We have collected real-world conversations between users and our Alexa skills through the five skills we published for a month,” reads the paper. “The dataset includes 21,308 user commands made by 2,699 users with corresponding response from our skills. The dataset will be released once the amendment of IRB protocol is approved.”
The group tested a number of simple phrases that sound similar to the names of malware tools to check if the assistants would process the request.
For example, attack skills “intraMatic opener” and “rap game” sound like target skills “Entrematic Opener” and “rat game.” The proof of concept confirms the assistants could not always tell the difference between voice commands, especially if the sounds are very much alike. Hackers could easily use this technique to trick users into installing malicious software without their knowledge.
The team contacted Google and Amazon to present their research. Their findings were acknowledged and they are all working together to fix the security risks.
“Nice catch! I’ve filed a bug based on your report,” Google replied. “The panel will evaluate it at the next VRP (Vulnerability Reward Program) panel meeting and we’ll update you once we’ve got more information. All you need to do now is wait. If you don’t hear back from us in 2-3 weeks or have additional information about the vulnerability, let us know!”
“Thank you for reporting the security issues to us, we are able to reproduce them, and we are currently investigating fixes and mitigations,” said Amazon. “If you could share the mitigations you mention, that would be helpful in our evaluation. We’ll update you again when have further information to share. Thanks.”Amazon Alexa google home security issues voice masquerading voice squatting