Security News Voice Squatting Attacks Impact Amazon Alexa and Google Home Assistants

LASER_oneXM

Level 37
Thread author
Verified
Top Poster
Well-known
Feb 4, 2016
2,520
A team of Chinese and US researchers has discovered a new method of attacking smart assistants like Amazon Alexa and Google Home, which they have named "voice squatting."


The academics described their technique in a recently published research paper, together with another attack method named voice masquerading.


The idea is to trick the user into opening a malicious app by using voice triggers similar to the ones of authentic apps, and using the malicious apps to either phish users for sensitive data or eavesdrop on their surroundings.

The Voice Squatting Attack

The first of these attacks is called "voice squatting" and relies on similarities between the voice commands that trigger specific actions.


Academics discovered that they could register voice assistant apps (called "skills" by Amazon and "actions" by Google) that trigger on very similar phrases.


For example, an attacker can register an app that triggers on the phrase of "open capital won," which is phonetically similar to "open capital one," a command for opening the Capital One home banking app for voice assistants.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top