A team of Chinese and US researchers has discovered a new method of attacking smart assistants like Amazon Alexa and Google Home, which they have named "voice squatting."
The academics described their technique in a recently published research paper, together with another attack method named voice masquerading.
The idea is to trick the user into opening a malicious app by using voice triggers similar to the ones of authentic apps, and using the malicious apps to either phish users for sensitive data or eavesdrop on their surroundings.
The Voice Squatting Attack
The first of these attacks is called "voice squatting" and relies on similarities between the voice commands that trigger specific actions.
Academics discovered that they could register voice assistant apps (called "skills" by Amazon and "actions" by Google) that trigger on very similar phrases.
For example, an attacker can register an app that triggers on the phrase of "open capital won," which is phonetically similar to "open capital one," a command for opening the Capital One home banking app for voice assistants.