- Aug 17, 2014
- The ecosystem of voice assistants is so stretched and intrusive, user privacy hangs on a thin thread.
- Researchers found a multitude of revealing sensitive user information stored on local devices.
- Companion apps on Android are another point of concern, as they ask the user to grant requests for several dangerous permissions.
A team of Greek scientists has decided to dive deeper into discovering what sensitive data could be hiding in the ecosystems of three popular voice assistants, Amazon’s Alexa, Google’s Assistant, and Microsoft’s Cortana. According to their detailed report, there are numerous reasons to be wary if you’re using one.
The team set up their testbed that is outlined in the diagram below, trying to figure out what data is received from the user, where it’s stored, how well it’s protected against unauthorized access, and what else can be deduced from it about the user. The accounts were created on the corresponding web interfaces. Then, the team used a smartphone, laptop, and a Raspberry Pi to interact with the voice assistants so as to get a complete idea of what happens across the ecosystem. The commands used covered everything from listening to music to adding items on the calendar and from opening private images stored locally on the device to creating to-do lists.
The ecosystem of voice assistants is so stretched and intrusive at the same time, user privacy hangs on a thin thread.