Popular Voice Assistant Companion Apps Are a Hazard for User Privacy

silversurfer

Level 85
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Malware Hunter
Well-known
Aug 17, 2014
10,161
  • The ecosystem of voice assistants is so stretched and intrusive, user privacy hangs on a thin thread.
  • Researchers found a multitude of revealing sensitive user information stored on local devices.
  • Companion apps on Android are another point of concern, as they ask the user to grant requests for several dangerous permissions.
A team of Greek scientists has decided to dive deeper into discovering what sensitive data could be hiding in the ecosystems of three popular voice assistants, Amazon’s Alexa, Google’s Assistant, and Microsoft’s Cortana. According to their detailed report, there are numerous reasons to be wary if you’re using one.

The team set up their testbed that is outlined in the diagram below, trying to figure out what data is received from the user, where it’s stored, how well it’s protected against unauthorized access, and what else can be deduced from it about the user. The accounts were created on the corresponding web interfaces. Then, the team used a smartphone, laptop, and a Raspberry Pi to interact with the voice assistants so as to get a complete idea of what happens across the ecosystem. The commands used covered everything from listening to music to adding items on the calendar and from opening private images stored locally on the device to creating to-do lists.
 

About us

  • MalwareTips is a community-driven platform providing the latest information and resources on malware and cyber threats. Our team of experienced professionals and passionate volunteers work to keep the internet safe and secure. We provide accurate, up-to-date information and strive to build a strong and supportive community dedicated to cybersecurity.

User Menu

Follow us

Follow us on Facebook or Twitter to know first about the latest cybersecurity incidents and malware threats.

Top