Spawn

Administrator
Verified
Staff member
[..] according to The Guardian’s whistleblower, we are learning that nothing has changed.
Revealing his identity for the first time, Thomas le Bonniec, a contractor employed to listen to and grade Siri recordings, has written to data protection regulators stating that Apple “keeps ignoring and violating fundamental rights and continues their massive collection of data.”

Describing his role in detail, he writes:

“I listened to hundreds of recordings every day, from various Apple devices (eg. iPhones, Apple Watches, or iPads). These recordings were often taken outside of any activation of Siri, eg in the context of an actual intention from the user to activate it for a request. These processings were made without users being aware of it, and were gathered into datasets to correct the transcription of the recording made by the device.

"The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device. The system recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever."

While Le Bonniec rounds primarily on Apple, he is also highly critical of the lack of action taken against Apple and big tech companies in general, saying “I am extremely concerned that [they] are basically wiretapping entire populations”.

It is important to note that Apple’s so-called wiretapping is not an isolated case. Amazon, Google and Facebook have admitted to similar practices [..]
 

MacDefender

Level 11
Verified
FWIW calling this wiretapping is pretty sensational. This is how all virtual assistants work and Apple in particular makes the opt in vs opt out decision as part of when you set up your phone. You can even turn off Siri altogether and still have an offline virtual assistant.
 

Kamer

Level 1
FWIW calling this wiretapping is pretty sensational. This is how all virtual assistants work and Apple in particular makes the opt in vs opt out decision as part of when you set up your phone. You can even turn off Siri altogether and still have an offline virtual assistant.
I'm sure he means that it violates its own explicit design. People expect a certain phrase to activate the device, and not listen/record otherwise.
 

Stopspying

Level 10
FWIW calling this wiretapping is pretty sensational. This is how all virtual assistants work and Apple in particular makes the opt in vs opt out decision as part of when you set up your phone. You can even turn off Siri altogether and still have an offline virtual assistant.
This may well be true but there must be *illions of people out there with virtual assistants and smartphones who have no idea what the tech companies are taking from them in return for making them, besides their hard-earned. They often have no idea how to access settings beyond the most simple ones. I'm not aiming to demean any of these people, they exist, they use technology, often their choices are made due to peer group pressure and ads, and they can't understand how to maintain any decent level of privacy. I count some of them as good friends. One called me this morning about some phishing emails, she had guessed correctly what they were, but I had to repeat a number of times that she should in no way respond to the emails, or any links provided in them. She is an intellegent person but not technologically inclined at all, other than to use phones and tablets at their more basic levels. I've tried on a number of occasions to explain the easiest ways to improve the security of her personal data on her iPhone, I see her eyes glaze and attention wander. I've had training in how to educate this sort of person to use tech in the past. Big tech encouraged consumers to purchase a lot of 'toys' that they can't use safely. IT security is a much bigger nightmare to maintain as there are so many more attack points, I'm not saying that everyone shouldn't have access to tech, its not in the interests of the data harvesters to encourage good security practices, but if people had some basic IT training then the world could be a better place, with less malware and its perpetrators.

I look forward to finding out what sort of grovelling apology Apple make, for failing to keep to their promises from the last time they were caught out on this score, the same old mealy-mouthed tosh probably. I'm not Apple bashing, big tech companies aren't much different from each other, this story is about Apple though.
 

MacDefender

Level 11
Verified
The tech companies just can't help themselves, I really did expect more from Apple. No doubt it's some sort of error :) and was not intentional if Apple reply.
I would be concerned it’s not an error. It seems like it’s standard to train and fine tune voice assistants by having a human review interactions and categorize the request if the AI did an incorrect job.
The only control the average user has is opting in or out of this human review process, and Siri/Google/Alexa all give some sort of way to do so.

if you don’t opt out, they all claim some sort of attempt to anonymize your data but that’s really not realistic. For example if you say “call my mom” and it resolves the contact card for your mother, there goes anonymity. Google claims they try to alter your voice to keep your voice anonymous.
 

SeriousHoax

Level 30
Verified
Malware Tester
These recordings were often taken outside of any activation of Siri, eg in the context of an actual intention from the user to activate it for a request. These processings were made without users being aware of it, and were gathered into datasets to correct the transcription of the recording made by the device.
This is the main worrying point. Few years ago I found the same on google voice recording log of my account and since then I am yet to use any personal assistant and don't give apps permission to access the microphone, location and camera unless it's required. At least the good thing was that I was able delete all the logs from my google account. Does apple give you the same option?
 

Vitali Ortzi

Level 20
Verified
This is the main worrying point. Few years ago I found the same on google voice recording log of my account and since then I am yet to use any personal assistant and don't give apps permission to access the microphone, location and camera unless it's required. At least the good thing was that I was able delete all the logs from my google account. Does apple give you the same option?
If you're worried about it better use something like graphene OS to have more control.
 

SeriousHoax

Level 30
Verified
Malware Tester
If you're worried about it better use something like graphene OS to have more control.
Not too worried. I use an Android phone, use gmail account, use google photos to store unlimited photos and some other things. So, I can't really degooglify (Can it even be a word!) myself. I just do what I can to minimize some things. I wouldn't want anyone to listen to me all day long, track my location via GPS so in this case I just keep those features turned off. That's it really.
 

Vitali Ortzi

Level 20
Verified
Not too worried. I use an Android phone, use gmail account, use google photos to store unlimited photos and some other things. So, I can't really degooglify (Can it even be a word!) myself. I just do what I can to minimize some things. I wouldn't want anyone to listen to me all day long, track my location via GPS so in this case I just keep those features turned off. That's it really.
Same.
At least use different profile for security and one for usability to switch between.
 

MacDefender

Level 11
Verified
This is the main worrying point. Few years ago I found the same on google voice recording log of my account and since then I am yet to use any personal assistant and don't give apps permission to access the microphone, location and camera unless it's required. At least the good thing was that I was able delete all the logs from my google account. Does apple give you the same option?


Yeah, Settings -> Siri And Search -> Siri & Dictation History.

Note that Siri's history in the cloud is never tied to your Apple account or your identity. Each of your devices has a unique identifier and that's the anonymous identifier used to tie your data. This is in contrast with Google and Alexa, both of which by default tie your voice history to your Google/Amazon account.

It's definitely worth considering whether to allow "Hey Siri" triggers if you find your voice assistant accidentally activating. If you have a phone, that's not a super useful feature in my opinion and it leads to those kinds of inadvertent voice clips.
 

blackice

Level 28
Verified
Yeah, Settings -> Siri And Search -> Siri & Dictation History.

Note that Siri's history in the cloud is never tied to your Apple account or your identity. Each of your devices has a unique identifier and that's the anonymous identifier used to tie your data. This is in contrast with Google and Alexa, both of which by default tie your voice history to your Google/Amazon account.

It's definitely worth considering whether to allow "Hey Siri" triggers if you find your voice assistant accidentally activating. If you have a phone, that's not a super useful feature in my opinion and it leads to those kinds of inadvertent voice clips.
If anything “Hey Siri” is mostly annoying.
 

Local Host

Level 22
Verified
This is not surprising, unfortunately the Smartphone is not a device you can remove/disable the microphone without harming basic functionality (how are you going to make calls otherwise).

This is not exclusive to Apple either, the majorly of companies worldwide do this if given the chance, is why Hardware switches (to turn OFF the Microphones and Cameras) are a must.

Another unfortunate fact, is the lack of TVs without the Smart nonsense, it forced me to torn down TVs to remove the Microphone funcionality directly through the Hardware.

@Stopspying You're confusing privacy with security.
 
  • Like
Reactions: Stopspying

Vasudev

Level 30
Verified
I've also disabled iOS's Share Analytics with Apple and Devs! With newest version of 3uTools you can delete those telemetry logs by clicking on Crash Analysis and delete all logs. Otherwise, you need to use iTunes and sync to delete telemetry and other logs.
Also disabled Siri and Hey Siri and replaced it with Google.
 
  • Like
Reactions: Stopspying

MonSpyder9

Level 2
Could this be possible for Android phones as well? They do have their "Ok, Google..." feature which might do the same thing. What's the proper way of dealing with this? Switching to LineageOS?
 
  • Like
Reactions: Stopspying

MacDefender

Level 11
Verified
Could this be possible for Android phones as well? They do have their "Ok, Google..." feature which might do the same thing. What's the proper way of dealing with this? Switching to LineageOS?

Yes, but if you don't change your Google settings, Google Assistant stores all voice clips tied to your account and your voice queries actually become part of your personalized search and advertising history too. Choose what to share with your Google Assistant - Google Assistant Help

I agree with what @Stopspying said in that it's confusing and buried beneath a lot of legalese and feel-good privacy policy language what various tech companies are doing with your voice recordings, but unfortunately I don't see a way around getting a basic understanding of how your assistant treats your privacy and what options to change to fine-tune that.
 
Top