- Nov 5, 2011
- 5,855
You cannot prevent the collecting of your data:
You Can’t Opt Out Of Sharing Your Data, Even If You Didn’t Opt In
By Maggie Koerth-Baker
Joseph James DeAngelo, the suspected Golden State Killer, was identified as a suspect in the decades-old cases after police linked DNA found at crime scenes to the DNA that DeAngelo’s relatives had uploaded to genealogy websites.
https://fivethirtyeight.com/wp-cont...841238.jpg?quality=90&strip=info&w=1024&ssl=1
Randy Pench / Sacramento Bee / TNS via Getty Images
The Golden State Killer, who terrorized Californians from Sacramento to Orange County over the course of a decade, committed his last known murder in 1986, the same year that DNA profiling was used in a criminal investigation for the first time. In that early case, officers convinced thousands of men to voluntarily turn over blood samples, building a genetic dragnet to search for a killer in their midst. The murderer was eventually identified by his attempts to avoid giving up his DNA. In contrast, suspected Golden State Killer Joseph James DeAngelo, who was apprehended just last week, was found through other people’s DNA — samples taken from the crime scenes were matched to the profiles his distant relatives had uploaded to a publicly accessible genealogy website.
You can see the rise of a modern privacy conundrum in the 32 years between the first DNA case and DeAngelo’s arrest. Digital privacy experts say that the way DeAngelo was found has implications reaching far beyond genetics, and the risks of exposure apply to everyone — not just alleged serial killers. We’re used to thinking about privacy breaches as what happens when we give data about ourselves to a third party, and that data is then stolen from or abused by that third party. It’s bad, sure. But we could have prevented it if we’d only made better choices.
Increasingly, though, individuals need to worry about another kind of privacy violation. I think of it as a modern tweak on the tragedy of the commons — call it “privacy of the commons.” It’s what happens when one person’s voluntary disclosure of personal information exposes the personal information of others who had no say in the matter. Your choices didn’t cause the breach. Your choices can’t prevent it, either. Welcome to a world where you can’t opt out of sharing, even if you didn’t opt in.
Yonatan Zunger, a former Google privacy engineer, noted we’ve known for a long time that one person’s personal information is never just their own to share. It’s the idea behind the old proverb, “Three may keep a secret if two of them are dead.” And as far back as the 1960s, said Jennifer Lynch, senior staff attorney for the Electronic Frontier Foundation, phone companies could help law enforcement collect a list of all the numbers one phone line called and how long the calls lasted. The phone records may help convict a guilty party, but they also likely call police attention to the phone numbers, identities and habits of people who may not have anything to do with the crime being investigated.
But the digital economy has changed things, making the privacy of the commons easier to exploit and creating stronger incentives to do so.
“One of the fascinating things we’ve now walked ourselves into is that companies are valued by the market on the basis of how much user data they have,” said Daniel Kahn Gillmor, senior staff technologist with the ACLU’s Speech, Privacy and Technology Project. A company can run along, not making a cent, but if it has a large user base and reams of private information about those users, then it’s valuable — and can be sold for millions. Companies that collect more data, keep that data, and use it to make connections between users are worth more. Sears, Roebuck and Co. may have been able to infer when you bought a gift from their catalog for a friend who lived in another town, but Amazon has more reason (and more ability) to use that information to build a profile of your friend’s interests.
We all saw this in action in the recent Cambridge Analytica scandal. The privacy of the commons is how the 270,000 Facebook users who actually downloaded the “thisisyourdigitallife” app turned into as many as 87 million users whose data ended up in the hands of a political marketing firm. Much of the narrative surrounding that scandal has focused on what individuals should be doing to protect themselves. But that idea that privacy is all about your individual decisions is part of the problem, said Julie Cohen, a technology and law professor at Georgetown University. “There’s a lot of burden being put on individuals to have an understanding and mastery of something that’s so complex that it would be impossible for them to do what they need to do,” she said.
Even if you do your searches from a specialized browser, tape over all your webcams and monitor your privacy settings without fail, your personal data has probably still been collected, stored and used in ways you didn’t intend — and don’t even know about.
Companies can even build a profile of a person from birth based entirely on data-sharing choices made by others, said Salome Viljoen, a lawyer and fellow with the Berkman Klein Center for Internet and Society at Harvard.
You Can’t Opt Out Of Sharing Your Data, Even If You Didn’t Opt In
By Maggie Koerth-Baker
Joseph James DeAngelo, the suspected Golden State Killer, was identified as a suspect in the decades-old cases after police linked DNA found at crime scenes to the DNA that DeAngelo’s relatives had uploaded to genealogy websites.
https://fivethirtyeight.com/wp-cont...841238.jpg?quality=90&strip=info&w=1024&ssl=1
Randy Pench / Sacramento Bee / TNS via Getty Images
The Golden State Killer, who terrorized Californians from Sacramento to Orange County over the course of a decade, committed his last known murder in 1986, the same year that DNA profiling was used in a criminal investigation for the first time. In that early case, officers convinced thousands of men to voluntarily turn over blood samples, building a genetic dragnet to search for a killer in their midst. The murderer was eventually identified by his attempts to avoid giving up his DNA. In contrast, suspected Golden State Killer Joseph James DeAngelo, who was apprehended just last week, was found through other people’s DNA — samples taken from the crime scenes were matched to the profiles his distant relatives had uploaded to a publicly accessible genealogy website.
You can see the rise of a modern privacy conundrum in the 32 years between the first DNA case and DeAngelo’s arrest. Digital privacy experts say that the way DeAngelo was found has implications reaching far beyond genetics, and the risks of exposure apply to everyone — not just alleged serial killers. We’re used to thinking about privacy breaches as what happens when we give data about ourselves to a third party, and that data is then stolen from or abused by that third party. It’s bad, sure. But we could have prevented it if we’d only made better choices.
Increasingly, though, individuals need to worry about another kind of privacy violation. I think of it as a modern tweak on the tragedy of the commons — call it “privacy of the commons.” It’s what happens when one person’s voluntary disclosure of personal information exposes the personal information of others who had no say in the matter. Your choices didn’t cause the breach. Your choices can’t prevent it, either. Welcome to a world where you can’t opt out of sharing, even if you didn’t opt in.
Yonatan Zunger, a former Google privacy engineer, noted we’ve known for a long time that one person’s personal information is never just their own to share. It’s the idea behind the old proverb, “Three may keep a secret if two of them are dead.” And as far back as the 1960s, said Jennifer Lynch, senior staff attorney for the Electronic Frontier Foundation, phone companies could help law enforcement collect a list of all the numbers one phone line called and how long the calls lasted. The phone records may help convict a guilty party, but they also likely call police attention to the phone numbers, identities and habits of people who may not have anything to do with the crime being investigated.
But the digital economy has changed things, making the privacy of the commons easier to exploit and creating stronger incentives to do so.
“One of the fascinating things we’ve now walked ourselves into is that companies are valued by the market on the basis of how much user data they have,” said Daniel Kahn Gillmor, senior staff technologist with the ACLU’s Speech, Privacy and Technology Project. A company can run along, not making a cent, but if it has a large user base and reams of private information about those users, then it’s valuable — and can be sold for millions. Companies that collect more data, keep that data, and use it to make connections between users are worth more. Sears, Roebuck and Co. may have been able to infer when you bought a gift from their catalog for a friend who lived in another town, but Amazon has more reason (and more ability) to use that information to build a profile of your friend’s interests.
We all saw this in action in the recent Cambridge Analytica scandal. The privacy of the commons is how the 270,000 Facebook users who actually downloaded the “thisisyourdigitallife” app turned into as many as 87 million users whose data ended up in the hands of a political marketing firm. Much of the narrative surrounding that scandal has focused on what individuals should be doing to protect themselves. But that idea that privacy is all about your individual decisions is part of the problem, said Julie Cohen, a technology and law professor at Georgetown University. “There’s a lot of burden being put on individuals to have an understanding and mastery of something that’s so complex that it would be impossible for them to do what they need to do,” she said.
Even if you do your searches from a specialized browser, tape over all your webcams and monitor your privacy settings without fail, your personal data has probably still been collected, stored and used in ways you didn’t intend — and don’t even know about.
Companies can even build a profile of a person from birth based entirely on data-sharing choices made by others, said Salome Viljoen, a lawyer and fellow with the Berkman Klein Center for Internet and Society at Harvard.
Imagine new parents signing up for a loyalty card at their local pharmacy and then filling all of their child’s prescriptions there. The information collected every time they scan that loyalty card adds up to something like a medical history, which could later be sold to data brokers or combined with data bought from brokers to paint a fuller picture of a person who never consented to any of this.
So does that mean that, in addition to locking down our own privacy choices, we need to police the choices of our friends and family? No, said Cohen, Gillmor and Viljoen. In fact, the privacy of the commons means that, in some cases, your data is collected in ways you cannot reasonably prevent, no matter how carefully you or anyone you know behaves.
Take, for instance, Equifax, the credit-rating company that lost control of the data of 143 million people last year. Those people weren’t necessarily members of Equifax. Instead, the company collected data from other companies the people chose to do business with, and much of that business was stuff people can’t get by without, like renting or owning a home. Or, alternately, consider Facebook, again. That company has admitted it tracks the online behavior of people who never intentionally engage with it at all, thanks to partnerships with other websites. (Like many sites, FiveThirtyEight has this kind of partnership with Facebook. Our pages talk to the social network in several ways, including through ads and comments, and because of the embedded “Like” button.) If hounding every person you’ve ever cared about into adopting encryption tools like PGP sounded like fun, you’ll love living in a van down by the river with no internet access.1
Instead, experts say these examples show that we need to think about online privacy less as a personal issue and more as a systemic one. Our digital commons is set up to encourage companies and governments to violate your privacy. If you live in a swamp and an alligator attacks you, do you blame yourself for being a slow swimmer? Or do you blame the swamp for forcing you to hang out with alligators?
There isn’t yet a clear answer for what the U.S. should do. Almost all of our privacy law and policy is framed around the idea of privacy as a personal choice, Cohen said. The result: very little regulation addressing what data can be collected, how it should be protected, or what can be done with it. In some ways, Gillmor said, online privacy is where the environmental movement was back in the 1950s, when lots of big, centralized choices were hurting individuals’ health, and individuals had little power to change that. “I don’t even know if we have had our ‘Silent Spring’ yet,” he said. “Maybe Cambridge Analytica will be our ‘Silent Spring.’”
Last edited: