Privacy and Girls Around Me
This is what can happen when you don’t understand the fine nuances of privacy policies: you could wind up a pop-up on Girls Around Me (“GAM”). GAM is a really creepy iPhone app that coordinates data from GPS readings on cell phones and Foursquare, and data from Facebook and Google Maps to give the user pop-up images of women physically located near the user. The user can then, innocuously enough, go approach these women to flirt and ask them for dates.
Or he could use the same data to determine where their homes and workplaces are, follow and stalk them, or rape them.
And while it looks like this is an astonishing and malevolent invasion of privacy, it really isn’t; the app only uses data that the
victims subjects themselves have “chosen” to broadcast and make public. So what’s to do about this other than to lecture people to actually read privacy policies before deciding to share data about themselves?
If your answer to that question is “nothing,” then you’re probably a reasonably strong libertarian, or you’re like Scott McNealy, who famously quipped, “You have zero privacy anyway. Get over it.” (Pre-9/11, natch.) If your answer to that question is “write a law,” then you’re probably the opposite. None of these seem like they’re particularly satisfying as answers, and I think the reason for that is when we talk of “privacy” these days, we don’t mean what that word has meant in the past.
Consider another challenge facing businesses and consumers: how to be price-competitive and price-savvy in a world that blends brick-and-mortar shopping with online commerce. Linking up the diverse bits of information about a consumer and connecting that to the consumer’s cell phone creates a world of opportunity for retailers. Google, Bing, and who knows how many other companies gather information about your search habits, your web surfing habits, and come to some sometimes accurate and sometimes silly decisions about the sorts of products, media, and services you would like to consume. Combine that information with, say, the data files your grocery store keeps about you, and mentions of product consumption behavior on your Facebook page, and coordinate that with the GPS in your phone, and if a retailer can gather all that data and assemble it fast enough the way GAM does, the retailer could in theory exploit that information to “guide” your consumer behavior while you’re in the retailer’s store. As described in the linked article, this variant of GAM aimed to benefit retailers could be benign or at worst annoying. But it takes very little imagination to come up with scenarios in which this assemblage of information would be exploited in more unplesant sorts of ways.
The libertarian in me objects, “So what if Macy’s figures out your credit rating when you walk in its door? It’s up to you to make good spending and good credit decisions.” Which is true enough. But the sorts of people who don’t make good spending and good credit decisions are the sorts of people who, by definition, are susceptible to attractive offers to make poor financial decisions. Enhanced information technology could be used to exploit this — to make varying credit offers to varying consumers and create financial products on the spot that would effectively double the price of goods sold in retail stores to the unsophisticated consumers likely to buy into the pitch. When enough of that accumulates, that starts to create a drag on the larger economy. So I’m not sure that the libertarian solution of utitur emptor is going to be an adequate solution from a social policy standpoint.
The instinct is to point to privacy protection laws, and urge people to opt out of data-sharing. Certainly we can do that. But the utility and advantages that these same companies offer is tremendous and we don’t want to do without that. It’s handy having a gmail account that you can access from anywhere in the world, across multiple devices. I’m willing to let Google gather information from my e-mail and blog reading habits to take ham-handed guesses about what kinds of advertisements would prompt me to click through in order to get that service, just like I’m willing to take “discounts” from my grocery store in exchange for letting it gather data about the kinds of products I buy, which in turn influences the coupons I’m given and the junk mail I get in my mailbox.
But there’s a limit to how far I’m happy about that sort of thing going. The issue is not that I’m super-protective of all this data about myself; there are things I’m willing to allow corporations to do with data about me. So the issue isn’t “privacy” in the sense of me controlling who has access to information about me — it’s “privacy” in the sense of controlling what they do with that information. This is a diminishment of privacy from the deontological sense of the word; the abuse of privacy that becomes objectionable stops being the inherent loss of one’s seclusion, but rather it is a utilitarian issue — the abuse of privacy is the use of that information in a manner that is somehow thought to be in violation of moral or social norms, the use of information in a manner that somehow harms the subject.
For instance, a single woman who is identified on GAM and is approached by an attractive, nice guy who found her on the service, might be pleased that a romantic opportunity came her way with seemingly no effort on her part. Or maybe she might be irritated by it but not distinguish it from other approaches. But there’s no effective way for GAM to allow her to choose “Only broadcast my information to really cute straight single guys with credit ratings of not less than 700, no criminal convictions other than traffic and parking tickets, who are at least five foot eleven inches tall,” even if the authors of GAM were remotely interested in doing such a thing. And she would resent being approached by a creepy guy she would never date, and she would be deeply upset (litigiously so, I should think) if that same creepy guy stalked her or hurt her somehow. The chances of the cute guy she wants to attract approaching her through GAM seem infinitesimally small compared to the number of outcomes that seem bad.
I’ve actually read some privacy policies, like Google’s. In this sense I suspect I’m ahead of the power curve of 95% or more of data service consumers. I do not know what companies Google shares its data with; my impression is that Google shares its data with pretty much whomever is willing to pay to access it, and a variety of governmental entities. One thing we could start with would be demanding, either as consumers or as citizens, to be told to whom our data is given, and to be able to opt out of data-sharing with companies with whom we do not choose to share our information.
Another thing to understand is that people sign up for services and forget about them. I might sign up for Foursquare or Latitude, thinking it would be handy for my wife and I to know where each other are. (Amusingly, Latitude sometimes places us more than a mile apart from one another when we are actually riding in the same car, which usually results in jokes about infidelity.) A periodic reminder from these services that it is sharing data with other entities would be helpful in allowing me to control what information about myself is exploited.
Come to think of it, some ability to know what governmental agencies are gathering information about me would be nice, too. Unless I’m being investigated as a suspect for a crime or there is some other overriding national security reason, I already have the right to make the government gather information about me and tell me what it is — but so few people actually use their FOIA rights that it’s a negligible burden on the government as a whole. If the government had to tell me, from time to time, “We have files about you with the Internal Revenue Service, Social Security Administration, Selective Service Administration, and the Tennessee Valley Authority,” or whatever it might be, I could be more aware of who was doing things with information about me and I could control my relationship to the government better. I might be prompted to take action to protect my privacy, or I might decide I don’t care. But it would be an easier decision to make and an easier decision to implement if I got the information to begin with instead of being charged with having to go root it all out like a detective.
As things stand now, the burden is on me to go seek out all of this information. Maybe shifting that burden, requiring some disclosures and opt-ins, is the right balance between burdensome tight legislative controls and laissez-faire libertarianism. Maybe if a woman were told that “we’re selling your GPS data to a Russian company that broadcasts that information to creepy stalker guys,” she might be better able to decide whether she wants to leave herself open to meeting a tall, handsome, tech-savvy stranger — or whether she prefers to avoid winding up dead in a ditch.