Tuesday, March 20, 2012

How's my driving?

I wanted to begin my stint as a contributor by trying to indicate some of the fundamental complexity of the issues that digitality raises.  Let's take the topic of privacy.  Everybody knows (and if you don't, read Richard Robeson's posts on this blog for a taste) that developments in surveillance technology pose significant threats to privacy, almost no matter how one conceives the term (and, of course, the very meaning of 'privacy' is notoriously difficult to pin down, even though what one means by the term makes a big difference for whether one thinks it deserves protection, as this paper demonstrates).  And of course no one is quite sure what to do about regulating these devices, as the Supreme Court's recent decision striking down the police's warrantless application of a GPS tracking device to a suspect's car illustrates (the majority opinion here was very narrow; Justice Sotomayor's concurrence signals that the Court may be ready for a landmark decision or two in these areas.  That there is some serious doctrinal confusion in the Court's Fourth Amendment jurisprudence about technology is a point that Justice Scalia made as far back as a 2001 case about thermal imaging devices).

The point I'd like to make is that the problems are both pervasive and go all the way down.  Consider webpages that want your personal information.  The standard narrative about privacy these cases says that, when these sites provide notice of their privacy information-sharing practices, and users consent to those, then everything is ethically kosher.  Call this the notice-and-consent standard.  The idea is that users get to exercise their autonomy by freely choosing whether or not to share information, as evidenced in their consent.

So far so good.  However, as this paper by Scott Peppet (a law professor at Colorado) argues, the very fact of control may itself be the problem.  As Peppet argues, digital technology makes it very easy to signal verifiable information about oneself to others, as for example by letting Progressive Insurance monitor one's driving habits.  This information is often of economic value.  Progressive, for example, wants to know who the good drivers are, and wants a better way to find out than either by asking ("are you a good driver?" "uh, yeah") or by going through proxies like age and gender.  The device delivers that information in a level of detail formerly unobtainable, short of the company hiring somebody to follow you around.  Users are free to opt-in to the monitoring or not.  So this is a perfect example of notice and consent.  But.  The best drivers have an automatic incentive to signal their best-ness to Progressive, and secure their discount.  The next-best drivers want to group themselves with the best ones... and so it goes.  Pretty soon, even the worst drivers are having to consent to monitoring, in order to avoid the even greater stigma (in the form of much higher rates) of refusing to be monitored ("what are you hiding?"). And so a very strong economic pressure makes the cost of retaining one's privacy prohibitive - even as (or even because) one has perfect control over the act of consent.

At the very least, this ought to be troubling - Peppet argues that it's generalizable to a wide range of contemporary privacy issues.  More fundamentally, it seems to me that it shows that digitality challenges our notions of ethical agency and personhood - autonomy may not be doing the work we need it to; at the very least, we may need to think hard about what, precisely, an "autonomous" decision is.

No comments:

Post a Comment