Coveillant bonds

Wired recently published an article entitled Why You Should Embrace Surveillance, Not Fight It, by Kevin Kelly (h/t Instapundit). It’s a well-written, thought-provoking article, though I find I disagree with its conclusions.

Kelly does make some very good points. His vision is the idea of “coveillance”, where surveillance is reciprocal. To Kelly’s credit, he says this almost longingly, with the observation

We’re expanding the data sphere to sci-fi levels and there’s no stopping it. Too many of the benefits we covet derive from it. So our central choice now is whether this surveillance is a secret, one-way panopticon — or a mutual, transparent kind of “coveillance” that involves watching the watchers. The first option is hell, the second redeemable.

This is probably his strongest point: the freest societies that have ever existed have already embraced surveillance, both the private and public sector. The option is not between no surveillance and mutual surveillance; that ship has sailed. Coveillance is the best we can hope for.

Kelly then provides historical context, but here the story becomes contradictory. From the article

So far, at every juncture that offers a technological choice between privacy or sharing, we’ve tilted, on average, towards more sharing, more disclosure. We shouldn’t be surprised by this bias because transparency is truly ancient. For eons humans have lived in tribes and clans where every act was open and visible and there were no secrets. We evolved with constant co-monitoring.

So historically we are used to co-monitoring, and whenever faced with a choice we choose more disclosure? Then how did we get to our modern privacy fetish? Not only is this logically inconsistent, I don’t even think it jibes with experience…secrets are frequently kept from family, friends and neighbors much more rigorously than from random outsiders. Think of it this way: weather being equal, would you rather sunbathe nude at a beach in the Caribbean among strangers, or in your front yard?

Kelly argues that the prevalence of oversharing on social media sites as evidence for his thesis. I certainly concede the point, but I’m not sure the evidence is as strong as he thinks. First almost all social media give the illusion of community; you aren’t sharing with the world…you’re sharing with your friends. Second, many people who overshare simply aren’t aware how the individual data they publish can be aggregated, e.g, if you post a picture of your gourmet dinner out every Friday night, your house is likely vacant next Friday; getting address from your name is easy, especially if you tag your spouse by name, or the restaurant names, in posts. Finally, most of the fuss about oversharing is not so much about people sharing too much important information; rather it’s the deluge of trivia that’s the problem.

Kelly’s conclusion goes all mystical

The self forged by previous centuries will no longer suffice. We are now remaking the self with technology. We’ve broadened our circle of empathy, from clan to race, race to species, and soon beyond that. We’ve extended our bodies and minds with tools and hardware. We are now expanding our self by inhabiting virtual spaces, linking up to billions of other minds, and trillions of other mechanical intelligences. We are wider than we were, and as we offload our memories to infinite machines, deeper in some ways.

I’m just a mathematician, so this sounds like the sort of crap I went to engineering school to avoid “learning” about in college. And no one who’s ever read an Internet comments board would argue that empathy is broadening in any way. Moreover, this just doesn’t have anything to do with coveillance, except to the extent it’s a plea for us to “just all get along”.

Information asymmetry is a concept in economics where one party has better information, and thus an advantage in negotiating a deal. Surveillance in its current form provides such an advantage that no entity with the capability will give it up voluntarily. If coveillance is possible, it will require a strong political effort to obtain. Why not spend that political effort to effectively limit or forbid surveillance? Because if you put it that way, it’s clear that it’ll never happen.

Comments are closed.