The form of power that Big Data employs is not so much panoptic as it is pan-analytic.
At this very moment, the phone, tablet, or computer on which you are reading this article has already helped multiple databases worldwide register the fact that you are interested in government and corporate surveillance. These databases have probably also stored a record of the type of device you hold in your hands. As Edward Snowden pithily put it in a New York Times column earlier this year: “As you read this online, the United States government makes a note.” And as more recent reporting from the Times and ProPublica has disclosed, this is made possible not only by intense interest on the part of government agencies like NSA but also by “extreme willingness” on the part of corporate data traffic managers like AT&T.
Looking beyond big telecom and the intelligence state, there are other factors at play in the new data regime: you and me. Huge amounts of data are being collected and crunched, marking where we have been, what we have said, and even pointing toward our future actions. With each step, tap, and click we leave little flecks of information behind. This is possible only if we are invested in maintaining the production and dissemination of it. All the data that is ceaselessly being produced about us and by us is so important to us in part because we are constantly invited to live our lives through updates, comments, shares, and other manner of informational self-presentation. This invitation is extended to us not only by the cool new social media startups and digital device manufacturers who profit from all the data, but it is also the message we frequently get from colleagues, contacts, family, and above all the “friends” who implicitly promise to pay attention to the data we produce.
The sheer number of facts being collected is unprecedented if not in fact unfathomable. To make sense of the new political power that can be built out of all this data requires metaphors, and these metaphors themselves are not without political stakes. Metaphors help shape the meanings of the activities in which we are engaged and they thereby help condition what possible actions we can conceive ourselves as undertaking. Metaphors thus have a political stakes in that they define the forms of power that control us and the forms of possible resistance to power we can imagine.
Most metaphors for data’s power draw on the idea of visual surveillance by regarding data harvesting as an all-seeing gaze sweeping across the citizenry. We imagine ocular devices (or even real human eyes) perched atop giant watchtowers, as in the “panopticon,” Jeremy Bentham’s 18th century idea for an efficient prison, revived in 1975 by Michel Foucault in Discipline and Punish. (Snowden even referenced the panopticon by name in a 2013 interview with The Washington Post.) In a panopticon, inmates are arrayed around a central watchtower that makes them understand themselves to be permanently visible, whether anyone is actually watching or not. With this heightened awareness of their own visibility, prisoners quickly learn to discipline themselves to avoid additional punishment. They become their own guards. In Bentham’s words, the panopticon is “a mill for grinding rogues honest, and idle men industrious.” It was a machine that would make men visibly mechanical.
Some two hundred years later, with panopticism serving as a paradigm for social engineering across all kinds of institutions — military barracks, psychiatry wards, medical clinics, schoolrooms — we readily act as if we are always being watched. We vigilantly normalize ourselves into conformists who nobody would care to take special note of. We encourage diligent obedience in friends, family, colleagues, and especially in our children. We teach them (because we were taught) to behave as if visual surveillance is inescapable. We do so regardless of the degree to which actual watchers actually see and the extent to which some of our actions are even capable of being seen at all.
However, in the case of our new information politics, the metaphor of visuality may not be as plausible as it first appears. The surveillance mechanism of the panopticon relies upon total visibility —you see the tower and assume the guards can see you. But the mechanisms assembled on behalf of new-fangled national security and consumer analytics seem to presuppose the opposite. They function through invisibility. The watchtower garishly announced itself; we need to see the security cameras for them to be effective. By contrast, the algorithm is invisible as it constructs its composites; it ever runs silently in the background like all that circuitry, voltage, and machine code that quietly lets you into your computer without ever announcing itself.
The government and corporate sectors’ algorithms work with data that is constantly being harvested and analyzed without our awareness — not only because the harvesting is sometimes in secret but also because we tend to not recognize the massive variety of mechanisms at play for turning our action, experience, and thought into data that categorizes, compartmentalizes, and calculates who we are.
An article by Dutch journalist Dimitri Tokmetzis last year showed just how far this data can go in assembling composite portraits of who we are. Google knows what you search for and can infer not only what kind of news you read on a Sunday morning and what kind of movies you prefer on a Friday night but also what kind of porn you would probably like to gawk at on Saturday night after all the bars have closed. Of course, Google won’t show you that, but they could quietly sell it to somebody who wants to show you. The NSA knows that you were interested enough in surveillance issues to click on this article and draws inferences about you based in part on who else has read it. Never mind the accuracy of these inferences—companies and governments act on them nonetheless. Even more unsettling is that we seem to be acting on them too. The present moment of our obsessive data production seems to be defined by a genre of social media in which we have come to recognize ourselves in our online “profiles.” And the next stage of this obsession may well become, as data anthropologist Natasha Dow Schüll suggests, a whole kit of wearable technologies that promise better living for the seemingly small price of a continuous self-tracking that runs silently in (and as) the background of our lives.
Bentham’s proposal for a total visibility machine may be less important to these emerging universes of data than his contributions to the morality of utilitarianism and its assumption that our welfare can be measured. We are today suffused in the sort of calculative thinking and quantitative comparisons that utilitarianism urges. Such reasoning does not rely upon visibility to do its work. Rather, it depends on analytical algorithms, which in turn depend on a barrage of other silent algorithms that silently convert our behavior into a flood of data.
The form of power appropriate to this data flood is not so much panoptic as it is pan-analytic. We may still be docile disciplinary subjects who conceive of ourselves as constantly under the gaze of parents, teachers, and society at large, but we have also become subjects of our data, what I like to call “informational persons” who conceive of ourselves in terms of the status updates, check-ins, and other informational accoutrements we constantly assemble. An informational flood flows out of us without our awareness. Our phones and computers are constantly communicating even when we are not — and even where we are made aware of it we are coaxed into not questioning it because we are told that it has become obligatory to be online and to have an online presence.
This informational flotsam is made up of familiar standardized data through which we have come to define ourselves: school transcripts, health records, credit scores, property deeds, legal identities. Today, these entrenched types of data selfhood are being expanded to cover more and more of who we can be: how many steps we take each day, how much water we drink every hour, how many friends we have, what books and movies we browse, how many cute cat videos we like to watch at night. Though many of us actively showcase much of the information we produce, the algorithms work on this information in silence whether we showcase it or not. We never see the algorithms doing their work, even as they affect us. They scan and scoop and store, and eventually they are able to produce us in their ciphers, all unseen, buried away in black boxes silently composing symphonies of zeroes and ones.
Were we aware of the clang and clatter of our data being swept up and put together, we would be overwhelmed. But once we recovered from sensory overload, we could develop ways of taking more care of our data, learning more about how so much data is produced, and forming policies and practices that might have a fighting chance against the brave new worlds of informational ubiquity in which we are being enrolled. Every form of power has its vulnerabilities, and the specific weakness of what I call “infopower” is shutting off the data feed that supplies the algorithm.
Snowden hopes that the world “says no to surveillance.” Most advocates of privacy and critics of governmental and corporate tracking agree with him. But what if saying no to the watchtower does not yet amount to saying no to the algorithm? We have a sense of what is involved in saying no to surveillance. But who among us really knows how to say no to information ubiquity? And who among us would be audacious enough to stop churning out the data that increasingly defines our very selfhood?