Incalculable Loss

The algorithms that make up Big Data distribute complicity for death across the populations they surveil

Once upon a time, the virtual represented a domain of free play, a realm separate from the flesh, a “second life.” But the corporatization of digital architecture and the advent of Big Data have ended this digital dualism. Now, says former NSA director Michael Hayden, “we kill people based on metadata.” Now that digital activity is a basis for state violence, the “virtual” and the “flesh” are no longer separate zones. Through mass surveillance, the virtual makes the flesh vulnerable to death, and the flesh lends its reality to virtual calculations. This link between surveillance, data-mining and death has birthed a new form of necropower (Achille Mbembe’s term to describe the politics of deciding who lives and who dies): algorithmic necropower.

The data-mining procedures used by the NSA and other entities draw their predictive power from their use of incalculable algorithms, enabling them to replace causation with correlation. They rely on numbers, not theories. Using huge datasets culled from the surveillance of online activity, Big Data algorithms can discern or generate correlations that would not be apparent to human perception. They test rules against these correlations: How likely is it that someone who spends an hour on the slots will spend 20 minutes playing blackjack? How likely is it that a college-educated ethnic minority will read the New Inquiry if they also read the Wall Street Journal? The predictions that emerge from these correlations and rules are context-free, but, when interpreted by people, become the basis for advertising campaigns, urban planning, actionable security decisions, and many other aspects of modern society.

Long before Big Data yoked together calculation and incalculability, Derrida described justice as “a calculation within the incalculable.” Because every demand for justice is singular, there is nothing it can fully be compared with: It is incalculable, because no other thing can enumerate its content. In this light, the algorithms used in Big Data appear as a kind of perverse instance of justice. The relationship between Big Data and the incalculable algorithms it hosts is reminiscent of the Derridean relationship between justice (a singular demand) and the law (the rule that singularity is held to). The law proceeds according to rules, but justice cannot be arrived at through the application of a rule: Justice is always singular, and always grounded on its own impossibility. It requires an incalculable calculation that we can perform only when at a loss, unable to speak and calculate. But even an incalculable calculation needs a rule, so we still need the law to guide us to justice, at the limits of what can be calculated.

Both mathematically and juridically speaking, powerful tools for modeling, predicting, and manipulating human reality have emerged from these limits. The history of computation is a prime example of the power of language to instrumentalize life. If language allows us to articulate possible worlds, then the ability to calculate possible worlds at the scale that algorithms allow ­changes the human relation to history and futurity. While we “grapple” with the incalculable, our algorithms can articulate worlds beyond our grasp, keeping track of the gray zones between finite states, the places our understanding can’t go.

Because of how our intelligence works, human logic has to start from the rule of law, from finite states and binaries, even if what we are seeking is the incalculable justice that exceeds the law. But algorithms, able to contain the incalculable, can start from the ground of incalculability. Despite the oppressive uses Big Data is put to, this capacity ironically seems to bring it closer to Derrida’s understanding of justice. In reality, managing incalculability becomes the basis for new automated modes of organization and control, in an alchemical process that “distills” history into a usable dataset. What remains of justice, when the incalculable can be made to count?

Algorithmic necropower—the computation of who should live and who should die—operates from the basis of incalculability to discern “non-obvious associations.” Still, according to the U.S. Inspector General, “association does not imply a direct causal connection.” Instead, it “uncovers, interprets and displays relationships between persons, places, and events.” Algorithms escape the laws of cause and effect and operate in a fluid state of exception, encompassing the financial sector, the military-security nexus, and the entertainment industry. Although algorithms seem to allow Big Data to bypass human judgment, in fact a huge amount of labor is required to map associations and interpret the output. The algorithm itself has to be written by a human, and even then it only spits out data; people still have to decide what the data means. Ordinary language and the “ordinary actions” of post-­digital citizens act as a database for algorithmic necropower to manipulate reality and generate threats. Risk levels are rated based on activity patterns that seem anomalous in relation to the norms derived from data.

The norms, the data relations, are what determine suspicion of terrorist activity, not causal evidence. The “March 2013 Watchlisting Guidance,” a leaked government guidebook for putting individuals on terrorist, no fly, and selectee lists, says, “Although irrefutable evidence or concrete facts are not necessary, to be reasonable, suspicion should be as clear and as fully developed as circumstances permit.” The document also has loopholes for cases where officials can’t articulate reasonable suspicion: Family relations of known or suspected terrorists,  individuals who may be “associates” of terrorists, or individuals with a “possible nexus” to terrorism, may be watchlisted.

Reasonable suspicion is thus a computer-aided ­human judgment based not on causal evidence (fact), but on data correlations, on perceived norm deviation. Everyday behavior becomes a means for the state to detect threats, a measure of risk. This architecture distributes complicity in a new way.

In terms of necropower, Mbembe’s concern is “those figures of sovereignty whose central project is… the generalized instrumentalization of human existence and the material destruction of human bodies and populations.” Who are these figures of sovereignty, when data appears as sovereign? Who kills, in algorithmic necropower? The people who coded the algorithms? The generals, managers, CEOs, or shareholders who ordered them? The companies buying and selling the algorithms? The civilians whose surveilled daily lives constitute the bulk of the data the algorithms analyze? Our banal activities are the source from which algorithms automatically generate kill lists made up of nodes that deviate from the cluster of normal activity patterns. Algorithmic necropower defers the act of killing and disperses complicity.

For algorithmic necropower, history is over: the past is not a record of causal relations, but raw material for increasing predictive power. At every turn, with each monitored action, data are made calculable. As “regular users” of language and of the algorithms used to reproduce sociality on the Internet, we fashion ersatz individualities in surveilled spaces geared towards consumption. The primary value of this online activity, from the point of view of security operations, is that it provides the norm for the data set probed by algorithms, testing and manipulating association rules. The activities of “making a self” and deciding what and how to consume are not normally considered “work,” but in the amorphous terrain of algorithmic war, stretching over various domains of modern life, affect itself becomes financialized, and biological life—bodies in physical spaces—becomes a surplus value where calculable, and, when incalculable, a contagion.

For data-mining corporations, “life” can be categorized according to this distinction: mineable and unmineable activity. The former serves as the database to mount an attack on the latter, as evidenced by the NSA’s increased monitoring of users of the TOR encryption network developed by the U.S. Naval Research Laboratory, as well as of visitors of the Linux Journal forum page, which the NSA deemed an “extremist forum.” Internet activity signaling that the user is conscious of her privacy sets off surveillance algorithms, which predict whether or not this kind of user is subversively aware of the role of her data in the architecture of war.

What halts the algorithmic state of exception? Derrida may provide an answer: the incalculable traces, the self-effacing cinders of language that point beyond language. Computation itself produces immense quantities of incalculable data that are effectively useless. These incalculable traces clog calculation: While incalculable algorithms are quanta of data that guide computation, incalculable data are simply qualia of data, groundless remainders, taking up server space. They’re not useful for intelligence or profit. Take the simple example of a curated Amazon product listing, updated in real time: It would generate so much traffic by scraping the site to get price data every, say, half-millisecond that Amazon would have to operate at a loss to maintain the servers necessary to allow the data hemorrhage of the price-mining process.

History is composed both of things that can be known, and things that can’t—mineable (calculable) and unmineable (incalculable) events. The latter category comprises not only things that happened without trace, but also the traces of what never happened. Even incalculable algorithms cannot yet enter into this incalculability of experience, which lives as mourning, or hope, or other relations between past, present and future that escape enumeration. Algorithms enumerate quanta of data, but the qualia of data—subjective experiences of the world—cannot yet be captured by algorithms. Nevertheless, our surveilled responses to qualia, our online “self-making,” loop back into the quantizable (mineable) zone, as with Facebook’s attempts to monitor emotional states.

Yet there is always an incalculable remainder. In the face of the seeming alchemy of the computational process, forgotten or unlived histories proliferate. This remainder of computation is where justice is to be found, guided both by law and by the incalculable, qualitative data of subjective experience. These qualia are not useful to algorithmic computation and are only intelligible to human eyes. One potential sabotage of algorithmic necropower would be for users to actively produce incalculable data. Facial warping such as in the work of the artist Zach Blas represents an aesthetic gesture toward incalculability. But algorithms are faster than humans, if not more inventive. It would be more efficient, if no less realistic, to wait for the authors of the algorithms to undertake a program of sabotage themselves.

If algorithms make complicity incalculable, it is because those who make the algorithms count on avoiding complicity. The idea that the algorithm itself decides is part of the general ideological offensive surrounding its deployment. The politics and interests of its authors may be incalculable from the standpoint of the person or population who is caught up in the algorithm, but this is precisely what the algorithm is intended to calculate. The remainders, the incalculable, messy qualia of particular human politics and interests are equally its ground, and what it will inevitably proliferate.

Algorithms straddle a gray zone between the privatization of war and the financialization of civilian life, acting as the connective fiber to fuse them in order to subjugate “life to the power of death,” as Mbembe says. In algorithmic war, “here” and “there” collapse: Ordinary “civilian” activity is a determinant source for state intelligence. Fused across finance, security, and entertainment, this relentless exercise of necropower might also be called necrocapitalism. Life itself becomes a surplus value. At the same time, the residue of history, of incalculable qualia, produced by attempts to read the future, becomes a contagion that Big Data is still struggling to manage. And the longer it goes on, the expansion of this incalculable contagion shows just how small Big Data really is.