twitter
facebook twitter tumblr newsletter
logo-socbar
Socialism and/or Barbarism
By Evan Calder Williams
Notes on a once & future nightmare. S a/o B 2008-2011 Follow @thickaswolves
rss feed

Counterproductive

042815_obamaOf course they are.

If “thug” is the term one face of contemporary anti-blackness prefers to speak, its other mouth has settled on “counterproductive” as the word to say exactly what it means while maintaining plausible deniability. One doesn’t have to specify what would be more productive, just a negative definition: anything but this. Anything but CVS.  So any time in the past week that commentators and elected officials, from Obama to Loretta Lynch to Baltimore City Council President Bernard Young,wanted to trod a certain rhetorical line with a but you can hear coming a page away, “counterproductive” got mobilized. In some cases, the term came first to mitigate the note of paternalistic pseudo-support to follow. (As in, you totally have a right to your negative affects, provided you express them in the manner we have decided appropriately positive.) Charles M. Blow’s NYT op-ed, for instance:

Those enterprises aren’t only criminal, they’re fruitless and counterproductive and rob one’s own neighborhood of needed services and facilities and unfairly punish the people who saw fit to follow a dream and an entrepreneurial spirit, and invest in themselves and those communities in the first place. 

[...wait for it...]

But people absolutely have a right to their feelings — including anger and frustration. Only the energies must be channeled into productive efforts aimed at delivering the changes desired. That is the hard work. That is where stamina is required. That is where the long game is played.

For others, like Obama, the term serves as a pivot between two other moves: acknowledging long-term “issues” (Obama: “This has been going on for a long time. This is not new. And we shouldn’t pretend that it’s new”) and actually just calling people “criminals and thugs” (again, Obama). Between these two, “counterproductive” gets trotted out as expected, meaning almost nothing, other than the implicit promise that a different mode of showing “frustration” – say, voting –would be more productive. And, of course, that everyone would have been happy to listen in the first place. To call the riots counterproductive, in this way, isn’t to say that they are useless. They are not merely unproductive. Instead, they confirm the worst judgments held about those in question. They set back Progress. As in: it is the fault of those who riot that nothing productive happens.

Baltimore-Riots-Army3Baltimore, 1968

This whole maneuver is hardly new and seems to recur every time anything happens that remotely approaches a rebellion unafraid to lay hands on property or cops, especially when that rebellion is black. And so, in recent months, it became a name for how things kicked off wrong in Ferguson:

However, the rioting, violence and destruction that plagued the collective response to this decision are not simply unproductive, but counterproductive to the causes of justice and equality.

Or in more openly conservative screeds, like the revisionist mutterings of ex-Babylon 5 actors, a way to argue that the Panthers would have been alright if only they just waited calmly for racial equality to unfold naturally:

To be fair, the Panthers were reacting to decades of the evils of state Jim Crow and deliberate federal noninterference. But by the time of their founding, the Civil Rights Act had been passed and events were clearly moving in the direction of racial equality and justice. Such violent pretensions were not only unnecessary but counterproductive.

Or even in actual historians of Caribbean slave revolt: “Slave rebellions were counter productive to the anti-slavery cause.”

But the relevant question isn’t productive as opposed to counterproductive, and never has been. The question is: productive of what?

America only accepted those who would come to be called black Americans because they were productive: because they were slaves who produced the material wealth on which American and Western power was built. Because they were not tolerated but kidnapped, sold, and killed, all to produce a population that was never just addendum, bonus, or small crutch which could have been foregone. Between 1500 to 1820, African slaves made up around 80 percent of all Atlantic passage westwards, the majority of lives on whom the rest rested and the necessary machinery to complete the Atlantic trade triangle. The nascent American state made this unmistakably clear by legally codifying slaves as actual property, as something to be mobilized for economic gain and which had no place outside those sites of production. They were to be hunted down and dragged back on escape: it was the legal obligation of citizens to return productive property to where it could be put to work again. There, they were policed and terrorized not as exception but as routine, as maintenance and training:

Scarcely a day passed while I was on the plantation, in which some of the slaves were not whipped; I do not mean that they were struck a few blows merely, but had a set flogging.

For those distanced from the actual fact of this, nominally well-meaning British reformers used those treated purely as wealth-producing property – as capital, in the most literal and corporal form – as mere rhetorical measures. Richard Oastler, justifiably lamenting the atrocious treatment of child workers in Yorkshire’s factories, writes himself into a corner:

Thousands of our fellow creatures and fellow subjects[…] are this very moment existing in a state of slavery, more horrid than are the victims of that hellish system “colonial slavery.” […] Poor infants! ye are indeed sacrificed at the shrine of avarice, without even the solace of the negro slave. [...] Ye live in the boasted land of freedom, and feel and mourn that ye are slaves, and slaves without the only comfort that the negro has. He knows it is his sordid, mercenary master’s interest that he should live, be strong and healthy. 

The solace and comfort, indeed.

Others, taking up the case of their own political struggle, wove arguments from the allegedly unbreachable moral border that slavery trespassed. The Huddersfield Political Union, in 1833:

No individual, or any number of individuals, has a right (nor ever had, nor ever will) to exercise a power over another, or any number of individuals, which if exercised over themselves, they would consider, and call unjust… social right either does or ought to emanate from the natural one.

Yet, in those same mills, they quite literally wove the cotton picked by those to whom this schema did not apply: who that very same year came to be nominally recognized as individuals, rather than property, in British law with the Slavery Abolition Act, but who were forced to continue to provide that cotton for another 27 years, with no boycott in sight. It would have been, after all, counterproductive to interfere when industry had such a good thing going.

Over the course of the nineteenth century, that productivity came slowly into crisis for all the Atlantic powers, though not with any accompanying quantitative drop in the numbers of those forced into such use. Far from it: 1860 saw the largest population of slaves in the Western Hemisphere ever, over 6 million, just under 4 million in the US alone. It came into crisis in part because of slave insurrections, the Haitian revolution, fugitive slaves, and international abolition movements, and in part because an ascendent economic order was proving capable of explosive accumulation and diffuse social control beyond what the management of rebellious populations could muster.

tumblr_l11am6miji1qapeowo1_1280Baltimore, 1968

The consequence, though, was that as America lost its slaves in name but not in body, it no longer had a juridical matrix by which to codify them as subhuman and to pardon its own atrocities. So America decided on a risky gambit, a long process of massive inversion. It would stop recognizing black people in terms of their productivity, and their productivity alone, and begin to assert the very opposite: that they were not just temporarily but genetically, physiologically, and morally destined to be unproductive. In particular instances over the next century and a half, shades of the old productivity continued: as inmates in prisons, as debtors to predatory lenders, as new demographics to pitch commodities to. Those are exceptions, though. On the whole, America ceased to understand black Americans as “productive,” and it has never stopped blaming them for this.

That sort of full-scale historical inversion is no easy task, especially for a white supremacy that wants to assert, despite what it had asserted for centuries prior, how blackness and productivity aren’t just casually unrelated. They are entirely unable to coexist. (At least, it suggests, once the chance is lost to realign them by force, through discipline of whip and cudgel.) What to do? Who to call? Obvious as a terrible procedural, America comes to marshall the resources of – what else? – the police, but not just in their literal deployment. The police are mobilized especially in that broader sense of the word that emerged as a particular operation of Enlightenment thought, one which had been refining itself since the seventeenth century. It designated a complex and self-authorizing apparatus centered on a) the control and “reclamation” of terrain seen as untamed and uncivilizable (such as the establishment of British estates in Scotland, along with the Scottish Commissioners of Police, in 1714) and b) in monitoring, measuring, and enforcing the “health” of a society – as when Adam Smith defined the role of the police (as one amongst four categories of jurisprudence) as tasked with “Cleanliness and security,” meaning both “the proper method of carrying dirt from the streets” and “the execution of justice, in so far as it regards regulations for preventing crimes or the method of keeping a city guard.”

Both senses of police, when backed by the general occupying force of real cops, would be mobilized in full, not just in black neighborhoods but also in a realm of discourse that pseudo-scientifically colonized, measured, and qualified the bodies of black people to find the hidden vault of that unproductivity – and perhaps, its proponents dreamt, of its solution. No project summarizes this more than the absurd work of Frederick Ludwig Hoffman, a statistician who worked for Prudential and who, with their support, wrote the infamous The Race Traits and Tendencies of the American Negro (1896), which, in Beatrix Hoffman’s words, “appealed to an American insurance industry that sought to identify poor risks for life insurance.” It worked to identify those who, no longer being property for which owners must care, were no longer a good return on investment, those whose existence was not productive even in death. The outlines of Hoffman’s project, an incredibly dubious work that cherry picks and fudges figures in order to confirm its extant judgment, can be seen in his first work on the topic four years earlier, “Vital Statistics of the Negro,” where on the basis of his rat’s nest of data, he concludes:

 Thus we reach the conclusion that the colored race is showing every sign of an undermined constitution, a diseased manhood and womanhood; in short, all the indications of a race on the road to extinction.

The excitement, and wishful thinking, can hardly be missed: as they were no longer productive, might black Americans have the good grace to quietly fade into extinction, to become the “vanishing race” that their genetic coding promised them to be?

In case any doubts were possible as to the stakes and angle of this project, in The Condemnation of Blackness: Race, Crime, and the Making of Modern Urban America, Khalil Gibran Muhammad usefully juxtaposes Hoffman’s conclusion with his study of white suicide only a year later where he lays the blame squarely on “external” factors:

According to an expert Hoffman cited, these individuals were victims not of “their own vices,” but “of the state of society into which the individual is thrown.” Hoffman agreed, insisting that the “total amount of misery and vice prevailing in a given community” was a manifestation of something fundamentally wrong in society.

He even goes so far as to suggest that these instances of suicide should encourage us to  fundamentally alter the social values of capitalism itself:

It is the diseased notion of modern life – almost equal to being a religious conviction – that material advancement and prosperity are the end, the aim, and general purpose of human life… It is the struggle of the masses against the classes.

Faced with the incoherence of his racial model, the actuary veers insurrectionary. In short, for structural violence that affects whites: structural overhaul, if not reformist revolution. For blacks: don’t let the door hit you on the way out of existence…

To the disappointment of white America, that “undermined constitution” was never actually the case, and racial autophagy wasn’t on the menu. So the second move on offer must be amplified, unceasingly towards our present: to claim that blackness is not just unproductive, in terms of contributing to civil society, but actually counterproductive. That it actually hinders the health and security of the otherwise stable social body. That it is a plague. That it is criminal: not when taken in aggregate, in a certain situation, but within each instance, within diseased minds themselves.

This is not hyperbole.  In the September 1967 issue of the Journal of the American Medical Association, no fringe publication but the central organ of American institutional biomedicine, one can find an editorial authored by doctors William Sweet, Vernon Mark, and Frank Ervin.  The authors argue that,

The urgent needs of underprivileged urban centers for jobs, education and better housing should not be minimized, but to believe that these factors are solely responsible for the present urban riots is to overlook some of the newer medical evidence about the personal aspects of violent behavior… The lesson on urban rioting is that, besides the need to study the social fabric that creates a riot atmosphere, we need intensive research and critical studies of the individuals committing the violence. The goal of such studies would be to pinpoint, diagnose, and treat these people with low violence thresholds before they contribute to further tragedies. 

In short, riots aren’t counterproductive because they do not achieve their goals. They are counterproductive because they are an expression of those who are already-counterproductive, those “individuals committing the violence,” those ever-ready to riot.

So little has changed of this. White America sees black Americans as something that has to be begrudgingly tolerated. Killed when militant, mocked when on state assistance, envied when successful. It sees them as those who only rarely escape their predestination as criminal, whose attempt to survive dose not contribute beyond itself, at least not within the named boundaries of civil society.

Baltimore riotsBaltimore, 1968

As for the initial question, then: productive of what? At least one answer is clear enough. Blacks were constructed in both white American imaginary and law alike as  productive but sub-human. Then as technically human but generally and genetically unproductive (as if unable to even perpetuate themselves beyond some ingrained racial debility). And then as openly counterproductive, a threat to American perpetuity, a scourge going nowhere fast that can, at best, be managed, quelled, and neutralized. In this regard, to ask that a riot be not a riot but productive – that it be work, that it be daily – is to ask that those who riot contribute to what politics has long meant in the US: a reproduction of what already is, an open denial of history.

Obama and his ilk got one thing right, at least and for once. The riots really are counterproductive: they stand both inseparable from this history of production and in direct refusal of it. To demand, against this, that they be productive would be to insist that those involved participate in a schematic with literally no place for them, one which can’t manage to offer more than 50% employment in Freddie Gray’s neighborhood. Who could want to be productive, given all this?

In love and memory

In love and memory of you, Chris, who were a friend and a comrade, you who literally taught me the meaning of that word. How it came from the Spanish camarada, the French camarade. From roommate, yes, but also how this was literal, from the real sharing of a space together, bound to the bonds formed by men who didn’t have security or wealth or private homes and so bedded down in the same small and rented rooms, who were friends and strangers, who fucked or didn’t. And this was at the heart of your brilliant queer history, starting not with a clearly defined erotics or identity but with that messy terrain of friendship and intimacy and class, inseparable from the spaces of capital and the attempts to make them our own. We organized and danced and argued together, and this was all part of our friendship, its care and work of trying to survive in the world we want to see upended.

When I saw you last, I had moved across the country and was back in California too briefly, and you drove me to the airport. Before I flew, we talked for hours over Indian by a highway entrance in San Jose, and I was struck anew by your impossible warmth, how it energized me and still does, and when we seize the bank and turn it into a dancehall, we’ll give it your name, Chris, our utopian, who this world couldn’t allow.

Any person who sings the praises of war is, in our opinion, a blithering idiot (Christmas Day, 1915)

O’ROURKE (London.) – Thanks, comrade. We are more proud of the comradeship of toilers like yourself than you can well imagine. It is such loyalty as yours that keeps us hopeful of our class and country.

CÚ CHULAINN (Dundalk.) – No! We do not believe that war is glorious, inspiring, or regenerating. We believe it to be hateful, damnable, and damning. And the present war upon Germany we believe to be a hell-inspired outrage. Any person, whether English, German, or Irish, who sings the praises of war is, in our opinion, a blithering idiot. But when a nation has been robbed it should strike back to recover her lost property. Ireland has been robbed of her freedom, and to recover it should strike swiftly and relentlessly, and in such a fashion as will put the fear of God in the hearts of all who connived at the robbery or its continuance. But do not let us have any more maudlin trash about the ‘glories of war’, or the ‘regenerative influence of war’, or the ‘sacred mission of the soldier’, or the ‘fertilising of all earth with the heroic blood of her children’, etc, etc. We are sick of it, the world is sick of it. And when combined with the cant about ‘patience’, and ‘waiting’, and the ‘folly of rashness’, and the ‘wisdom of caution’, and all the other phrases that are to be heard from the Irish eulogists of war we confess it gives us a feeling like sea-sickness – nausea.

No, friend! War is hell, but if freedom is on the farther side shall even hell be allowed to daunt us.

A brief review of Ngozi Onwurah’s Welcome II the Terrordome

Screenshot 2015-04-14 10.55.55

In a just world, a virus of tremendous scope and tenacity would ravage all the archives and vaults and shelves, all its servers and drives, its dens and libraries. It would draw no distinction between digital or analogue, bootleg or licensed. The CDC would be baffled: it appears to be crystalline in structure, yet its rate of replication is unprecedented…  Pundits would lose their shit on air, terrified that the virus might mistake them for the already-recorded and snake their throat mid-speech. Amazon would go on full lockdown: nothing in, nothing out, its long-rumored drones circling on updrafts, training red dots on anything that moved. Other rumors abounding, like how a conspirator’s union of ex-Blockbuster execs and the remnants of local videostores were behind it all. Still, it would creep through plastic and code alike, invading mancaves and Netflix queues, no matter the firewalls or plastic sheeting or shotguns. We would be held in thrall, in disarray. But despite the fears of those who lie awake and hear it rifling through code and celluloid, its endgame would not be to spread to the human body. Its symptoms would be simple: whenever it encountered a reel or MP4 or DVD that contained The Help or The Butler or The Intouchables or The Legend of Bagger Vance or Get Hard or Hitch, it would consume all the data that makes them up and leave in its place Welcome II the Terrordome.

But if this was a just world, Welcome II the Terrordome would not exist. It would not be entirely necessary, which is what it is. Maybe that’s the case for any of those rare things that actually deserve to be called political film: they need to be seen, as often and by as many people as possible, but they exist precisely because the order of the world is posed in full against that possibility, its hackles up and Bagger Vances ever-ready for immediate deployment. They could only be about this world, unmistakably so, but it’s this world alone that they exist to ruin.

Glass Hands (Violent Motion, 2)

2015-03-21 09.40.06-1

Who shall say that man does see or hear? He is such a hive and swarm of parasites that it is doubtful whether his body is not more theirs than his, and whether he is anything but another kind of ant-heap after all. May not man himself become a sort of parasite upon the machines? An affectionate machine-tickling aphid?

- Samuel Butler, Erewhon, 1872

Until five years ago, I only had a couple of intimate and tactile relationships with glassy surfaces. The first began early, a symptom of growing up where winter is long but inconstant, with temperatures that climb and drop hard.  Ice always appeared less a thing than a pause, a literal freeze-frame, because in Maine you don’t get to step outside of the process. You don’t go through winter piecemeal, always in a full circuit. First, there’s the anticipation of snow, which can be huffed, a smell of dry in the throat’s back, like smoke but from burning metal, not wood.   Then comes snow, sleet, tapering-off rain, slush, night’s black ice, partial melt, brown lumps of not-snow-but-not-ice, and then do it all over again, all on a long GIFish loop until you wake up and it’s late April and outside, all the permafrost dog shit has come to light and nose.

But when that pause happens fast enough, or with a slight interval of rain before night, you get the rare marvel of totally clear ice. We’d go skating, down the Royal River or out in the Cumberland marshes, where you could TIE Fighter swoop between the cattails and ash stands. I’d catch cracks and end up mouthing the milky pond, cutting my hands. Like an improbable crush or money, ice’s surface always split this way, pitched between pleasure and damage, never more dangerous than when you don’t realize you’re already navigating it, when it’s black on back roads.

When my grandmother’s second husband died, she moved from Florida to live near us. I became newly, sharply aware of just how fragile we can be when crossing ice, especially when our bodies have already begun to bend like birds. All the same, I’d carry slick loafers in my backpack to school and put them on for the downhill walk home, to spend it slipping as much as possible, acting out a semi-skilled pratfall teetering on the edge of facial reconstruction.   The gestures we develop to negotiate ice barely manage this tension, always swerving between sublime grace and dorkly flailing. The cool kids didn’t wear jackets when it was cold – James H. never wore a parka, was structurally incapable of sleeves – but ice can still blow anyone’s cover.

hb_69.521

Man Ray’s “Dust Breeding,” 1920 photo of the back of Duchamps’s The Bride Stripped Bare by Her Bachelors, Even (The Large Glass)

Given that I don’t wear glasses, my other memories of focus on, and care for, the smooth and unyielding are all about jobs. One was selling wine, where you spend so much of the day touching and holding glass. Stacking boxes of it, proffering $200 bottles of it to the rich like snake-oil.  Lifting and arranging it, realizing that when vintners want their wine to appear to justify their prices – to demand what they fetch – they make the bottles heavier, as if value was something to be hefted and luxury sold by the pound.

 

dusty_wine_bottles

For slick surfaces, the bottles gathered endless dust, or at least the dust appeared so visible because every speck violates an unspoken premise of wine fetishism. The wine can appear old, but only if it is wildly expensive, starts with Chateau, and conjurs cellar visions. If not, it must look as if it’s just on a layover, a minor lag before rushing off the shelves. This image of the agreed-upon guarantees a consensus of taste, which is what yuppies need as a backdrop to take flight from it. It lets them make shopping into a virtuosic labor of discovery, Indiana Jones on a Brunello kick, so misanthropically excited to prove to you that unlike the other schmucks quaffing whatever, they alone have memorized the precise numerical grade Robert Parker gave to a 2007 St. Supéry Napa Valley Estate “Elu”.

So all day I touched glass, but the glass was supposed to be invisible, only a problem – only visible – when too slick and itching to fall. It only worked when it could be ignored.

featuredimage-copy

The other job with this surface experience was washing dishes at restaurants where I’ve cooked. It was the opposite of wine’s glass. Washing makes you fixate on trying, as fast as possible, to get a surface back to looking like it was never used in the first place. (Or just close enough that your boss won’t make you do it over.) This requires, and therefore develops, a strange hyper-sensitivity of fingers and palms, even as the heat, water, and bleach wreak havoc on hands, because you have to gauge without really looking: how much is still crusted on, how much texture there is to what is supposed to have none. It is hellishly boring work, barring the radio and talking and sometimes flirting, because going through the motions means learning a set of gestures repeated over and over again but which never get to be fully automatic, the rhythm broken every time burnt salmon skin just won’t let go.

Until five years ago, these were the only times I had significant, affectively-thick tactile experiences with glass or the glassy. But five years ago, I got my first phone with a touchscreen. Now, like most people I know, I touch, rub, tap, worry, flick, and stroke glass at least once an hour, almost every hour that I am awake, almost every day of the year. My days, and whatever intimacy they include, are inseparable from the feel of something that shivers, as if touched with ice, yet always touches the same way back. I fell in love from afar, finger-skating fast filth on a Foxconned slab.

2015-03-20 13.43.38

Fingers found in a Google Books scan of an 1848 fiction collection from Maine

I have zero interest in either bemoaning or celebrating this, because it makes no sense to me to think it in terms of good or bad. Still, what is certain is that this transformation of experience – of a juncture of surfaces, signs, sight, and touch, a juncture crystallized around the touchscreen – is without precedent in human history, in terms of just how fast it has rewritten modes of gesture, reading, and seeing for just how many people.

There are, of course, other equally large shifts in humans’ technical-social experience. The use of buttons, for instance, in the typewriter brought with it a widespread experience of language as mechanically input, one previously available only via the typesetter or the telegraph and hence still requiring an “expert’s” mediation. The radio’s simultaneity, both from source and site of listening and between those sites, between the car, bar, home, and farm. The mechanization of war. Cinema’s animation of the still and reproduction of motion. The “whipping machine.” Mass-produced commodity markets. All online everything.

work_farocki_eye_machineII_02_2211

Harun Farocki, Eye/Machine II, 2002

 Still, it seems right to assert that almost no other machine – not the phone itself but the set of gestures, textures, embedded memories and technical knowledge, metals and plastics, supply chains, work, and social forms that stitch together at the moment where it and I use each other – has so rapidly become inextricable from the everyday.  The other drifts into use (and into the familiarity of being casually touched and unremarked upon) took decades, if not centuries, even if their effects were as dramatic in the long run. Indeed, the only others that happened close to this scale and velocity, across borders and populations, are all bound to war, to the sudden awareness of being part of a new machine – one named trenchcarpet bombingdrone, gas attack, napalm, heat signature, or counter-insurgency – that will literally kill you if you don’t rapidly come to terms with how it functions. They remake the landscape and turn all maps 3-D, all neighborhoods into theaters of operation. And no matter how atrociously normalized that becomes, it can never approach the sleepy familiarity of a thumb that vaguely flicks a feed.

My interest doesn’t lie in the general anthropological overhaul bound up with the digital, for which the phone has come to serve as one particularly visible index. I wouldn’t know where to begin with that. Instead, I’ve become fixated on the sensation of a screen, on how it shifts from a sensational element in space – one that delimits the distance covered by projection or around which we gather – to a sensual object in itself. On how only a few years ago, I did not carry around a warm slab of receptive ice in my pocket but now  don’t need to look down to trace patterns into and via it. Even more simply, in the way that so many of us spend so much time petting slick and smeary surfaces, carrying windows in our ass pockets.

large

HARDING

But with this whiplash recalibration of sight and hand, to find precedence of how it feels to not only touch but also to hold and carry screens, to care for them to the point that I don’t think about them, I have to go back to what seems far from screens and their images, back to dishes, wine, and ice. Growing up with computers made a visual-practical relation to screens second nature, sure, but I learned to manually navigate Androids through winter and wages. Or through reading about Spinoza polishing lenses and realizing that even if there’s only one substance, there’s no reason that all of a body moves at the same speed. Or the way my friend polishes her glasses with a grey cloth when she’s thinking or tired or both. The way that Tonya Harding fell and then wept, in spite of those gold blades. The way that Anna Kavan writes winter in Ice, running up against the limits of description and so just writing white and ice and frozen over and over again. How the security glass of bank windows require blows of different speeds and type. Malaparte’s horses trapped in a lake’s prison of ice. Hundertwasser’s loopy, and totally correct, fantasy of cultivating mold on high-modernist glass houses. Dying in the ice worlds of Mario. Drawing dogs in condensation. Wiping dry erase boards with bare hands.

And, to some degree, through watching and reading American sci-fi. Though not as much as to be expected, because the touchscreen has an odd history in the decades before it became ubiquitous. Especially in speculative fiction fixated on the digital, ice and glass made famous appearances.

Screenshot 2015-03-22 11.57.19

Neuromancer

In William Gibson’s first novels, ice (or “ICE,” Intrusion Countermeasures Electronics) named the defense systems within cyberspace that had to be broken and evaded like frozen architecture, with the “worrying impression of solid fluidity” yet subject to becoming “the shards of a broken mirror.”

hackersHackers, 1995

Hackers, on the other hand, took that “endless neon city-scape” and gave it a Tron-ish literalism, depicting information as physical construction and vice versa, mainframes as glass towers between which hackers can zip and maneuver.

But in Hackers, as in the Gibson novels (and other strains of cyberpunk), the ice and glass remained either virtual or a metaphor-made-architecture. When it comes to the apparatus for navigating info cities or server farms, it is a console, something unwieldy and manual that the “jockey” must plug into in order to pilot cyberspace like a glider (Gibson). Or it’s just a laptop you spray-painted with camo because you’re so bad-ass you don’t need to see what the keys say (Hackers and almost me, before I realized it was a terrible idea).

In the films I was watching in the ’90s and ’00s, ones actively concerned with feeling prescient, that split remained operative.

lawnmower-man-original.0

In Lawnmower Man (1992), the space is navigated from within the mind’s eye, as the climax of experimental drug treatment and with slighter grander dreams – becoming pure energy in a mainframe, controlling the world, etc – than expanded emojis. If anything, the film is especially intrigued by the way that these experiences are never purely digital. The god-to-be is not defeated  by Wing Chun with trenchcoats or a proto-Matrix peeking behind the veil – that is, not by struggles inside VR. Instead, they just blow up the building where the mainframe is housed.

swordfish

 Swordfish, 2001

And when excitement and budgets didn’t get blown on CG to depict what the Internet looks in cross-section , they went instead for the virtuosity of managing multiple screens at once, via a generic hacker’s keyboard flutter, as in Swordfish.

independence-day-jeff-goldblum-will-smith

As for starships, they held to a vaguely pilot vibe. For all the advanced tech in Independence Day, the alien craft is piloted by something held – a tremendous metal joystick – and analog enough to be patched into by Goldblum’s Dell. On TV shows, like Battlestar Galactica and Stargate SG-1, the screen-heavy spaces of the “bridge”/combat control room/etc continued to contain what one would expect:

CIC_Overview-2

 Battlestar Galactica

AOT23180

 Stargate SG-1

namely, keyboards and monitors as distinct things. One of the prime reasons seems to be that like the Kinetocsope, the touchscreen is an interface built for one, never particularly efficient where one might want to show someone else what’s on the screen. (“As you can see, Commander, the enemy’s ships are approaching from…” “Can you move your enormous hand? Jesus.”)

There are, of course, exceptions. Star Trek: The Next Generation, for instance, got in on the touchscreen game early. That’s a rarity, though.

snapshot20130510111007.7Total Recall, 1990

Total Recall, for instance, manages to envision something not just of Wii Tennis…

TS_Total_Recall_screenshot

 but also, and more saliently, TSA screening lines. Yet in that screening scene, we notice that the keyboards of the future (just the desk itself) on which the agents clatter are not themselves the screen. That sits next to it, a jutting black and green monitor.

total-recallTotal Recall, 2012

By the time of the remake, 22 years later, phone and tablet lessons had been learned, and the film has become obsessed with this other possibility. Its screens are now a) indistinct from architecture and b) ready/eager to be touched, using the palm as one’s biometric identification so that Colin Farrell can have Very Urgent Chats, a hand laid on the surface like the Plexiglass wall of a prison visiting room.

All in all, most of what I watched was uninterested in the feeling of screens. It was far more hyped about skipping over the messy fingerprint stage and going straight to the vaguely holographic, as if aware that there’s something endlessly throwback and manual about drunk-smearing burrito fingers across Grindr. The banal fact that touch leaves traces and that screens need cleaning just doesn’t seem very future. (I’m still holding out for a director’s cut of Iron Man where Tony Stark struggles for an hour to place a screen protector on his faceplate without trapping any dust under the plastic, but I’m not holding my breath.)

Minority-report-UIMinority Report, 2002

 And so, in Minority Report, not only does Tom Cruise not touch the screen. (The germs, he cries, the germs.)  He also wears special gloves, data prophylactics that let him keep his distance. The screen becomes the wall, yet the wall ceases to be something we might rub up against: set in space, yes, but as immaterial as possible.

Elon-Musk-Is-Building-Iron-Man-s-Holographic-Computer-in-Real-Life-378278-2

The Iron Man films go even further with this, arraying OS windows in the holographic air, letting you crumple up represented paper without even needing glass in the room.

That gets pushed to its extreme in what was the most stylistically innovative show on television, CSI: Miami, for which the entire precinct is a Constructivist upchuck by way of Lisa Frank, where the built and the screened look actually identical. I imagine Horatio walking into walls all day, mistaking them for browser windows and pastel lens flares. So while while the entire fantasy of the show is of unlimited forensics, where no image is too poor to not be “cleaned up” into magically high resolution, no trace to miniscule to not bind it back to a body, the screens themselves just hover, neon and pure, as free of smudge and splatter as the agents’ white jeggings.

Screenshot 2015-04-06 13.56.54

CSI: Cyber

CSI: Miami ended its gloriously protracted decade in 2012. CSI: Cyber has just started this year, and it’s hard to say how long it will stick around. But even in its first episodes, one can see the mark of a different shift: the real existing ubiquity of touch screens means that for a show that’s nominally secular and set in the present, the world has to be full of screens, especially of the glass and touchable variety. The way these are depicted, though, is markedly split into a few variations, from which the full-bore holographic of Miami is largely absent, barring a showy digital autopsy scene that takes place in what looks like a motion capture studio that they didn’t bother to green screen.

Screenshot 2015-04-06 15.09.29

First, those that cannot be touched whatsover, as they exist only for the viewer, not for the character, in House of Cards-esque floating panels, as in the episode where they moralize massively against Uber (under the name “Zogo”).

Screenshot 2015-04-06 14.13.31

Second, in huge situation-room panels at the center of their office, where they can do military-grade FaceTime but, crucially, still control everything by the keyboards and mice at their desks.

Screenshot 2015-04-06 14.13.56

Third, with so many hands, gesturing, tapping, pointing, and getting in each other’s way. Sometimes these hands are part of shots that try and approximate what it feels like to be absorbed in a screen, like the camera rotating around a stationary Arquette, staring at her tablet while snippets of text and voice flutter about. Other times, it’s just Van Der Beek – or a hand model – literally pointing our attention while explaining what a phishing scheme is. (Again, the show may not be long for this world.)

The same dynamic is also at work with more future-oriented shows and films. It’s not until the real existing daily use of touchscreens by millions of people that sci-fi grudgingly hoists them into view. And when it does, it reserves them for restricted and obvious uses.

Screenshot2011-06-06at122202PM

1. Phones of the near-future (same as regular, just a bit more translucent, like in Robot and Frank).

ui-transparent-touchscreen-from-prometheus

Prometheus, 2012

2. Proxy for the medico-erotic.

oblivion evil high modernism

Oblivion, 2013

3. Domestic/military logistics (the cup of tea resting on the touchscreen control panel that monitors both a blighted earth and a glass-bottomed swimming pool). Indeed, it’s that moment of screen-as-table that comes closest to the fantasy most of us really have about touchscreens. Not to have them vanish into space, not to become ever thinner until they wish themselves into immateriality, not to have them lilt bright around our heads, but to eat off them, to watch Hannibal through a vanishing mosaic of Hot Pockets and give the holographic Ghost of Commodity Future a very sloppy middle finger.

Both that urge – the slob screen – and the way in which most speculative films and TV tried to ignore it until it became unavoidably quotidian strikes me as telling, hinting toward an awkward proximity. For all the fantasies of the holographic and the talk of the immaterial, the fact remains that the digital stays rooted in the very physical, from the mining of rare earths to human mechanical turks, from the multiple hands of the compositor to the experience of touching glass. No matter what we do on these things that become inseparable extensions of the body and head – a phantom limb for every American! – they are nevertheless things that we hold and carry, leave at the bar and drop into the toilet. A dual, fractured physicality. Something worn and steady, always there and Jack-of-all-trades, slowly polished like sea stones in the hand over years and chirping beside the bed. An alarm, a flashlight, a last-resort vibrator.  But also something tenuous and flighty, always in danger of being bobbled, juggled, and, finally, shattered.

they live by night watchmaker fake dissolve copy

They Live By Night, Nicholas Ray, 1948

Prior to touchscreen phones, there were probably only two significant interactions you could have with commodities through glass. You could go window shopping, entering a circuit of fantasy and denied fulfillment, becoming also an extension of the store itself, a flesh-and-blood ad’s image of rapt yearning. Or you could go looting. Those are the options: find some potential pleasure in being blocked by the glass or stop being blocked by it.

It’s in those terms that the opening of Joseph Lewis’ Gun Crazy (1950) is so smart. There, the lens of the camera and the “lens” of the shop window double up without telling us, as we’ve taken our position behind the looking glass before the film makes this clear with a reverse shot from the boy’s perspective. So the camera is already a security camera, opening up the possibility of not just the triangulation which psychoanalytical takes on cinema love so much – the apparatus, the gaze of the boy, the promise of the gun – but, much more importantly, the line of invisible demarcation that’s supposed to keep the gun on “our” side, the poor safely on the other. As is often the case, at least in actual history, it takes a rock to make it obvious where things stand, a rock to make clear that surveillance doesn’t stop when you turn your back on it.

segun-motorola-el-milestone-2-no-explota

 Screen blood

With touchscreens, a new operation is possible: one can activate commodities, be they in-game powerups or Amazonian toilet paper, by pushing on another commodity layered with alkali-aluminosilicate glass, by pressing your face to the window. Still, despite the unbreakable and unscratchable promises of Gorilla Glass (a descendent of what Corning first developed as “Project Muscle” in the early ’60s), phones have a strange existence: they get broken, way beyond repair, and they keep getting used. Vic pointed this out to me: what else can be so busted, in terms of the promises on which it was sold (billions of pixels, the sheen of interactive glass), yet still be handled and used every day? Sure, keys fall of computers and we just hit the button below. Headphones die in one ear, and we rig them with twist-ties to get them to still play provided they are held in exactly the right position. Dissident mufflers get roped back into place.

 

smartphone-iphone-repair

 Shards (in ad for repairing a screen built too fragile from the start)

Of course, the reasons not to replace screens are plentiful and obvious – mostly, that these things are so expensive to start, and few people can afford to put even more money toward them, even if they come to feel like a necessary element of how one navigates every day. Nevertheless, I’ve watched people embed shards and splinters of glass in their fingers, tiny motes of blood coming to the surface, because they keep swiping across a shattered surface that holds the cuts in place, bleeding because they had to reduce their pet’s stress level in Kawaii Pet Megu. When I smashed the hell out of my phone’s screen, bobbling it down a flight of cement stairs as it slid out of my pocket wrong, a screen protector kept the shards in position, laying a permanent spider web at the edge of everything I saw. But sometimes when I spoke, I’d find tiny crystals of broken glass dusting and cutting my ears, like a disaster had been whispering there.

 

2015-03-26 14.25.00

1928

Near the beginning of Alexander Kluge and Oscar Negt’s gargantuan, sprawling, and genuinely brilliant History and Obstinacy, an immediate concern is the difference between types of grasping. First, between the “crude grasp” (Rohgriff) of primitive accumulation – “Its particular grasp annihilates what is actually supposed to be accumulated,” as if Hulk tried to seize communal oil fields – and a finer one that “resembles a legislative machine,” self-regulating and perpetuating, not destroying what it snatches (p. 85). But they also mean “grasp” in a more literal sense:

“Of all the characteristics responsible for unifying the muscles and nerves, the brain, as well as the skin associatively with one another – in other words, for the human body’s feedback systems, its so-called rear view [Rücksicht] – the ability to distinguish between when to use power grips and precisions grips is the most significant evolutionary achievement. It is the foundation of our ability to maneuver ourselves, an ability that is most easily disrupted by external forces. These forces are also capable of disturbing our self-regulation. Self-regulation is the outcome of a dialectic between power grips and precision grips.” (p. 89)

With touchscreens, we grapple with a new language of gestures imposed wholly by external forces, gestures that signify only insofar as they are registered in code. But perhaps the strangest of these isn’t one bound up with the software, neither the erotic indifference of the left swipe or the fact that, in a rather generous act of anti-planning, the swipe-to-text function of my current phone obstinately avoids predictively spelling TOMORROW, offering instead TIMEPIECESTINTYPETIPTOES, and TIMELESSNESS. No, the gesture that seems so alien to me, alien in just how natural it has become, is the maneuver of the sliding grip, a delicate oscillation between power and precision. Because despite being held by hands, iPhones and their competitors were never designed to be held. They are designed to be naked screens. Screens without hardware, ghost screens, guillotine-blade screens. They are only allowed to be held grudgingly, as a last resort, and we slowly fool ourselves into imagining that these sharp panes of ice feel good in the hand, let alone stable. But how do we hold something that isn’t supposed to be held, whose entire front is intended to be as smooth as possible? What would it be to grip a screen that’s only as thick as its image?

And so aside from learning how to type and swipe, we learn also that particular move of gripping glass while letting it slide. In bed, trying not to wake someone sleeping on our chest, we grope far out in the dark until we just barely feel that familiar smooth chunk, marked by feeling like nothing much at all. We slide it toward us with a minor flick of the middle finger, then pry it up between the thumb and forefinger, drawing the hand toward us with a gentle wrist’s whip, so that the phone is both held and turns. It rotates over a breakable fall, pinioned without screws by finger’s oils, and comes to rest in the palm. Then we look and see that it is still night and that a bot named “silkfeather_92″ liked a photo we took and that in the dialectic between power grips and precision grips, there are no winners.

 

iphone-gesture-icons

 

2015

That grip, a variety of which is also used when we pinch-draw the phone from a pocket and throw-slide it into the hand, is not just a grip. It is also a gesture, which to me means that it doesn’t signify or supplement anything. Rather, it stands uneasily between language and action, speaking of the limits of the former to make itself heard and the refusal of the latter to stop trying. It is an intermediary that makes it possible to see mediation, the machines at work behind the experience of communication, the fingerprints marring the surface.

If you asked me to say to you a list of all the words I know, I’d probably have the experience of knowing that there are words I’ve read, wrote, and said but that I simply cannot call to mind for the list, that I don’t remember that I know. In order to explain this to you, though, I would have to use words, and, in so doing, likely stumble onto some of the words I’d forgotten.

Perhaps that’s how it is with our hands of glass. To draw a balance sheet of what we’ve lost and gained, what forms of touch have been displaced or enhanced or scrapped because we can’t keep our hands off ice screens, we could only do so through moving our hands, passing them over things, tilting slabs, seeing what happens, what registers, what remembers. But being gestures, they can’t say or do anything other than depict the contours of the systems within which they might communicate, amongst ourselves and between our devices. From what I’ve read, it seems that those who design and manufacture these things, those who mine rare earths and seep thorium into water supplies those who drive workers to suicide – that is to say, we, we who are inseparable from these things – keep yearning for tougher and tougher glass, be it Gorilla or sapphire. The endpoint of that dream can only be a glass so hard that it could only be marked as ever having been touched by more of that glass itself. If that’s the case, at least I’ll finally know how to write TOMORROW on my phone.