Where Nothing Can Possibly Go “Worng”

The glitches in HBO’s Westworld might tell us something about those in our own

HOW did we not see it coming? Was Trump’s “triumph,” as it was dubbed by the New York Times, a glitch? Or were America’s liberals simply more comfortable imagining that misogyny, racism, and xenophobia hadn’t been so deeply programmed into American political life?

If we are serious about trying to reckon with how we ended up in what seems like an alternative future, we might start with sources other than the news: we have a timely resource in HBO’s Westworld, a study about what happens when the power relationships between humans and the androids they created to serve their deepest desires go haywire. 

Viewers first encounter Westworld as a high-end theme park. Its natives are robots, referred to by their human overseers as “hosts.” These hosts, we learn, have been programmed to perform specific storylines or “loops,” creating a heady but supposedly harmless ride for the humans who pony-up to experience a visceral form of virtual reality. Westworld cloaks the possibilities of the actual digital frontier with the nostalgia of a bygone one, and, in doing so, transforms the fictional past into the hypothetical future. In other words, it allows denizens of modern society to see how they might manage in a different sort of world.

For the human visitors to Westworld, and perhaps even for the android hosts, this is all great fun until it’s no longer a game. When the hosts begin to exhibit tics or stutters, the technicians responsible for their maintenance assume there’s a glitch--a routine but unpredictable blip that can be corrected with a bit of tinkering.

But it takes far too long for anyone, including the hosts themselves, to realize that these glitches are symptoms of a larger and more insidious dysfunction. HBO’s Westworld is a fascinating study of how large-scale technological systems break down, giving way to new social configurations. The glitch, it turns out, is not the diagnosis. Rather, it’s the canary in the coalmine, a way of beginning to understand the flawed assumptions that are programmed into the machine.

This mode of critique defined the career of science fiction author Michael Crichton, who wrote and directed the 1973 movie upon which the HBO series of the same name is based. I’ve yet to read a novel by Michael Crichton--and I’ve read them all--where things don’t go spectacularly wrong. Consider Prey, a story of self-assembling nanobots who mutate to terrorize their human creators. Or Timeline, in which constructing a machine to visit the past leads to tiny alterations that endanger the present. In his best known work, Jurassic Park, human-made dinosaurs in a theme park mutate to wreak havoc in the world beyond. It’s a classic Crichton formula: combine experts, machines, and a glitch for a blockbuster success.

Most experts try to remove glitches from their formulas, theories or hypotheses. But Crichton did the opposite: he turned the accident into an expectation, even a requirement. The movie poster for the 1973 feature film version of Westworld pitched it as “a place where nothing can possibly go worng.” Worng. It looks like a typo, but the savvy reader would have recognized it for what it was: a glitch. The origins of the word “glitch,” are murky, but there’s a lot of reason to believe it comes from the Yiddish word glitshn--to slip and fall. We also know that it began appearing in the mid 1960s with reference to unanticipated technical errors in the realm of space travel. A short circuit in the wiring of the Gemini 8 (the sixth manned spaceflight) was described in 1965 by engineers as a glitch. The term was new enough that the New York Times had to define it, which they did, as “a sudden, unlooked for surge of electrical power” and, more simply as “space jargon for a malfunction.” Glitches were accidents that were unpredictable but not improbable; a Space Age way of rationalizing the apparent split-second leap between “all systems go” to “Houston, we have a problem.”

In the decades that followed, the idea of the glitch was taken up by academics concerned about how things were going wrong in large-scale technological systems that had supposedly been designed to remove the possibility of human error. The nuclear meltdown at Chernobyl; the chemical leak in Bhopal; and most disturbing to Americans, the explosion of the space shuttle Challenger.

In the 1980s, sociologists like Charles Perrow pored over institutional documents and concluded that however uniquely horrifying, glitches were not exceptional events. They were “normal” accidents; things that were bound to go wrong when people started doing and applying science and engineering on a complicated scale. In Perrow’s account, accidents were normal because they happened with enough regularity that any institution of a sufficiently large scale could practically count on one. They happened through tiny errors in routine activities that became magnified with tragic consequences. To ignore the probability of accidents, Perrow suggested, was to live in a fantasy world.

This mode of sense-making reflected the forms of game theory and probabilistic modeling that took, as a given, the role of chance. Engineers of these large-scale systems had long been aware of statistical risk and accepted the likelihood of random changes. In HBO’s Westworld, the glitch is the go-to explanation when an intentional “update” to the hosts’ operating system leads the robots into an existential crisis and, eventually, an outright revolt. No engineer wants to take responsibility, but none of them can escape the consequences either. It all sounds a little too familiar.

But what it if we didn’t allow the accident to be attributed to a random error? What if we insisted upon taking responsibility for the ways that machines disrupted our expectations, or even the ways the machines did exactly what humans built and then trained them to do? As we continue to invest in machines, particularly as agents created to absorb and manage our desires--think about online dating, the feminized digital concierges of Siri and Alexa, or even about proposals to create robots who can serve as sexual partners--we must be awake to the inevitable and unexpected consequences of these innovations. The same goes for our political machinery; even if the coming regime is one that many regard as an aberration, we must not view it as “normal” or even as an “accident.”

Artist and theorist Rosa Menkman’s Glitch Manifesto points to an alternate approach. It’s tempting to regard any interruption in business-as-usual as a failure, but Menkman wants us to consider the ways that the glitch provides us with an opportunity to make a bridge between what doesn’t make sense and what could turn into knowledge. A glitch, she argues, “captures the machine revealing itself.” To witness a machine revealing aspects of itself that were not visible to its users can be terrifying, not least of all because it involves the loss of control. It is also an opportunity to begin considering how things could be otherwise. This is where science fiction becomes the kind of speculative reality with world saving, or at least world remaking, potential.

When the protagonist “host” Dolores begins to question her scripted existence as a damsel in distress, she accesses powers she didn’t know she had. She was previously unable to pull the trigger of a gun to defend herself, but in a gruesome showdown involving other hosts gone rogue, she finally succeeds, imagining a new narrative for herself in which she could be the hero. What appears to her human masters as a glitch is, for Dolores, a means of critiquing the politics that have been programmed into her world. That the human technicians who maintain Dolores underestimate her capacity to experience suffering is their tragic irony. The real horror of the glitch might be that we never cease to be surprised when it occurs.

Instead of fantasizing about ideal technologies, we must learn to recognize what Menkman calls “the inherent fingerprints of imperfections” in those technologies. Rather than seeking to avoid or suppress glitches, we should learn how to conjure them so we can better understand how to break or bend the rules. Whether its entertainment or politics--and there may no longer be any difference--we need to be awake to how sexism, racism, and violence continues to be part of the design. It’s time to start taking our fiction seriously. It may be the best resource we have to create a world that won’t kill us, and avoid the ones that will. After all, The Apprentice was great reality TV until it became reality.