Playing Outside

The Honeymoon Killers, 1969

If video games want cultural legitimacy, designers will have to concede it's not all about fun

Video games are bone-weary with growing pains, though the signs are that the medium has arrived: The multi­billion-dollar game industry draws curious investors and heavy-hitters. Franchises like Call of Duty and World of Warcraft have become cultural institutions and tradeable names, and on the back of their success the CEO of leading game publisher Activision received an 800 percent pay raise, to nearly $65 million, last year.

Interactive entertainment hasn’t just been succeeding financially; it has begun attracting a new cultural legitimacy as well. The soundtrack to Journey, one of 2012’s most celebrated games, got a Grammy nomination, and the game itself crushed the annual awards cycle, an impressive feat for an indie game based on Joseph Campbell’s monomyth and the sentimental concept of unvoiced collaboration with strangers. Just last week, the TriBeCa film festival hosted a panel on Beyond: Two Souls, a game heavy with the vaunted promise of mature storytelling — and featuring the voice and ­facial-mapped performance of actress Ellen Page.

These are the kind of radar blips that attract attention from the “proper” art and technology world, those who might have previously consigned gaming to the realm of the incomprehensible plaything, mere power fantasy for males of a certain age. Gamification is mostly marketing-friendly snake oil as far as to anyone who actually makes real games is concerned, but the idea that game-design concepts can impact and influence productivity and social change has turned a lot of heads.

There’s a growing vibe in the games industry that while the wider world still might not “get” video games, they might be willing to entertain a re-evalulation of their prejudices. This is, after all, the era of the smartphone, and games are the most popular category on the App Store, eclipsing even books.

The growing outsider interest in the medium is partially a result of patient insistence on visibility by charismatic innovators like Will Wright (of Sims fame) and game critics and journalists, and partially the result of games industry’s dogged pursuit of legitimacy.

Outcasts and weirdos in the late 1970s and early 1980s founded what would become gaming culture. Rogue programmers and witty, countercultural Steve Jobs types built awkward, secretive text adventures and later led the early rush to colonize the Internet with ways for people to role-play and explore fantasy realms together. The modern games industry we see today — the one made to answer for mass shootings and to serve in-jokes to Family Guy and South Park ­— emerged more slowly, the result of a certain lack of self-esteem that maybe isn’t surprising for a medium born of nerds.

The industry of the 1990s viewed itself as an earnest second cousin to film and generally consented to help hardware makers mobilize their young male audience with promises of immersive interactivity and “in your face” special effects. Since Doom and ­Counter-Strike began colonizing student computer labs all across the nation, the muscle-bound first-­person shooter has been the dominant paradigm, though far from the sole one.

Japanese role-playing games, comedy-sketch adventure stories, and sprawling, stat-heavy fantasy campaigns became safe zones for the uncool. You could be a social outcast or a geek, but as long as your parents could afford a PC or a Nintendo 64, you could have access to a world where every variable, every event could be controlled, where every conflict had a predictable win condition, if only you were skillful enough.

The downside of developing a lexicon that only you and your friends value and understand is that it’s by definition inscrutable to everyone else. Advertisements for hyper-realistic war landscapes and parades of unrealistic, objectified female bodies intimidate or alienate most people. And all gamers have some story of the time they tried to show a non­gamer friend some rich, transporting universe with which they’d fallen in love, and despite bravely hefting a controller — these days they include twin sticks, a directional pad, four buttons and four rear triggers — the friend struggled to make the character stop walking determinedly into a wall.

But passionate game fans are willfully blind to the communication gap between the games industry and everyone else. “Are games art?” is a question raised so often in social media-enabled gaming fandom that it’s almost a joke. When the late, great film critic Roger Ebert declared years ago that video games “can never be art,” he must have had no idea of the nerd war he was about to launch. He eventually recanted that position in 2010 — not because some revolutionary work of interactive entertainment had changed his mind, one imagines, but because he’d simply become exhausted by Internet trolls and declared a truce.

Gamers are a force to be reckoned with online, uniting with absurd fervor to defend their medium. They see the mainstream world’s dubiousness about the value of their safe space as further rejection, more teasing from the jocks. They’re so attached to those old high school dynamics that they have a hard time seeing that interactive entertainment finally has the attention of digital artists and experience designers, and a real shot at broader recognition.

Sexism is such a hot topic in the games industry these days because new voices are virtually banging down the industry’s doors to be recognized, included, and heard. The geek treehouse is terrified at the idea of change. The obsessively earnest Internet comments and tweets about how games absolutely are an expressive art form that deserves as much respect as anything else are paired with claims about how feminism and “censorship” are going to ruin everything for them, naturally.

In most ways, gaming culture is much like any other insular, ideologically-driven group faced with the fact their little world needs to start meaning more things to more people. One finds the same political problems among passionate leftists — white men who feel especially sorry for themselves in ways that countermand their expressed desire for respect and change.

But these attitudes may no longer be financially viable. The main arm of the commercial games space — colloquially called triple A because of its $100 million budgets, hundreds-strong studio teams, and obsession with the ideal of visceral realism — is contracting despite the increases in executive pay. A significant portion of gaming’s founding fan base has quietly turned into grown-ups and parents, more hesitant than they might have once been to put war simulations and high-resolution breast physics in front of their colleagues and kids. As game play shifts to more participatory online multiplayer, muddling in the trenches with a lot of slur-slinging, phobic Internet trolls is an ever-less attractive proposition.

With decreasing time budgets, shorter — and less expensive — art-house games and smartphone-market “distractionware” become a more appealing proposition. Much of gaming’s historical audience would rather integrate gaming into their adult lives than cling to a militantly geeky platform.

But the game industry, laboring under a dated marketing vision that still dreams of the 17-to-25 year old gadget geek with the endless wallet, hasn’t grown up at the same rate. Risk-taking and creative innovation are receding amid a destructive feedback loop in which appealing to a niche audience becomes ever more critical the more that audience’s contribution to the bottom line shrinks. As a result, the games currently lining store shelves are increasingly impossible to distinguish from one another. Game companies bet on becoming the single most attractive player in the same homogenous field rather than branching out to create something new  and risking expensive failure.

***

When games writer Jason Schreier recently wrote a Kotaku post expressing annoyance and fatigue with yet another set of unrealistic anime breasts in fantasy-action game Dragon’s Crown, he drew a wave of ire — and even a homophobic gibe from the game’s Japanese artist. Shouldn’t an artist be allowed to express himself, myopic fans demanded to know in regards to a top-heavy cartoon sorceress. And aren’t games just for fun, anyway?

The idea that at the end of the day, games are obligated to serve the purpose of “fun” above all others has been the main wrench in the works of the gaming industry’s machinations for legitimacy. Why should games be mature, cope with social issues, reflect society, or demonstrate the genuine artistic vision of a grown-up creator? At the end of the day, they’re just for fun, say gamers when they’ve run out of defenses against the mainstream industry’s embarrassing, stagnant homogeneity.

Alarmingly, professional game creators often contribute to that echo chamber too. Traditional game production is a ruthless wringer of a career, demanding 80-hour weeks and making widows of wives (creators are still predominantly male). A developer who’s led a product that’s managed to achieve commercial success is virtually canonized for his sacrifice. Veteran game developers are masters of creating “fun”, and understandably lead the charge against the idea that games can or should be anything else. But if genuine legitimacy for games lies in the idea that they can be creative expression, tools of global communication and teaching — that’s the evolutionary purpose of play, after all — fun decreases in relevance. Culture-changing entertainment is rarely described as “fun.”

Games’ 21st century culture war could have been consigned to end in attrition: The form became an economic heavy-hitter, colonized every modern technology platform, spawned a fervent and accessible independent games scene that’s as relevant to the medium as festival flicks are to cinema. But when all the opportunities were finally at hand, the industry’s slavish commitment to economics, best practices, and the grail of “fun” kept wider legitimacy at arm’s length.

That future doesn’t need to repeat itself. A tiny but reverberating kernel of game­makers, tired of waiting to be embraced by veteran developers busy planning how to make an even more realistic assault weapon, have embraced new technology that allows them to create independently.

Development tools used to form a significant barrier to entry. To make a game once involved a highly specialized skill set, access to programming or digital art and design ­education, and high-end technology for both practice and creation. But game tools have been quietly skewing closer to Web development tools, becoming easier to use. Tools with pricey licenses are getting dislodged by tools with lower barriers to entry, like Unity, GameMaker, RPGMaker, the visual novel generator Ren’Py, and a flexible, free interactive text tool called Twine.

This mass democracy of tools has meant that women, genderqueers, minorities, outsider artists, and a broad, brand-new wave of individual creators who haven’t necessarily grown up on the insular vocabulary of “video gaming” have been drawn to experimenting with games, digital play, and interactive entertainment. The popular wisdom in game design has long been to lavish players with the idea that they are the ultimate storytellers and that the job of the game designer is simply to provide a framework for the players to create their own experiences. That approach is not inherently invalid, but it just feels huge that after decades of rationalizing and defending nerd fantasies as high art, personal expression and authorship now have the opportunity to at last become a major trend in games for the first time.

One nominee in the Excellence in Narrative category of the 2013 Independent Games Festival (the industry’s Sundance, in a sense) was Dys4ia, a game by Oakland-based designer and author Anna Anthropy, about the intimate struggle she endured undergoing hormone therapy for gender transition. Anthropy has long been one of indie games’ most beloved and contentious Angry Young Women, developing indie games that deal with themes of identity, kink, relationships, and personal storytelling. Her manifesto, Rise of the Videogame Zinesters, urged aspiring artists of all kinds to get their hands on the broadening menu of accessible game-creation tools and be heard in interactive entertainment.

Dys4ia didn’t win at the IGF, which traditionally celebrates young white guys experimenting with the language of design through well-liked but increasingly familiar twee, retro aesthetics. (Disclosure: the festival is run by the same organization that owns one of my longtime employers, industry news and features site Gamasutra.) But this year the festival’s grand prize winner was another harbinger of a new, deeper way of viewing what game experiences can mean. Richard Hofmeier, another independent game designer, made Cart Life, a grueling simulation of the daily life of someone relying on a sidewalk cart’s income to survive. Its bleak grayscale art and the ruthlesness of even its smallest rituals, like showering, buying food, and remembering to pick your child up from school, represented a shift in the “life sim” genre, highlighting the humble heroism in simply facing the world every day without privilege rather than the power fantasies with which games are usually associated.

Hofmeier took a further step once he received the award: He turned over his booth on the well-trafficked show pavilion to his friend, critic, writer, and text-game creator Porpentine so she could showcase her game Howling Dogs, a fascinating, brilliant text experience in confinement, depression and escapism.

Since then, the individual games movement has exploded, attracting curious creators and new experimenters in droves. It’s also attracted its share of detractors, veteran game designers who look at the narrative-driven personal-storytelling games as “cool, but ‘not games.’?” They may see a betrayal of their sanctified best practices of systems design, player agency, and reaction driven by conditions. Their resistance has begun to seem as political as it is professional, a desire to close a door to under­represented voices just as they’ve begun to step through it.

It’s within this growing personal games movement that we can see the genuine potential of games as art and communication. The geek community has been validated for long enough by economic growth and a product industry that panders to them even at their own expense. Let them have that niche if they’re so desperate to choke it into irrelevance.

Cart Life, Howling Dogs, Dys4ia, and their contemporaries are touching and occasionally brutal, antithetical to games’ holy doctrine of fun. But they are important. The high-end triple-A gaming business will hopefully start to learn from the brave experiments of ­indie artists with everything to say and nothing to lose, folding some of the movement’s most resonant lessons into its experience design. That’s the road to a healthy culture and a genuinely mature, artistically legitimate games industry. But for now, individual outsider games are gently bringing game creation back to its spiritual origin — this time, in ways that could include everyone.

That nongaming friend who views even the most innovative among console games as a foreign language can pick up Dys4ia, Howling Dogs, or any of its contemporaries and understand its purpose, receive its intended message, be touched by empathy. Changed — maybe enlightened about a lived experience that isn’t necessarily their own. They won’t just be walking into walls.