twitter
facebook twitter tumblr newsletter
blog-marginal-174
Marginal Utility
By Rob Horning
A blog about consumerism, technology and ideology.
rss feed

A Man Alone

rod

Rod McKuen died a few days ago. Because I have spent a lot of time in thrift stores, I feel like I know him well, since that’s where lots of his poetry books (Listen to the Warm, Lonesome Cities, etc.) have ended up, alongside the works of kindred spirits Walter and Margaret Keane. His albums, sometimes featuring his singing but generally he just recites his poetry over light-orchestral music, can be found there too. I like “The Flower People“: “I like people with flowers. Because they are trying.”

Artists like McKuen and the Keanes, who achieved unprecedented levels of success with the mass-market audience in the 1960s while being derided by critics for peddling “sentimental” maudlin kitsch, fascinate me — probably a hangover from graduate school, when I spent a lot of time studying the 18th century vogue for “sensibility” novels, which were similarly saturated with ostentatious tears. McKuen has a lot in common with the 18th century “man of feeling” epitomized by the narrator of Sterne’s A Sentimental Journey, who travels around seeing suffering and  ”having feelings,” which prove his humanity and allow readers to experience their own humanity vicariously. McKuen let his audience accomplish something similar with his tales of urban love and loneliness and his wistful recollections of weather and whatnot.

Still, I wonder why the market for McKuen and the Keanes re-emerged just then, in the 1960s? What made reified desolation a sudden hot commodity? Did it have to do with changes in available media, or the general air of postwar prosperity? And what’s the relation between their success and their reputation? Why is that kind of critical contempt they received reserved for artists who commercialize sadness and feelings of loneliness and vulnerability? What did audiences want from their work, such that critics could seize upon it to mock it and make themselves and their readers feel superior to it all?

The New York Times obit of McKuen concludes with a quote from him in which he claims that success turned critical opinion against him:

“I only know this,” Mr. McKuen told The Chronicle in 2002. “Before the books were successful, whether it was Newsweek or Time or The Saturday Evening Post, the reviews were always raves.”

I wonder if McKuen thought that only his popularity kept him from earning the respect that, say, Leonard Cohen or Jacques Brel (whose songs McKuen translated into English-language hits) or maybe even Wordsworth and Whitman tend to get. But such counterfactuals seem beside the point, not only because critical opinion is fickle and ever-changing but because it is impossible to separate the “quality” of a work from the conditions surrounding its reception.

Participating in the phenomenon of McKuen’s popularity (or conspicuously refusing) became essentially what his work was about, beyond the melancholy remembrances about lost lovers and cities at dusk. You were either on board and willing to conform, willing to let McKuen be the way you defused potent and inescapable fears about decay, sadness, anonymity, and fading love along with millions of others and thereby mastered those feelings, put them in a safe place to be admired, or you were not on board, unwilling to conform, unwilling to admit those feelings could be collectively tamed but instead must be personal demons you never stop fighting alone, far more alone than any McKuen poem could ever testify to.

One might be tempted to champion McKuen as a populist who rendered the ordinary person’s feelings and aspirations in easily digested metaphors while the culture snobs sneered. But if you listen to a lot of his music or read through his books, you might end up with the sense that his point of view has more to do with snobs than ordinary people: He seems to travel a lot from glamorous seaside city to glamorous city, indulging in late-night bouts of boozy nostalgic melancholy with little fear of economic want, issuing patronizing advice about how to feel to readers or listeners or discarded lovers or total strangers, luxuriating in emotions as if they were badges of privilege rather than the afflictions he would otherwise have you believe. (Listen to “Earthquake,” for instance.) McKuen sounds like a humble-bragger whose medium is misery; his sadness makes him more important and individuated than less sensitive or self-regarding souls.

I wonder if, when McKuen was popular, critics felt threatened not by his work’s “sentimentality” but its familiarity, which they then labeled “vulgarity” to try to expunge it from their own sensibility.  I know that is how I feel when I listen to his music. It sounds smug to me because I’ve felt those smug feelings and romanticized them privately (lacking the courage or the chutzpah to try to cash in on them). I can’t hear his poems as straightforwardly earnest, like perhaps the millions of people who bought in could. I implicate myself in these works instead, in every self-satisfied line of self-deprecation and self-pity. I recognize someone who wants to feel different from everyone else but still wants them all to feel sorry for him.

McKuen went more or less underground of his own volition in the early 1980s, which perhaps could be seen as a kind of admission of guilt. His obituaries describe him in his reclusion as severely depressed, holed up in his California home with half a million records and CDs. It’s an emblematic tableau that stands as a warning. You can retreat to the mountain with your carefully curated collection of records that prompt you to have all those important feelings that you can’t bear to experience through or with other people, but that’s not going to let you understand what all those people felt when they bought a Rod McKuen record in the 1960s and maybe even played it once or twice.

Simple and Plain

Screen Shot 2015-01-08 at 4.16.41 PM

Today is Elvis Presley’s birthday. He would have been 80. Most people accept that he died in 1977, at the age of 42, which means I am older now than he ever was, a fact I have a hard time wrapping my head around.

I’m currently reading Careless Love, the second volume of Peter Guralnick’s biography of Elvis, and it is bringing me down. It’s about how fame was a collective punishment we administered to Elvis, which he would not survive. Fame allowed him to coast along when he should have been stretching himself; like a gifted child praised too much too soon, it made him incapable of coping with challenges. Fame allowed his manager, Colonel Parker, to construe Elvis’s talent as a cash machine. Parker encouraged in Elvis a zero-sum attitude toward his art, so that he demanded as much money he could get for output as superficial as they could make it, as if the shallowness implied savings, a better bargain from the forces who commercialized him. Fame transformed Elvis into a kind of CEO who inhabited his own body as if it were a factory, a capital stock, on which an enormous and ever-mutating staff relied upon for their livelihood. As a consequence, fame isolated him completely. His friends, no matter how much they loved and respected him, remained a paid entourage whom he could never completely believe actually loved him for real. “He constructed a shell to hide his aloneness, and it hardened on his back.” Guralnick writes in the introduction.  ”I know of no sadder story.”

I first got into Elvis after stopping at Graceland, his home in Memphis, during my first road trip across the U.S., in 1990. I knew very little about him, just what you sort of absorbed by osmosis from the culture. Elvis impersonators were probably more salient than Elvis himself at that point. My grandmother, I remember, had some of his later records: Moody Blue; Aloha From Hawaii via Satellite. I wanted to stop at Graceland because I thought it would be campy fun; I wanted to re-create the scene in Spinal Tap when they experience “too much fucking perspective” at Elvis’s graveside.

But Graceland was surprisingly somber, straddling the line between pathos and bathos, never letting me take comfort in either territory. It didn’t seem right to laugh when confronted with the meagerness of the vision of someone who could have had anything but chose Naugahyde, thick shag rugs, and equipping rooms with dueling TV sets. And it was genuinely humbling to recognize the desperation in it all, the dawning sense that Elvis had nowhere to turn for fulfillment and had none of the excuses we have (lack of time and resources, lack of talent) to avoid confronting inescapable dissatisfaction head on.

In one of the stores in the plaza of gift shops across the street from Graceland, I bought a TCB baseball hat and cassette of Elvis’s first RCA album, the one whose design the Clash mimicked for London Calling.

elivis

Every time it was my turn to drive, I put the tape on; listening to “Blue Moon” while driving through the vacuous darkness of Oklahoma was the first time I took Elvis seriously as a performer, the first time I heard something other than my received ideas about him. Then, like a lot of music snobs, I got into the Sun Sessions and the other 1950s stuff and declared the rest of his career irrelevant, without really knowing anything about it. In recent years, I have overcorrected for that and listened mainly to “fat Elvis” — the music he made after the 1968 Comeback. I’m amazed by moments like this, a 1970 performance of “Make the World Go Away.” Wearing a ludicrous white high-collar jumpsuit with a mauve crypto-karate belt around his waist, he mumbles a bit, tells a lame joke about Roy Acuff that nobody gets, saunters over the side of the stage to drink a glass of water while the band starts the saccharine melody, then out of nowhere hits you with the first lines, his voice blasting out, drawing from a reserve of power that quickly dissipates. Then he skulks around the stage, visibly antsy, as if trying to evade the obvious relevance of the song’s lyrics to his sad, overburdened life.

I never paid any attention to 1960s Elvis, but now, reading through Guralnick’s dreary, repetitive accounts of Elvis’s month-to-month life in the 1960s, when he flew back and forth mainly between Memphis, Los Angeles, and Las Vegas as he accommodated a relentless film-production schedule — he made 27 movies from 1960 to 1969 — fills me with an urgent desire to somehow redeem this lost era of his career, to study it and find the obscured genius in it, to rescue it through some clever and counterintuitive readings of his films or the dubious songs he recorded for them. I just don’t want to believe that Elvis wasted the decade; I don’t want to accept that talent can indeed be squandered, that instead it finds perverse ways to express itself even in the grimmest of circumstances. But this was an era when he was cutting material like “No Room to Rhumba in a Sports Car” (Fun in Acapulco), “Yoga Is as Yoga Does” (Easy Come, Easy Go), “Do the Clam” (Girl Happy), ”Queenie Wahine’s Papaya” (Paradise, Hawaiian Style), and “Song of the Shrimp” (Girls! Girls! Girls!). I’m not sure it’s all that helpful to pursue a subversive reading of Clambake. What there is to see in Elvis’s movies is doggedly on the surface; as Guralnick makes clear, these films were made by design to defy the possibility of finding depth in them.

At best, a case can be made for appreciating Elvis’s sheer professionalism in this era, his refusal to sneer publicly at material far beneath him. Sure, he was on loads of pills, and the epic-scale malignant narcissism of his offscreen behavior was establishing the template for all the coddled superstars to come. But he wasn’t a phony. If he was cynical, it was a hypercynicism that consisted of an unflaggingly dedicated passion for going through the motions. Guralnick describes Elvis in some of these films as being little more than movable scenery, a cardboard cutout, but he is a committed cardboard cutout. A bright empty shell with a desultory name and job description (usually race-car driver) attached, Elvis wanders through an endless series of unconvincing backdrops, reflecting back to us the cannibalizing effects of fame, inviting us try to eat the wrapper of the candy we already consumed.

Tim Burton’s “Big Eyes”

FB series9Tim Burton’s Big Eyes makes a strong case that Walter Keane was a first-order marketing genius and his wife Margaret, whose paintings he appropriated and promoted as if they were his own, used his marketing talents up until the moment she could safely dispense with them. Given that Margaret Keane apparently cooperated with the making of Big Eyes (she painted Burton’s then wife Lisa Marie in 2000, and I think she appears at the end of the film alongside Amy Adams, who plays her), this seems sort of surprising. On the surface, the movie tells the story of her artistic reputation being rightly restored, but that surface is easily punctured with a moment’s consideration of the various counternarratives woven into the script. Then we are dealing with a film about a visionary who turned his wife’s hackneyed outsider art into one of the most popular emblems of an era and who has since been neglected and forgotten, despite inventing art-market meta-strategies that have since become ubiquitous.  The movie seems to persecute Walter because the filmmakers believed it was the only way they could get us to pay enough attention to him to redeem him.

I went in to see Big Eyes expecting a cross between Burton’s earlier Ed Wood and Camille Claudel, the biopic about the sculptor whose career was overshadowed by her romantic relationship with Rodin, whom she accused of stealing her ideas. That is, I thought it would be about how female artists have struggled for adequate recognition, only played out in the register of kitsch pop art. I figured Burton would try to capture something of whatever zany, intense passion drove Margaret Keane to make her “big eye” paintings, much as he had captured Ed Wood’s intensity in the earlier film. We would see a case made for the legitimacy of Margaret’s work, which is now often seen as campy refuse, maudlin junk you might buy as a joke at a thrift store, at the same level as Love Is… or Rod McKuen poetry books.

But Burton doesn’t make much of an effort to vindicate Margaret on the level of her art. No explanation is suggested for why she paints or why audiences connected to her work. Rather than giving the impression that no explanation is necessary, that its quality speaks for itself, this omission has the effect of  emphasizing the film’s suggestion that the significance of her painting rests with the innovative job Walter performed in getting people to pay attention to it, operating outside the parameters of the established art world. Meanwhile, Margaret’s genius remains elusive, as unseeable as it was when Walter effaced it. Margaret is a bit of a nonentity in the film, locked in a studio smoking cigarettes and grinding out paintings at her husband’s command, much as if she were one of Warhol’s Factory minions, while Walter is shown as a dynamic, irresistible figure who comes up with all the ideas for getting her work to make a stamp on the world. In fact, in the script, Burton likens Walter to Warhol multiple times and the movie even opens with a Warhol quote (from this 1965 Life article) in which he praises Walter Keane: “I think what Keane has done is just terrific. It has to be good. If it were bad, so many people wouldn’t like it.”

Since this quote came before seeing anything of the story, I took it as Burton’s attempt to use a name-brand artist’s imprimatur to validate Margaret’s work in advance for movie audiences who possibly wouldn’t read any irony in Warhol’s statement — Burton could laugh at his audiences and show his contempt for their expectations by rotely fulfilling them, as he had with Mars Attacks and the Planet of the Apes remake. But (as usual) I was being too cynical. Afterward, I started to think Burton was in earnest in choosing this quote, and that Big Eyes is instead subverting the expectations liberal audiences might have of it being a stock feminist redemption story. It mocks those audiences, mocks the indulgence involved in using depictions of the past to let ourselves believe we have now somehow transcended the bad old attitudes of sexism. The somewhat smug and self-congratulatory view that “Nowadays we would accept Margaret Keane as a real artist and see through Walter Keane’s tricks” is complicated by the fact that Margaret’s art is kitsch and that Walter’s tricks come not at the expense of art but are instead the sorts of things that nowadays chiefly constitute it.

Margaret is depicted as the victim of Walter’s exploitation, but that view is too simplistic for the film that ostensibly conveys it. It makes Margaret passive, intrinsically helpless, easily manipulated. So simultaneously, Big Eyes gives a convincing portrait not of Margaret’s agency, as you might expect, but of Walter as a passionate, misunderstood genius, a Warhol-level artist working within commercialism as a genre, doing art marketing as art itself with the flimsiest of raw materials and executing a conceptual performance piece about identity, appropriation, cliches, and myths about creativity’s sources that spanned a decade. When the script has Walter claim having Walter claim that he invented Pop Art and out-Warholed Warhol with his aggressive marketing strategies, we can read it “straight” within Margaret’s redemption story as a sign of Walter’s rampant egomania. But the film actually makes a solid case for that being plausible, stressing how Keane was able to bring art into the supermarket before Warhol brought the supermarket into art.

Similarly, when Margaret discovers that the Parisian street scenes Walter claimed were his own while wooing her were actually painted by someone else and shipped to him from France, she is shocked, and we are seemingly supposed to share in this shock and feel appalled. But it makes as much sense to want to applaud his audacity and ingenuity, his apparent ability to assemble and assume the identity of an artist without possessing any traditional craft skills at all. He’s sort of the ur-postinternet artist.

All of Big Eyes is shot as if the material has been viewed naively through the child-like big eyes of one of Margaret’s subjects, a perspective from which Walter’s acts just seem selfish and insane. But Burton is careful to allow viewers to regard the action from a more sophisticated perspective, which reads between the lines of what is shown and looks beyond the emotional valences of the surface redemption story being told. Margaret’s character always acknowledges Walter’s marketing acumen in the midst of detailing his misdeeds, and she never explains why she helped Walter perpetrate his fraud, other than to say, “He dominated me.” From what Burton shows and has Margaret say, this domination is less a matter of intimidation than charm. As awful as his behavior might have been in reality, Walter is little more than a cartoon villain in the film’s melodramatic domestic scenes; the misdeeds Burton depicts are Walter’s getting drunk and belligerently accusing one of Margaret’s friends of snobbery for rejecting representational art, and his flicking matches at Margaret and her daughter when he is disappointed about her work’s reception.

Of course, Walter’s primary crime is making Margaret keep her talent a secret (an open secret, apparently) — “from her own daughter!” even. He capitalizes on a sexist culture to take credit for Margaret’s ability, and then uses the specter of that sexist culture to control her, while more fully enjoying the fruits of what her ability brought them — the fame, the recognition, the celebrity hobnobbing, and so on. But Big Eyes also makes a point of undermining that perspective to a degree, making it clear that Margaret (in part because of that same sexist culture) never would have had the gumption to make a career out of painting without Walter’s support, and certainly she wouldn’t have been able to follow through with all the self-promotion necessary to sustain an art career and allow it to thrive. We are told that she didn’t want the spotlight; at the same time we are supposed to see her being denied the spotlight as part of her victimization. Walter helped created conditions in which Margaret could paint as much as he exploited the inequities of those conditions. And Margaret triumphed in ways that go far beyond the limited accomplishment of earnest “self-expression.”

During the trial scene, which is supposed to be Margaret’s ultimate vindication, one instead gets a sense through her testimony, and Walter’s outlandish performance as his own lawyer, that he will stop at nothing to put across his vision of the world and himself, despite not having any talent with traditional materials of representation. Doesn’t that make him the greater artist, the film seems to suggest, that he can use other people as his medium? All Margaret can apparently do is the parlor trick of making a big-eye painting in an hour in the courtroom. Whereas Walter could get Life magazine to interview him and tell his story, he could contrive elaborate background for how he suddenly came to paint waifs and kittens, and he could get his wife to willingly make all of his work for him and let him sign it as his own.

It is hard to walk away from Big Eyes without wondering just how much Margaret and Walter collaborated on the character of “Keane,” the artist who made compelling kitsch, and it’s hard not to feel sorry for him when before the ending credits we are shown a picture of the real Walter Keane, with text explaining how he died broke and penniless while he continued to insist on his own artistic genius. I wondered if in working with Burton, Margaret wasn’t still covertly collaborating with Walter, muddying the waters around their life’s work and letting some ambiguity flourish there. This impression, more than anything depicted explicitly in the film, gave me the strongest sense of Margaret’s character, beyond cliches of resiliency and self-actualization.

 

Selfies without the self

hey you fuckfaceTaking selfies is routinely derided as narcissistic, a procedure of solipsistic self-regard in which one obsesses over one’s own image. But selfies are not solipsistic; they are only selfies if they circulate. The term selfie not only labels an image’s content (though this usage is slipping, as when TD Bank invites me to “take a check selfie” to deposit it), but it also describes a distribution process. Selfie is shorthand not just for pictures you take of yourself but instead for one’s “self in social media” – one’s self commoditized to suit the logistics of networks.

As art critic Brian Droitcour writes:

Producing a reflection of your image in Instagram always involves an awareness of the presence of others, the knowledge that your selfie is flaking and refracting in their phones. Labeling this reflection #selfie tacitly recognizes the horizontal proliferation of reflections, the dissolution of personhood in the network. The real narcissists are the ones who never take selfies. They imagine their self as autonomous, hermetic—too precious to be shared.

If selfies are not narcissistic, sometimes they are perceived to be the opposite, too performatively strategic. Posting selfies is often seen as part of an effort to build social capital, an effort to deploy the self in a social network to gain attention, reputation, influence, and so on. It instrumentalizes self-representation; selfies are a way to explicitly conflate ourselves with objects to be manipulated. If Droitcour is right, selfies, when they enter circulation, aren’t a matter self-expression (as their defenders sometimes claim) but self-surrender. This could be a precursor to moving past the political limits of individualism, yet selfies nonetheless exemplify an instrumental attitude toward the self that may block intersubjectivity. You can “flake or refract” me, selfies seem to say, but only at the level of these images. This locks the terms of interpersonal engagement at the level of image exchange.

Selfies may be mistaken for autonomous self-expression: an assertive, short-circuiting gesture that recuperates the communication/surveillance platforms that otherwise contain the self. But selfies don’t tap a suppressed inner essence; they develop the “self” as an artisanal product line. What they express depends less on what they depict then on how well they circulate, what uses they are put to within networks.

The selfie commemorates the moment when external social control — the neoliberal command to develop a self as a kind a capital stock and serially reproduce oneself in self-advertisements — is internalized as crypto-defiance. I’m not going to consume their images, I’m going to make one of my own, take control of how I’m seen!

With selfies we can think we are asserting an agency that escapes control, though this is control’s exact contemporary mechanism: producing ourselves as an object for the network, performing the obligatory work of identity construction in a captured, preformatted space. Selfies, then, primarily signal the availability of the self to the network.

The practice of selfie-making doesn’t eradicate the infrastructure of commercially exploitable identity that is embedded in the media tools for “expressing” it. The selfie doesn’t invent a language of identity; it marks a voluntary entry into established codes, reinforcing their validity even if a particular selfie tries to subvert them.

Alexander Galloway claims that the economic mobilization of self-production that selfies epitomize have prompted a new “politics of disappearance”:

The operative political question today, thus, in the shadow of digital markets is … the exodus question: first posed as “what are we going to do without them?” and later posed in a more sophisticated sense as “what are we going to do without ourselves?”

Maybe selfies are a step in the direction of answering that. The selfie is sometimes condemned for its inauthenticity, but in its explicit constructedness, the selfie may herald the emergence of a postauthentic self: a overtly manufactured self that is confirmed and rendered coherent in an audience’s reactions and always changing with each image, as opposed to a static “real self.”

In other words, selfies assault the notion of autonomous, persistent, transcendent identity. Intentionally or not, the willingness to take them and share them demonstrates you don’t believe in the “authentic” self inside but instead in the desire to be remade anew in any given moment. Selfie taking recognizes that the notion of the “self” always implies another’s point of view on it, a perspective that generates it. The act of taking a selfie simulates and evokes that outside point of view. It makes our self real to us, something we can experience and consume, at the expense of pretending to be someone else as we look.

The selfie breaks us out of the cage of static identity, but the platforms they are posted to shove us back in, associating and attempting to integrate all the data they generate. The platforms affirm that I’m a discrete self, one baseball card in their pack, with my statistics always printed on the back.

Social Media Is Not Self-Expression

Screen Shot 2014-11-14 at 1.41.06 PM

1. Subjectivation is not a flowering of autonomy and freedom; it’s the end product of procedures that train an individual in compliance and docility. One accepts structuring codes in exchange for an internal psychic coherence. Becoming yourself is not a growth process but a surrender of possibilities that we learn to regard as egregious, unbecoming. “Being yourself” is inherently limiting. It is liberatory only in the sense of freeing one temporarily from existential doubts. (Not a small thing!) So the social order is protected not by preventing “self-expression” and identity formation but encouraging it as a way of forcing people to limit and discipline themselves — to take responsibility for building and cleaning their own cage. Thus, the dissemination of social-media platforms becomes a flexible tool for social control. The more that individuals express through these codified, networked, formatted means to construct a “personal brand” identity, the more they self-assimilate, adopting the incentive structures of capitalist social order as their own. (The machinations of Big Data make this more obvious. The more data you supply, the more the algorithms can determine your reality.) Expunge the seriality built into these platforms, embrace a more radical form of difference.

2. In an essay about PJ Harvey’s 4-Track Demos, Michael Barthel writes:

while she was able to hole up in a seaside restaurant and produce a masterpiece, I need constant feedback and encouragement in order not to end up curled in some dark corner of my house, eating potato chips and refreshing my Tumblr feed in the hope that someone will have “liked” my Photoshopped picture of Kanye West in a balloon chair.

He’s being a bit facetious, but this is basically what I’m trying to get at above: the difference between an inner-directed process of discovery and a kind of outer-directed pseudo-creativity that in its pursuit of attention gets overwhelmed by desperation. I’m trading in a very dubious kind of dichotomizing here, I know — artists make a lot of great work for no greater purpose than attention-seeking, and the idea that anything is truly “inner-directed” may be a ideological illusion, given how we all develop interiority in relation to a social world that precedes us and enables us to survive. But what I am trying to emphasize here is how production in social media is often sold to users of these platforms as self-expressive creativity, as self-discovery, as an elaboration of the self even, but it is really a narrowing of the self to the reductive, defensive aim of getting recognition, reassurance of one’s own existence, that one belongs. That kind of “creativity” may crowd out the more antisocial kind that may entail reclusion, social disappearance, indifference to reputation and social capital, to being someone in particular in a network. Self-invention in social media that is perpetually in search of “feedback” is really just the production of communication, which gives value not to the self but to the network that gets to carry more data (and store it, and sell it).

Actual “self-invention” — if we are measuring it in range of expressivity — appears more like self-dissolution. We’re born into social life and shaped by it; self-discovery may thus entail a destruction of social bonds, not a sounding of them.

Barthel lauds the “demos, experiments, collaborative public works, jokes, notes, reading lists, sketches, appreciations, outbursts of pique” that are “absolutely vital to continuing the business of creation.” But the degree that these are all affixed to a personal brand when serially broadcast on social media depletes their vitality. If PJ Harvey released the demos as she made them to a Myspace page, would there ever have been a finished Rid of Me? Would the end product merely have been PJ Harvey, as the fecund musician?

Social media structure creative effort (e.g., Barthel’s list above) ideologically as “self-creating,” but they often end up as anxiety-inducing, exposing the self’s ad hoc incompleteness while structuring the demand for a fawning audience to complete us, validate every effort, as a natural expectation. Validation is nice, but as a goal for creative effort, it is somewhat limited. The quest for validation must inevitably restrict itself to the tools of attracting attention: the blunt instruments of novelty and prurience  (“Kanye West in a balloon chair”). The self one tries to express tends to be new, exciting, confessional, sexy, etc., because it plays as an advertisement. Identity is a series of ads for a product that doesn’t exist.

The process can’t quell anxiety; this kind of self-expression can only intensify it, focus it onto a few social-media posts that await judgment, narrow it to the latest instances of sharing. Social media’s quantifying metrics aggravate the problem, making expression into a series of discrete items to be counted, ranked. It serves as the infrastructure for a feedback loop that orients expression toward the anxiety of what the numbers will be and accelerates it, as we try to better those numbers, and thereby demonstrate that the self-monitoring is teaching us something about how to become more “relevant.”

The alternative would seem to be a sort of deep focus in isolation, in which one accepts the incompleteness that comes from being apart from an audience, that comes from not seeking final judgment on what one is doing and letting it remain ambiguous, open-ended, of the present moment and not assimilated to an archive of identity. To put that tritely: The best way to be yourself is to not be anybody in particular but to just be.

3. So is the solution to get off the Internet? If social media structure social behavior this way, just don’t use them, right? Problem solved. Paul Miller’s 2013 account at the Verge of his year without Internet use suggests it’s not so simple. Miller went searching for “meaning” offline, fearing that Internet use was reducing his attention span and preoccupying him with trivia. It turns out that, after a momentary shock of having his habits disrupted, Miller fell back into the same feelings of ambient discontent, only spiked with a more intense feeling of loneliness. It’s hard to escape the idea of a “connected world” all around you, and there is no denying that being online metes out “connectedness” in measured, addictive doses. But those doses contain real sociality, and they are reshaping society collectively. Whether or not you use social media personally, your social being is affected by that reshaping. You don’t get to leave all of society’s preoccupations behind.

Facebook is possibly more in the foreground for those who don’t use it than for those who have accepted it as social infrastructure. You have to expend more effort not knowing a meme than letting it pass through you. Social relations are not one-way; you can’t dictate how they are on the basis of personal preference. As Miller puts it, describing his too-broad, too pointed defiance of the social norms around him, “I fell out of sync with the flow of life.” Pretending you can avoid these social aspects of life because they are supposedly external, artificial, inauthentic, and unreal, is to have a very impoverished idea of reality, of authenticity, of unique selfhood.

The inescapable reciprocity of social relations comes into much sharper relief when you stop using social media, which thrive on the basis of the control over reciprocity they try to provide. They give a crypto-dashboard to social life, making it seem like a personal consumption experience, but that is always an illusion, always scattered by the anxiety of waiting, watching for responses, and by the whiplash alternation between omnipotence and vulnerability.

Miller’s fable ends up offering the lesson that the digital and the physical are actually interpenetrated, and all the personal problems he recognizes in himself aren’t a matter of technologically mediated social reality but are basically his fault. This seems too neat of a moral to this story. Nothing is better for protecting the status quo than convincing people that their problems are their own and are entirely their personal responsibility. This is basically how neoliberalism works: “personal responsibility” is elevated over the possibility of collective action, a reiteration of requirement to “express oneself” as an isolated self, free of social determination, free for “whatever.”

What is odd is that the connectivity of the internet exacerbates that sort of neoliberal ideology rather than mitigating it. Connectivity atomizes rather than collectivizes. But that is because most people’s experience of the internet is mediated by capitalist entities, or rather, for the sake of simplicity, by capitalism itself. You can go offline, but that doesn’t remove you from the alienating properties of life in capitalist society. So the same “personal problems” the Internet supposedly made you experience still exist for you if you go offline, because you are still in a capitalist society. Capitalist imperatives are still shaping your subjectivity, structuring your time and your experience of curiosity, leisure, work, life. The internet is not the problem; capitalism is the problem.

Social media offer a single profile for our singular identity, but our consciousness comprises multiple forms of identity simultaneously: We are at once a unique bundle of sense impressions and memories, and a social individual imbued with a collectively constructed sense of value and possibility. Things like Facebook give the impression that these different, contestable and often contradictory identities (and their different contexts) can be conveniently flattened out, with users suddenly having more control and autonomy in their piloting through everyday life. That is not only what for-profit companies like Facebook want, but it is also what will feel natural to subjects already accustomed to capitalist values of convenience, capitalist imperatives for efficiency, and so on.

So Miller is right to note that “the internet isn’t an individual pursuit, it’s something we do with each other. The internet is where people are.” That’s part of why simply abandoning it won’t enhance our sense of freedom or selfhood. But because we “do” the internet with each other as capitalist subjects, we use it to intensify the social relations familiar from capitalism, with all the asymmetries and exploitation that comes with it. We “do” it as isolated nodes, letting social-media services further suppress our sense of collectivity and possibility. The work of being online doesn’t simply fatten profits for Facebook; it also reproduces the condition that make Facebook necessary. As Lazzarato puts it, immaterial “labour produces not only commodities, but first and foremost the capital relationship.”

4. Exodus won’t yield freedom. The problem is not that the online self is “inauthentic” and the offline self is real; it’s that the self derived from the data processing of our digital traces doesn’t correspond with our active efforts to shape an offline/online hybrid identity for our genuine social ties. What seems necessary instead is a way to augment our sense of “transindividuality,” in which social being doesn’t come at the expense of individuality. This might be a way out of the trap of capitalist subjectivity, and the compulsive need to keep serially producing in a condition of anxiety to seem to manifest and discover the self as some transcendent thing at once unfettered by and validated through social mediation. Instead of using social media to master the social component of our own identity, we must use them to better balance the multitudes within.