facebook twitter tumblr newsletter
Marginal Utility
By Rob Horning
A blog about consumerism, capitalism and ideology.
rss feed

Vinyl re-enchantment


The Economist‘s website has an article about Record Store Day, a marketing stunt during which a bunch of vinyl-only releases and reissues are choreographed in hopes of driving music buyers to support some brick-and-mortar businesses. Every label wants to piggy-back on the hype of every other release, leading to an overwhelming hodge-podge of material record stores are supposed to carry to be full participants in the event. This creates problems for the stores that the event is supposed to help, saddling them with stock whose appeal to non-hardcore record collectors may already be questionable.

I have never understood the point of Record Store Day, in part because I have not traditionally been sentimental about record stores. I tend to associate them with judgmental clerks and aggressive taste peacocking and stereos playing the most confrontational music the workers could get away with to keep the store clear of unwanted browsers. Independent record stores often seemed more like clubhouses, and I was never confident enough in my tastes to believe I could truly belong. The record store was a place where “nerds” could be vengeful bullies; now that we have a whole culture that is like that, record stores feel a bit redundant in that respect.

But the more salient reason Record Store Day repels me is that it runs counter to what I do find appealing about shopping in small record stores, the fact that I can’t predict what they will actually have in stock. Record Store Day supplies you with a prefab shopping list and an easy way to cross off every item on it — just come early (or have a friend who works at the store; it’s still a private club, a market in which it matters who you know). If I wanted to shop in a market that I knew in advance would have what I want, I would go on to Amazon, or to Spotify. But I started to buy vinyl again not for the records so much as for the intermittent rewards. Going to a used record store not knowing what I will find allows me to go in not knowing for sure even what I want — and this expands my capacity for desiring things. It re-enchants consumption for me, for better or worse. I have a list in my head of records I hope to come across some day, but since I can download all this music to actually listen to it, I am more invested in the quest itself than its completion. It keeps me flipping through crates, looking for a lottery-like payout.

Record stores are a bit like thrift stores in that they produce a sense of rarity and serendipity, a shopping experience that can trump whatever it is one ends up buying. When I go to record stores, I want to enter into a fantasy of one-of-a-kind finds, of consumption sweetened by its contrived precariousness. I want to pretend I’m in a world where you have to earn your consumer pleasures, and where the bond between pleasure and ownership is still tight. (This is a depressing realization.)  It’s not uncommon for me to be so excited about coming across a record I love that I’ll buy it again, “forgetting” that I already have a vinyl copy.

I want there to be some sort of excuse for taking pleasure in the sheer act of buying records. I want to pretend record buying is not just another species of luxury indulgence, so I come up with specious theories about why it is somehow righteous — ethical, even. (Record Store Day smacks of this sort of moral posturing.) In The People’s Platform Astra Taylor makes a strong case for “sustainable cultural consumption” supporting the creative ecosystem of artists by paying for their work — but buying used records hardly qualifies for that. Instead, I am susceptible to fetishizing records as occulted objects, making claims about how they capture the way recorded music is “really” supposed to sound. (The Economist article mentions a museum exhibition in Oakland devoted partly to this premise that listening to vinyl is more “intentional,” whereas this post does some work to demystify the illusion of mistaking nostalgia for a medium for measurable sonic superiority.) I am prone to assigning my records an aura, marveling at their unique patina, the skips and scratches and pops that bequeath me a genuinely unique listening experience. No one else out there gets to enjoy those bona fide skips on the copy of Moby Grape’s Wow that I just bought at the WFMU record fair. Those are now for me alone.

My craving for these purely idiosyncratic consumption experiences has something to do with wanting to enjoy something unsharable, something that can’t go viral, as though that might authenticate it in the solipsistic counter-reality I try to create for myself. The curator of the Oakland exhibition remarks that albums, as material objects, places the emphasis on music appreciation’s “social aspect,” but I think that’s backward. I want to use records to prove that I am “better” than those social pleasures of validation that are now so readily sought for online. I want records so I can try to remind myself  that I can get autonomous joy from a private world of things.

This is essentially sociologist Colin Campbell’s theory of modern consumerism, which is summarized clearly in this paper. The problem with “autonomous hedonism” pursued in the individual imagination is that “the more proficient one becomes at creatively imagining emotions and sensations, the more likely it is that ‘real’ consumption fails to deliver a comparable intensity of pleasure.”

I find that I yearn for pleasure in pure ownership because I don’t have time for use value.  I don’t have any time to play the records I already own. In fact, I resumed buying vinyl a few years ago, before I even had a working turntable. I started to assemble a collection for the sake of the act of collecting, because I was overwhelmed by music online but still wanted to maintain a strong affective bond to it. Keeping buying and enjoying linked in my mind was the only way I could think to do it. My record collection sits in my living room as a testament to the failure of my imagination, the deficiencies in my aesthetic capability. I wonder how many times I will need to play them for penance, until my listening becomes authentic at last.

“Surveillant anxiety” and exceptional conformity


“What does the lived reality of Big Data feel like?” Kate Crawford asks in “The Anxieties of Big Data.” Part of her answer is “surveillant anxiety,”  a double-sided concept meant to capture the mounting fears of both the watchers and the watched, and the way in which they fuel each other. The agencies conducting surveillance collect so much data that “the sheer volume can overwhelm the critical signals in a fog of possible correlations.” The more they know, the more they fear they can’t understand what it indicates. As a consequence they try to collect more data and refine the correlations their algorithms churn up, attenuate the information with theory and expertise from an increasing range of social science disciplines, and make big data smaller, more granular, even as the volume increases. But this only defines more unknown pieces, undertheorized relations between seemingly correlated data points. So the fear intensifies.

At the same time, those surveilled recognize that they are being watched more intently, increasing their fear of what such scrutiny can turn up about them, what sort of synthetic “facts” about them the raw data can be dressed up to suggest. Crawford argues that this leads to a “populace that wishes nothing more than to shed its own subjectivity.” People want to be able to disappear, to blend in, to avoid having subjectivity if all that means is having one ascribed to you.  As the need for social camouflage becomes more urgent, it also becomes — in the classic recuperative manner of capitalism — more aestheticized, stylized. It becomes, as Crawford points out, “normcore.” The anxiety about surveillance, as it is recognized and commercialized, turns into an anxiety about status, about fashionability — about maintaining one’s cool:

the rapid rise of the term normcore is an indication of how the cultural idea of disappearing has become cool at the very historical moment when it has become almost impossible because of big data and widespread surveillance.

The twist that normcore has taken, from being an art term capturing the yearning for desubjectivation over capitalist subjectivity to being a term about a new distinctive fashion trend — suggests how quickly the desire to escape unwarranted institutional notice becomes a desire to be seen in the right way, to control how one is noticed. Left behind is the ability to imagine that social surveillance is not a give; instead it feels even more total, more thoroughly ubiquitous than it probably is in practice. Concepts like normcore help convey the plausibility of a panoptic society, even as the institutions ostensibly operating it are getting overwhelmed by data they can’t process. The concept does the disciplinary work that the hermeneutically challenged agencies can’t.

In other words, normcore is an “apparatus” that controls individual subjects just as much as the state spying agencies. In “What Is an Apparatus?” Giorgio Agamben argues that “what defines the apparatuses that we have to deal with in the current phase of capitalism is that they no longer act as much through the production of a sub­ject, as through the processes of what can be called desubjectification.” In exchange for permitting oneself to be defined by the structuring apparatuses of society, one comes away with a subject that is a nonsubject, one that experiences a distinction in being undistinguished, that revels in exceptional conformity.

Agamben also points out how generic subjectivity leads to a more extensive and pervasive need for surveillance and discipline:

It is only an apparent paradox that the harmless citizen of postindustrial democracies … who readily does everything that he is asked to do, inasmuch as he leaves his everyday gestures and his health, his amusements and his occu­pations, his diet and his desires, to be commanded and controlled in the smallest detail by apparatuses, is also considered by power — perhaps precisely because of this — as a potential terrorist.

The paradox is that the more docile and compliant one is, the more one seems like a suspect in the eyes of authority. This is the governing logic of the big data era, as it has been of every aspiring totalitarian regime. To continue to justify its increasing intrusiveness and expansiveness, our society’s data-collecting capacity requires the view that everyone will ultimately be found guilty.

When everyone is presumed guilty, does that encourage them to be guilty in fact? Since the state already suspects them, will they begin to act suspiciously? Will the additional information the state continually needs to know about us begin to seem inherently subversive to ourselves? Agamben is skeptical about the subversive potential of this “elusive element”

The more apparatuses pervade and disseminate their power in every field of life, the more government will find itself faced with an elusive element, which seems to escape its grasp the more it docilely submits to it. This is neither t0 say that this element constitutes a revolutionary subject in its own right, nor that it can halt or even threaten the governmental machine.

To threaten the “governmental machine” — or to reframe the question in Crawford’s terms, to “find a radical potential in the surveillant anxieties of the big-data era” — Agamben argues that we need to “profane the apparatuses.” If I knew what he meant by that exactly, I would tell you, but it seems to have something to do with reversing the processes of separation, of making “sacred” and therefore unusable, the resources we might otherwise share in common. I am trying to think this through with respect to symbolic resources — signs, signifiers, the things that are temporarily consecrated by fashion (in the service of capitalism’s need for enchanted goods) and then voided (in the service of capitalism’s need for accelerating circulation of goods). With individuated social media, our appropriation of signifiers becomes a process of value creation, and thus a process of re-consecration, of separation, of denying what is common in a common resource, language. Phenomena like normcore seem to yearn to arrest this process and describe a way to be in the world without having to create value. But that is just nostalgia in a time when value is extracted from virtually every move we make. To profane apparatuses, we have to cease to be one ourselves.



Me Meme

Screen Shot 2014-04-29 at 10.38.26 AM

With social media, the compelling opportunities for self-expression outstrip the supply of things we have to confidently say about ourselves. The demand for self-expression overwhelms what we might dredge up from “inside.” So the “self” being expressed has to be posited elsewhere: We start to borrow from the network, from imagined future selves, from the media in which we can now constitute ourselves.

This doesn’t guarantee “personal growth” however. The more we express ourselves for self-definition, the more we limit the self to what we have the means to circulate. The sort of self we can imagine ourselves to be becomes contingent on the available media.

Screen Shot 2014-04-29 at 10.39.59 AM

Social media offer a wealth of new resources (images, links, likes, screen grabs, image macros, serial selfies, emoji, etc.) to continue to bait us into “becoming ourselves.” These seem to let us express the self without the limits of language, or the fixed centrality of the speaking subject. The self expressed through a suite of social-media accounts seems to pour into networked space from all over the place, spilling over the edges of unitary profiles and giving users the freedom to renege on a precomposed identity in favor of ongoing glimpses at the process of self-composition.

The push toward a self-concept of perpetual becoming would seem to thwart subjectivation as social control. But the more we construct identity through the means of social media, the more we self-assimilate to the incentive built into them: to turn all experience into more and more strategic expression.

Screen Shot 2014-04-29 at 10.49.21 AM

How damaging is this? Users generally behave as though social media are safe spaces to reveal the self as an imperfect work in progress, but these media typically compile a permanent archive of our “becoming,” which negates its provisional nature. As much as the self feels uncontainable, yet it is contained.

It becomes the user’s problem to manage the tension between the static profile and the real-time experience of being a self in dynamic social-media feeds. How do users balance the freewheeling pursuit of immediacy, attention and salience within the network with the threat that it will all go on their permanent record?

Screen Shot 2014-04-29 at 10.53.53 AM

The tension between the archive and the feed puts pressure on “authenticity,” which lingers on as a compass for the self. If one believes a unique “true self” exists, then the archive is always inauthenticating it, exposing contradictions, disproving its spontaneity, and undermining its supposed originality.

Active social media use urges a de facto rejection of the “true self” that users may not be ready to accept ideologically. It calls for a self that is reconstituted (for others and for oneself) the same way a recommendation page or a news feed is reconstituted in real time.

This pressure fractures authenticity into two divergent sorts of “self”: a self anchored in a “negative theology,” and a postauthentic self. These work simultaneously as complements.

Screen Shot 2014-04-29 at 10.54.48 AM


The possibility that everything we do on networks is surveilled has generated an urge to remove the “true self” from charted territory.

In an essay for Rhizome about Google Glass, artist Molly Crabapple argued that “the things we once called souls are not legible to algorithms […] The network alters you in ways that make you more legible to the network. But maybe there are some things it still can’t get.”

As we make ourselves, or are made, more legible to the network, we may become more aware of what isn’t translatable to its formats. Deeper engagement with the network could provide a more thorough awareness of our “soul.”

In “Returning the Dream,” Adam Phillips quotes psychologist D.W. Winnicott: “At the center of each person is an incommunicado element, and this is sacred and most worthy of preservation.” Phillips interprets this as a “negative theology of the self” and argues that in Winnicott, “the aim of intimacy is to sponsor the solitary unknowability of the True Self.”

If that’s so, then social media would seem to destroy intimacy, replacing it with sharing and copious self-expression. But there is no reason to equate what is shared in social media with what is “authentic” about the self. Social media can support Winnicott’s negative theology of self by allowing us to express precisely what is inessential about ourselves, establishing the negative space where the ineffable self can reside.

Sharing is thereby a process of shedding, a way to purge what we think we know about ourselves but which is inherently wrong because expressible. Our “true self” is then constituted in the silences. The network, in this sense, actually alters us to have a soul, preparing the ground for it. And intimacy, then, is going over someone’s social media offerings and reassuring them that none of it seems much like them at all. The death of intimacy is when someone looks at your Facebook and thinks you’ve got yourself pegged.

But if our “deepest self” is inexpressible by definition, it is nonsensical to make “expressing the true self” into a life goal. Personal or artistic expression cannot then be about “sincerity” or “authenticity.” Yet few would accept that what they express through social media is inauthentic or false. What is going on there?

Screen Shot 2014-04-29 at 11.01.08 AM

What’s going on there is the postauthentic self. The uselessly inexpressible “true self” would doom us to an unbridgeable isolation despite social media’s ubiquitous connectivity. It requires the postauthentic self as functional complement.

The postauthentic self is premised on the possibility that one’s traces — the ephemera of everyday life that is being ejected from the “negative theology” self — can be processed into a makeshift identity on social media platforms, through feedback and algorithmic processing. This identity can be shared and consumed not only by others but by oneself. You can enjoy yourself as a novelty item rather than confront oneself as a problem to solve, or a set of inborn constraints. The “What Would I Say on Facebook” meme was a small example of this, of how self-expression can be supplanted by self-consumption.

Postauthenticity (you might call it “normcore”) rationalizes the terror of ubiquitous surveillance and makes accelerated consumption benevolent. Doing more online causes not more anxiety about whether what you’re doing is “right” but instead fuels the algorithms’ generating a new version of yourself to consume. The burden of uniqueness is on the algorithm, not us.

Being on social media deepens awareness of what can’t be shared, protecting the exclusionary purity of the inexpressible within us, beyond public appropriation. Meanwhile metrics and algorithms stand in as a quasi-populist proxy for the postauthentic self’s approving other in the circuit of self-production.

From brand to meme

Postauthenticity rejects specific consumerist signifers of the self (their increased circulation in social media makes their meaning unreliable anyway) in favor of the “engagement” metrics that track content, which become the newly reliable basis for the self. With the self grounded in metrics rather than specifics, one positions oneself in the social-media environment less as a personal brand than as a meme. One adopts a “viral self,” anchored in continual demonstrations of its reach, based on ingenious appropriation and aggregation of existing content, not in its fidelity to a static inner truth or set of tastes. It is defined by its ability to circulate, not by the content of what it circulates.

To make the self-as-meme register, it’s imperative to associate oneself with dynamic, emotionally resonant things that circulate virally. When you like a brand, for instance, nothing really happens. Since the viral self is a self constructed to exist in feeds, it requires material that is not inert. By sharing viral content, the self is in play. We don’t circulate memes so much as the memes circulate us.

Virality serves an equally tautological token of authenticity, as the spontaneous yet inexpressible self is for the negative-theology self. It indicates the presence of “real” feeling.

Vicarious identification with metrics

Content recedes to mere alibi for engaging emotionally with the circulation data, leading to vicarious identification with how information travels rather than what that information encodes.

We shift from consumerist pleasures of fantasizing about how owning certain branded goods would make us into a certain kind of person and secure us a certain sort of affirmation to fantasizing about triumphant moments of social quantification, about getting likes and retweets, having lots of Tumblr activity, etc.

Virality becomes the horizon beneath which occurrences no longer figure socially, no longer count for anchoring identity or asserting a self.

Screen Shot 2014-04-29 at 11.06.54 AM

Without viral content, you are in danger of becoming a blank.

Screen Shot 2014-04-29 at 11.07.28 AM

A look at the most notorious supplier of viral content, Upworthy, may shed some further light on how to adopt virality as a technique of the self and engineer it as an end in itself.

No content (or self) is inherently viral; likewise, no content is inherently nonviral. Optimization techniques can always be applied. Upworthy’s putative goal is to get its preferred content past filters and into massive circulation, but their tricks end up trumping the content, so the techniques to circulate the material are more revealing than the “things that matter” that they are about. Circulation itself becomes the content.

Most notorious of these techniques is the much-parodied curiosity gap headline: “You Won’t Believe etc.” But as employees explain in a recent New York magazine profile of the company, the specific formula is not the key and could easily change. What matters are what approached get results, as revealed in A/B tests of competing headlines. Hence the “real version” of any story’s packaging derives from the circulation process, with rejected versions doing nothing to compromise the winning one. Upworthy’s success in social media feed suggests we can approach identity in the same way.

Just as genuineness has proved irrelevant to content’s potential virality (stories are sometimes debunked after going viral), it is also irrelevant to the viral self, whose “authenticity” is an after-effect of having marshaled an audience. “Realness” is a matter of attention metrics rather than the fidelity of the content to some core “truth.”

Typical Upworthy material does not privilege uniqueness or complex emotional responses. Such subtleties belong to the negative-theology-of-self world of being so unique, you cannot be expressed comprehensibly. Rather, Upworthy emphasizes the use of formulas because they are comforting and uncool, and repel cynics who are more invested in exclusion than inclusion. They say they try to “reach people where they’re at”: “You don’t want to be that guy in your Facebook feed going, ‘These ReTHUGlicans out there …’ ”

That may be good advice in itself, but more significant, I think, is the assumption about who “you” are: a guy in a Facebook feed. Upworthy content strives to let you become that guy and not get filtered out yourself from the site of viral selfhood.

Upworthy’s intensive use of formulas illustrates how virality becomes a formal genre, recognizable independent of its circulation data. This helps permit one to experience virality as a feeling and identify with it in the face of inevitably disappointing actual metrics.

By encoding audience enthusiasm at the level of form, viral content permits vicarious participation not only in the viral story — whose apparent popularity helps encourage an indulgent suspension of disbelief—but in the social itself, conceived as a flow of information in social networks. The function of a story in social relations is shifted: it is not retold in time and dependent on co-presence, but instead linked to in space and “told” — that is, the identity quotient harvested — through circulating in networks.

Screen Shot 2014-04-29 at 11.09.40 AM

If virality is the model, then social-media participation ultimately has no “content.” It is just a matter of exchanging and tallying gestures of sharing. As virality supplants authenticity, the emotion that viral content provokes, a feeling of spreading connection, threatens to become the root of “authentic” feeling. Having other sorts of feelings becomes pointless if you can’t be seen having them. We will want to feel only what will spread.

For instance, because I know my reactions reading can be performed on Twitter, I am sure to have a reaction—to method-act my response and see how it goes over. That is an added incentive for emotionality that I don’t get from perusing People in a grocery-store line.

The ubiquity of virality makes it seem as though one can fit in only by spreading oneself indiscriminately. Social media sustain a measurement system that makes “more attention” seem always appropriate and anything less insufficient. If your appropriated content is not circulating ever more widely, then you are disappearing. This can feel like total exclusion: You are adding nothing to the social bottom line. You are not inspiring anybody. But it is also a confirmation of the other sort of “authentic” self  that must disappear to actually exist.

Artistic autonomy and subsumption

Manufactured Paradise

I’m participating in a seminar at the ACLA conference in New York this weekend, called “Culture and Real Subsumption.” Here are some hastily typed thoughts inspired thus far by it. I am always hazy on what constitutes real subsumption as opposed to formal subsumption (I tried to wrangle with it a little bit here; the terms stem from Marx’s “Results of the Immediate Production Process”); as I understand it, formal subsumption is when pre- or noncapitalist production processes are modified to accommodate the divide between capital and labor, whereas real subsumption is when the production process depends on capital from the get-go and is inconceivable without it, without its scale, its sort of factory organization and division of labor and staging of expropriated cooperation between workers and so on.

I’ve been interested in this terminology as a way to talk about “the real subsumption of subjectivity” as Jason Read puts it — what I take as a kind of consciousness imposed by capitalism that subjects have difficulty thinking outside of. The production of this subjectivity by the individual, subsumed as it is by capitalism, naturally is a profit center for capitalists who manage in various ways to own the means of this production. (Usually I’m thinking of these capitalists as being media companies or social-media platforms.) Identity production is subsumed in that we can’t contrive ways of doing it convincingly for ourselves without it also being a form of exploitable labor for someone else, or in the best-case scenario, for ourselves as a mechanism of accumulating “human capital.” We make our identity with an intrinsic entrepreneurial awareness of the sorts of social relations we are articulating and the sorts of status games we are playing and valorizing. Any self-knowledge outside of that matrix seems not to count, not to be our “real” validated self, or conversely, our real self remains precisely what we can’t articulate, because once we articulate it, even to ourselves, it becomes semiotic capital of one form or another. So “identity” is either a personal brand or something uselessly ineffable.

This pertains to the discussions in the seminar, which have circled around the idea of artistic autonomy. In what sense, if any, can artists or artworks be autonomous. Or what structures the experience of art such that making it gives one a feeling of autonomy? What does it even matter if an artwork is autonomous? Doesn’t that just mean that the culture has succeeded in depoliticizing it? Is autonomy a form of resistance? A form of exodus? A mode of “cruel optimism”? Isn’t the experience of autonomy itself subsumed under capital? The assumption of risk that makes autonomy register — the feeling that what one is doing is not predetermined, its outcome is not preordained, it is a matter of free choice — depends currently on the neoliberal organization of society. It thrusts risk at us and invites us to self-manage, to develop and pursue our own projects (provided the profits they generate accrue to capital).

Generally, the artist’s autonomy is in relation to the market: they are “free” to act as they please if they are working without commercial constraints. But of course, that freedom usually is contingent on economic independence that must be conferred on the artist through some means. (Or maybe artistic autonomy strictly consists of being able to tolerate hunger.)  It may be more that the market structures the possibility of autonomy as a relative freedom from its determinations (not an absolute transcendence of it, and of the need to “earn a living” via a labor market). Artistic autonomy, it seems to me, is a consolation prize for a certain kind of risk-heavy labor that artists and other “creative free agent” types take on — this is something Boltanski and Chiapello’s New Spirit of Capitalism touches on, I think. The “artistic critique” of capitalism, that it leads to drudgery, is resolved by making workers enjoy more creativity within their work, which tends to be a matter of obliterating the work/leisure divide. If you can no longer identify an autonomous sphere of leisure for yourself, you may as well assume that it’s because working itself has become so much “fun.” Social media implement that blurring of work and leisure extremely well, as I argue in this paper.

Under neoliberalism, the ability to enjoy or make art can seem like a consolation prize for entrepreneurial subjectivity, the best modality of that sort of subjectivity rather than a respite from it. Enjoying and making art, in some ways, become more and more the same experience of curation in internet-based art — the value of an individual work becomes hard to differentiate from the value of being able to circulate it meaningfully and make its value augment itself through greater exposure. The “prosumer” mentality comes to govern aesthetics and autonomy, as autonomy is experienced in ersatz freedom to consume what you want and well, and to make what you want of yourself through those appropriative gestures. (Appropriative art being a kind of production that is necessarily marked by tasteful and clever consumption.)

But the perspective that somehow people can be artists outside of capitalism, or prior to their experience of capitalism, is wrong. It’s not that artists are born artists, then capitalism corrupts them. It’s that capitalism sets up a situation where people with certain means can experience themselves as artists and try to move away from more determined-seeming modes of subejctivity within capitalism. The “artists” have the wherewithal and the habitus to try to distance themselves from wage drudgery and meaningless work and declare themselves autonomous — but within capitalism. It’s a measure of capitalism’s continued success and expansion that more and more people feel confident in describing themselves as creative, as artists. The neoliberalist turn hinges precisely on this, that more and more people can imagine themselves artists — in part because ordinary consumption has become a mode of personal expression, in part because capital has placed various forms of audience-building media at nearly every nonimpoverished individual’s disposal, in part because every scrap of one’s life gets turned to account as reputation, as human capital. We get an audience for our creative autonomy in action, a scenario which depends on (is subsumed by) the apparatus of communicative capitalism. If we are being “creative” without an audience, it no longer registers as an expression of autonomy; social media has crowded out the space in which an individual could be content to create without spectators. Now that is simply a failure of nerve, not independence — it’s too easy to circulate one’s gestures of creativity to rest easy in obscurity.

The paper I submitted for the seminar is a version of this post about machine gambling as an analogue for social media. Machine gambling is a visible manifestation of the real subsumption of autonomy — it’s the pathological management of risk and choice by gamblers, which generates a steady stream of profit for the machine owners. The machines are engineered to usher users into the “zone” — a flow state that feels like a fusion of thought and action, of pure autonomy. But in practice it is an instigated compulsion that plays on the brain’s reward system; it’s the very opposite of autonomy at the same time as the purest expression of it. Users become literally fused with the machine, inhabiting a machinic subjectivity, as programmable as a robot through the gaming machines’ sensory overloading and payout schedules.

So the “machine zone” is an expression of the real subsumption of autonomy, autonomy deliberately engineered as an addictive commercial product that turns consumers simultaneously into workers for capital. (Maybe the most chilling thing in Natasha Dow Schüll’s book about machine gambling is her description of “continuous gaming productivity,” the industry’s term for getting users to insert money into machines at a steady, predictable rate.) For users, this experience of autonomy may be “reparative” or restorative insofar as it relieves them of the burden and precarity of entrepreneurial subjectivity by overloading them with it. Machine gamblers indulges rational choice to the point where their subjectivity dissolves and they escape into automaticity. They become a “machine person” free of the neoliberal demands for managing risk and turning all behavior to account through a kind of perverse totalization of that demand within the gambling arena. The addict’s warped experience of autonomy is confirmed by the affective experience of the zone. Playing video poker is radical exodus from the neoliberal self.

Social media functions similarly, if not more deviously. It too invites entering a zone of mindless, machine-driven checking of social-media accounts, looking for their intermittent rewards (likes, comments, retweets, acknowledgments, etc.) It allows for an emptying of the self through sharing it — one can expel the contents of the self into the circulatory machinery of online social networks, where its fate becomes the network’s problem, not yours. But since social media also work as a site of explicit self-production as well as a covert means of self-abnegation, the alibi of  using social media to make human capital is more effective. It better disguises the compulsion as productive, efficient. It masks the experience of this as exploitive without being obviously irrational, as sitting at a poker machine for 40 consecutive hours may strike some people. Checking Twitter every 30 seconds somehow seems less irrational, within the bounds of normal behavior. All the while, the pseudo-autonomy of social media use is binding subjects to communicative capitalism’s pleasures and insecurities, rationalizing its demands for constant work without monetary compensation. Social media work as a mode of seeming self-exploitation in the name of personal expression.

The compulsion of using social media for relief from neoliberalism may feel like artistic autonomy, one’s discovery that by using Facebook, one is secretly a performance artist. But it remains also a neoliberalistic practice of microentrepreneurship. (It’s hard to see how art can be divorced from such entrepreneurship in one’s own reputation or reified creativity.)

Social media use is arguably a masochistic practice that dissolves the self while simultaneously building it out as data/capital for media companies and marketers. (I spell out the masochism part here.) This empties the self phenomenologically, leaving a blankness that engages with the various interfaces. But this process feeds data into the networks’ algorithms which can then restore the self to the social media user as a processed good — a substantiated identity that is objective,a reflection of achieved reputation, achieved human capital. Once again, this resolves some of the pressure of neoliberal subjectivity while sustaining it as an essential form. The self is reported back to us as a jackpot of algorithmically synthesized personal “truths” — and these payoffs keep us somewhat mindlessly engaged with social media. The urgency of self-production as capital switches into a consumer experience of the produced self passively as pleasurable product and then switches back again into insecure search for confirmation through the production of more data in the same form — more updates, more Tweets, etc. to produce the desired feedback of a constituted identity. I will post these notes, and sit back awaiting confirmation of my reality.




Beyond Avant-Garde


It’s not so strange to compare avant-garde artists to social media users. They both produce a lot of content that few people bother to look at. And some commentators might be inclined to regard early adopters of social-media apps as avant-garde consumers, seizing on new possibilities for gratification, evasion, and status distinction. (Be like André Breton and install Secret on your iPhone 5S!)

In The Weak Universalism Boris Groys offers a  somewhat counterintuitive definition of “avant garde” — one that is the opposite of “making it new.” Since novelty is the status quo of consumer culture, the avant-garde seeks to advance from that, Groys claims; they must challenge and change the disposition of perpetual change.

This is a bit self-defeating. Groys argues that avant-garde artists, to evade this, aspire to make work that is “weak,” in the sense of not being contingent, timely. Instead avant-garde artists try to reduce art to its transcendent, essential core. Because avant-garde work, in Groys’s view, is committed to timeless purity, it can garner none of the popularity of mass art, which is rooted in the novel. And it is seen as undemocratic, even though it tries to bring art back to first principles. Shouldn’t art that can be dismissed as something a child could make be regarded as ultra-democratic? Groys writes:

Avant-garde art today remains unpopular by default, even when exhibited in major museums … the avant-garde is rejected—or, rather, overlooked—by wider, democratic audiences precisely for being a democratic art; the avant-garde is not popular because it is democratic. And if the avant-garde were popular, it would be non-democratic.

Groys wants to argue that what is popular is actually in fact elitist, since not all things achieve the same degree of fame. And since avant-garde art is unpopular, it allows ordinary people to see themselves as artists as well, making their own basic, unpopular work.

Indeed, the avant-garde opens a way for an average person to understand himself or herself as an artist—to enter the field of art as a producer of weak, poor, only partially visible images. But an average person is by definition not popular—only stars, celebrities, and exceptional and famous personalities can be popular. Popular art is made for a population consisting of spectators. Avant-garde art is made for a population consisting of artists.

Mass taste is then secretly elitist taste, because you are rooting against the underdogs by liking it. And there is nothing democratic, either, in broadly shared taste for something that is popular. Fascist leaders are popular too.

I am interested in this as it relates to vicarious participation (as opposed to “genuine” participatory art) and the current vogue for virality. It may be that we ordinary “nonartists” are not envious of the avant-garde but trapped within it, and we look to vicarious participation in popular art and virality to escape this curse.

If Groys is right, avant-garde art is at once universal (reduced to its elemental gestures) and only relevant to small local communities; if it were popular, it would become part of a historical zeitgeist and become doomed to be dated. This seems to me analogous to a certain fantasy about the purity of local music scenes and making DIY bedroom/garage music as opposed to the supposed fleeting insubstantiality of enjoying hyped superstars and corporate pop. Think of a million amateur bands playing the same elemental garage music in a million basements, and that is Groys’s avant-garde. Such music not in any way original and doesn’t aspire to be; it instead reiterates the timeless gesture of wanting to make music.

Insular collectives of artists or writers or even just friends on social media provide another analogue. They all consume each other’s work as peers and have narrow enough horizons to ignore the ways in which what they are all doing might be considered derivative. Artists and audiences are one and the same in such circles; making and consuming are simultaneous, and hierarchies among participants are suppressed.

But at the same time, those fabled anonymous garage bands are inspired not merely by the impulse to make music but by the vicarious desire to become like the popular musicians they admire. Amateur garage bands wanted to be like the Beatles or, later, like the Ramones. They wanted vicarious participation in the notoriety of their idols. In their emulation of “popular art” they remain spectators, despite the way in which they contribute to, in Groys’s sense, rendering that art “avant-garde” — they clumsily make it simple, generic, crudely timeless through inept imitation.

So it may be that popular, zeitgeisty mass art is necessary as a sort of timeless inspiration for the trickled-down creative impulse that yields basic, transcendent gestures of art making. If it didn’t exist, we would have to collectively create it through a spontaneous coordination of attention to make what someone is doing appear to be the model for garnering social recognition, to make emulating it worth attempting. The spectator may need an impetus to become an artist capable of consuming/creating avant-garde art as Groys defines it. Groys suggests that “participatory practice” — starting your own garage band, or your own mosh pit, at least — “means that one can become a spectator only when one has already become an artist.”

But one might go further and say we are born artists and find making art boring, childish, regressive, pointless, and we long for exposure to the kind of work that will turn us into spectators. This in turn would make it worthwhile to use our inborn art ability. If we’re always already artists, then what vicariousness and virality offer is a chance to transcend that for something bigger — participation not in the banal routines of self-expression but in something genuinely larger than ourselves, historical.

Social media, as Groys suggests, makes users “avant-garde” to the degree that they try to use it to be “creative” in the sense of expressing themselves in the most generic of ways. Groys argues:

This repetitive and at the same time futile gesture [of making reductive avant-garde art] opens a space that seems to me to be one of the most mysterious spaces of our contemporary democracy—social networks like Facebook, MySpace, YouTube, Second Life, and Twitter, which offer global populations the opportunity to post their photos, videos, and texts in a way that cannot be distinguished from any other conceptualist or post-conceptualist artwork. In a sense, then, this is a space that was initially opened by the radical, neo-avant-garde, conceptual art of the 1960–1970s. Without the artistic reductions effectuated by these artists, the emergence of the aesthetics of these social networks would be impossible, and they could not be opened to a mass democratic public to the same degree.

That seems preposterous. I’ve been willing to argue that self-construction on social media is a kind of democratized performance art, but to claim that Facebook users couldn’t do what they do if it weren’t for actual late 20th century conceptual artists seems absurd. Only some infinitesimal percentage of social-media users would have had any exposure to such work, and even this small group may not have found the encounter particularly emboldening. And Facebook’s engineers weren’t exactly sitting down with Lucy Lippard’s Six Years before coding new features for the site. The aesthetics of Facebook, Instagram, etc., have much more to do with what is built into the interfaces to generate circulation and interaction; these enticements don’t seem to be necessarily influenced by conceptual art. If anything, it’s more that both conceptual art and interface design owe something to cybernetics, network analysis, and postwar computer science.

What I think Groys is talking about is the democratization of the expressive gesture that social media affords in its generic, preformatted fashion. It’s similar to what Julian Stallabrass’s list of the routine subjects of amateur photography, circa 1996: “Landscapes, holiday destinations, loved ones and pets, fragments of the urban scene and natural wonders all come to participate in a continuum in which all objects are known and (when things go well) all respond kindly to the photographer’s subjectivity.” The social media user is like the sun at the center of this universe of benevolent, displayable things. Like an avant-garde reductivist, the social-media sharer breaks down photography to its basic impulse of archiving and possessing. Every instance of social-media sharing is thus a potential repetition of Groys’s avant-garde “weak gesture.” It’s a reiteration of Malevich’s black square — “an even more radical reduction of the image to a pure relationship between image and frame,” Groys explains, “between contemplated object and field of contemplation, between one and zero … We cannot escape the black square—whatever image we see is simultaneously the black square.”

Groys offers the dubious analysis that on social media too many people are sharing too many things and no one could possibly consume it all. (False: That’s what algorithms are for!) This makes it unseen or even unseeable art, akin to Warhol’s Empire. The point isn’t to watch it; the point is simply that it exists as a limit. It is, in a sense, antiviral. It sits there, inert.

But the content of  sharing on social media is not always “weak” in Groys’s sense — it is not all timeless mundanity (photos of domesticity, pictures of meals, etc.). Much of it is an attempt to seem timely, to mark one’s participation in successive waves of hype. It’s writing about very specific things, like, say, House of Cards (to the intense irritation of those who aren’t watching). Often social-media sharing is an attempt to belong to one’s time and show how one is willing to change with it, not transcend it/be out of touch with it in a narcissistic bubble. We may know that everything that becomes popular is just a trend — that it is only “ready to disappear,” as Groys says — but that can make it more, not less, urgent and exciting to participate in it.

This is what pursuing virality as a feeling is about. Groys is right that on social media “the facticity of seeing and reading” a particular piece of content “becomes irrelevant,” but that is because its circulation and metastasis is being so carefully tracked. Virality is an aesthetics for ubiquitous surveillance. It takes being seen for granted and moves beyond that to momentum, circulation. In some ways, the concept of “curation” is too static for this era. One wants to put something out there that develops momentum, that has an unpredictable life span, that offers a vicarious gateway to the unbounded vitality of collective culture, which our solitary interfaces and devices tend to curtail phenomenologically. We try to get stuff (whatever stuff, it doesn’t matter) to go viral to participate in that shared social enthusiasm that surges and dissipates.

The popular, then, is akin to the pre-individual, in Simondon’s sense — a cultural matrix out of which our individuality emerges, its precondition. The avant-garde is the denial of that origin, embracing the mundane inevitability of individuation as some unique personal triumph.

Groys argues that once upon a time we were “expected to compete for public attention.” That seems backward. Once we took for granted a certain recognition of our place and worked to transcend that, to dissolve into something anonymous, urban, genuinely “mass” — the level at which dreams of cosmopolitanism and universal legibility are conceivable. People wanted to escape local attention and vicariously enjoy fame — indulge in the fantasy of ubiquity without surrendering identity the way genuinely famous people must, usually to their psychic destruction.

We want vicarious participation in the popular because it feels less lonely than reclaiming one’s inherent potentiality as a solitary, transcendent avant-garde artist. If everyone can be an artist, no one needs to be congratulated or recognized for being one. Instead, one needs to be recognized for the rarer skill of appreciation, of being able to sympathize with others and unite with them in feeling. Eternity is very lonely.