twitter
facebook twitter tumblr newsletter
blog-marginal-174
Marginal Utility
By Rob Horning
A blog about consumerism, technology and ideology.
rss feed

Collector’s Item

Screen Shot 2015-04-10 at 4.34.03 PM

I am in the process of moving, which entails packing up my record collection, and confronting some awkward questions about why I even have one. The collection is not about the music: I don’t own a single record that I don’t also have in digital form on an array of hard drives and triple-redundant backups. And though I am as prone as anyone to fetishize the “warm” sound of real vinyl, I’m also self-aware enough to be skeptical of my own ears. Plus, down that road leads to things like obsessive fretting about which plants the records were manufactured at and the need to get “hot pressings” to hear how the recording should “really” sound. For me, MP3s are basically fine.

Beyond that, the collection’s bulk makes it incredibly inconvenient, though therein may lie its actual appeal. The inconvenience enchants the act of listening, enchants my labor in assembling the collection. Inconvenience triggers nostalgia, particularly since “progress” tends to be understood in terms of efficiency. The cumbersome nature of putting on a record and then flipping it over conjures all sorts of other lost experiences — dialing rotary phones, looking things up in books, etc. Listening then becomes a journey into a romanticized, half-remembered past from which tedium, frustration, and disappointment have been edited out. And if the record skips, I can always play the song on my phone.

But is that nostalgia enough to justify all the moving boxes? All the packing and unpacking? The collection has become a physical manifestation of sunk costs; it makes me feel like I have come too far to stop now.

Walter Benjamin’s “Unpacking My Library” is sort of the canonical account of collector-hood, but I am put off a bit by his talk of “real libraries” and being a “genuine collector.” Much of what he says about collecting books is echoed by Jean Baudrillard in The System of Objects, only Baudrillard makes it all pejorative. Collectors “invariably have something impoverished and inhuman about them,” he writes. They “never … get beyond a certain poverty and infantilism.” (Benjamin, by contrast, rhapsodizes that “to a true collector the acquisition of an old book is its rebirth. This is the childlike element which in a collector mingles with the element of old age.”) Benjamin celebrates the “harmonious whole” of a collection; Baudrillard sees this harmony as pathological escapism.

No matter how open a collection is, it will always harbor an irreducible element of non-relationship to the world. Because he feels alienated and abolished by a social discourse whose rules escape him, the collector strives to reconstitute a discourse that is transparent to him, a discourse whose signifiers he controls and whose referent par excellence is himself.

If this holds for gestures of digital appropriation, it may shed light on Pinterest and Tumblr usage. Although it may seem a bit counterintuitive to view social media this way, posting images and linking them to your profile can be seen as an effort to transcend social judgment, flying in the face of the metrics that want to make it inescapable. On your own Tumblr, you get to be a taste tyrant; each new post supports the fantasy that you can dictate the rules of style for yourself by fiat, beyond the encroachment of cultural-capital anxieties. The mere process of adding another image (rather than the judicious choice of some specific content) can be the means by which you push aside the fear that your choices may be governed by a social logic beyond your control. Any specific item, evaluated on its own, immediately calls forth social standards of evaluation, but the process of accumulation itself is beyond judgment — particularly under capitalism, where only accumulation can be for its own sake.

The metrics, from this point of view, are there to prompt you to try negate them with additional content. Alternatively, one could treat low numbers as proof that one has successfully checked out of the numbers game, regarding objective nonpopularity as a sign of one’s indifference to it.

For Baudrillard, collecting is a way to sustain desire in the face of inevitable death, a way to escape time. “What man gets from objects is not a guarantee of life after death but the possibility, from the present moment onwards, of continually experiencing the unfolding of his existence in a controlled, cyclical mode, symbolically transcending a real existence the irreversibility of whose progression he is powerless to affect.” Collecting allows collectors to turn lost time to cyclical time, subordinating serial acquisition to spatial ordering. And because the collection signifies the collector, it lets collectors “recite themselves, as it were, outside time.”

Baudrillard claims that “what you really collect is always yourself.” Thus he, like Benjamin, argues that possessing objects stands in opposition to actually using them. Any collection tends toward forbidding actual use: think the comic-book collector whose items are rated and sealed in plastic, or the record collector who is afraid to ruin the vinyl by playing it. One collects objects to purge them of their usefulness, subordinate that use value to the curatorial logic that the collector applies so that any collected object signifies only the collector.

But once you start signifying yourself with what you collect, you are consigned to always be collecting:

An object no longer specified by its function is defined by the subject, but in the passionate abstractness of possession all objects are equivalent. And just one object no longer suffices: the fulfillment of the project of possession always means a succession or even a complete series of objects. This is why owning absolutely any object is always so satisfying and so disappointing at the same time: a whole series lies behind any single object, and makes it into a source of anxiety.

I don’t like to admit to myself that I collect records to stabilize my sense of my own identity, and I try to resist the seductive idea that my taste is autonomous, that it makes me unique. Despite how real that feels — no one else has this same weird collection of records as me! — I try to counter that tendency, resocialize my understanding of my taste. I want to demystify my own sense of individuality, collect my way out of the impulse to keep collecting. I want to exempt myself from the problem of being authentic, being unique — a losing proposition, self-hypocratizing. I want to defy individuality only because it seems like a nonconformist thing to do.

Boris Groys, in this passage from On the New, suggests that what’s worse than striving for authenticity is regarding yourself as inherently authentic. 

In many respects, contemporary man is a victim of the theory of original difference. He has been poisoned by the suggestion that, in the absence of all effort, he is already unique, different from all other men at a certain extra-cultural, authentic level of life. That is why he feels a certain frustration attendant upon the inevitable realization of his actual, insurmountable cultural banality.

To escape one’s cultural banality, Groys suggests, one must “work professionally in the cultural field.” Collecting things is a way to pretend to that status, especially if one approaches it not as a connoisseur but as a speculator in cool. I find that when I go to record stores, I get caught up in such games of aesthetic arbitrage. When I go record shopping I tend to only look in bargain bins. These are the records that have been deemed uncollectible, beneath serious notice. Will Straw, in “Exhausted Commodities: The Material Culture of Music,” argues that this built-up sediment of unwanted culture demystifies collectibles in general:

In the ways in which they accumulate, and in the fact that they sit there, unsold, these commodities contradict the definition of the commodity as a signifier of social desire.

The bargain-bin records are, in Groys’s terminology, the “profane,” the cultural material that is the opposite of art, the opposite of what is accepted in the official archive of relevant, memorable, interpretable culture. They are socio-cultural refuse.

To me, these records represent a cultural opportunity to buy low, a chance for me to assert myself in a territory revealed by the receding tide of fashion. By finding “good” records among the refuse, I get to assert a taste I know is highly idiosyncratic (In buying these Linda Ronstadt records, I am choosing something the contemporary market has rejected), and I wager on my own social influence (I will redeem these rejected Linda Ronstadt albums, and when they come back in style, I will have been there all along and can imagine I played some small role in revitalizing them.) And even if what I buy never becomes popular again, I can console myself with the proof of my uniqueness. (Until I remember how banal it is.)

Only in the bargain bins can I shop comfortably, knowing that I am not coattail-riding on someone else’s cultural capital, not following someone else’s fashion. Instead I can pretend both that I am both exercising my sovereign judgment and am indifferent to the whole game of taste, and also fully invested in the game and taking a savvy position within it, letting my taste be wholly guided by tactical positionality within it. When necessary, I can tell myself I have no taste at all — only timely, economically incentivized moves within fashion cycles. I can’t be held responsible for “really” liking anything! I am safely opaque.

Something similar happens, perhaps, in seeking virality in social media. The more apparently it seems that something was posted “just for likes,” the less it says about one’s “true self.” It’s just strategic, and everyone knows and accepts it as such. The more self-promotion you do, the less it seems you are talking about yourself. You’re just talking in the dialect of accumulation, reading from a shared script for the entrepreneurial self.

Curatorial gestures are likewise an amalgam of strategy and self-expression, with one perpetually permitting disavowal of the other. As with the bargain-bin records, If something I reblog on Tumblr gets reblogged a lot later, I can feel partly responsible and enjoy that success; if it doesn’t, I can congratulate myself for my distinctive taste. The pleasure I take in these things in themselves? That’s the most malleable component in the system, so that gets adjusted accordingly, to accommodate the other pressures.

For Groys, such salvage missions are the essence of cultural innovation, the hallmark of the artist’s function since the time of Duchamp’s ready-mades. Art, he argues, stems not from the creative unconscious or from the technical ability to represent objective beauty or truth but from redrawing the boundary between art and not-art. It comes from understanding “cultural-economic logic” and fashion cycles, and having the social wherewithal to affect them. Craft is more or less discarded, and art becomes indistinguishable from curation, collecting. Once the ubiquity of reproduction (mechanical and now digital) makes technical skill superfluous, a kind of mystified ornament, the only significant artistic medium is the cultural archive itself, and the ability to shift things in and out of it.

But there is nothing particularly special about being archived anymore. Digitization has made the cultural archive itself seem massive and amorphous, limitless, even as it becomes easier to search and exhume things from. Rediscovery, revalorization, devalorization, forgetting: all of it happens more quickly, and with lower stakes, since we all know everything is being saved in the cloud anyway. My puny record collection stands against that limitless digital archive; it’s my private attempt to raise the stakes again, even if only at the level of personal fantasy. I will be lugging that fantasy down three flights of stairs, unless I am willing to entrust it to the movers.

Still, it is hard to imagine that anything will ultimately be left out of the millions of petabytes of data being collected and stored. We’re frequently reminded that our little contributions are important enough to register in it — every time an algorithm tries to predict something about us, we know we are in there. We are already unique ID numbers in these databases, we are all inadvertently de facto “professionals in the cultural field.” That makes it all pretty banal from the human point of view. But big data sees the eternal value in all our curating and collecting, and it will save us all.

 

Permanent Recorder

Screen Shot 2015-03-05 at 1.15.22 PMIt used to be easy to mock reality TV for having nothing to do with actual reality — the scenarios were contrived and pre-mediated, the performances were semi-scripted, the performers were hyper-self-conscious. These shows were more a negation of reality than a representation of it; part of their appeal seemed to be in how they helped clarify for viewers the genuine “reality” of their own behavior, in contrast with the freak shows they were seeing on the screen. To be real with people, these shows seemed to suggest, just don’t act like you are on television.

But now we are all on television all the time. The once inverted anti-reality of reality TV has turned out to be prefigurative. In a recent essay for the New York Times, Colson Whitehead seizes on the reality TV conceit of a “loser edit” — how a shows’ editors pare down and frame the footage of certain participants to make their incipient failure seem deserved — and expands it into a metaphor for our lives under ubiquitous surveillance.

The footage of your loser edit is out there as well, waiting … From all the cameras on all the street corners, entryways and strangers’ cellphones, building the digital dossier of your days. Maybe we can’t clearly make out your face in every shot, but everyone knows it’s you. We know you like to slump. Our entire lives as B-roll, shot and stored away to be recut and reviewed at a moment’s notice when the plot changes: the divorce, the layoff, the lawsuit. Any time the producers decide to raise the stakes.

Whitehead concludes that the important thing is that everyone gets an edit inside their own head, which suggests that the imposition of a reality TV frame on our lives has been clarifying. “If we’re going down, let us at least be a protagonist, have a story line, not be just one of those miserable players in the background. A cameo’s stand-in. The loser edit, with all its savage cuts, is confirmation that you exist.” Reality TV models for us what it is like to be a character in our own life story, and it gives us a new metaphor for how to accomplish this — we don’t need to be a bildungsroman author but instead a savvy cutting-room editor. Accept that your life is footage, and you might even get good at making a winner’s edit for yourself.

You could draw a similar conclusion from Facebook’s Timeline, and the year-in-review videos the company has taken to making of one’s raw profile data. These aren’t intrusive re-scriptings of our experience but instructional videos into how to be a coherent person for algorithms — which, since these algorithms increasingly dictate what others see of you, is more or less how you “really” are in your social networks. Facebook makes the winner’s edit of everybody, because everyone supposedly wins by being on Facebook. Everyone gets to be connected and the center of the universe simultaneously. So why not bequeath to it final-cut rights for your life’s edit?

Tech consultant Alistair Croll, in post at O’Reilly Radar, is somewhat less complacent about our surrendering our editing rights. He makes the case that since everyone henceforth will be born into consolidated blanket surveillance, they will be nurtured by a symbiotic relationship with their own data timeline. “An agent with true AI will become a sort of alter ego; something that grows and evolves with you … When the machines get intelligent, some of us may not even notice, because they’ll be us and we’ll be them.”

In other words, our cyborg existence will entail our fusion not with some Borg-like hive mind that submerges us into a collective, but with a machine powered by our own personal data that represents itself as already part of ourselves. The algorithms will be learning how to edit our lives for us from the very start, and we may not recognize this editing as stemming from an outside entity. The alien algorithms ease themselves into control over us by working with our uniquely personal data, which will feel inalienable because it is so specifically about us, though the very fact of its collection indicates that it belongs to someone else. Our memories will be recorded by outside entities so thoroughly that we will intuitively accept those entities as a part of us, as an extension of the inside of our heads. Believing that something that is not us could have such a complete archive of our experiences may prove to be to unacceptable, too dissonant, too terrifying.

Croll argues that this kind of data-driven social control, with algorithms dictating the shape and scope of our lives for us, will be “the moral issue of the next decade: nobody should know more about you than you do.“ That sounds plausible enough, if you take it to mean (as Croll clearly does) that no one should use against you data that you don’t know has been collected about you. (Molly Knefel discusses a similar concern here, in an essay about how kids will be confronted by their permanent records, which reminds me of the “right to be forgotten” campaign.) But it runs counter to the cyborg idea — it assumes we will be able to draw a clear line between ourselves and the algorithms. If we can’t distinguish between these, it will be nonsensical to worry about which has access to more data about ourselves. It will be impossible to say whether you or the algorithms “knew” some piece of information about you first, particularly when the algorithms will be synthesizing data about us and then teaching it to us.

In that light, the standard that “no one should know more about you than you do” starts to seem clearly absurd. Outside entities are producing knowledge about us all the time in ways we can’t control. Other people are always producing knowledge about me, from their perspective and for their own purposes, that I can never access. They will always know “more about me” than I do by virtue of their having a point of view on the world that I can’t calculate and replicate.

Because we find it hard to assign a point of view to a machine, we perhaps think they can’t know more about us or have a perspective that isn’t fully controllable by someone, if not us. Croll is essentially arguing that we should have control over what knowledge a company’s machines produce about us. That assumes that their programmers can fully control their algorithms, which seems to be less the case the more sophisticated they become — the fact that the algorithms turn out results that no one can explain may be the defining point at which data becomes Big Data, as Mike Pepi explains here. And if the machines are just proxies for the people who program them, Croll’s “moral issue” still boils down to a fantasy of extreme atomization — the demand that my identity be entirely independent of other people, with no contingencies whatsoever.

The ability to impose your own self-concept on others is a matter of power; you can demand it, say, as a matter of customer service. This doesn’t change what those serving you know and think about you, but it allows you to suspend disbelief about it. Algorithms that serve us don’t allow for such suspension of disbelief, because they anticipate what service we might expect and put what they know about us into direct action. Algorithms can’t have opinions about us that they keep to themselves. They can’t help but reveal al all times that they “know more about us” — that is, they know us different from how we know ourselves.

Rather than worry about controlling who can produce information about us, it may be more important to worry about the conflation of data with self-knowledge. The radical empiricism epitomized by the Quantified Self movement is becoming more and more mainstream as tracking devices that attempt to codify us as data become more prevalent — and threaten to become mandatory for various social benefits like health insurance. Self-tracking suggests that consciousness is a useless guide to knowing the self, generating meaningless opinions about what is happening to the self while interfering with the body’s proper responses to its biofeedback. It’s only so much subjectivity. Consciousness should subordinate itself to the data, be guided more automatically by it.  And you need control of this data to control what you will think of yourself in response to it, and to control the “truth” about yourself.

Reducing self-knowledge to matters of data possession and retention like that seems to be the natural bias of a property-oriented society; as consciousness can’t be represented as a substance than someone can have more or less of, therefore it doesn’t count. But self-knowledge may not be a matter of having the most thorough archive of your deeds and the intentions behind them. It is not a quantity of memories, an amount of data. The self is not a terrain to which you are entitled to own the most detailed map. Self-knowledge is not a matter of reading your own permanent record. It is not an edit of our life’s footage.

A quantified basis for “self-knowledge” is bound up with the incentives for using social media and submitting to increased surveillance of various forms. If we accept that self-knowledge is akin to a permanent record, we will tolerate or even embrace Facebook’s keeping that record for us. Maybe we won’t even mind that we can’t actually delete anything from their servers.

As our would-be permanent recorders, social media sites are central to both data collection (they incite us to supply data as well as help organize what is collected across platforms into a single profile) and the use of data to implement social control (they serve algorithmically derived content and marketing while slotting us into ad hoc niches, and they encircle us in a panoptic space that conditions our behavior with the threat of observation). But for them to maintain their central place, we may have to be convinced to accept the algorithmic control they implement as a deeper form of self-knowledge.

But what if we use social media not for self-knowledge but for self-destruction? What if we use social media to complicate the idea that we could ever “know ourselves”? What if we use social media to make ourselves into something unknowable? Maybe we record the footage of our lives to define therein what the essence of our self isn’t. To the degree that identity is a prison, self-knowledge makes the cell’s walls. But self-knowledge could instead be an awareness of how to move beyond those walls.

Not everyone has the opportunity to cast identity aside any more than they have the ability to unilaterally assert self-knowledge as a form of control. We fall into the trap of trying to assert some sort of objectively “better” or more “accurate” identity that reflects our “true self,” which is only so much more data that can be used to control us and remold the identity that is assigned to us socially. The most luxurious and privileged condition may be one in which you get to experience yourself as endlessly surprising — a condition in which you hardly know yourself at all but have complete confidence that others know and respect you as they should.

Authentic sharing

023.tif

“Sharing economy,” of course, is a gratingly inappropriate terms to describe a business approach that entails precisely the opposite, that renders the social field an arena for microentrepreneurship and nothing else. Yet the vestiges of “sharing” rhetoric clings to such companies as Airbnb and a host of smaller startups that purport to build “trust” and “community” among strangers by getting them to be more efficient and render effective customer service to one another. What more could you ask of a friend?

By bringing a commercial ethos to bear on exchanges that were once outside the market, the civilizing process that is often attributed to the “bourgeois virtues” of capitalism — with successful economic exchange building the only form of social trust necessary — gets to spread itself over all possible human relationships. The only real community is a marketplace in which everyone has a fair shot to compete.

The freedom of anonymous commercial exchange amid a “community” of well-connected but essentially atomized strangers well-disciplined by the market to behave conventionally and sycophantically is not the sort of community the sharing companies tend to crow about in their advertising. The rhetoric of the sharing economy’s trade group, Peers, is instead saturated with testimonials of communal uplift and ethical invigoration. In an essay about the cult-like methods of sharing-economy indoctrination, Mike Bulajewski cites many, many examples of the companies’ blather about community and the ornamental techniques they encourage among users to sustain the illusion. (Fist-bump your driver! Neato!) He notes that “What’s crucial to realize is that proponents of “sharing” are reinventing our understanding of economic relations between individuals so that they no longer imply individualism, greed or self-interest” — i.e., the bourgeois virtues, which make for atomized “metropolitan” people whose freedom (such as it is) is protected in the form of anonymity and equal treatment in the marketplace. “Instead,” Bulajewski writes, “we’re led to believe that commerce conducted on their platforms is ultimately about generosity, helpfulness, community-building, and love.”

Is this rhetoric fooling anyone? Marketing professors Giana M. Eckhardt and Fleura Bardhi suggest that it is bad for their business. In an article for the Harvard Business Review they recount their research that found that consumers don’t care about “building community” through using services like Airbnb and Lyft; they actually just want cheaper services and less hassle. They want consumerist “freedom,” not ethical entanglements. The platforms are popular because they actually diminish social interaction while letting users take advantage of small-time service providers who are often in precarious conditions and have little bargaining leverage. You “trust” the sharing-platform brand while you exploit the random person offering a ride or an apartment (or whatever) without having to negotiate with them face to face.

When “sharing” is market-mediated — when a company is an intermediary between consumers who don’t know each other — it is no longer sharing at all. Rather, consumers are paying to access someone else’s goods or services for a particular period of time. It is an economic exchange, and consumers are after utilitarian, rather than social, value.

That seems almost self-evident. The sharing-economy companies are not a way to temper capitalism (and its tendency to generate selfish individualists); they just allow it to function more expediently. The sharing economy degrades “social value,” defined here as the interpersonal interactions that aren’t governed by market incentives and economistic rationality, in favor or expanding the “utilitarian value” of consumption efficiency, more stuff consumed by more individuals (generating more profit). Utilitarian value is impeded by the need to deal with other humans, who can be unpredictable or have irrational demands.

Eckhardt and Bardhi propose “access economy” as an alternative term to sharing economy. One might presume “access” refers to the way consumers can pay brokering companies for access to new pools of labor and rental opportunities. Think “shakedown economy” or “bribe economy.” Middlemen like Uber who (like an organized-crime racket) achieve scale and can aggressively bypass the law can put themselves in a prime position to collect tolls from people seeking necessary services and the workers who hope to provide them.

But Eckhardt and Bardhi want to use the term to differentiate renting from owning. People are content to buy access to goods rather than to acquire them as property. Viewing the sharing economy from that angle, though, you can almost see why some are beguiled by its communitarian rhetoric. The sharing economy’s labor practices are abhorrent, but we might overlook all that if we think instead of how it liberates us from being overinvested in the meaning of our stuff. Leaving behind consumerist identity presumably could open the space for identity based in “community” (though it would be more accurate to say an identity based on caste, and what services you render).

Renting is very bad for marketers (it’s not “best practices,” the marketing professors note), because people don’t invest any of their identity into brands they merely rent. They don’t commit to them, don’t risk their self-concept on them. “When consumers are able to access a wide variety of brands at any given moment, like driving a BMW one day and a Toyota Prius the next day, they don’t necessarily feel that one brand is more ‘them’ than another, and they do not connect to the brands in the same closely-binding, identity building fashion.” So what marketers want consumers to want is ownership, which puts their identity in play in a more high-stakes way and gives advertisers something to sink their teeth into. Whether or not consumers actually want to own so many things is a different question. Marketers must insist that they know what consumers want (that’s their rationale for their job); the benefits consumers supposedly reap according to marketers are actually just the ideological tenets of marketing.

This helps bring into focus what a true sharing economy — one that discouraged ownership while imposing reciprocal human interaction — might accomplish. Marketers approve of “brand communities” that let isolated people ”share identity building practices with like-minded others,” but little else. That is, in such communities they can “share” without sharing. They can “share” by buying products for themselves.

But with more widely distributed rental opportunities, identity anchored in what one owns can potentially be disrupted. As Eckhardt and Bardhi  write:

When consumers are able to access a wide variety of brands at any given moment, like driving a BMW one day and a Toyota Prius the next day, they don’t necessarily feel that one brand is more “them” than another, and they do not connect to the brands in the same closely-binding, identity building fashion. They would rather sample a variety of identities which they can discard when they want.

If not for the burden of ownership, then, consumers would conceivably try on and discard the identities implied by products without much thought or sense of risk. They would forgo the “brand community” for a more fluid sense of identity. Perhaps they would anchor their identity in something other than products while enjoying the chance to play around with personae, by borrowing and not owning the signifying resonances of products.

Perhaps that alternate anchor for the self could be precisely the sort of “social value” human interaction that exceeds the predictable, programmable exchanges dictated by the market, and its rational and predictable incentives. This is the sort of interaction that people call “authentic.” (Or we could do away with anchors for the self altogether and go postauthentic — have identity only in the process of “discarding” it.)

Companies like Lyft and Airbnb do nothing to facilitate that sort of interaction; indeed they thrive by doing the opposite. (Authenticity marketing, incidentally, does the same thing; it precludes the possibility of authenticity by co-opting it.) They subsume more types of interaction and exchange to market structures, which then they mask by handling all the money for the parties involved. This affords users the chance to pretend to themselves that the exchange has stemmed from some “meaningful” rather than debased and inauthentic commercial connection, all while keeping a safe distance from the other party.

Sharing companies use their advertising to build a sort of anti-brand-community brand community.  Both sharing companies and brand communities mediate social relations and make them seem less risky. Actual community is full of friction and unresolvable competing agendas; sharing apps’ main function is to eradicate friction and render all parties’ agenda uniform: let’s make a deal. They are popular because they do what brand communities do: They allow people to extract value from strangers without the hassle of having to dealing with them as more than amiable robots.

When sharing companies celebrate the idea of community, they mean brand community. And if they appropriate rhetoric about breaking down the attachment to owning goods as a means of signifying identity and inclusion, it’s certainly not because they care about abolishing personal property, or pride in it. It’s because they are trying to sell their brand as an alternative to the bother of actually having to come up with a real alternative to product-based personal identity. They just let us substitute apps and platforms in for the role material goods played. They cater to the same customer desire of being able to access “community” as a consumer good.

The perhaps ineluctable problem is that belonging to communities is hard. It is inefficient. It does not scale. It doesn’t respond predictably to incentives. It takes more work the more you feel you belong. It requires material sacrifice and compromise. It requires a faith in other people that exceeds their commercial reliability. It entails caring about people for no reason, with no promise of gain. In short, being a part of community is a total hassle but totally mandatory (like aging and dying), so that makes us susceptible to deceptive promises that claim to make it easy or avoidable, that claim to uniquely exempt us. That is the ruse of the “sharing economy”—the illusion it crates that everyone is willing to share with you, but all you have to do is download an app.

Meanwhile, the sharing economy’s vision of everyone entrepreneurializing every aspect of their lives promotes an identity grounded in the work one can manage to win for oneself, in the scheming and self-promoting posture of someone always begging for a job. If its vision of the economy comes true, no one would have the luxury to do little sharing-economy tasks on the side but would instead have to do them to survive. And there would be no safety net because there would be no political solidarity to generate it, and many of its functions will have been offloaded to sharing-economy platforms. The result would be less a community of equals exchanging favors than a Hobbesan war of all against all, with the sharing-company Leviathans furnishing the battlefield and washing their hands of the casualties.

A Man Alone

rod

Rod McKuen died a few days ago. Because I have spent a lot of time in thrift stores, I feel like I know him well, since that’s where lots of his poetry books (Listen to the Warm, Lonesome Cities, etc.) have ended up, alongside the works of kindred spirits Walter and Margaret Keane. His albums, sometimes featuring his singing but generally he just recites his poetry over light-orchestral music, can be found there too. I like “The Flower People“: “I like people with flowers. Because they are trying.”

Artists like McKuen and the Keanes, who achieved unprecedented levels of success with the mass-market audience in the 1960s while being derided by critics for peddling “sentimental” maudlin kitsch, fascinate me — probably a hangover from graduate school, when I spent a lot of time studying the 18th century vogue for “sensibility” novels, which were similarly saturated with ostentatious tears. McKuen has a lot in common with the 18th century “man of feeling” epitomized by the narrator of Sterne’s A Sentimental Journey, who travels around seeing suffering and  ”having feelings,” which prove his humanity and allow readers to experience their own humanity vicariously. McKuen let his audience accomplish something similar with his tales of urban love and loneliness and his wistful recollections of weather and whatnot.

Still, I wonder why the market for McKuen and the Keanes re-emerged just then, in the 1960s? What made reified desolation a sudden hot commodity? Did it have to do with changes in available media, or the general air of postwar prosperity? And what’s the relation between their success and their reputation? Why is that kind of critical contempt they received reserved for artists who commercialize sadness and feelings of loneliness and vulnerability? What did audiences want from their work, such that critics could seize upon it to mock it and make themselves and their readers feel superior to it all?

The New York Times obit of McKuen concludes with a quote from him in which he claims that success turned critical opinion against him:

“I only know this,” Mr. McKuen told The Chronicle in 2002. “Before the books were successful, whether it was Newsweek or Time or The Saturday Evening Post, the reviews were always raves.”

I wonder if McKuen thought that only his popularity kept him from earning the respect that, say, Leonard Cohen or Jacques Brel (whose songs McKuen translated into English-language hits) or maybe even Wordsworth and Whitman tend to get. But such counterfactuals seem beside the point, not only because critical opinion is fickle and ever-changing but because it is impossible to separate the “quality” of a work from the conditions surrounding its reception.

Participating in the phenomenon of McKuen’s popularity (or conspicuously refusing) became essentially what his work was about, beyond the melancholy remembrances about lost lovers and cities at dusk. You were either on board and willing to conform, willing to let McKuen be the way you defused potent and inescapable fears about decay, sadness, anonymity, and fading love along with millions of others and thereby mastered those feelings, put them in a safe place to be admired, or you were not on board, unwilling to conform, unwilling to admit those feelings could be collectively tamed but instead must be personal demons you never stop fighting alone, far more alone than any McKuen poem could ever testify to.

One might be tempted to champion McKuen as a populist who rendered the ordinary person’s feelings and aspirations in easily digested metaphors while the culture snobs sneered. But if you listen to a lot of his music or read through his books, you might end up with the sense that his point of view has more to do with snobs than ordinary people: He seems to travel a lot from glamorous seaside city to glamorous city, indulging in late-night bouts of boozy nostalgic melancholy with little fear of economic want, issuing patronizing advice about how to feel to readers or listeners or discarded lovers or total strangers, luxuriating in emotions as if they were badges of privilege rather than the afflictions he would otherwise have you believe. (Listen to “Earthquake,” for instance.) McKuen sounds like a humble-bragger whose medium is misery; his sadness makes him more important and individuated than less sensitive or self-regarding souls.

I wonder if, when McKuen was popular, critics felt threatened not by his work’s “sentimentality” but its familiarity, which they then labeled “vulgarity” to try to expunge it from their own sensibility.  I know that is how I feel when I listen to his music. It sounds smug to me because I’ve felt those smug feelings and romanticized them privately (lacking the courage or the chutzpah to try to cash in on them). I can’t hear his poems as straightforwardly earnest, like perhaps the millions of people who bought in could. I implicate myself in these works instead, in every self-satisfied line of self-deprecation and self-pity. I recognize someone who wants to feel different from everyone else but still wants them all to feel sorry for him.

McKuen went more or less underground of his own volition in the early 1980s, which perhaps could be seen as a kind of admission of guilt. His obituaries describe him in his reclusion as severely depressed, holed up in his California home with half a million records and CDs. It’s an emblematic tableau that stands as a warning. You can retreat to the mountain with your carefully curated collection of records that prompt you to have all those important feelings that you can’t bear to experience through or with other people, but that’s not going to let you understand what all those people felt when they bought a Rod McKuen record in the 1960s and maybe even played it once or twice.

Simple and Plain

Screen Shot 2015-01-08 at 4.16.41 PM

Today is Elvis Presley’s birthday. He would have been 80. Most people accept that he died in 1977, at the age of 42, which means I am older now than he ever was, a fact I have a hard time wrapping my head around.

I’m currently reading Careless Love, the second volume of Peter Guralnick’s biography of Elvis, and it is bringing me down. It’s about how fame was a collective punishment we administered to Elvis, which he would not survive. Fame allowed him to coast along when he should have been stretching himself; like a gifted child praised too much too soon, it made him incapable of coping with challenges. Fame allowed his manager, Colonel Parker, to construe Elvis’s talent as a cash machine. Parker encouraged in Elvis a zero-sum attitude toward his art, so that he demanded as much money he could get for output as superficial as they could make it, as if the shallowness implied savings, a better bargain from the forces who commercialized him. Fame transformed Elvis into a kind of CEO who inhabited his own body as if it were a factory, a capital stock, on which an enormous and ever-mutating staff relied upon for their livelihood. As a consequence, fame isolated him completely. His friends, no matter how much they loved and respected him, remained a paid entourage whom he could never completely believe actually loved him for real. “He constructed a shell to hide his aloneness, and it hardened on his back.” Guralnick writes in the introduction.  ”I know of no sadder story.”

I first got into Elvis after stopping at Graceland, his home in Memphis, during my first road trip across the U.S., in 1990. I knew very little about him, just what you sort of absorbed by osmosis from the culture. Elvis impersonators were probably more salient than Elvis himself at that point. My grandmother, I remember, had some of his later records: Moody Blue; Aloha From Hawaii via Satellite. I wanted to stop at Graceland because I thought it would be campy fun; I wanted to re-create the scene in Spinal Tap when they experience “too much fucking perspective” at Elvis’s graveside.

But Graceland was surprisingly somber, straddling the line between pathos and bathos, never letting me take comfort in either territory. It didn’t seem right to laugh when confronted with the meagerness of the vision of someone who could have had anything but chose Naugahyde, thick shag rugs, and equipping rooms with dueling TV sets. And it was genuinely humbling to recognize the desperation in it all, the dawning sense that Elvis had nowhere to turn for fulfillment and had none of the excuses we have (lack of time and resources, lack of talent) to avoid confronting inescapable dissatisfaction head on.

In one of the stores in the plaza of gift shops across the street from Graceland, I bought a TCB baseball hat and cassette of Elvis’s first RCA album, the one whose design the Clash mimicked for London Calling.

elivis

Every time it was my turn to drive, I put the tape on; listening to “Blue Moon” while driving through the vacuous darkness of Oklahoma was the first time I took Elvis seriously as a performer, the first time I heard something other than my received ideas about him. Then, like a lot of music snobs, I got into the Sun Sessions and the other 1950s stuff and declared the rest of his career irrelevant, without really knowing anything about it. In recent years, I have overcorrected for that and listened mainly to “fat Elvis” — the music he made after the 1968 Comeback. I’m amazed by moments like this, a 1970 performance of “Make the World Go Away.” Wearing a ludicrous white high-collar jumpsuit with a mauve crypto-karate belt around his waist, he mumbles a bit, tells a lame joke about Roy Acuff that nobody gets, saunters over the side of the stage to drink a glass of water while the band starts the saccharine melody, then out of nowhere hits you with the first lines, his voice blasting out, drawing from a reserve of power that quickly dissipates. Then he skulks around the stage, visibly antsy, as if trying to evade the obvious relevance of the song’s lyrics to his sad, overburdened life.

I never paid any attention to 1960s Elvis, but now, reading through Guralnick’s dreary, repetitive accounts of Elvis’s month-to-month life in the 1960s, when he flew back and forth mainly between Memphis, Los Angeles, and Las Vegas as he accommodated a relentless film-production schedule — he made 27 movies from 1960 to 1969 — fills me with an urgent desire to somehow redeem this lost era of his career, to study it and find the obscured genius in it, to rescue it through some clever and counterintuitive readings of his films or the dubious songs he recorded for them. I just don’t want to believe that Elvis wasted the decade; I don’t want to accept that talent can indeed be squandered, that instead it finds perverse ways to express itself even in the grimmest of circumstances. But this was an era when he was cutting material like “No Room to Rhumba in a Sports Car” (Fun in Acapulco), “Yoga Is as Yoga Does” (Easy Come, Easy Go), “Do the Clam” (Girl Happy), ”Queenie Wahine’s Papaya” (Paradise, Hawaiian Style), and “Song of the Shrimp” (Girls! Girls! Girls!). I’m not sure it’s all that helpful to pursue a subversive reading of Clambake. What there is to see in Elvis’s movies is doggedly on the surface; as Guralnick makes clear, these films were made by design to defy the possibility of finding depth in them.

At best, a case can be made for appreciating Elvis’s sheer professionalism in this era, his refusal to sneer publicly at material far beneath him. Sure, he was on loads of pills, and the epic-scale malignant narcissism of his offscreen behavior was establishing the template for all the coddled superstars to come. But he wasn’t a phony. If he was cynical, it was a hypercynicism that consisted of an unflaggingly dedicated passion for going through the motions. Guralnick describes Elvis in some of these films as being little more than movable scenery, a cardboard cutout, but he is a committed cardboard cutout. A bright empty shell with a desultory name and job description (usually race-car driver) attached, Elvis wanders through an endless series of unconvincing backdrops, reflecting back to us the cannibalizing effects of fame, inviting us try to eat the wrapper of the candy we already consumed.