Adapted version of talk given at University of Southern Maine, November 16, 2012. Image by imp kerr. What does it mean to be “microfamous”? Is that term even worth using? Against the notion that the internet delivers on Andy Warhol’s cliché of everyone being famous for 15 minutes, or that microfame involves overeager and heedless self-branding teenagers, I argue that ”microfame” is a structure of feeling for coping with mandatory requirements to construct identity online.
THE “BALD BRITNEY” CONUNDRUM
On February 16, 2007, having just left a rehab center in Antigua, Britney Spears went to the San Fernando Valley in southern California and gave herself a well-publicized haircut. According to the New York Post, Spears “drove around aimlessly for about half an hour, and then pulled into Esther’s Hair Salon in Tarzana, Calif., at about 8:30 p.m. The Grammy-winning performer sat in her car for about 10 minutes, crying, before jumping out — still bleary-eyed from the tears — and heading into the cut-rate hair salon.” There, she shaved her head while the salon’s hairdresser stood by watching.
In the gossip press at the time, Britney’s bald head was regarded as a cryptic yet unmistakable cry for help. According to Us Weekly, when a paparazzi asked her why she was shaving her head, she replied, “Because of you.” At the tattoo parlor she went to next, Spears explained herself by saying, “I don’t want anyone touching me. I’m tired of everybody touching me.”
But if her problem was too much unwanted attention, then why do something that is sure to attract even more attention? However she may have understood her head-shaving privately — as an attempt, maybe, to reassert control over how she is objectified in the media — it played publicly as further proof of her having lost control.TNI Vol. 10: The Gossip Issue is available now – subscribe for $2
Whenever a celebrity flips out from overexposure, it’s generally served up for schadenfreude. You could pick any number of incidents, but this one seems especially representative to me. The reassuring lesson we are supposed to enjoy here is that too much fame will cause a person to disintegrate, and Britney’s meltdown somehow dignifies our relative obscurity. Unlike Britney, of course, we don’t need constant attention; we don’t depend on the media to reflect back to us the meaning of what we do. With our concern for her mental health as an alibi, we can enjoy the spectacle of her losing her mind from too much social recognition — recognition being in somewhat short supply for the rest of us. While Spears is forced to perpetually produce and consume her own notoriety to have any chance at understanding herself, we can achieve uncomplicated self-knowledge and preserve our true selves in private. She’s a freak; we can live comfortably in the shadows.
Moreover, Spears serves as a sexual scapegoat. It’s typically women who are hounded into this sort of public meltdown, as a punishment for the apparent voodoo of their excessive attractiveness. Such sensationalized breakdowns serve as warnings to all women about the risks they run if they abuse their presumed powers of bewitchment. Britney’s head shaving can be assimilated to the ongoing cultural morality play that constantly reminds us how lasciviousness is always the fault of the object that provokes it, not the person who experiences it.
Britney’s conundrum, however, is less an exception than an extreme depiction of what social media and surveillance bring everyone. And our response to this has been equally contradictory. Though the pressure falls unevenly on people according to gender and other factors, we are all now under similar forms of surveillance, which technological innovation seems geared toward intensifying.
In a 2009 paper, “Unwilling Avatars: Idealism and Discrimination and Cyberspace,” law professor Mary Anne Franks cites example after example of people who have been interpolated onto the Internet to be defamed, threatened, sexualized, and harassed by often anonymous persecutors. Debunking the weirdly persistent idea that the Internet offers a paradise where one can leave behind offline stigmas, Franks describes a “world populated by an increasing number of unwilling avatars, reduced to their physical characteristics, caricaturized, ventriloquized and under attack.” Online, any of us are vulnerable to this sort of aggression, to having what we’ve shared in networks wrenched out of context and used against us to fatefully define our identity.
The reality of unwilling avatars is especially clear in the way women’s visibility in particular is policed online — from the existence of Reddit creepshot forums and revenge porn, to the abusive comment threads on posts written by women degrading their work on the basis of their looks or dismissing their topics as evidence of female narcissism, to the slut shaming and relentless harassment of girls like Amanda Todd, a Canadian teenager who recently committed suicide after posting a grim YouTube video recounting her persecution.
No one would call bald Britney an unwilling avatar, but much of the scrutiny and ridicule she faces derives from the same routinized sexism. Shaming her as an out-of-control celebrity deranged by fame constructs the illusion that sexism is fomented by inappropriate attention seeking and can be cured by more palliative doses of pre-emptive modesty. We are exempt from persecution as long as we eschew the spotlight. Policing female visibility generally serves the same function for men, reassuring them that they are still afforded the unquestioned privilege of being judged on terms that they control. Men can pretend that they are not doomed to ever becoming bald Britneys themselves.
But given the social-media environment, however, this is increasingly untenable, even as a canard. There are as many spotlights as there are people online. This has leveled the playing field somewhat in terms of unwanted scrutiny: This makes the rise in surveillance increasingly ambivalent. It’s not just power watching you, forcing you to control yourself, but it also can create retaliatory forms of resistance. (Think doxxing, etc.) While visibility clearly reinforces existing inequities — reinforcing normative sexism, for instance — it also can plausibly strip privilege from those accustomed to being entitled to it.
FORMS OF SURVEILLANCE
No one will dispute that technology in recent years has changed the nature and scope of surveillance, and our lived experience of surveillance has adapted rapidly but unevenly. Social media has completely changed the stakes and the perpetrators of surveillance.
With the development of mobile technology and social media, there has been an evolution in forms of surveillance. Before, we had the classic panopticon, in which many are observed by the few. If you invert this, you have sousveillance, the few observed by the many. When the Internet was lauded as a tool for transparency, sousveillance was posited as weapon against power, a la WikiLeaks. But sousveillance is also a model of traditional fame — the masses gossiping about a handful of stars, while the rigid distinction between the two categories in maintained. We consume publicity but are not subject to it.
Full-fledged ubiquity of social media is not panoptic, as it’s sometimes mislabeled. Instead it brings about lateral surveillance or “participatory surveillance,” the many observing the many. It enacts a sort of “horizontal control,” inducing us to spy on one another to regulate one another’s behavior and generate marketing data. As law professor Eben Moglen declared, “every time you tag anything or respond to anything or link to anything, you’re informing on your friends.”
The ideological enthusiasm for “participation” disguises the emptying out of privacy, and the inescapable scrutiny and social documentation ushers in “self-surveillance” — a grimmer way of describing online self-fashioning or identity construction. In using social media, we become fatally aware of how we can sell ourselves and thus intensify self-marketing practices. We put ourselves forward as a brand in order to register in these commercially oriented, quantification-driven systems. As use of these sites become more pervasive and normative, we start to seem to have no choice but to self-brand because it is the only way to take the measure of ourselves.
That’s the prerequisite for the condition of microfame. Social media researcher Alice Marwick has described how social media has changed celebrity from “something a person is to something a person does,” and that it “exists on a continuum rather than as a singular quality.” She usefully defines microcelebrity “as a mind-set and set of practices in which one‘s online contacts are constructed as an audience or fan base, popularity is maintained through ongoing fan management, and self-presentation is carefully assembled to be consumed by others.”
What I find provocative about that definition is how inclusive it is: anyone who takes social media’s affordances at face value may end up as microcelebrities, whether they have five followers or 5,000. But while this definition captures the active, self-branding side of celebrity, I think it underemphasizes the bald-Britney side of it a bit. In my view, the threat of surveillance coincides with the opportunity of self-branding.
MICROFAME AS STUCTURE OF FEELING
The threat and the opportunity converge and imbricate themselves to constitute microfame as a structure of feeling, which Raymond Williams defines as capturing “specifically affective elements of consciousness and relationships: not feeling against thought, but thought as felt and feeling as thought: practical consciousness of a present kind, in a living and interrelating continuity.” That’s pretty nebulous as a definition, but I take that to mean that we still experience our everyday life as ruled by matter-of-fact habits even as the nature of everyday life is being overhauled. As a structure of feeling, microfame gives an internalized way for to process impulses, experience, emotions with regard to having an avatar, a mandatory social-media self. But it’s precisely because the metaphor doesn’t work explicitly in our consciousness that it serves these functions. It makes enough sense to not require careful consideration as we are in the midst of everyday decisionmaking and interpretation. It allows the contradictions that come with surveillance, exposure, and the search for social recognition to stand unresolved, despite the ongoing, low-level dissonance they produce.
Microfame is a matter of juggling our residual fantasies of inhabiting some pure, uncalculated, “authentic” identity with the conflicting potential audiences for our constant, inevitable self-performance. It captures the fantasies of omnipotent control and terrors of ultimate abandonment in the midst of all the universal love and harmony you can see flowing down your Facebook newsfeed.
THE THREAT OF INVISIBILITY
Well, okay. So if visibility is so potentially traumatic, and we are the spies inflicting the trauma on one another, why don’t we just try harder to collectively opt out? What’s wrong with radical exodus from social media as a solution?
Aside from the many ways in which social media serve as an agreed-upon repository for public reputability, the problem is a two-pronged threat of invisibility:
(1) participation in social media seems economically necessary under a neoliberal organization of work within a so-called attention economy.
Self-surveillance yields self-branding, a packaging of the self to accommodate precarious economic conditions (one must be flexible, one must put the total personality to work in productivity, and so on).
(2) social media seems experientially necessary to secure sense of social belonging and ontological security. To have a stable sense of self that is “relevant,” an identity with “integrity” with credibility on the burgeoning reputation market administered by Facebook and other online identity repositories.
Participatory surveillance produces a “participatory subject” that knows itself only through audience reactions. Self-worth becomes subject to the law of network effects.
I’m using microfame to try to capture that double threat: with the fame part capturing both the risk and reward of our efforts to manage our visibility, which in many ways is already beyond managing, and the micro part evoking how the affect is parsed out in small doses, in microaffirmations — likes, retweets, reblogs, mentions, and so on. The checking rituals that cement together the disparate moments of the day for more and more people.
Social media can make the feeling of belonging seem like an alienated accomplishment measurable in discrete amounts of individualized attention. In a social environment that’s increasingly congested by competing and unceasing claims for recognition, we must clamor for the attention we do want. Microfame means feeling unduly neglected, not feeling slightly famous.
We must continually reconcile the self we create as a transmittable product with the vulnerable, variable consciousness we inhabit from moment to moment. Despite social media’s scorecards, belonging is also a matter of fleeting, spontaneous empathy, moments of presence in which we’re not just watching and tracking others but experiencing an underlying mutuality. The tension between the immeasurable intimacy of such moments and the precisely metered popularity of the self as a branded identity can become unbearable. It plays out as a perpetually unfolding crisis of insecure exposure that we crave and refuse simultaneously.
USES AND PROBLEMS OF MICROFAME METAPHOR
The fame part of microfame invites some misconceptions worth clearing away.
(1) No one sets out to become microfamous. It’s something to which we are all susceptible; it’s something that happens to us, to how we think of ourselves and what seem like reasonable goals. Our expectations of what we can expect for ourselves shift without our having to deliberately stoke our ambition.
(2) Microfame starts early. People now become unwilling avatars as children. As novelist David Zweig confessed in a recent New York Times essay that “relentless documentation” is “making our children increasingly self-aware.” Children inherit not only the traditional circumscribers of identity — social class, race, gender, geography — but a whole set of images, unremembered memories.
(3) Micro evokes something small and insubstantial, but it actually marks a hunger, an excess dissatisfaction that is generated but can’t be resolved by social media.
(4) It rationalizes “overeager personal branding” and the idea of an attention-economy. The attention economy is unthinkable without microfame as a general condition. Those two ideas I want to look at more closely.
SELF-BRANDING AND SELF-SURVEILLANCE
The imperative of self-exploitation strikes everyone unevenly. It’s conditioned by gender and race and class and social capital and innumerable other things that would be impossible to totalize. The sort of attention we can seek, let alone attract, is not an entirely autonomous choice. What is available, how far we can go, is highly conditioned by where we are already situated. Rewards and punishments are just as variable, despite the deceptive uniformity of identity templates on social-media platforms. Social media systematically efface the distinctions between forms of attention, positing it as uniformly positive and thus universally desirable. They do a poor job of allowing us to calibrate our exposure, which is always theoretically infinite despite whatever temporary barrier privacy settings erect. Their entire logic militates against it.
Social media’s seemingly objective measures of individual reputation and influence are part of the ex-post justification process. The service Klout is merely the most egregious of these; it measures how much influence we “deserve” by being active online. Also, having our thoughts, opinions, friends, and relations turned into marketing data serves to justify surveillance.
That sort of attention constitutes us as a particular kind of sharing subject, confirming that we are “being ourselves” when we produce data, validating the primacy of documents over immediate lived experience.
Political theorist Jodi Dean derives this from the demands of neoliberalism for flexible, self-starting subjects willing to convert all of life into capital:
Neoliberal ideology does not produce its subjects by interpellating them into symbolically anchored identities (structured according to conventions of gender, race, work, and national citizenship). Instead, it enjoins subjects to develop our creative potential and cultivate our individuality. Communicative capitalism’s circuits of entertainment and consumption supply the ever new experiences and accessories we use to perform this self-fashioning — I must be fit! I must be stylish! I must realize my dreams. I must because I can— everyone wins. If I don’t, not only am I a loser but I am not a person at all. I am not part of everyone. (Democracy and Other Neoliberal Fantasies, 66)
Once this sort of documentation takes hold, life becomes a pretense for recording, and social being becomes alienated as “communicative capitalism.” Lives are lived merely to be confessed and monetized on social media, which confer significance to otherwise meaningless seeming events. Getting likes on a photo of a meal is more “significant” than eating it.
Nathan Jurgenson calls this “the Facebook eye”: we experience the “present as always a future past” as we process experience in terms of how we can rebroadcast it in social media. “Our brains always looking for moments where the ephemeral blur of lived experience might best be translated into a Facebook post; one that will draw the most comments and “likes.”
COOL/HIPSTERISM/THE YOUNG-GIRLTNI Vol. 10: The Gossip Issue is available now – subscribe for $2
But what guides that search? One way of articulating it is as the pursuit of “cool”: we want to share things online that make us seem like we know what is going on, like we can keep up and “curate” effectively, that we get current memes and can add our own twist on them.
The degree to which one accepts cool as a legitimate positive value, as something that enhances life, is also the degree to which one has bought into communicative capitalism. Pursuing cool is the pursuit of self-alienation as self-realization. You end up ruining all the cool things you want to share in social media to a degree because you can’t share them without obscuring them with self-importance. Look at me! predominates over Look at this!
In Preliminary Materials for a Theory of the Young-Girl, the French ultra-left theory collective Tiqqun describes this sort of self-promotion through cool as being “reformatted by the Spectacle,” marking “the moment when each person is called upon to relate to themselves as value.” Tiqqun’s figure for such a person is the “young-girl”: the “model citizen as redefined by consumer society since World War I.”
On the slide below are some of the traits Tiqqun assigns to the Young-Girl, which are also characteristics of microfame: the self-consciousness of self-branding, the paranoia of always being watched, the pre-emptive self-objectification. The figure stands in for all the ways in which our sense of self is rendered more insecure in all-embracing social media.
In a lot of ways, it makes more sense to me if you replace the somewhat inflammatory “Young-Girl” with marginally less inflammatory “hipster,” which describes the same figure. The effective hipster signifies a plenitude of cool that it cannot actually possess but that nonetheless inspires a kind of hopelessness in observers, regulating all who come into contact with it by inspiring feelings of inadequacy, disappointment, envy, boredom. The hipster is the bellwether for the sort of subject that understands itself only through surveillance, through the assumption that its every desire is being judged, and that desire is pointless unless it can be displayed and surveyed. Social technology has made this sort of anxiety commonplace.
THE ATTENTION ECONOMY AS FRAMEWORK
All these hipsterized efforts at cool creation feed into what’s sometimes figured as “the attention economy.” Like microfame, this metaphor is problematic — but I think it makes explicit some ideological assumptions that might not otherwise crystallize. “The attention economy” defines social media strictly as a field of opportunity to exploit, downplaying the dangers of a dangerous plain of exposure.
The attention economy is part of the conceptual framework that helps hold together the different components of microfame that pull in opposite directions:
(1) the self as composed of chunks of information, not embodied experience. (self-surveillance yielding an alienated personal brand)
(2) the requirement to produce the self as information, i.e., the self as small media company producing audiences (a participatory subject conducting lateral surveillance)
(3) the requirement to have information about the self priced in a marketplace of cool
(4) the conception of attention as an abstraction, masking its qualitative differences, presenting it chiefly as a product, a reward, as utility plain and simple (“All attention is good attention.”)
(5) attention as the form of self-worth and social recognition
In the classic formulation of the attention economy, from Herbert Simon in 1971, a scarcity of attention is produced by a surfeit of information.
In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.
This leads to our needing to economize our attention. In a sense, TMI produces attention as something worth measuring.
Clearly we have more ready access to information, and we’re dumping and circulating information in online networks like never before. But the problem Simon correctly identifies should be differentiated from the capitalist solution to that problem, which is to regard attention as a capital stock, or as a product subject to supply and demand curves.
Neoclassical economics translates Simon’s insight into the notion that information has become cheap. But also at the same time, social media has induced us to see social life and our identity as information. How elastic is our demand for information, even when it is “social.”? By the attention economy’s lights, we can no longer afford to flatter ourselves that our curiosity is unlimited, even about our favorite subjects or our closest friends. At the same time, economizing attention confronts us with the possibility that there might not be enough microaffirmation to go around. Overwhelmed by information yet encouraged by that very fact to regard ourselves as nothing but information, we become alive to the danger of being diluted to insignificance.
Thinking about attention as capital alienates our attention span; we are supposed to spend attention rationally, as an investment or to win the exchange, rather than as a consequence of finding ourselves absorbed. If attention is cheap, curiosity has become expensive.
Similarly, if we take attention to be a product, the attention economy becomes no different from the traditional media business, in which attention is a matter of brokering audience shares. In this model, information serves merely as attention’s alibi. Information is the medium that permits attention to be measured. Social media’s flood of information serves not to inform but to artificially inflate the value of attention.
This conception takes the “Facebook Eye” one step further; life isn’t merely lived to be documented, but life is documented merely to be quantified.
The attention-economy metaphor reflects our desire to master the fact of being for sale. It offers the conceptual illusion that by measuring how much we are scrutinized, we assert some control over it, and that surveillance is a personal-growth opportunity.This holiday season, give the gift of TNI for only $25
But the solace of abstract quantity can only take us so far. Though we know we are connected to hundreds of friends on Facebook and thousands of people on Twitter, and we know for sure that we are more concretely connected and discretely known than was previously imaginable, this seems only to raise the salience of missed connections, the sense of threatened intimacy, the imminent possibility of abandonment for all those other avatars out there competing for attention. The pressure to communicate raises the disturbing possibility that we might run out of raw material with which to continue the project of maintaining a recognizable self. This pressure is embedded in social media’s algorithms, like Facebook’s EdgeRank, which determines whose posts show up where.
The threat of social disappearance looms: You might have to act like bald Britney or else vanish from the social world altogether. You can be well-documented and a nonentity at the same time.
The attention-driven subjectivity has insecurity built into it, and it’s exacerbated by the practices that are meant to ameliorate it. In this it is a natural evolution out of consumerism, which works the same way, inviting us to chase an authentic idea of ourselves through an endless series of purchases. We’re sold the promise of an essential self to chase along with the goods; we get to blind ourselves to how contingent and open-ended identity is. We get to forget for a moment that we are strangers to ourselves.
But when we broadcast our identity to discover it, the results can only feel inauthentic, no matter how authentic the process may have felt. Online, “becoming oneself” has turned into a crappy job — a compulsory low-paying, low-skill job. Interiority has become a factory; social media the showroom floor. Social media grant coherence and a convenient place to try to make a self, but in exchange we get invested in a notion of the “integrity” of that self (à la Mark Zuckerberg’s infamous claim that “Having two identities for yourself is an example of a lack of integrity”) that we can’t live up to. So we are driven by the online self’s apparent inadequacy, pursuing some purer form of identity that simultaneously renders us more vulnerable to defacement.
Living in an attention economy means dealing with not only a scarcity of time to consume information (and people as information) but also a scarcity of empathy. Attention deficits become double-sided; we don’t have enough to focus on what’s important, and we don’t receive enough to feel solid. Intimate communication becomes inefficient as its token abundance makes it less effective. All of it fails to convince; it all raises more questions of trust rather than answers. The more we put in, the more we think we are stabilizing identity, but really that additional information makes identity more vulnerable to subsequent attack. It all becomes grist for future doxxing.
While social media constructs attention as abstract and positive, the reality of unwilling avatars, unsatisfying selves, remind us that not all attention is equal. The quality of attention re-enters the picture in obscured, anxiety-inducing ways.
I’ve already mentioned how celebrities are scapegoated for so readily accepting alienated identity. We treat them as canny entrepreneurs of exhibitionism to preserve the illusion that for the rest of us that there’s an authentic identity that capital can’t yet touch.
But social media has changed that, and such scapegoating merely disguises our complicity in these networks. If we embrace the idea of a personal brand, we come to seem culpable for the problems visibility causes. “Microfame” has sometimes been used to refer to people who are too eager to compete in the attention economy instead of using social media with appropriate moderation. Those people can then be accused of betraying social media’s promise instead of merely manifesting its logic.
We can disavow and pathologize their confessional practices and reassure ourselves that such people are narcissists; they have been seduced into false values. In 2011, social-media researcher danah boyd described teenagers’ pursuit of microcelebrity in these terms, as “celebritization”:
Celebrity becomes a correlate to a perfect life — money, designer clothes, and adulthood. What being a “celebrity” means is discarded; fame is an end to itself with the assumption that fame equals all things awesome despite all the copious examples to the contrary. So teens only hold on to the positive aspects, hoping for the benefits of becoming famous and ignoring the consequences.
Boyd was responding to a 2011 Rolling Stone profile by Sabrina Rubin Erdley of Kiki Kannibal, a teenager who used MySpace to achieve notoriety but, as boyd puts it, “lacked the resources to handle the onslaught.” Unlike, say, Britney Spears, she doesn’t have the compensatory wealth and protection that comes from having agents, handlers, lackeys, and the institutional support of massive media corporations. Erdley described Kiki Kannibal as “a girl with 12,000 Twitter followers whose actual life is empty of real relationships. She’s trapped in suburban isolation; outside the bubble of her family, her most meaningful interactions are electronic. In real life, she’s lost.” Send your thoughts to email@example.com
The implication is that Kiki Kannibal’s warped pursuit of celebrity stifled her ability to foster intimacy with the appropriate, local people. Instead of learning how to make “real” friendships, she learned only how to market herself online. boyd suggested that girls get trapped in such dynamics because “fame is a toxic substance,” noting that “when the attention is good, it’s really good and it feels really good. And when the attention fades, people can feel lonely and anxious, desperate for more, even if it’s negative attention.” The fantasy of fame is toxic because it translates threats into thrills.
One could pathologize Amanda Todd in a similar way. In her video she explains the genesis of her harassment in terms of the seductiveness of the approval of strangers: “In 7th grade I would go with friends on webcam to meet and talk to new people. Then got called stunning, beautiful, perfect, etc … Then wanted me to flash … So I did.” By the logic of “toxic fame” critique, one might conclude, If only she and her friends could have contented themselves with each other’s approval instead of yearning for “new people,” then she wouldn’t have become so vulnerable.
These cases are represented as tragic outliers, but they seem paradigmatic to me. To pin the desire for “too much” attention on the aberrant psychology of celebrity-addicted teenagers foolishly trying to become like Britney Spears downplays the way social media and the attention economy render affirmation from friends less genuine and less valuable than those from more rare or novel contacts. (In this, it resembles consumerism, with its emphasis on novelty as value.) All social-media users are getting habituated to receiving social recognition in the form of online notifications, digitized receipts of acknowledgment from further and further afield.
Most social media use is admittedly fairly mundane. In a recent paper, “The Public Domain: Social Surveillance in Everyday Life,” Marwick points out that “sharing information … is often motivated by trust and intimacy. Studies show that electronic communication is primarily used to reinforce pre-existing relationships, especially by young people.” The volume of captured communication required to prove trust and intimacy, however, creates an equal and opposite opportunity to violate that trust, producing more communication to address the drama. This becomes part of the backdrop of life’s everyday hassles, the at times tedious work of weaving strong bonds. It may be that the avalanche of microaffirmations from friends prompts us to immediately take them for granted and seek some nonfriend to supply “genuine” unbiased recognition that’s not rote.
To reach such audiences one might resort to exhibitionism, probably the hallmark microfame practice. One can turn to performing a seductive or confessional form of self to reach the purest sort of adoring audience beyond ordinary friendship. So off to the side of the habitual and frustrating contact among close friends, yet accessible at the edges of the same networks, are seemingly objective voices whose interest and attention is novel and thus much more potent. So it blossoms into Tumblr ask boxes or personal YouTube channels or posting selfies or keeping freeform chaos self-harm blogs or whatever, but really it extends through any online social-media behavior motivated by the contradictory feeling of playing offense and defense simultaneously — inscrutable tweets, ambiguous trolling, approach-avoidance flirting, lurking, ad hominem pleading. One seeks intensity at the limits of connectivity.
This intensity-seeking resembles a sort of magical thinking by which ambiguous threatening audiences are automatically converted into appreciative “fans,” to use Marwick’s word. But that seems to falsely imply microfame exhibitionism stems from delusions of grandeur. Valences drawn from celebrity culture and discourse may help suppress the threat these fringe audiences represent, but that doesn’t mean the attention seeker is unusually needy and grandiose.
These edge-of-network experiences are available online to everyone, and social media makes them easier and easier to enact. You don’t have to be particularly extraordinary to reach the outer limits; that’s the democracy of microfame.
Social media becomes a field for taking calculated risks, blending threat and opportunity, for pursuing what gamblers have always called “action,” an intensity over ventured stakes that makes the present moment seem like the only thing that matters in the world. Action can’t happen without stakes to gather the affect to you. Gamblers’ wagers are monetary and wait for the roll of the dice to decide the outcome; the anxious and microfamous risk their reputation for an immediate response. This is why it matters that this behavior can be traced to you. The intensity of microfame stems from defying the threat of invisibility, from turning unwilling avatars inside-out. If the appeal was merely a matter of confessing for confessing’s sake, you could go on Reddit and use a disposable user ID and disgorge yourself.
Erving Goffman argues that we pursue action to access otherwise inaccessible dimensions of character and prove our poise and “composure”— the ability to act natural. In the age of hipsterism, we might call it a display of “authenticity.” Choosing “action” makes us believe we can assert control over the way our lives are contingent and at the mercy of fate. We seem to choose the momentous occasions for ourselves rather than become subject to them. Through our composure in the risky performance of self, we prove that the identity we are constructing is also natural, who we really are. It has gravity; we are not simply deletable, ephemeral.
Most people aren’t courageous enough to seek action outright, Goffman claims, so they pursue vicarious substitutes in entertainment (they consume the extreme risky behavior of heroes in books, movies, TV) or in packaged thrills, like amusement-park rides. Social media, especially highly structured sites like Facebook, offer opportunities for self-definition that shade from vicarious to fully interactive. Like the action at casinos where you get to pretend at high-rollerdom, social-media action is “at once vicarious and real.”
Those moments of extending ourselves, crossing boundaries, are extreme in the way that celebrity can seem extreme — transcending the ordinary limits of a life bounded by time and space to touch the infinite possibilities of other lives intersecting yours. The architecture of social-media platforms encourage us to substitute friends’ and strangers’ attention freely for one another and reckon with the consequences of the imperfect substitution later. But it’s one thing to have, say, a witty retort about a Presidential debate re-tweeted, and quite another to have a picture of yourself show up on a jailbait subreddit. Any communication that once established connection, trust, intimacy, can rapidly be redeployed as exhibitionistic and shaming. The once objective-seeming “fans” become terrifyingly irrational abusers. Microfame posits an eventual social universe of stalkers and sycophants and not much else.
Being microfamous doesn’t make you any less of a nobody. It doesn’t resolve the authentic-self problem; it exacerbates it. It’s a coping strategy that offers ersatz solutions to problems it stabilizes. It gives shape to how to become a self, but the self it shapes is barely tolerable, a kind of self in constant crisis, always oversharing and scrambling.
In this respect, microfame is a manifestation of what Lauren Berlant calls cruel optimism: it makes us hopeful about the incompatible pressures on the self without doing anything to end the crisis. It’s an attempt to make the best of being stuck in surveillance, stuck with inauthenticity.
Overcommittment to the project of self-branding may be a way of making it seem sustainable. The voluntary vulnerabilities of “seeking action” seem to afford psychic protection. It inverts acute self-consciousness into an asset instead of the liability it can be in seeking “spontaneous authenticity.” The self-monitoring of social media can serve as a mode of escape from an overwhelmed self. Berlant writes.
To create forms for managing the post-traumatic drives requires an acute visceral and intellectual sensorium that monitors at all times, judging and distinguishing, yet gathering up sensations generously. Monitoring is more important than the mastery of knowing … that patrolling activity, which enables self-deferral as well. But monitoring in itself assures no authenticity: it just keeps the subject close to the scene, the enigmatic representation.
This seems to me a description of microfame’s radical vulnerability, its compulsive self-revealing seriality. Monitoring forestalls identity, defers it even as it anchors the self with more details. The expression of each additional detail is liberating in the moment; that moment becomes a moment of freedom from the past history of the self — a brief window that remains open until the new detail is assimilated to the general narrative, processed by its audiences or conclusively ignored.
Embracing microfame may be a way of moving beyond individuality through an intensification of it; it becomes a giving of oneself over to what a public will make of you, looking for relief in that annihilation. It can seem that only by revealing yourself, by exposing yourself, humiliating yourself in public that you get to move past the fear of it — a pre-emptive self-bullying to short circuit the universal predation that internet culture intensifies.
According to sociologist Anders Albrechtslund, “Visibility becomes a tool of power that can be used to rebel against the shame associated with not being private about certain things. Thus, exhibitionism is liberating, because it represents a refusal to be humble.” That seems a little overstated.
But visibility and attention, anxiety and self-exposure, shame and countershame are all bound up in tight cycle that yields more product to flood the channels of communicative capitalism without ever really endangering them. We bear the burden of the contradictions they generate. But the logic can be carried further. If we are convinced that pre-emptive self-exposure is mandatory, than any apparent freedom from microfame could be treated everywhere as a form of undeserved and unearned privilege that should be attacked and exposed, belittled and mocked, until everyone behaves as though they are microfamous — until everyone is, Tiqqun’s term, “Young-Girlified.”
In a way, this would realize Zuckerberg’s dream of compulsory “integrity.” Reflexive sharing admits you as a good-faith member of society; covert online identities or any other subterfuges to preserve privacy are the mark of furtive sociopathy.
To the extent that microfame secures you a loyal group of well-connected devotees ready to mobilize surveillance, rumor, and gossip on social media on your behalf, it works to protect you. In the meantime, it makes privacy truly the prerogative of the ultra-powerful, as posses of microfamous vigilantes online out the more weakly defended. Meanwhile, the lords of privacy will remain above the fray, profiting from the way our crises drive communication and commerce, until the human equivalent of colony collapse disorder sweeps all our contentious, warring hives away.