Free to Choose A or B

Image by Langdon Graves (via)

There has already been a lot written about the Facebook mood-manipulation study (here are three I found particularly useful; Tarleton Gillespie has a more extensive link collection here), and hopefully the outrage sparked by it will mark a turning point in users' attitudes toward social-media platforms. People are angry about lots of different aspects of this study, but the main thing seems to be that Facebook distorts what users see for its own ends, as if users can't be trusted to have their own emotional responses to what their putative friends post. That Facebook seemed to have been caught by surprise by the anger some have expressed — that people were not pleased to discover that their social lives are being treated as a petri dish by Facebook so that it can make its product more profitable — shows how thoroughly companies like Facebook see their users' emotional reactions as their work product. How you feel using Facebook is, in the view of the company's engineers, something they made, something that has little to do with your unique emotional sensitivities or perspective. From Facebook's point of view, you are susceptible to coding, just like its interface. Getting you to be a more profitable user for the company is only a matter of affective optimization, a matter of tweaking your programming to get you pay more attention, spend more time on site, share more, etc.

But it turns out Facebook's users don't see themselves as compliant, passive consumers of Facebook's emotional servicing, but instead had bought into the rhetoric that Facebook was a tool for communicating with their friends and family and structuring their social lives. When Facebook manipulates what users see — as they have done increasingly since the advent of its Newsfeed —  the tool becomes more and more useless for communication and becomes more of a curated entertainment product, engineered to sap your attention and suck out formatted reactions that Facebook can use to better sell audiences to advertisers. It may be that people like this product, the same way people like the local news or Transformers movies. Consumers expect those products to manipulate them emotionally. But that wasn't part of the tacit contract in agreeing to use Facebook. If Facebook basically aspires to be as emotionally manipulative as The Fault in Our Stars, its product is much harder to sell as a means of personal expression and social connection. Facebook connects you to a zeitgeist it manufactures, not to the particular, uneven, unpredictable emotional landscape made up by your unique combination of friends. Gillespie explains this well:

social media, and Facebook most of all, truly violates a century-old distinction we know very well, between what were two, distinct kinds of information services. On the one hand, we had “trusted interpersonal information conduits” — the telephone companies, the post office. Users gave them information aimed for others and the service was entrusted to deliver that information. We expected them not to curate or even monitor that content, in fact we made it illegal to do otherwise...

On the other hand, we had “media content producers” — radio, film, magazines, newspapers, television, video games — where the entertainment they made for us felt like the commodity we paid for (sometimes with money, sometimes with our attention to ads), and it was designed to be as gripping as possible. We knew that producers made careful selections based on appealing to us as audiences, and deliberately played on our emotions as part of their design. We were not surprised that a sitcom was designed to be funny, even that the network might conduct focus group research to decide which ending was funnier (A/B testing?). But we would be surprised, outraged, to find out that the post office delivered only some of the letters addressed to us, in order to give us the most emotionally engaging mail experience.

Facebook takes our friends' efforts to communicate with us and turns them into an entertainment product meant to make Facebook money.

Facebook's excuse for filtering our feed is that users can't handle the unfiltered flow of all their friends updates. Essentially, we took social media and massified it, then we needed Facebook to rescue us, restore the order we have always counted on editors, film and TV producers, A&R professionals and the like to provide for us. Our aggregate behavior, from the point of view of a massive network like Facebook's, suggests we want to consume a distilled average out of our friends' promiscuous sharing; that's because from a data-analysis perspective, we have no particularities or specificity — we are just a set of relations, of likely matches and correspondences to some set of the billion other users. The medium massifies our tastes.

Facebook has incentive to make us feel like consumers of its service because that may distract us from the way in which our contributions to the network constitute unwaged labor. Choice is work, though we live in an ideological miasma that represents it as ever and always a form of freedom. In The New Way of the World, Dardot and Laval identify this as the quintessence of neoliberalist subjectivity: "life is exclusively depicted as the result of individual choices," and the more choices we make, the more control we supposedly have over our lives. But those choices are structured not only by social contexts that exceed individual management but by entities like Facebook that become seen as part of the unchangeable infrastructure of contemporary life. "Neoliberal strategy consisted, and still consists, in constantly and systematically guiding the conduct of individuals as if they were always and everywhere engaged in relations of transaction and competition in a market," Dardot and Laval write. Facebook has fashioned itself into a compelling implementation of that strategy. Its black-box algorithms induce and naturalize competition among users for each other's attention, and its atomizing interface nullifies the notion of shared experience, collective subjectivity. The mood-manipulation study is a clear demonstration, as Cameron Tonkinwise noted on Twitter, that "There's my Internet and then yours. There's no 'The Internet." Everyone using Facebook see a window on reality customized for them, meant for maximal manipulation.

Not only does Facebook impose interpersonal competition under the rubric of sharing, it also imposes choice as continual A/B testing — which could be seen as the opposite of rational choice but, from the point of view of capital, it is its perfection. Without even intending it, you express a preference that has already been translated into useful market data to benefit a company, which is, of course, the true meaning of "rational": profitable. You assume th erisks involved in the choice without realizing it. Did Facebook's peppering your feed with too much happiness make you incredibly depressed? Who cares? Facebook got the information it sought from your response within the site.

A/B testing, the method used in the mood-manipulation study, is a matter of slotting consumers into control groups without telling them and varying some key variables to see if it instigates sales or prompts some other profitable behavior. It is a way of harvesting users' preferences as uncompensated market research. A/B testing enacts an obligation to choose by essentially choosing for you and tracking how you respond to your forced choice. It lays bare the phoniness of the rhetoric of consumer empowerment through customization — in the end companies like Facebook treat choice not as an expression of autonomy but as a product input that can be voluntary or forced, and the meaning of choice is not your pleasure but the company's profit. If your preferences about Facebook's interface compromise its profitability, you will be forced to make different choices and reap what "autonomy" you can from those.

That would seem to run against the neoliberal strategy of using subjects' consciousness of "free" choice to control them. But as Laval and Dardot point out, "the expansion of evaluative technology as a disciplinary mode rests on the fact that the more individual calculators are supposed to be free to choose, the more they must be monitored and evaluated to obviate their fundamental opportunism and compel them to identify their interests with the organizations employing them." Hopefully the revelation of the mood-manipulation study will remind everyone that Facebook employs its users in the guise of catering to them.