Facebook as experiment

When I was in high school, I worked for a while at a market-research firm. My job was to cold-call people out of the phone book and ask them questions about cat litter, cigarettes, laundry detergents, and other products, depending on whatever clients the firm had at the time. This was before caller ID was widespread, so most people would answer. (Imagine that, talking to a stranger on the phone! It seems almost rash to take a call from an unknown number now -- the cell phone makes me feel like someone has singled me out for persecution.)  While some members of the households I would rudely intrude upon would be surly, most people were surprisingly polite. Sometimes we would have to cajole people with promises of how important their opinions were and how they had an opportunity to shape the products that make up their world, but many didn't need much priming. Some were terribly disappointed when they were disqualified from further questioning (no sense asking nonsmokers what they think of the branding on generic-cigarette packs) and would sometimes ask to change their answers so that they could continue with the survey.

Market research thereby reminds us that the consumer arena is about making yourself heard, making yourself known and feeling like you really matter -- it's certainly not about the humdrum acquisition of use values. No one wants to feel the exclusion of the disqualified consumer. Moreover, the market itself is where to seek that sort of affirmation about the significance of your opinion. And it reinforces the consumerist notion that opinions are the essence of you; as far as society (i.e. the aggregate of individual choices mediated by markets) is concerned, we are the sum of our opinions, not our experiences or actions.

There is obviously an element of this eagerness to have one's opinion tallied to social-media usage, and companies have been mining sites for market-research data for a while now. I wonder how many people change their answers, so to speak, in search of some of that attention online, and whether a certain amount of attention along those lines is necessary to feel like one belongs to a consumer society.  The feeling of inclusion, after all, is not something you achieve once and for all, but is something that we continually crave and seek. Social media offers addictive reminders. The microaffirmations offered by Twitter and Facebook hinge on this need. What drives us to compulsively check is not merely the compulsive quest for novelty but also the comfort in the flattering illusion that this new information was offered to us in hopes that we would respond -- that our opinion matters and we belong on our terms, under conditions in which we always get to have our say. Dozens of messages scrolling past me on Twitter, all begging me to retweet them: I feel so powerful!

As the engine of micro-moments of pseudo-belonging, social media encourages ephemeral participation that reflects no necessarily deep-held concerns but rather an eagerness to say anything while the microphone is on and someone might be listening. But increasingly this behavior is being taken as indicative of not of users' impulsiveness but as indications of how they behave socially in general, as if the medium does not condition what they do,  selectively recording only certain aspects of it. Social-media communication is being regarded as indistinguishable from other forms of communication that hold together social forms in real space, despite the way that real spaces tend to have nonnegotiable components to them that condition our reactions. Real spaces are now defined by requiring mandatory sorts of communication, communication that we can't escape or time-shift or control the terms of. This is now what makes "real life" real: that we have to compromise and put up with other people's bullshit without being able to just scroll past them on a screen.

Yet social researchers are dazzled by the data logged by social media sites and are apparently eager to use that information as uncomplicated proxies for social choices in general. At this point, enough people have made Facebook use a part of their everyday life that social researchers are treating Facebook as an empirical model of society itself. This threatens to yield policy conclusions that take people's regular Facebook use for granted, embedding Facebook use as "normal" and social-media rejection as pathological. It reinforces the idea that you need to be using Facebook to participate meaningfully in society, to matter statistically.  This paper at First Monday by Andre Oboler, Kristopher Welsh, and Lito Cruz argues that Facebook needs to be understood as a "computational social-science tool" that is being operated not by scientists in the name of human betterment (I know, as if scientists are ever really working for just that) but by a for-profit company explicitly dedicated to "moving fast and breaking things." So if Facebook moves fast and tears apart the social fabric as we have come to count on it? Oops. If we have fashioned an identity and an ethos based on certain expectations of privacy and mutual respect for boundaries that no longer pertains? Oh, well. That's not Facebook's problem.

The paper's authors believe we should make it Facebook's problem. The company's mission statement "highlights the need for external constraints so society is not left bearing the cost of mistakes by social media innovators." In other words, states should develop regulations (which they concede will be basically impossible to enforce internationally) to stop Facebook from treating its users as guinea pigs, as this somewhat breathless Technology Review article by Tom Simonite describes. " Facebook has collected the most extensive data set ever assembled on human social behavior," Simonite enthuses, and they've hired a "Data Science Team" to figure out how to best exploit it: "Bell Labs for the social-networking age." If they keep working diligently on discerning "the basic patterns and motivations of human behavior" from status updates and other shared data, maybe "Facebook might eventually be able to guess what people want or don't want even before they realize it." You might think you are using social media, the same way a rat in the maze "uses" the scientists to get cheese, but that sensation is just an epiphenomenal by-product of the media using you -- just like with ebook readers, which monitor how you read to better make it a profit center for publishers.

Even more exciting, the Data Tea, is discovering that it could alter the structure of the social network to change social behavior outside it, making Facebook not a simulacrum of society but its operating system: "By learning more about how small changes on Facebook can alter users' behavior outside the site, the company eventually 'could allow others to make use of Facebook in the same way,' says Marlow," the Facebook Data Team leader.

Apparently, that doesn't sound like a patently terrible idea to some people. I guess that sounds like joyous opportunity, wondrous innovation, to precisely the sort of technocrats who become corporate social scientists who would never dream of being mistaken for ordinary people/customers and controlled by corporations in the same way. (They probably don't look at their paychecks too closely.)

But the authors of the First Monday paper are alive to the problems of predictive social research performed by private for-profit entities. "The danger today is that computational social science is being used opaquely and near ubiquitously, without recognition or regard for the past debate on ethical social science experimentation," the authors,  pointing out that Facebook intends to experiment on its user base (and license other companies to do the same) in search of new profit streams. They also remind us that "the very existence of social media can also promote government’s agenda":

While social media might enable activism, computational social science favours the state or at least those with power. Computational social science tools combined with social media data can be used to reconstruct the movements of activists, to locate dissidents, and to map their networks. Governments and their security services have a strong interest in this activity.

People hesitate to agitate for any sort of change when they are aware that their every movement is being recorded and that incentives are everywhere for others to turn them in, in ways that are decreasingly metaphoric with social-media  ubiquity.

Calling Facebook a big social experiment implies that the network opens up new possibilities for the service's users to change society, when in fact those opportunities rest with the site's administrators and the minions who can access the whose of the data trove. Rather than crack society open and make it open to experiment, it helps preserve existing social relations and make them more supple and instrumental for those who already are in power. This really shouldn't be news to anyone: Facebook is not an experiment; it's a jail.