"Organic stories"

Image by Sophie Kern

Facebook recently changed its News Feed algorithm, which determines which content users see of all the stuff shared by the other users they have elected to follow. Traditionally, Facebook has veiled its algorithm in secrecy, even going so far as to disavow "EdgeRank," an earlier name for this algorithm. But this time (as ReadWrite notes) Facebook has decided to accompany the round of algorithm tweaks with a propaganda post about the company's benevolent intentions. It's as nauseating as you'd expect, full of cheerful insistence about how hard the company is working to try to give users what they want, which is its doublespeak for making users think they want what Facebook wants for them.

The goal of News Feed is to deliver the right content to the right people at the right time so they don’t miss the stories that are important to them. Ideally, we want News Feed to show all the posts people want to see in the order they want to read them ... With so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked stream of information. Our ranking isn’t perfect, but in our tests, when we stop ranking and instead show posts in chronological order, the number of stories people read and the likes and comments they make decrease.

That sounds noble enough, but you can see what really matters to Facebook — that you feed it likes and comments and so on. It's going to rig your feed to extort the most interaction (a.k.a. free immaterial labor) from you. It also wants to be able to control what it programs for your delectation so that it can sell your attention span to whoever is willing to bid for it. The News Feed algorithm is what enables the sponsored stories racket, in which you have to pay to assure that your friends will actually be displayed the material you share.

The claim that Facebook wants to give you what you want also makes no logical sense, given that Facebook decides what you think is important only on the basis of what you have already seen and liked on Facebook, ignoring the possibility that you could want something it hasn't already cajoled your friends into sharing. Instead it takes the even more dispiriting position that most of what is posted on its site is spam, and you should be grateful that Facebook is willing to protect you from the morons you have chosen to friend and their interminable updates about "checking in to a restaurant." If Facebook didn't impose its hierarchy on what you see, you might get all confused and upset! It's important that things be hidden from you, so you can see what's important.

But how can Facebook seriously claim that it "does a better job of showing people the stories they want to see" when there's no point of comparison? You can't evaluate whether people got to see "what they wanted to see" without knowing how much they would have wanted to see what's been withheld. Facebook must just figure you can't possibly want anything you haven't seen yet.

An increase in likes and shares and comments is not indicative of a better News Feed consumption experience, and Facebook is either being stupid or disingenuous when it suggests that it is. Facebook makes the categorical error of using interaction with the site (the only metric it cares about for its core data-retailing and marketing businesses) as a proxy for user engagement with the content that user's friends have shared — what Facebook chooses to call, in a very telling euphemism, "organic stories."

It's not clear what the inorganic stories are, whether they are "sponsored stories" or merely user-generated content that has been decontextualized by the News Feed algorithm from the time and occasion in which it was originally shared. Organic stories seem to be ones that have not yet been processed by Facebook into soylent green that's suitable for shoveling out into user News Feeds.

This predication of what you see on what you've already seen and approved guarantees a closed epistemological loop in which platform-dictated patterns can be reinforced and intensified — a phenomenon exemplified by such blight as Zynga's social games, which choreograph meaningless engagement for engagement's sake.

Because it measures what we want in terms of how much we interact with Facebook, Facebook will never give priority to or assume we like anything that is meant to encourage us to do anything outside of Facebook's universe. The only thing we really "want," from the algorithm's point of view, is to use Facebook as much as possible. Facebook wants us to forget that it can potentially provide useful information that directs us out toward the world and instead it demands we regard it as immersive entertainment, sucking us in. As Alexis Madrigal detailed here, Facebook aspires to be like a slot machine, engineered to pull you into the K-hole "machine zone."

It might be more apt that Facebook wants to make you into a machine. To say that "when a user likes something, that tells News Feed that they want to see more of it" is to impose an incredibly simplistic, Pavlovian psychological model of pleasure and desire on users. It presumes that of course anything you find pleasurable can be understood in quantitative terms and automatically indicates you want more of it. It's almost unfair to neoclassical economics to argue that the algorithm mirrors its limited behavioral psychology, because it sounds as though News Feed can't even assimilate the idea of diminishing marginal utility.

Facebook certainly can't fathom — or can't tolerate, rather — that you might enjoy something without leaving a trace of your enjoyment; that some pleasure in social-media comes lurking; that sometimes we are so intensely engaged with "stories" that it would seem glib to "like" them and inadequate to leave a comment. Sometimes we want to see content that we are ashamed of, but we still want to see it, even if we won't interact or "engage" with it. Sometimes we hate-like. Sometimes etiquette compels our interaction with users whose content we don't want to be more prominent.

In short, a user's interaction with content in Facebook is irreducibly complex, and Facebook's attempt to filter it for you is an attempt to eradicate the complexity and make you an emotional simpleton. Then you become a much more attractive and easily-sited target for the marketers Facebook wants to sell you to.

When you like something, Facebook assumes you are liking a generic category, not a particular thing, as though there could be nothing uniquely likeable about any given "story." Only if all stories are generic can liking one story warrant getting served "more of it." To Facebook. any shared item is just one more unit in a heaping pile of stuff that can be separated into a few broad-based categories based entirely on metadata.

One gets the sense that Facebook regards its paternalistic algorithm as valuable intellectual property akin to Google's ever-evolving PageRank algorithm for generating search results. It's not hard to imagine Facebook's engineers congratulating themselves for coming up with the  "PageRank of people," or maybe the "social PageRank." Facebook likes nothing better than treating its users as tractable chunks of information amenable to processing. The company has trouble conceiving of its users as people who should be afforded the respect of being able to make their own decisions, whether about privacy or about the information that they want to see or anything else. Instead, Facebook wants to cow users into accepting that the algorithm always knows better than they do. And once they accept that, they will be trapped in the habit of making explicit their feelings about everything to make sure it is registered and factored in, so the benevolent algorithm can serve them.

Facebook is like a television that monitors to see how much you are laughing and changes the channel if it decides you aren't laughing hard enough. It hopes to engrain in users the idea that if your response to something isn't recordable, it doesn't exist, because for Facebook, that is true. Your pleasure is its product, what it wants to sell to marketers, so if you don't evince it, you are a worthless user wasting Facebook's server space. In the world according to Facebook, emotional interiority doesn't exist. Introspection doesn't exist, and neither does ambivalence. There is only ostentatious enthusiasm or null dormancy.

Yet rather than blame social-media companies for demanding elaborately performative emotional reactions from its users and skewing their users' experience to secure it, worried social commentators will evoke generational narcissism and insouciance about privacy to explain why people broadcast their mundane reactions to everything they encounter. I'm sorry, I mean their organic stories.