Feedbags

You don't control an algorithm by feeding more information to it; you teach it to control you better.

Facebook is updating the way it implements its News Feed algorithm, which controls what users see when they open up Facebook and just begin scrolling. In apparent response to criticism that the algorithm is an opaque black box that displays content to users that suits Facebook more than the users' interests, Facebook will now allow users to supply information to directly tweak the algorithm on their end, instead of having to try to blindly guide the algorithm through behavior (from within and outside Facebook) that Facebook may or may not capture and process. Rather than like a bunch of posts by a person in hopes of seeing that person's posts more regularly, you can now tell Facebook whose posts you always want to see, as if you were following them on Twitter and had them on a list. You can also bounce people from your feed without unfriending them, a move akin to "muting" in TweetDeck.

In the company's announcement of these changes, Facebook product manager Jacob Frantz explains the company's motivation: "We know that ultimately you’re the only one who truly knows what is most meaningful to you and that is why we want to give you more ways to control what you see." I love the use of the word "ultimately" in that statement. We'll try to implement a bunch of technological mediations to shape what meaning even means to you, but ultimately we accept that that final decision rests with you. It took us a while to get there, but we ultimately recognize, with some reluctance, that we unfortunately can't tell you what to think.

But I think that Frantz wants to make Facebook ultimately sound considerate. The company is letting users have more agency. It is deferring to them. The changes seem like a reasonable response to criticism of the algorithm's opacity: Look, here are some more levers to adjust that algorithm.

But this is not really a new development; there were already innumerable levers at our disposal to alter Facebook's algorithm and its interpretation of us. Like this post, view that profile, visit this third-party site while logged into Facebook, etc. We didn't know what the exact effects of these would be, and we still don't know what the exact effects of the new "controls" will have on our News Feeds. You don't control an algorithm by feeding more information to it; you teach it to control you better.

Facebook has always deferred to users because that deference allows it to gain more information that can be presumed more accurate than what it can merely infer. And it has never wanted to tell us what to find meaningful; it wants only to inscribe Facebook as the best place in which to discover our sense of meaning. The control Facebook's algorithms impose is not what to think but where to think it.

At best, these new options allow users to enjoy more of a feeling a responsibility for what the News Feed serves them, but the apparatus still functions in the same way. Users generate data for Facebook's algorithm, which the company uses to categorize them.  This categorization then not only determines what Facebook users see but it also constitutes a profile that allows the user's attention to be sold to advertisers and other third parties.

The only difference is that you are feeding data directly into an algorithm whose inner workings remain obscure. The algorithm still serves as an intermediary alibi that permits Facebook to present whatever information it wants to present to users under the auspices that somehow the user has obliquely insisted on it. The News Feed takes control away from users while conveying a sense of the users' ultimate responsibility: The users get to see themselves as discriminating (in the positive sense of the term, though the pejorative sense is also at play) while making none of the effort.

Behind this is a presumption about what Facebook users want: an automatic stream of content that keeps them looking at and engaging within Facebook rather than doing something else that takes them outside of Facebook. Much like cable television narrows our exercise of choice amid the field of possible information to flipping channels, Facebook narrows it to the single action of "scrolling down" through the programming it has seen fit to algorithmically supply.

The underlying assumption is that people don't want to have to choose among different ways of choosing to be informed — that is, of different ways to seek, evaluate, and assimilate information. They don't want to have to be; they just want to scroll. Scrolling is perfect in that it satisfies a users' need for action and their need for boredom, as a spur for further action. It sustains desire in an ideological cultural climate that tells us over and over again that "desiring" (particularly in the form of the money or attention we have to spend) is what makes us desirable, interesting; that our desire is what makes us powerful, not the choices we ultimately end up making on account of it.

So the problem is not that Facebook users have insufficient control over the algorithm that displays content; it's that users are willing to use Facebook as their primary gateway to the world, a kind of television with the minor improvement that the local news always includes reports on people you know. This leads to seeing the world only as so much content that Facebook can sort and prioritize and reify and sell. Facebook becomes (much like television had been before it) the medium that confers reality on experience. Until the News Feed algorithm has processed something, revealing its overall significance in our social graph, it really doesn't properly exist. The moment the algorithm assimilates it is the moment when something actually happens.

Recently, Mark Zuckerberg said that he'd "bet there is" "a fundamental mathematical law underlying human social relationships that governs the balance of who and what we all care about." Of course, this is precisely what Facebook's News Feed algorithm aspires to articulate. And naturally Facebook will represent this "law" as a discovery, a rendering of a pre-existing truth, a fact of nature. But that will only conceal the real accomplishment: having imposed a model that reshapes what is perceived as real about social relations in its image, in terms of what it can manage. The "fundamental law" depends on everyone using Facebook, and thus all that law will be able to express is how people persist in using it. For Facebook, the ultimate meaning of your life is that you want to use Facebook while you're living it. Let your profile be your epitaph.