I recently finished reading The Boy Kings, Katherine Losse's account of what it was like to work at Facebook from 2005 to 2010. It's not a tell-all, burn-all-bridges exposé by any means, but it is fairly critical of Facebook's hubris and its personality-warping effects on users. A year ago I wrote an essay that argued that Facebook was a training ground for becoming a neoliberal subject, and lots of Losse's observations seem to me to confirm that. Connectivity and flexibility for their own sake were valorized at Facebook, even as the site's architecture was designed to capture and digitize more of users' behavior. A hacker ethos was glorified while it was also domesticated; rules were for the little people (or the ordinary Facebook users) while the worthy devise "social hacks" to get ahead through heedless defiance and supplicating sweet talk after the fact. Losse writes:

Engineers were tacitly encouraged to break rules while the rest of the company had to follow them, unless they had some trick of their own. The people in the company who could get around this paradox were the ones who could social it (the short term for social engineering or hacking one's way around something using social means) by breaking the right rules and, above all, remaining popular, and in doing so riding all the inherent corporate contradictions as far as they would take them.

This and other passages reminded me a bit of Shoshana Zuboff's In the Age of the Smart Machine (I wrote about that book here), which explores how computerization made executive privilege even more ineffable and rarefied, as everyone else's work was deskilled and made subject to quantification. That view makes Facebook's goal to digitize everything in the everyday life of its users even more sinister, as it suggests it is an attempt to deskill everyone with respect to the sort of "social engineering" Losse refers to as the source of hacker power.

I was sort of shocked to learn that Facebook's engineers were actually also enthusiastic Facebook users, especially given how callously Facebook tends to treat its users. Losse chalks this up to Facebook's engineers having a technological mind-set that is indifferent to all things nonquanitfiable, all things not manipulatable as data. But could they possibly be blind to the way Facebook use makes them subject to the same sort of control protocols, the same surveillance and subjugation, depriving them of the space to work social hacks and instead leaving them to have to grind out social capital (just like all the other rubes on Facebook) through ceaseless uncompensated work on the social network? It seems to me that Facebook makes your connections overt and obvious, whereas the connections and the charisma that matter to socialing advantages for yourself are not. When captured by Facebook, that kind of privilege, which often operates in secret and can only be inferred, gets dispersed. Trying to use Facebook connections to get ahead is not a hack; that is just playing the game as it has always been played — getting the right recommendations, sucking up, etc., etc. The real power is in getting people to do you unacknowledged, untraceable favors, and being able to grant them.

What Facebook makes inescapable, Losse notes, is the transformation of all captured information into cultural capital, into currency in a status game. "Instead of making a technology of understanding," Losse writes, "we seemed to sometimes to be making a technology of the opposite: pure, dehumanizing objectification. We were optimizing ways to judge and use and dispose of people, without having to consider their feelings, or that they had feelings at all." Feelings are for the social hackers; for Facebook's user base, feelings don't count — they've been recast as likes and been dispensed with.

Basically, Losse restates the idea that social media mainly prompts not "openness" but  judgment, poseurdom, defensiveness, and resentment. It serves to guarantee that information is used to articulate hierarchies rather than dismantle them. The best we can hope for is the coexistence online of multiple hierarchies, some of which we might be able to dominate. ("Dominate," incidentally, turns out to be a favorite word of Mark Zuckerberg's.)

In social media the point of information is always status, all the time. Nothing is for its own sake. This means,as Losse frequently claims, that using Facebook robs you of a sense of spontaneity and compromises your sense of personal authenticity. Everything becomes self-conscious, somewhat desperately strategic — and the more intense the social surveillance through Facebook and other social media is, the more this is the case. Without any space outside to develop identity autonomy, we have no space to try out tentative beliefs (as philosopher Tom Sorell argues here); instead much of what we try is immediately fed into algorithms and social-judgment mechanisms and has gravity and persistence.

Losse points out how Facebook's News Feed and like buttons led to a company culture in which personality traits and experiences needed a "proof of concept" through appropriate metrics before being embraced as cool. As social media has become more omnipresent, trying out identity without "proof of concept" has become more risky. It begins to make sense to commit to no particular identity in advance and live from within a sort of beta-testing self to see what sort of self the network tells you that you should embrace. Since News Feed relies on algorithms to narrativize our life experiences in terms of what has proved popular with our "friends," why not let it tell our life story rather than trying to devise one in advance, before the fact?

What Facebook use may be doing is acclimating users to this post hoc self. Users seize as their identity only what they are told is acceptable after the fact — an algorithmically recommended personality. In other words, social media redefines spontaneity as orthodoxy much like it redefines serendipity as automatically generated recommendations. Spontaneity becomes surprise at what algorithms and tracking tell us we should own as the basis of our identity. Oh! That's who I am!

Toward the end of the book, Losse wonders if Facebook's engineers are generally so out of touch with themselves that they developed the site expressly with this intention, to make having a core personality somewhat superfluous:

For all their rabid data consumption, there was a lot the engineers didn't know. That was partly why Mark made Facebook, and why the boys of the valley were so busy turning our lives into data, as if by doing so, their algorithms could tell them something that their eyes and hearts couldn't. As Thrax [one of Facebook's engineers] announced triumphantly at his desk one day, "I just wrote an algorithm to tell me who I am closest to!" He went on to show a set of scores that, according to his algorithm's calculations, revealed how close he was to all his Facebook friends.

Some version of that algorithm is likely powering Facebook's mechanism for choreographing users' News Feed, deciding for them what sorts of information they should want about the people they are connected to.

Losse laments several times that Facebook undermines our ability to be "loved for who we really are" by inviting universal judgment of everything we "share." But a clearer way of putting that may be that Facebook is making "who you really are" a moribund concept. Facebook, in fact, solves the need to be loved for who we really are by dispensing with it. The site lets users garner the far more secure experience of being loved for becoming the algorithmically desired object. If you view algorithms not as rankings but as instructions — not as judging the a priori self but positing one to be that can't be judged — perhaps you will be set free to become who you really are rather than express some  version of it contrived in advance.

Facebook's ultimate goal may be to destroy the idea that you can have a self before using Facebook. Now that's domination.