None of This Is Written By Me

When I write something, it tends to disappear from my brain; once I've blogged an opinion, I often forget that it's my opinion, and when people sometimes refer to something I've vigorously claimed, it can seem strange to recall that I once felt that strongly about it. "My thinking has evolved on this question" is a dodge, but there's also something to it. Sometimes I realize how I said it wrong, and had to say it to come to that realization. Sometimes I struggle to recognize things I wrote years ago, and if I start something and don't finish it, it can feel like trying to raise a corpse from the grave armed with a kite and a key and a bolt of lightning. And sometimes I read something I wrote, in the past, and I'm struck by how clever this writer is; if only I could write something like that!

I am a great starter of things, and my follow-through sucks. I start five essays for every one I actually finish. There's probably a reason I didn't finish these things. It might have been a good reason. Probably it was just that I got distracted.

Below, I have run together a whole stack of things I've written and didn't finish--more or less culled at random--and I comprehensively disavow all of it. Some of it is recent-ish, some of it is old. Some of it is great. Some of it is garbage. The person who wrote these words has moved on; he's a relative of mine, and I like how he thinks, but I'm not him, and I couldn't finish what he started. I won't try; I haven't even re-read most of this. But information wants to be free, and so do unfinished draft blog posts. Now they are.

* * *

Drones make this chain of command transparent, in fact; less accurate bombers could more easily make “fog of war” their alibi, pretending they hadn’t meant to hit the things they hadn’t aimed at. But drones are just much more accurate, much more invulnerable, and much cheaper than conventional bombers, right? Since they only do what a cruise missile does, in fact, in what sense are UAV anything new? What is new is not the ability to kill people from far away with no personal risk to the operator; what’s new is our country’s preference to do so, a lot, as a generalized policy.

Being upset at “drones,” then, can make us forget that it’s a specific president—his name is Obama—who has sent those drones out there to kill people, and that he takes personal responsibility for signing off on what happens. “Drones don’t kill people,” we might say, if we were the National Rifle Association; “President Obama kills people.” We might demand, using the same impeccable logic of personal responsibility, that the availability of a cheap and easy technology does not absolve its user for the death they use it to create.

It would be easy to call WIlliam Otter a sadistic sociopath, since his idea of fun mostly seems to be cruel jokes at the expense of the weak or vulnerable, and often with real physical injury involved. It would also be easy to call him a bigot, except that he makes women and ethnic minorities his particular target often enough that it’s also too easy; if you’ve participated in mob violence against a “house of worship for Africans” (as he has) and you begin that story by saying “Having always a propensity for fun, an opportunity presented itself to give loose to, and gratify it” (as he does) we’re a little past the point where those kinds of names are really very useful.

Still, it’s worth noting that he’s in it for the fun, and that he doesn’t show any particular or personal antipathy towards the black people or Irish that he targets (and he targets quite a few white men, too). His “sprees” are about power, about the fact that he can make people hurt or confused, that he can laugh, and that he can win. The best spree is one where only he knows what happened, where he not only has the last laugh, but where he also has the whole story (and can tell people about it afterwords, in a tavern or in his book). In this sense, calling him a sociopath is exactly wrong: he cares, deeply cares, what people think about him. That’s the whole point of all of it. He wrote a book-length autobiography consisting almost totally of pointlessly cruel and violent tricks and pranks because he thinks it’ll make him popular.

Most pertinently, public universities are, today, effectively defined by the permanent crisis of crippling public funding cuts, the gradual but continuous removal of the funding source that created and defines them. This is an existential crisis. A public university without public funding is fundamentally untenable, and can only survive by shifting itself towards other sources of funding—student tuition, partnership with industry, and philanthropic and alumni donations—but changing its funding changes its fundamental logic: a public university is accountable to the public that funds it, but a “public” university without public funding will be accountable to its customers (students), it clients (industry), or its shareholders (alumni and donors).

Occupy is dead, as you know. I mean, look out your window. Are there camps? Are there marchers? Obviously not. The anarchists and utopians and wayward children have tired of their little game and have gone home to their ipads. They’ve had their fun, but it couldn’t last, couldn’t survive the problems which more sober analysts were pointing out from the very beginning. If only they had listened! In retrospect it was so clear. They should have built bridges with the Tea Party. They should have articulated a clearer message, a strict organizational hierarchy. They should have banned anything but symbolic politics; civil disobedience scares people away, after all. They needed to focus on elections and political process. Mitt Romney! They should have kicked out the homeless. They shouldn’t have put up tents in the first place. They should have been smart, and done something completely different from the very beginning. They shouldn’t have existed, and now they don’t.

The obituary for Occupy is a wish fulfillment fantasy, the expression of a desire which was present from the beginning — for Occupy to be other than Occupy — but which could only be articulated, as such, once the camps were gone. When the camps were there, when there were people in the streets to be seen, talked to, thought about, ignored, or joined, a different kind of “realism” was unavoidable, for those whose dearest wish is to be realistic, the journalists, pundits, and commentators who find that voting for Obama is always the only realistic option for positive change in the country. When Occupy existed, when it was there, this “realism” was their Achilles heel: one could not argue that a movement did not exist when it actually did exist. You had to wait until they had been destroyed before you could go back to denying their possibility.

After all, one could not argue that there was no alternative to capitalism when, obviously and undeniably, an alternative to capitalism was being presented in the streets, for good or for ill. When the camps were there, they were a material fact that had to be contended with. In this sense, they weren’t simply a protest against the status quo; they were a revolutionary crisis in the status quo, materially present as such.

Much abused though the words “revolution” and “crisis” are, I mean something quite simple: the presence of an active “no” that would have to be actively suppressed. I don’t mean that Occupy was ever in a position to actually overthrow the established order, but counterfactuals are also not our friend here; we know what did happen, and that’s the important thing. But one needn’t have achieved a revolution to be worthy of the adjective. Once there were many thousands of people in occupations, all across the country all saying “no” simultaneously — and working on formulating something to which they could say “yes” — a crisis existed, in the most literally etymological sense of the word. We were at a turning point, a contradiction, from which only one of two possible outcomes could emerge. Either the occupiers would overturn the status quo or they would be suppressed into nonexistence. The fact that it was the latter doesn’t make that choice any less necessary, or the moment any less revolutionary in the necessity of the choice. If they hadn’t been suppressed into nonexistence, they would have remained. One or the other.

To come to terms with how it was the former — how police were used to destroy every last occupy camp in the country — is important, and I hope we’ve all learned something from the experience, both about what is true and what is becoming even more true. Early reports that there was coordination between the federal government and local police were roundly debunked by realistic journalists, for whom there was nothing to see here, it can’t happen here. But in retrospect, there obviously was substantial federal, state, and local coordination. I know a lot more than I used to know about how theoretically public space is, in practice, heavily regulated in terms of the desirable quality of people who can use it. I know more about the police than I did, and I’m not the only one. If there is one thing the Occupy sequence of events may have truly accomplished it’s that a vast swath of Americans have seen both what the police already are and what they threaten to increasingly become. Violent, arbitrary, racist, unpredictable, and political.

In a world where so few people get what they deserve—by any standard short of “the ways of God are mysterious”—why do we talk so much about what we deserve? I am bothered by it. This is not to say that I have a studied critique, exactly, or a fully articulated thesis that I will defend to the death; I am trying to understand something that I don’t fully understand.

“We,” by the way, just means “me” and people those for whom particular things that are true about me are also true. That’s a group that includes some people, but also does not include other people.
But the ease with which we say things like “Everyone deserves a job” or “He got what he deserves” seems strange, when I consider what else I believe, or, rather, what else I don’t: I am an atheist-by-default who believes that capitalism is a really terrible way to organize every part of human existence, and I also think “America” is a vicious and excruciating narrative with which to name, understand, and enact ourselves. Basically, I can’t think that God’s invisible hand orders life according to His divine plan, I don’t think The Market is at all rational or humane, and I don’t believe that dividing the world into “us” and “them” gives any kind of order to the problem of how to be a person in the world, or at least not any order I particularly like. And let’s not forget: none of us are getting out of this alive. At a certain point, none of this matters, and that non-materiality is, in fact, the only thing that is certain.

These are all things that I believe, but you know what? None of it matters much for what I do. I try to Accomplish, not because I believe that accomplishments will be rewarded, but because it seems like the right horse to place my bet on, because it seems like a better bet than playing the lottery, metaphorically or literally. Or perhaps it’s because I’m afraid to do anything else. I try to be nice to people, in general, not because I believe all the stuff I internalized as a young Christian, when I was young, but, rather, because I internalized it, that’s the story I like to tell myself about how I behave. And what I believe about the United States—or what I say about it—just couldn’t be more irrelevant to the fact that the things that I am, do, and think have everything to do with how I became the person I am, a process which takes as its adjective “American.”

If Ben Ali had not fled Tunisia when he did — if he had instead made the decision which other authoritarian Middle East and North African rulers have made, and doubled down on harsh repression and violently suppressed popular resistance — would the Egyptian revolution have even gotten off the ground? The first “Day of Rage” in Egypt was January 25th; would the world historic crowds that flooded in Tahrir Square have had the faith, the fury, and the sense of inevitability if Ben Ali’s flight from Tunisia on January 14th hadn’t given them the Tunisian example to emulate? Mass protests only began in Yemen and Syria, as well, after the Tunisians had unseated their despot, and in Libya and Bahrain after the Egyptians had unseated theirs. What if Ben Ali had held tight? What if Mubarak had refused to relent (or never been forced to decide)? Would the “Arab Spring” have even happened? We might, even now, be speaking about the Tunisian revolution in the same way as we talk about the stillborn “Green Revolution” in Iran, a moment of inspiring popular uprising followed by depressing authoritarian violence, and a continuation of the status quo.

I don’t mean to deny the fact that the underlying causes of the “Arab Spring” were deep and broad; as many have observed, x and y and z, and the sclerotic aging regimes of Mubarak and others were surprisingly inept in combating..But the inevitability of Ben Ali or Mubarak’s downfall is only the flipside of the impossibility of unseating them, an argument which filters out the chance and contingency which

Why would a person risk his life for a very small amount of money? Why would he mine platinum for $500 a month? Why would he charge a line of police officers with

A confrontation like this one can teach us a lot about the point where violence and contracts merge and become indistinguishable. They’re supposed to be separate. The state is meant to have a monopoly on violence, to employ violence only in regulating the Social Contract itself and in protecting you from all forms of violence: if you break that Social Contract—which is to say, if you commit a crime—the state has not just the right but the duty and responsibility to use violence to bring you to bear, something it does for the safety of everyone else.

The idea of the “contract,” then, becomes a liberal fetish object, containing within itself the code that defines where and when and how violence can be used. If you make a contract, and you obey it, you will be free from violence; if you break the contract, you will be violated.

If you cannot be paid, you must strike.

If you cannot strike, you must disperse.

If you cannot disperse, you must be shot.

If you are shot, you will die.

In retrospect, it’s possible to say that if Barack Obama didn’t exist, we would have had to invent him. Which is not to say that we didn’t—of course we did!—but it’s also worth remembering the existence of that ambitious kid who was stuck with the middle name “Hussein,” of all the fucking things; the guy who went by “Barry,” after all, actually does exist, but he does so in a way that’s distinct from the vast corporate enterprise that goes by the name “President Obama,” the massive ideological enterprise and partisan project that takes this person as its face. After George W. Bush, the USA’s brand was in the toilet, and that’s no way to run a hegemony.

After all, could there be anything more irrelevant than Barack Obama’s birth certificate? Even the birthers couldn’t possibly care less about what some piece of paper says. Could there be anything less necessary than the question that “what Obama really thinks,” or who he “really” is (or what he would do if he could)? The idea that the king is being mislead by his counselors is as old as the desire to believe that the status quo isn’t fucked up and bullshit. Finally, and to raise the stakes a bit, could it possibly matter that Barack Hussein Obama didn’t actually tweet “No Spoilers please?” Of course not. Of course not.

Before I introduce the readers, I just want to say a few words about Professor Kofi Awoonor. Few African writers—or writers of any kind—receive the kinds of eulogies and attention that Awoonor did, on the occasion of his passing. The siege of the Westgate mall in Nairobi was already in worldwide headlines when it was announced that he had been one of the victims, though it took the press about a day to figure out who he was and what to say about him. After that, the death of a great African writer quickly became part of the story. And most of the American obituaries and news reports frame the event of his death in those terms. They emphasize the senselessness of his death, the tragic stupidity of it, since he wasn’t a target, but just happened to be in the wrong place at the wrong time. And a big part of the loss of his passing was that it was so meaningless, so random.

It’s worth noting, then, that one of Kofi Awoonor’s most consistent themes—across all his writing—was exactly that, the random senselessness of death. His mother was a traditional funeral singer, and in the early 1970’s, he published this book of translations of oral poetry from his own ethnic tradition, the Ewe people of Ghana, and the funeral dirges are arresting, unsettling, perhaps all the more so for being translated into English. If there’s a beauty to them, it’s a rough, jagged, alienating beauty; it’s a poetry filled with confusion, terror, and unwilling incomprehension.

For example, this is from a dirge he recorded his aunt singing:

No one knows the day

A man dies; the death that killed Denu

No one recognizes it.

Now the hour has struck; it is time

Let us be ready for the warrior

If anyone has a task, let him perform it now.

Death has refused to appoint a day.

Men’s children perish in life.

They are not hymns of comfort, or the reassurance that death is just a part of life; there isn’t a passage from lamentation to consolation or acceptance. The untimely death is not an anomaly; in these dirges, all death is untimely.

In fact, when the singer tells us to “be ready for the warrior,” Awoonor notes that the Ewe word used means “a predatory slave-raider” and in this way, death is also the middle passage, as terrifying and incomprehensible as the slave trade itself, which tore so many millions of Africans out of their homes and away from the world in which they lived. This would be an increasingly important theme in his work, when he lived in New York and Texas, and especially when he served as Ghana’s ambassador to Cuba and Brazil. One might crudely say that if he used the middle passage as a way to think about death, the African diaspora became a way of imagining re-birth ad futurity.

Especially through translation; he spoke Spanish and Portuguese, and

At the same time, in another poem, called “The Journey beyond,” he appeals to “Kutsiami the benevolent boatman”:

when I come to the river shore
please ferry me across
I do not have tied in my cloth the
price of your stewardship.

“Kutsiami,” means “death-translator,” and

When someone like Kofi Awoonor passes on, especially in the way that he did, it is hard not to resort to the cliche that his death was “senseless.” He was not targeted in particular, he just happened to be in the wrong place at the wrong time. It seems so wrong that he could die in that way, so stupid and unnnecessary; it was very hard to believe that it had happened. Spent a lifetime making sense of death.

If you want to understand why American football is what it is, you could do worse than look at the turn-of-the-century moment when Rugby, Soccer, and (American) Football parted ways. Football is violent, today, because it was created as a means of managing, utilizing, and preserving violence. Football was never supposed to become chess

We don’t let minors drink alcohol, and for good reason: if they did, they might hurt themselves. And then, having allowed them to drink, it would be our fault, as a society. Should attractive women be allowed to drink? If we don’t ask this question

Take up the wmb There is one thing white men can do better than anyone else: take abuse. Unrealized potential, of course

The cylons are not in the least scary; what’s scary is that Space Daddy seems to have been emasculated, that our parents are breaking up. The scene where Adama’s children—Kara and ??—reject this interloper who wants to split their family

The show’s strength is

But the show forgives the cylons, much too quickly. The cylons aren’t just evil; the cylons are genocidal monsters. The show will eventually work out a kind of rationalization for the Cylons doing what they did, but only post facto: in the present-tense of most of the show, our characters have just escaped from the kind of disaster so biblically tremendous and evil that it needs a word like Nakba or Shoah to describe it. The cylons are worse than Nazis, in the sense that the writers have worked hard to make them

At the same time, the cylons are complex, sympathetic, even pathetic, in both the normal sense and in the sense of evoking pathos. They are human. The realization that Boomer is actually a cylon is presented in terms that can only produce empathy: she is tortured, afraid, unwilling. She is a cylon, but the fact that she is a cylon is as unwelcome to her as it is to us.

When I re-watched the miniseries, I was overwhelmed with the horror of it.

But by season two, by the time you get to the Pegasus sequence, the show is starting to lose its grip on that horror. So the show starts to create new storylines to remind us—the episode where Tyrol builds the blackbird, for example, and the sociopathic murderousness of the Pegasus—but each of these efforts backfires.

Naming the blackbird “Laura” only reminds us that she’s dying in the present, a death we will care so much more about than all the billions who perished on earth.

The show relies heavily on the trope of “if X, then we really are no different than the cylons,” but the flipside of that is at least as true: the cylons are no different than us.

My dog has a rather simple philosophy of consumption: smell everything, eat anything, and if it’s bad, barf it up. When her stomach starts to trouble her—when she’s eaten something too gross even for her—she eats enough grass to induce vomiting, and the problem is solved.

This is, I think, the basic post-enlightenment protestant free speech ethic that the The West uses to think about things like information consumption. To be simplistic, Catholicism came to be identified with the suppression of speech, or vice versa, and the secularizing trend of the Protestant reformation (which seems like a contradiction, but isn’t) made “read now, ask questions later” into a virtue. The Papists are tyrants that want to keep a monopoly on Truth, they said, But I’ve got some literature that I’d love for you to take a look at. Also: the invention of the printing press, mass readership, public culture, etc.

If this is simplistic, it’s not untrue. The idea that it’s necessary to read a book before you decide whether it’s good is an ethical principle that The West holds very dear. How can you criticize a book without having read it? “Don’t judge a book by its cover” is more than just a cliche. It’s a moral absolute.

We judge books by their covers all the time, though.

The best argument against tax-payer funded public higher education is that it is not designed to benefit the public at large, but, instead, to enrich the already wealthy at the expense of the tax-paying public. This is a much stronger argument than most academics like to admit, and our refusal to level with the problem dooms us to lose the debate. Academics, by and large, are not very interested in the one thing that could conceivably save them.

I am tired of hearing professors deliver lectures on how to make the case for the humanities, for example, or write op-eds on the necessity to better explain why we are necessary. The problem is not rhetorical; if eloquence could save us, then it would have. These books have been written; they just haven’t been what we needed, and we’re the only ones who read them anyway. Moreover, for every theory-laden elitist who alienates the hoi polloi by using jargon and French, there are fifty academics who do not remotely match such caricatures of the ivory tower intellectual. To imagine that our problem really is the fact that we are beyond saving—that we really are nothing but a guild of alien beings with contempt for the poor—just demonstrates how comprehensively we have swallowed the propaganda of the conservative backlash against us: no one has a lower opinion of “academics” as a category than academics themselves.

At the same time, it’s a form of self-flattery to imagine that we can make the case for ourselves through rhetoric. This is what we are good at, we think, and the people will listen if we can just find the words, if we can only make them see. And this deep incoherence speaks volumes about how it is that we think about who we are; alongside the internalized anti-intellectualism that academics use to rail against academia, we all like to pretend that none of that applies to us. Only they have ideology; I’m different and see things as they are. Sure, the postmodern Marxist Lacanian theory-monkeys over there might be everything the conservatives say they are, but I can speak with the restrained down-to-earth eloquence of the public intellectual; I understand what’s at stake and how to address it.

In one sense, I obviously share this aspiration, and as an aspiration, it’s a good one. But such an approach often presumes that we are basically right, and that the problem is that “the people” simply don’t understand. We just need to find the right way to talk to them, we reassure ourselves; if we can only find the right words, they’ll understand. On this, I disagree. I think “the people” understand something about our institutions that we don’t want to want to think about, and it’s this: the taxpayers who fund our public universities are seeing very little return on those dollars. They may or may not know that the primary economic function of a public university actually is, of course. But when they vote against funding those universities—or when they look on apathetically as public funding is cut, year after year—it’s because they recognize, quite rightly, that they are not losing all that much by seeing them go. They weren’t getting very much for their tax-dollars.

The reason is simple: universities work to the benefit of capitalists, first and foremost, and only secondarily does that value trickle down to everyone else. This is long been the case, but it’s becoming much more so, in part because educating students is no longer as central to capitalism’s needs; it wasn’t that long ago that businesses lobbied the states to build more universities and educate more citizens. They needed an educated labor force, and only government could provide that.

Back when “modernization” meant moving towards a goal of mass popular consumption, one of the most popular ways to think about economic development was W. W. Rostow’s five stages model, which describes the difference between traditional and modern as, essentially, an airplane taking off:

The specific details are probably less important than the basic structure of thought, in which a traditional economy is stuck on the ground until it can build enough momentum to slip the surly bonds and climb towards “maturity.” Rostow even figured out when each national economy was supposed to have taken off; in Britain, it was the industrial revolution, in the US, it was the market revolution of the 1840’s and 1850’s, and in China and India, it was the 1950’s or thereabouts.

Subtitled “A Non-Communist Manifesto,” Walt Whitman Rostow was trying to theorize exactly how it was that communism wasn’t responsible for the Soviet Union’s incredible rates of growth, and how only the particular kinds of economic policies adopted in the capitalist West could lead us into the future. Only western-style economic development could get the Third World off the ground. And what that would mean, it turned out, was a mass consumption society: unlike the Soviet second world—with its bread lines, shortages, and black market—the West would bring traditional societies into a future of plenty, of satiated desire for commodities.

Today, by contrast, it is universally acknowledged that “consumption” is the problem. Without the need to prove to the world that capitalism was the only road to prosperity—indeed, without the sense that there is another way—consumers have become a rather questionable kind of person, the source of all our woes: leftists decry consumerism and greed, while the right now divides the nation into virtuous “makers” and rapacious “takers,” the good white people who work for the love of working, or something, versus the women and racial minorities who “want stuff” and “feel they are entitled to things.” That’s not a false equivalence because the right wing narrative is racist horseshit, and financial system actually is based on greed. But beyond left or right, we find something else, a deep ideological discomfort that forms the foundation of austerity thinking: bad people get stuff while good people go virtuously without.

This is the only way it makes sense when Governor Jerry Brown’s resorts to the Old Testament and Zen Buddhism—two tastes that always go well together—to explain why the best that California can now hope for is a “sustainable glide model” for the future. We are still in a flight metaphor, but it’s no longer directed towards high levels of consumption. Instead, our engine is dead, we’re losing altitude, and if we don’t cut enough dead weight soon enough, we’ll crash. We need to tighten our belts, trim the fat, and bring it in for a landing.

The backdrop, of course, is that California’s state budget process is broken: to give you the short version of the long story, the state’s tax base has been crippled ever since the Tax Revolt of the 1970’s resulted in Proposition 13, which lowered property taxes and requires a supermajority in the legislature to raise any kind of tax at all. The state has almost never be able to raise the kind of revenue that it had always relied on having, but not only because of Proposition 13 itself (though California’s schools do steadily deteriorate as the amount of revenue that can be taken from property taxes is kept artificially low). The problem is a whole way of thinking about revenues that came out of the Orange County in that era, in which it became as politically impossible to raise taxes as it was legally constrained. This was new. Even when Ronald Reagan was governor of California, after all, he had raised taxes—one of the first things he did, after explicitly campaigning on fiscal responsibility—but the 1980’s and 1990’s would be defined by a fiscal logic in which taxes could only ever go in one direction, down.

And so they did! During the good economic times of the 1990’s and early 2000’s, a series of tax cuts eroded what was left of the state’s ability to fund its schools, programs, and infrastructure, and the first thing Arnold Schwarzenegger did on taking office was repeal the state’s Vehicle License Fee (VLF). That tender mercy spared every car owner a $150 per year—though it was more about saving millions for corporations with vehicle fleets, especially rental car companies—but the the amount of revenue lost to the state would have paid for almost all of the draconian cuts to vital public services that we’ve seen ever since. As Robert Cruickshank summed up the situation four years ago:

“In the flush years of the 1990s, California enacted tax cuts that now cost us some $12 billion a year. Spending is rising, but that's what happens when you have both a growing and aging population, and that increase in spending is on core programs - health care and education - not on frivolities. Half of this tax cost - $6.1 billion - is due to the VLF alone. We are going to destroy public education just so drivers can save $150 a year.”

Two days ago, I woke up to a California with a Democratic super-majority in both houses of the state legislature and a Democratic governor. What’s more, that Democratic governor has a mandate to raise taxes to fund education, since the state’s voters passed Proposition 30, comfortably, if not overwhelmingly. And so, if you are of the opinion that the problem with California’s budget is that it cannot raise taxes, then you might look at this situation and conclude that the Democratic party is now in a position to do what it wants to fix the situation. This is more or less what Jerry Brown was asked at a press conference. His response?

Brown said he will be guided by a biblical reference to seven years of plenty being followed by seven years of famine, and to the need in better times to save crops for less abundant years.

"We need the prudence of Joseph going forward over the next seven years, and I intend to make sure that that's the story that we look to for our guidance," Brown said. Brown is likely to face pressure of liberal allies -- if not Democratic lawmakers -- to increase spending. He suggested today that he will resist such efforts.

"We're cutting, and now we have more revenue, but that revenue will be used prudently and judiciously," he said. "And I don't underestimate the struggle over the next couple years to keep on a very calm, clear and sustainable glide-path."

The speed with which Jerry Brown has pivoted towards “years of plenty” tells us something fascinating; a week ago, anyone with a brain would have said that California is in the middle of a revenue crisis, that we are cutting incredibly important and vital parts of the social infrastructure—that people will actually, literally die from cuts to health care—because there isn’t enough money coming in. We aren’t trimming fat; we are losing muscle mass, eating our own vital organs, because we can’t get enough food. But now that the state is in a position to raise taxes, say, by removing some of the tax breaks which were given to big businesses in the last two decades, it’s suddenly very important that we store up food for future famines.

By the second sentence of “The Adventures of Louisa Baker” (1814, renamed “The Female Marine” in 1815), we realize that we’re on familiar, even cliched territory. We are in the world of the sentimental seduction novel:

“At the age of sixteen, by the deceptive arts of one whom I could not think capable of a base action, I was shamefully robbed of that which is rightly esteemed of inestimable value to my sex; of that, which, although it did not enrich the monster, made me poor indeed!”

Perhaps we can tell how familiar (even cliched) this genre convention had become by 1815 by how quickly Louisa puts it in the rear-view mirror; this is page one stuff, to be gotten through as quickly as possible. The author of Louisa Baker’s adventures is simply not that interested in either virtue or its rewards: this is not an event in the plot, but simple exposition, the thing we have to get out of the way before the real novel can begin.

By contrast, Samuel Richardson lingered for hundreds (even thousands) of pages on the trials and tribulations of Pamela and Clarissa as they sought to protect or failed to protect their treasure, thereby hyper-fetishizing the virtue that would be (or should be) rewarded

Pamela protects her virginity from Mr. B until he decides to marry her and Clarissa fails to protect it from Lovelace—and must redeem herself in tragic death—but at least has the satisfaction of convincing Lovelace to join her in expiration and expatiation
These novels drag on, and on, and on (and on and on), because suspense is key to pumping up the value of the withheld treasure: the more it is withheld, the greater our narrative desire for it becomes. But when the author of The Female Marine disposes of the matter on page one, we have no choice but to locate the suspense—and narrative desire—somewhere else. And while elements of the old narrative linger on, the author of this book is interested in a different question. It is this: having lost her womanly virtue—what she calls “the only gem that could render me respectable in the eyes of the world”—could a woman become respectable by becoming a man? And the answer is: kind of!

The Female Marine is not a very good book, by conventional standards, but, then, I suppose it depends on what conventional standard you decide to test it by. Published in three sequential pamphlets, it’s not exactly a novel by modern standards (and those are the only standards for novels there are). It’s definitely fictional—most likely written by a hack writer in the employ of a Boston printer named Nathaniel Coverly—even if it’s presented as fact (and fictionalizes a real female marine). But “it” isn’t even a single book: the first pamphlet was released in 1814, and hewed fairly closely to the sentimental novel convention (albeit with its one major innovation). The sequel pamphlet almost completely jettisons these tropes (wholly abandoning itself to early republican cross-dressing fantasy), while the third pamphlet attempts to reform itself by marrying its protagonist off.

Basically, it went down this way: in the wake of the war of 1812, Nathaniel Coverly decided to cash in on post-war patriotism by selling an updated version of the story of Deborah Sampson Gannett, a young lady from Massachusetts who not only impersonated a man successfully enough to enlist as a soldier during the revolutionary war, but served with enough honor that she actually managed (after a long campaign) to receive her discharge pay and a military pension. Gannett’s memoir, The Female Review, is not exactly a page-turner (nor was it even written by her)

Men writing for women pretending to be men; so Shakespearian!
, but it’s a fascinating document in its own curious way; a combination of Joan of Arc patriotism and desire for adventure send Gannett on an endeavor that she won’t quite recommend for emulation to her readers (and probably can’t), but which is also quite far from being something she can regret. After all, she fought for her country! And then settled down and raised a family, too. So, you know, no harm done.

Louisa Baker is not, however, Joan of Arc, neither a patriot virgin nor a “maid of heaven.” She falls on the first page, in the conventional way, and into the usual (and quasi-biblical) trap. The problem, as it always is, is this: how can her innocence protect her from the sin of which she is innocent? Like Eve, it’s exactly the point that it cannot; she cannot know to defend herself from a sin of which she is innocent, cannot anticipate the very thing of which she has no knowledge, until it is too late. And so, the “solemn declarations” of her “false friend” sway her mind; since she has no conception of anything but love—and her failing was that she loved too much—she is defenseless against deception, and thus is lost.

So… what now? This is the delicately posed problem of page two. What happens to an innocent after she has lost her innocence? What does she do? Where does she go?

See, for example, Hannah Webster Foster’s The Coquette, or William Brown’s The Power of Sympathy; in both cases, death is the narrative answer to out-of-wedlock pregnancy.
In the annals of the sentimental literature of the day, there are a few options: there’s death, there’s prostitution (and then death), and there’s, um… no, actually, that’s pretty much it.

But instead of committing suicide, Louisa goes to Boston—the “Big City”—on the theory that she can have the baby there and then somehow return, having thereby spared her faultless parents the embarrassment of a pregnant daughter. It isn’t clear what she plans to do with the kid (if she has a plan), and in the event, the problem will solve itself; “happily for it and its wretched mother [the baby] did not survive its birth but a few minutes.” This is the sort of facile plotting that makes it not a great novel; throughout, the hand of the author will be quite heavy in guiding the narrative conveniently to wherever it’s supposed to go. But put that aside; the point is that the loss of virtue is the entree to the book’s narrative, the thing that forms the foundation of its narrative possibility. It is not, in other words, the end of the world; in the new world, in fact, one might observe that it’s the beginning of the story. There is th possibility of redemption.

And she finds it.

For example, the kind of utilitarianism which understands the world through a matrix of pros and cons is incapable of explaining why the Irish shouldn’t butcher their babies for pleasure and profit; the cutting edge of Swift’s satire, after all, is that it makes sense in its own terms. The pros outweigh the cons. If you concede that economic rationality is the way to evaluate whether a thing is worth doing or not, then it’s clear that cannibalism is an effective and sensible way of solving the problem of starving babies and their parents: if they’re going to die anyway, why not use them to feed the survivors? Which will be more numerous as a result?

The reason why not, of course, is that we don’t eat human beings. We just don’t. Not ever, nuh uh, never ever, not EVER. Never.

Of course, sometimes people do eat other people, but that’s not the point. My point is that one doesn’t compare the pros and cons of cannibalism to other ways of alleviating poverty. Market logic doesn’t outweigh the fact that people eating people is something NEVER NEVER.

Now, you might be saying that mass surveillance is not “as bad” as cannibalism. But even there, you’ve given the game away the moment you start to establish terms of possible equivalence; if Cannibalism is X bad, then maybe mass surveillance is x/100 bad. Which is to say, once you’ve established harm as commodity, you’ve made it possible to buy it off, even necessary. The same is true with Irish cannibalism. If a million people are going to starve to death, but institutionalizing mass baby butchery will allow half of that number to live, how can you argue against it? Are you saying 500,000 Irish people need to die because you can’t make the hard choices?

Of course, even this is a dodge. The “hard choices” are quite easy: make the weak suffer so the strong can thrive, always the path of least resistance. The hard choice is never to raise taxes on the rich; the hard choice is always to cut social spending on the poor, the non-white. There is no choice easier than a politician pandering to the base. But even this example serves my point: the need to get re-elected trumps the need to balance the budget, or whatever austerity clusterfuckery my imaginary politician is engaging in. Economic rationality gets trumped by a different kind of consideration