Stale Blog
January 2004-July 2004
July 28, 2004
Two Essays on Massively-Multiplayer Games
1) The first essay I post here, as a PDF, is a paper I presented at a conference in Bristol in the summer of 2001 (See Ben Talbot's report on that conference for more). I should have circulated it widely then, but I didn't. First because I had hoped it would be part of an anthology I was trying to assemble that eventually collapsed because the other contributors had other obligations or couldn't commit after all, and then second, because I sent it in to a fairly well-known cultural studies journal. I sat on it for a while after getting peer comments. Some of those comments were useful and valid enough, and others--as several colleagues and friends predicted--amounted to a kind of disciplinary gatekeeping from new media and CMC researchers who didn't think I had paid my dues in that field. (One of the reviewers went through a long song-and-dance about how cultural studies always should address questions of production, circulation and audience, and that my piece was too focused on only one aspect of the audience's consumption of the game text. Grandmother, here's how you suck eggs.) So I finally decided to just get the thing out of my closet, for whatever it's worth, Though I updated it a bit in early 2002, it's now badly out of date in a number of ways, most crucially because it only takes passing note of the work of Edward Castronova, who has pretty much changed everything known on this topic. (As has Julian Dibble). It's also fairly out of date in its summary of the state of affairs in Everquest, Asheron's Call and Ultima Online, and in the wider world of MMOGs. (There's been some good material at Waterthread lately on the current state of things, not to mention the usual fine content at Terra Nova.) But here it is: Rubicite Breastplate, Priced to Move Cheap: How Virtual Economies Become Real Simulations.
2) The second essay is a shorter think-piece about a different way to design massively-multiplayer online games, about how to move more decisively towards making virtual worlds. It's called The Narrative-Nudge Model for Massively-Multiplayer Games.
There's a third essay that's a sort of missing link between the two, which is an almost-completed, fairly lengthy review essay on the (infinitely many) problems of Star Wars: Galaxies, and how studying SWG made me change my mind completely about the conclusions in the paper I presented at Bristol. That will be appearing (I hope) in an online journal in the not-too distant future.
As I Would Not Be A Slave, So I Would Not Be A Master
I have rarely paid
much attention to the party conventions, but this year is different in every
respect. Ive been finding the gosh-wow stupidity of the television journalists
about the presence of bloggers unintentionally hilariouslistening to Jeff
Greenfield on CNN explain the exotic idea of a link as if he were
trying to explain superstring theory, followed by some reporter minion of his
practically wetting himself over the intricacies of some strange new-fangled
thing called the Web , was especially rich.
The more interesting
thing to me was something that came out in Gores mercifully brief speech
and reverberated occasionally throughout the night, what I saw of it. Even before
9/11, one of the things that really bothered me about Bush and his administration
was their sick arrogance, their lack of respect for the thinness of their electoral
margin and what that should have told them about their mandate or lack thereof.
It bothered me before I even knew that it did, or why it did. It bothered me
early and angers me now because that arrogance has dragged American society
to a very seriously dangerous juncture in its history.
Im not talking
about my usual opinion on Iraq and the response to 9/11. If you read this blog,
youve heard that all before. Thats reason enough to vote against
Bush.
However, theres
something deeper and wilder here, a fire that will more than burn the hands
of kids playing with matchesand theres been a lot of playing with
matches since November 2000.
The New York
Times has lately been assuring us that ordinary Americans are not bitterly
divided on partisan grounds, and in one sense, I believe it. Yes, I know that
there are a great many issues on which there exists some degree of consensus,
and probably many issues beyond that where there might be disagreement between
Americans, but of a mild and unexciting sort. In another sense, I think the
Times polls are full of crap. Among the Americans who actually
vote, who are attuned to political issues, theres a high-strung sense
of tension and anxiety that Ive never experienced in my lifetime. Maybe
1968 would compare: I was more concerned with my tricycle at that point, so
I cant say in a meaningfully experiential way.
I used to say,
around October 2000 or so, Ok, so what if Bush wins? It wont be
that bad. Hell do some things I dont like, but hell be fairly
constrained both by the size of his victory [which we could all see would be
small if it came to pass] and by a prudential need to appease the political
center. I was seeing Bush as his fathers son, and his presidency
as the mirror image of Bush the Seniors presidency.
This was staggeringly
wrong. The second Bush presidency has been unprecedented in its ideological
extremism and arrogance. I think that reflects very badly on Bush and his associates.
If we want to talk about Bushs lies, lets start with his promise
to govern with all Americans in mind, as a uniter and not a divider. Thats
his biggest lie of all, one that cant be qualified as an accidental error
based on faulty intelligence or a modest distortion. Theres no way to
argue that Bush has governed with the intent to unite, to overcome partisan
division. Al Gore called Bush on this lie last night, and rightfully so.
This is about more
than Bush. One of the reasons I chide people on the left for not seeking dialogue
and consensus, one of the reasons I am constantly looking for the presence of
reason and the possibility of connection across the political spectrum, is that
if we get ourselves into a situation where 51% of the voting population or a
narrow majority of electoral votes is imposing a total and coordinated social
and political agenda on the almost-as-large minority who has a radically different
and equally coordinated social and political vision, were staring at the
threshold of a very scary future, regardless of whom the 51% is or what they
stand for.
In this respect,
we have to see past George Bush and his poor leadership for a moment, and see
the people who strongly stand behind him. It is they who really matter, their
choice which will shape the next four years. It to them that I make my most
desperate, plaintive appeals, my eleventh-hour plea not to pull the trigger.
To choose Bush is to choose to impose the starkest, most extreme formulation
of the agenda that Bush has come to exemplify on a population of Americans to
whom that agenda is repellant. To choose Bush is to choose Tocquevilles
tyranny of the majority (or even, judging from the popular vote in 2000, tyranny
of the almost-majority). To choose Bush nownot in 2000, when he plausibly
could have been many things--is to aspire to tyranny, to ruling your neighbors
and countrymen. That some on the left have had or even achieved similar aspirations
from time to time doesnt change things: its wrong whenever it is
the driving vision of political engagement, for whomever holds it.
I know that there
are socially and culturally conservative Americans, many of them Christians,
who already feel that they live in a Babylonian captivity, that they are already
at the mercy of a secular culture. But the vigor of evangelical Christian culture
in the past decadethe profusion of Christian books, movies, television
shows, and so ondemonstrates to me that a secular, consumerist America
is one where even nonsecular or dissenting Americans are free to make their
own way, form their own communities, choose their own culture. A culturally
conservative crusade led from the White House is not the same thing, not a mere
flipping of the coin, a karmic reversal. An evangelical Christian can refuse
to consume pornography, but if pornography is outlawed, then anyone who wishes
to view it is a criminal. Feeling the need to avert ones eyes and being
subject to criminal penalty are very different things. Its the difference
between freedom and unfreedom, between the Bill of Rights and a series of wrongs.
If Kerry is elected, and imposes a kind of extremist political vision root and branch upon the Americans who oppose him that is comparable to what Bush has done (I dont see how he could, given that Congress is likely to be Republican in any event), then well know that there is no possible consensus for us all, that a kind of final struggle has been joined in which every American will end as either tyrant or slave. I choose to believe and hope and trust that were not there yet. I choose to believe that we can have leaders who will not push us to that brink, and that we can have voters who also forbear to do so. If Bush is chosen, it may signal that there's no way out. I yet believe we can find the place where ordinary American decencies live, where most of us can go along to get along, where dont tread on me and the City on the Hill belong in the same neighborhood, are part of the same love of country, are equally part of the American Dream.
July 26, 2004
Brief update on the garden situation. The tomato thieves are deer. I caught them at it this morning, one walking around with a big tomato right between its teeth, and to my surprise, they're getting in not by jumping over, but by squeezing through a very thin gap in the fence at one point. So I've tied that gap off with wire and we'll see if that makes a difference.
July 26, 2004
The Limits to Generalism
I spent three days
at the 3rd International Conference
on Autonomous Agents and Multi-Agent Systems in New York last week.
I was a little
disappointed, in some ways. I had hoped the meeting would be a bit more interdisciplinary,
despite its strong connections to the American Computing Society. It was pretty
much computer scientists all the way down. But thats where multi-agent
and autonomous agent systems live intellectually. One should not be surprised
that the sun is in the sky during the daytime.
The consequence
for me was that I understood very little of what I saw and heard. Every once
in a while, the light broke through the clouds, generally in papers that were
very explicitly devoted to using multi-agent systems for social simulation,
those more concerned with the conceptual design and application of their simulations
and less concerned with the formalisms, protocols or algorithms underlying them.
I was able to grasp one presentation on the simulation of social insects and
pheremones (due to the intensely well-travelled nature of the example) and even
to see that the presentation offered relatively little that was new on that
topic. I really liked one presentation that proposed a formalism for generating
irrational agents, or at least for nesting normal Bayesian game-theoretic rationalities
one step away from the functioning of a multi-agent system. It seemed very innovative
and intelligible, particularly given that I was struck by how utterly reliant
the whole field has become on rational optimizing designs for agents. I was
also struck at the extent to which the demand for application to commercial
needs drove the vast majority of presentations.
At most other points,
however, not only did I not understand anything, I barely understood what Id
have to understand in order to understand a presentation.
I repeatedly extoll
the virtues of generalism, but it cannot do everything. The sinking feeling
I repeatedly had during the conference was knowing that to even get to the point
where I grasped the substantive difference between different algorithms or formalisms
proposed by many of the researchers at this conference, where I could meaningfully
evaluate which were innovative and important, and which were less attractive,
would take me years of basic study: study in mathematics, study in computer
science, study in economics, areas where Ive never been particularly gifted
or competent at any point in my life. To get from understanding to actually
doing or teaching would be years more from there, if ever.
The reverse movement
often seems easier, from the sciences to the social sciences or humanities,
and in truth, it is. Theres an important asymmetry that I think is a big
part of the social purpose of the humanities, that intellectual work in that
domain returns, or should return, broadly comprehensible and communicative insights
rather than highly technical ones, and thus, that the barriers to entry are
lower.
The ease of that
move is deceptive, however. Its the kind of thing that leads someone like
Jared Diamond or other sociobiologically inclined thinkers, especially evolutionary
psychologists, to what I call ethnographic tourism. Operating out
of a framework that requires the assumption of universalisms in order to make
cogent hypotheses about human history and behavior, scholars coming along that
path often quickly scoop up the studies and accounts which support the foundational
assertion of a universal and ignore those which do not or casually dismiss them
as biased or culturalist, regardless of the methodology those studies
employ. Thats what leads to their peculiar preference for the work of
Napoleon Chagnon on the Yanomano, for example. Bogus or wild-eyed controversies
about immunizations and manipulation aside, theres at least reason from
an utterly mainstream, meticulous, scrupulous and disinterested perspective
to view some of his methodologies as debatable and to take seriously the work
of other scholars who have made very different findings. Theres a selectivity
principle at work in ethnographic tourism that wouldnt be tolerated if
it wasn't scientists cherry-picking material from anthropological scholarship
they like and ignoring contradictory work.
That is not atypical
of what can happen when scientists pressing towards generalism think they understand
disciplines outside the natural sciences. Similarly, its become easy to
mock and ignore scholarship in the humanities for being too theoretical, fashionable,
incoherent, and so on, which it very often is. Alan Sokals hoax hit a
real target, but if you want to think and write about problems like the nature
of existence and knowledge, or about why and what a cultural work means to its
audiences, sometimes you really are going to have to go into deep waters that
require a complex conceptual framework. Some scientists tend to forget that
on a series of crucial issues, skeptics in the humanities were closer to the
truth for decades than scientists, most notably in the early debate between
philosophers of mind, neuroscientists and computer scientists working on artificial
intelligence about how easy it would be to create AI.
That debate is
an important reminder, however, of what a kind of disciplined drift towards
generalism can bring. The intensely fertile contemporary practice of cognitive
science draws from all those areas and more besides. It almost seems to me that
a good generalist ought to combine an overall curiosity and fluency in the generality
of knowledge with a structured search of the possibility space of the intellectual
neighborhoods which are just far away enough from their specializations to return
novel possibilities and angles of attack but just close enough that those neighborhoods
are potentially accessible with a reasonable amount of scholarly labor. To think
about generalism in this way is to realize that different generalists are not
going to end up in the same place. Their mutual engagements or conversations
will have to happen in places of accidental overlap, because the concentric
circles of one's own generalist competency are going to differ because they
originate out of different initial specializations.
Proximity to your
own discipline and specialization can also be deceptive. Im planning another
version of my Primary Text Workshop course for academic year 05-06. Id
like it to involve the students in doing the preparatory work that would be
required for making a virtual reality environment based on a historical communitythe
speech trees, the knowledge of clothing and other material culture, the architectural
and geographical knowledge, the understanding of everyday life rhythms, and
so on. Id prefer it be about a city whose history I know very wellJohannesburg
or Cape Town spring to mindbut access to primary materials will obviously
be limited. On the other hand, late colonial Philadelphia seems an apt choice,
but I find myself simply overwhelmed by the literature Id have to read
in between now and then in order to achieve a basic comfort level. Its
not enough to have read Alan Taylor, Timothy Breen, Gordon Wood and so on about
the colonial and revolutionary eraId need to go far deeper historiographically
than that, and at that point, you begin to wonder whether it isnt just
smarter to hand the class off to a colleague who already specializes in that
era.
Ive been thinking about how to calculate the wider bounds of generalism beyond the discipline. In my case, for example, some of the ideas associated with complex systems, emergence, autonomous agents and multi-agent systems and so on are close enough conceptually that I can make use of them and contribute insights to colleagues working in those areas, but theyre just far enough away that I should not ever expect to do original work directly in computing applications myself. Sociobiology might be close enough that I could reasonably expect to offer some critical insights into its methods, but not close enough that I could expect to do my own original research into population genetics. Theoretical physics would be far enough away in every respect that I might not ever reasonably expect to understand it, let alone do it, given that much of it cannot even be translated from its mathematical conception into broadly communicative prose. At that point, you have to have enough faith in the entire system of knowledge production to just say, I trust you to do what you do, and to do it how you do itand if it becomes imperative to do more, as it does in the case of tenure review, you just have to outsource the job of deciding whether another scholars work is original or skilled to someone else, to have the humility to know where the final outer bound of a generalist intellect lies.
What Gus Here is Sayin
Well, criticizing
Michael Moore definitely seems to get a rise out of some people, judging from
this Crooked Timber
thread in which John Holbo springboards from some negative
comments I made about Fahrenheit 911.
There are criticisms
I feel free to disregardthe cry that attacking Moore is breaking ranks
or failing to play for the home team. The political rap across the knuckles,
the call for left solidarity, is one of the surest signs of intellectual weakness
that I know of, and a major reason I have no interest any longer in whether
Im considered to be on the left or not. Equally is the reflexive,
gut assumption that anyone who fails to genuflect to Moore must be a defender
of the war on Iraq. Hardly, as anyone who reads this weblog knows very well.
A number of commentators
protest what they see in my original comments or in Johns argument as
an equivalence between Moore and the Bush Administration, or between Moore and
the most grotesque liars and rabid animals of the polemical right like Ann Coulter
or Michael Savage. I agree theres an asymmetry. In the first instance,
because the people who lead the country and the people who comment on that leadership
are simply very different in the consequences of their views. Theres no
question that the intellectual dishonesty and closed-mindedness of the Bush
Administrations key war planners is vastly worse, and of vastly more concern,
than anything Michael Moore has to offer. And I dont see anything in Fahrenheit,
for all that I dislike it, that compares to someone like Ann Coulter wishing
that Timothy McVeigh had blown up the New York Times building. There are differences
of proportion in either comparison, and Moore is hardly job one or even job
one hundred on a very long and filthy list.
But what some CT
commentators seem to me to be saying is this: Politics is a dirty, hard business,
and we have to play dirty to win. They're saying, dont come in here with
your effete intellectualism, your Marquis-of-Queensbury rules, your naïve
pomposity. Moore works, hes down with the people, hes telling it
like the American people need to hear it.
This is precisely
what I took up in my Cliopatria
essay: is Moore effective, and effective at what? So I dont
disagree with the CT commentators who say that you have to play politics to
win, and that if Moore is effective, thats a countervailing virtue that
outweighs any pedantry one might unload at him. What I think is the CT commentators
are actually revealing, however, is why the American left is on a persistent
losing streak in the tough game of political struggle (not to mention a nasty
little streak of intellectualized anti-intellectualism that is another classic
kind of left-wing panic button).
They assume that
fairness and intellectual discipline are somehow antithetical to the crafting
of effective political argument and rhetoric and they assume rather than demonstrate
that Fahrenheit is positively influencing the constituencies whose mobilization
against the Iraq War and the Bush Administration is useful or needed at this
point.
Fairness and open-mindedness
is a pretty crucial part of my own political and intellectual voice. Thats
first because I assume that it is a positive good, an ethical position, and
to adopt an ethical mode of acting in the world is itself a political strategy.
It is a commitment to the dispensation that one hopes to build. I assume, very
deeply and I hope not unreasonably, that there would be enormous social good
that would come to pass if the American public sphere was everywhere authentically
marked by fairness, open-mindedness, and mutually agreed-upon standards for
rational argument and use of meaningful evidence.
This the critics
would be right to say is an insufficient reason to criticize anyone failing
to reach that standard. By itself, it is a luxurious high-mindedness. However,
fairness also works as politics in the operational sense. An operatic,
performative commitment to decency, an over-the-top acknowledging of the legitimacy
of potentially legitimate arguments, an attempt to reduce cheap shots, a showy
constraint for saying only that which can be said based on strong evidence:
these all function as powerful tools in political struggle within the American
public sphere.
Who brought Joe
McCarthy down in the end? Not somebody playing dirty, down in the
same gutter with McCarthy, but someone who waited for their moment and caught
McCarthy in a decency trap, who revealed the mans fundamental unfairness
and viciousness in part by being scrupulously decent themselves. How did Archibald
Cox defeat Richard Nixon? By walking the straight and narrow. Being decent and
fair and meticulous isnt intellectual wankery: its hardball.
Its especially important in the context of the metapolitics of weblogs as a subdomain of the public sphere. Crooked Timbers contributors regularly take other webloggers to task for the inconsistency of present arguments with past positions, or for their contradictory use of evidentiary standards. That kind of critique only has political influence, e.g., the capacity to alter the way that others think and act, inasmuch as it is a performative, demonstrated constraint on those who offer it. This is what I understand John Holbo to be talking about most centrally in his own comments. If you hold someone else accountable to standards that you do not maintain when you're talking in the public sphere about someone on your "home team", you've shot your wad, you've blown your credibility, you've lost political capital.
Thats the
league that Michael Moore is in: the public sphere, weblog and otherwise. Within
that league, there are or ought to be rules. Playing by the rules earns you
political capitaland if you have political capital, and spend it wisely,
youre effective in influencing other players in the public sphere,
even sometimes those who may pretend not to care about those rules. If you have
none, you never get the chance.
All this might
be, as some CT commentators suggest, purely academic or at least confined to
a sparsely inhabited region of the public sphere where the air is thin if Fahrenheit
were a boffo smash with those American audiences who have yet to commit to the
struggle against the Bush Administration. Some CT commentators assume this rather
than demonstrate it, presumably on the basis of the movies impressive
ticket sales to date. But by that same standard, one would have to assume that
The Passion of The Christ converted huge numbers of previously secular
Americans to Christianity. Ticket sales, even in the land of Mammon, can tell
a thousand different sociological stories, and it takes more than that to know
what a particular film, book or weblog is doing out there in the world. Theres
nothing harder than studying an audience's mindset. But at the least, we already
know enough about where Fahrenheit is doing well to suspect that it is
largely preaching to the converted.
My own intutionjust
as thin evidentiarily as that provided by the usual working-class-heroes cheerleader
squadis that Moores particular confabulation of conspiracy theory,
left-wing writ, smarminess, and powerfully affecting and moving scenes of suppressed
truths is only sporadically persuasive for those American constitencies which
are potentially moveable in their views on the war or on George Bush, and may
at times be actively counterproductive. Much of what irritates me about Fahrenheit
is that is often self-indulgent, unnecessary, superfluous, appealing mostly
to the very intellectuals who then turn around and tell me that appealing to
intellectuals is effete and ineffective. Though it might be aesthetically less
satisfying and entertaining, something much more conventionally melodramatic
or Ken-Burns-respectable might be more powerful by far, crucially because of
a peformance of fairness". The curious thing that moves through at
least some defenses of Fahrenheit is an assumption that Ma and Pa Kettle
aren't gonna come out and see a documentary unless it has plenty of bread-and-circus
pleasures, lots of yuks, unless it goes down smooth and easy. To me, that defense
isn't just vaguely condescending, I would also suggest it's wrong. I think you
could sell $100 million in tickets for a de-Mooreified Fahrenheit that
had all of the heat, all the anger, all the revelation, but without all of the
bullshit.
Some reply further
at this point in the argument that the effectiveness of Fahrenheit is
not measured in whether it changes any hearts and minds, but in mobilizing and
energizing the left for the struggle ahead. First of all, come on: how much
angrier and more mobilized can people on the American left possibly get without
having an aneuryism? YEAH! YEAH! IM SO ANGRY! GRRRR! GONNA TAKE BACK MY
COUNTRY!! GRRR!!
More to the point, I cant think of anything less effective politically. Guess what happens to a boxer who gets wildly pissed off and starts taking huge swings at his opponent? He ends up tired and leaves himself wide open for jab after jab. Maybe he gets Buster-Douglas lucky once in a great while, but most of the time he ends up on the canvas.
July 15, 2004
The Swarthmore Tomatofield War
Along with travelling,
Ive been gardening. My faculty rental comes with access to a very nice,
large garden plot that is some ways away from the house, on the verge of a large
wooded area that descends to Crum Creek.
My first year of
having a vegetable garden was the best, in 2002.
I had a fabulous yield of tomatoes, peppers,
zucchini (way too much zucchini),
pumpkins, tomatillos,
sunflowers and herbs. The only thing that got absolutely annhilated was my corn,
which some animal stripped bare just as the ears appeared.
My second year I gave up on corn and added string beans. These grew fabulously well and were hugely tasty. But this time
my sunflowers
were absolutely destroyed before they could even germinatesomething systematically dug up all the seeds for two plantings (this happened again this year).
The 2003 tomatoes
were also subject to heavy assault by unknown vermin. The zucchini died of some
kind of rot that covered the leaves with a grey mold and then turned the stalks
to mush. The herbs did really well, though, except for rosemary, which just
doesnt seem to grow out here.
This year, Im
doing ok. I planted 18 tomato plants because the whole point of this garden,
really, is to get me the tomatoes I cant buy anywhere, and get me lots
of them. But something has assaulted them againtheyre disappearing
just as the first streak of red appears. So Ive taken to picking them
when theyre yellow and letting them ripen inside. The string beans sucked
this time, but I think thats mostly the seeds I plantednot as good
as the heirlooms I planted the first year. The zucchini is rotting again. The
pumpkins died quickly for some reason. The herbs are terrific as always. The
tomatillos are growing, though like last year, have been slow to flower. Carrots,
to my surprise, are flourishingI tried the previous two summers and couldnt
get any to germinate. Peppers are doing well (poblanos, jalapenos, serranos,
thai bird peppers). Herbs are self-sustainingly great now. Eggplants are limping
alongIve never had much success with those, either.
Its the tomatoes that are on my mind all the time now. They are what I want and crave. I managed to get one off and it ripened and I just put a bit of salt on it and devoured it. Nothing like it in the stores, not even the fancy-schmancy ones.
Heres what
I do to protect the garden: a 5-foot wire fence that I set into a one-foot deep
trench to prevent digging under. Bobcat and coyote urine in the corners of the
garden and soaked into cotton tags hanging from the tomato cages. An egg white-capascin-vinegar
repellant mix sprayed on the tomatoes themselves. And theyre still disappearing.
This year, theyre disappearing outrightIm not finding the
half-eaten corpses Ive found in the previous two years. And theyre
disappearing well before they ripen.
So Im working
on hypotheses about whats doing it.
1) Chipmunks and squirrels. Ive seen both of them raiding the tomatoes in the past; chipmunks were clearly the guilty parties last year. But they usually leave half-chewed tomato corpses and they usually only want semi-ripe ones.
2) Woodchuck. I dont think the local ones can get inside my garden as it stands, and Ive never caught them going for tomatoes anyway. I suspect them instead for other assaults in past years, including the Great Corn Massacre of summer 2002.
3) Rabbit. So why arent they eating the carrots, which are almost ready to be picked? Maybe they dont know what they are. How are they reaching tomatoes that are two feet off the ground? But the young ones can unquestionably get in through the fenceIve spotted babies and juveniles in the garden before.
4) Deer. I thought Id really made it so they couldnt jump over, but theres plenty of tales of deer jumping 5 feet, so maybe. No tracks, though. Do they eat tomatoes? Not sure, but Ive seen other things that make me think theyve been in the garden (plants that look trampled).
5) Raccoons. We got em, theyre clever, and they could easily carry away tomatoes if they can get in. But damned if I know how they might climb the fenceI wouldnt think it would support their weight, and theres nothing dug under it anywhere. They might be pushing in past my improvised gate but I doubt itit always is tight whenever I come out to the garden myself, with no signs of disturbance.
6) People. Im afraid this is my current working hypothesis. The garden is a long ways from the house and people can come and go in it without being observed from any house at all. No footprints, though, even with the recent rain. I have had thefts from the garden before, thoughseveral ripe pumpkins disappeared in September 2002 just as they were ready to pick, for example.
I dont think there is much left I can do to keep varmints of all kinds out, though. Maybe lock the gate to test the people hypothesis, though that seems extreme. I tried stringing chicken wire around the top of the fence to discourage deer, but it ended up looking like a vegetable gulag rather than a garden. I used to put chicken wire around the tomato area, but the chipmunks just laughed at that. I kind of wish I had an old hound, a rocking chair and a shotgunId sit out there for a few nights and see whats what. Except that its illegal and I dont think the college would be too wild about me blasting away at various critters on their property, not to mention my neighbors. I suppose I could put traps in the garden and see what gets caught, but thats like catching a few raindrops and thinking youre going to get a sunny day.
I keep thinking about that bit in Robert Lawsons Rabbit Hill where the kindly (and evidently very rich) Folks put out a crapload of vegetables and such every single night in order to keep all the animals out of their own garden. There is enough for all, they said. Im guessing that this is not the casethat I could plant 50 tomato plants and still watch them get stripped by the Mystery Vermin. So its warif only I could figure out what Im fighting and how to fight it.
How I Spent My Summer Vacation (So Far)
Spent a good while
visiting family in Southern California in June and July, which was a lot of
fun.
I had a chance
to visit the gallery my brother runs in Los Angeles Chinatown. Its
called Oulous Repair Shop, and I really
like what hes done so far. The web page for the gallery doesnt actually
list the address, which is 945 Chung King Road.
Speaking of which,
hes trying to put together material for a show sometime this fall on fringe
technological designs. Hes been writing a few scientists to see if they
receive and keep letters or inquiries from fringe inventors or technologists,
both to try and get names of people to contact and to see if he can gather together
any sketches or visual material that were included in such inquiries. If youve
got any ideas or sources of possible material, contact him at xing@oulous.com
.
I also spent a bit of time at my mothers store, Mixt, which is in the Rivera Village shopping district in Redondo Beach, 1722 South Catalina. Its a great placeshes got a nice mix of little doo-dads and very interesting high-end craftwork.
Los Angeles as
a whole still puzzles me. I like it (and California as a whole) a lot better
than I did when I was a surly teenager. In fact, some of Southern Californias
best features are tailor-made for the middle-aged: good food, good booze, great
weather, easy living. Its a tough place to live if you dont have
moneythe housing market there now staggers me, after two decades on the
East Coast.
One of the interesting
things about LA to me now is that it seems to me that the ceaseless reworking
of its built landscape has slowed somewhat. I remember a period from about 1980
to 1995 or so when I would visit and find that the retail and residential landscape
had shifted once again within a very short time frame. Wed go to places
where there had never been houses and lo! Giant developments sprawling as far
as the eye could see, people moving in who were facing daily commutes of two
hours in each direction. Youd go back to a mall or neighborhood with stores
you liked and they were all gone. There are areas which are still very much
in flux, but a lot of things seem to me since 1995 or so to have been much more
static across the core of the LA Basin. Maybe Im wrongits
hard to know when you only visit twice a year or so. My brothers often have
pointed out that there's much more visible, physical history to California's
built landscape than most people, including locals, think.
It also seems to
me that high-end food retail nationwide has caught up somewhat with Californiaits
much easier now around us to find very good produce and meats, and quintessentially
California businesses like Trader Joes are now national (though I think
Trader Joes is going to be very hard-pressed to maintain anything close
to its traditional quality/price ratio at its present rate of expansion).
But even with overdevelopment, pollution, crowding, traffic and the like, Im pretty hard-pressed to think of anywhere on the East Coast that has the attractive mix of weather and landscape that a number of California cities do, including Los Angeles. I just can't work up enthusiasm for East Coast beaches or East Coast mountains in comparison. My Dad, who was born in California, always used to say whenever the Rose Bowl was on, showing people playing football on a sunny day to the rest of a miserably cold nation, Well, here comes another 50,000 assholes.
21st Century College: An Outline
I've been messing around with some ideas about a fantasy college, about what kind of institution of higher education I might build given $500 million and total autocratic power. This is what I've come up with. The sketch I lay out deals with three interwoven issues: first, the overspecialization of the academy, second the insularity of academic life, and third, the increasing over-management of academic communities and the heedless expansion of the "full-service" approach to higher education.
The result will doubtless not be particularly palatable to most or many--hell, I'm not even sure that I would want to teach there in a few ways. But on the other hand, I think it is sometimes useful to imagine systematic alternatives in order to understand how--or whether--what we already have might need to be changed.
July 14, 2004
On Third Partyism
A modest rebooting
of this blog now that Im back, on the
recently discussed subject of whether it is a kind of infantilism to support
political reforms in the U.S. to allow third parties to compete fairly at the
polls.
The simple answer:
it depends, but yes, often third-partyism is an infantilism, one I have myself
been guilty of in the past. The basic problem with the most devout third partyists
is that they either have woefully unrealistic models of the likely prospects
of their own preferred third party or they lack any sense of a comprehensive
alternative idea of political competition, and argue for third party competition
as a purely ad hoc response to some particular dissatisfaction with the Republicans
and the Democrats.
Greens or Libertarians,
for example, probably would poll only marginally better in most cases than they
do now after a breakup of the two-party duopoly. They might have regional strongholds
that theyre denied now, and be able to send a few representatives to Congress
or to state legislatures, but in Presidential races or even state-wide ones,
I dont see them being competitive for the forseeable future.
This is even more pressingly true for the progressive wing of the Democratic
Party. Third-partyism here, especially in its Naderite form, really is a kind
of head-in-sand wish-fulfillment scenario, a belief that progressive or left
politics, once freed of its captivity to the Democrats, could be a powerful
electoral force in general.
I would agree that
a Democratic candidate who ran with some conviction and a strong sprinkling
of populism might well be a roaring success with independents for the same reason
that John McCain is, but that is a question of character: what is liked about
such a politician is their honesty, their authenticity, not their ideology.
Stripped of a compensatory
attraction due to the character of a candidate, a strongly left politics in
most areas of the country would be a major electoral failure, and a simple third-party
with a progressive character that was permitted to compete fairly within the
present system would go nowhere, both on its own terms and in terms of its influence
on the Democratic Party, which would probably move even further to the right
in order to compete for the larger pool of independent voters rather than the
small pool of hard-core progressive votes. As a voting base, progressives simply
dont compare in either fervor, geographic rootedness or numbers with the
religious right, and cant hope to accomplish what the religious right
has in terms of seizing control over the Republican Party.
In conventional
terms, the only third party that might benefit from a relaxation of the standard
barriers to competition would be something like the Reform Party, a kind of
independent-bloc soft libertarian party that could give a home to backers of
McCain, Schwazenegger and other Republicans who dont fit with the Bible
Belt social conservatives who have seized control of the Republican Party whilealso
drawing off some suburban Democrats and possibly working-class Reagan
Democrats as well. If thats whats on offer, no, thats
not an infantilism, its a reasonable if unlikely third-party ambitiona
parallel to the 19th Century formation of the Republican Party, a response to
new social constituencies who found themselves effectively without any political
party corresponding to their interests and outlook. The same may be true now
for a variety of Americans, or it may not be true, but the most unrepresented
American constituencies in this sense are not urban voters or rural ones, but
instead the swing constituencies who are perpetually wooed by both
parties but the bedrock voting base of neither. This is the only conventional
third-party movement that I can see making any real political headway
at this moment in American history.
Such a third party could also only succeed by first pursuing electoral success at the state level and in Congressional races and by placing reform of winner-take-all politics as its first and primary agenda. The more comprehensive and specific its alternative political platform was, the less headway it would make, as the most appealing parts of its agenda would be cherrypicked by the other two parties and electoral reform left quietly by the wayside. In a sense, this party would have to enter the political system by agreeing to back the agenda of either of the other parties in exchange for systematic reform of the current electoral system to create a level playing field, in a very conscious process of horse trading. That mission accomplished, the new party could then begin to flesh out an independent political platform of some kind. This is quite evidently not the strategy being pursued by Ralph Nader, now or in 2000, nor is it the strategy that any of the third-party Presidential candidates of previous years have pursued.
Third-partyism also makes some degree of sense if its articulated as a comprehensive program of political change designed to transit American politics to a more parliamentary mode, with many parties that have narrow ideological or political agenda and serve highly particular constituencies. This is a comprehensive change, rather than the normal argument for one or two third parties through minor tweaking of winner-take-all voting or reforming ballot-qualification requirements. This is not what most third-partyists in the United States seem to be arguing for, possibly because most people recognize that this particular reform is far from self-evidently desirable.
I used to think that a greater degree of ideological sharpness in our political system would be a good thing, but that seems far less desirable to me now: I dont want to have to choose between two exaggeratedly single-view philosophies. A parliamentary politics with many, many parties seems an even more unsatisfactory halfway house between republicanism and direct democracy than our present system. Id rather vote for a representative who strikes me as rational and fair-minded, even one who takes positions different from my own, than have to choose from a diversely sectarian menu and so divide my own political beliefs into fragments.
The Great Escape
Unlike Chun the Unavoidable (assuming he's not joking), I haven't given up blogging. I've just been away helping my mother-in-law to move out of her home and also trying to focus exclusively on several articles I owe to various publications. I'm going to be travelling some soon so blogging will have to wait a bit longer, but I have a number of substantial entries and materials being worked up that should be appearing here in mid-July, including long think-pieces on MMOGs, a blueprint for a new kind of college, and some more "Readings and Rereadings".
May 20, 2004
Busy couple of weeks here with grading, Honors exams, and some family matters, so blogging has been and will be lighter than usual for a bit.
Preparing a Place in the Museum of Failure
Norman Geras argues
strongly that as a supporter of the war in Iraq, he bears no responsibility
at all for Abu Ghraib.
I agree that those
who supported the war with a rigorously reasoned case do not have to feel personally
responsible for Abu Ghraib. I think it is appropriate to hold war supporters
directly responsible for Abu Ghraib if (and only if) they fail to regard systemic
abuse there and other American military prisons as being a grave concern by
the very same criteria that we held Hussein's misrule a concern.
Abu Ghraib does
have serious consequences for at least some of the arguments in favor of the
war, and I don't think one can dodge those consequences. It's possible but highly
unlikely that this is merely seven bad apples doing bad things--even if that
were so, this is where the basic point about oversight comes in. A failure to
have effective oversight is a guarantee of "bad apples" having impunity
to do what they do. The furtive, paranoid unilateralism of the current Administration,
its stonewalling of entities like the Red Cross, its apparent disinterest in
due diligence practices within its own institutional frameworks, made Abu Ghraib
inevitable.
Beyond that, however,
the evidence is considerable that this abuse was not merely an accident of mismanagement,
but a deliberate policy, deeply integrated into the Administrations entire
approach to the war on terror. Supporters of the war do need to
regard that as a serious issue for their case, because the war cannot be supported
as an abstraction. It can only be supported as a concretized, real-world project,
and if it is done badly in the real world, it eventually will (and I think already
has) do as much damage as the things it set out to fight. If you support the
war as part of a battle against illiberalism, then illiberal conduct by your
own "side" in the war has to mean something to you, have inescapable
implications for your struggle. You can't just shrug off the creation of a gulag
in Guantanamo where people have no rights, or evidence of a consistent policy
of humiliation and abuse.
To understand this
as a conflict that is resolvable strictly through military means or through
the imposition of formalist structures is my mind to absolutely and completely
misunderstand the nature of the larger conflict against terrorism. To extend
the military trope, its the equivalent of fighting the wrong battle with
the wrong weapons in the wrong placeand in military history, thats
how you lose a war even when you may have superior resources and force at your
disposal.
Those who do misunderstand it this way almost all share two things. One, a belief in the universal and inescapable obligations of modern liberalism. Its no accident that some Marxists, some liberals and many neoconservatives have found the war attractive, because they all derive tremendous intellectual strength from unversalist frameworks. This I find laudable and important and I recognize many supporters of the war who take this approach as intellectual cousins. (Those who do not share this commonality, like those parochalists and chauvinists on the American right who have endorsed brutality at Abu Ghraib, I recognize no connection with.)
But these supporters
on both left and right share another attribute which I do not share: a belief
that liberalism comes from above, that it can be imposed by power, that it emanates
from the structure of the state and is guaranteed by securing a working monopoly
on the means of violence. Equally, these thinkers share a belief that illiberalism
and oppression emanate from the top, have their source in malformed states and
ruling elites who have illegitimately seized control of the state in spite of
the natural and rational desire of most people for liberal democratic norms.
In essence, many of them--some from the left, some from the right--are statists.
This is what the shorthand of "Wilsonian" is all about: a grab-bag
aggregate that usefully links ideologically diverse arguments through their
common understanding of the nature of political change and the sources of illiberalism
in the world.
Fundamentally,
this is a clash between different models of change-by-design in the world, of
how one does praxis. Even when I was more strongly influenced by Marxism, I
was always drawn to the Gramscian vision of politics, to the notion of a war
of position, because that seemed much closer to me to how meaningful,
productive, generative change in the world actually comes about, in the messiness
of everyday life, in the small and incremental transformation of consciousness.
I do not believe, and have never believed, in revolutionary change, in the proposition
that a sudden, sharp disjuncture between the flawed present and the shining
future can be produced by a seismic transformation of social structure directed
by the state, by political vanguards or other major social institutions that
possess strong governmentality.
Real revolutions
happen in history, and they are genuinely disjunctive, deeply and abruptly transformative.
The ones that are productive largely happen by accident. They happen because
smaller social transformations have been building towards a point of criticality,
towards a sudden phase change. They do not happen by design or intention. Real
revolutions can be guaranteed by changes at the top, by the creation of laws
and rights and constitutions, but they don't come from those things.
False revolutions
happen in history, and they are much less disjunctive than their supporters
pretend. These are the classic political revolutions, the ones that try to force
history into a new mold by totalizing design, from above. They can do almost
nothing generatively useful at the level of real social change: they can only
destroy and terrorize. They cannot create. The only good example we have in
modernity is the American revolution, and it is notable that its most fundamentally
radical achievement was to specify constraints on its own transformative capacities.
Its moderation was the essence of its radicalism, and the source of its long-term
fecundity.
Power has a thermodynamic character: good things can happen when more energy is added to an existing system, but only if those bringing power to bear have modest ambitions and tremendous respect for serendipity and unintended consequences, for the organic evolution of events. The more ambitious the design, the more totalistic the ambitions, the more fatal and destructive the consequences are likely to be. A human world fully embued with the humanistic values of the Enlightenment is a world we all should desire, and we should harshly regard the world where it falls short of that. But this is where we have to have faith in the desirability of those values, and play the game steadily towards victory.
It is the velvet
revolutions of the 1990s that we should cast our covetous eyes at. The
fall of the Berlin Wall and the defeat of apartheid are the real triumphs of
our age. No invasions or interventions have a share in those victories, but
the resolute moral and political will of many states, civil institutions and
individualsbacked where necessary by military powercan claim
a great share of the credit. I don't deny that on occasion, positive revolutionary-style
change does come from above, but this is a rare circumstance, and all the historical
stars have to be in alignment for it to happen. That was not the case with the
war in Iraq.
The Iraq Wars
structural failure is that it is closely allied to the false revolutionary project,
to statism, to the belief that social practice usually can be highly responsive
to and conforming to the will of strong power, if only that power articulates
its will clearly. This is the failed conceit at the bottom of the well, and
where Iraq differs from Afghanistan. Afghanistan I support because its primary
logic was self-defense, and its secondary logic put forward a sensible, consistently
reasoned proposition that failed states represent a clear and imminent danger
to the security of liberal democratic nations. The national security logic of
Iraq, in contrast, was weak before the war and has gotten dramatically weaker
since.
Alongside this deep philosophical shortcoming, the failure at Abu Ghraib is indeed a sideshow. It is the deeper failure that the reasoned supporters of the war need to hold themselves accountable for. The Iraq War will take its place eventually as an exhibit in a museum alongside Cabrini-Green, state-run collective farming, Brasilia, the Great Leap Forward, Italian fascism, and other attempts to totalistically remake the substance of social practice from above.
Welcome to Paragon City
Im supposed to write an assessment of Star Wars: Galaxies and Ive been putting it off because I feel I need to go in and play the game again just to challenge my established prejudices. The conventional wisdom is that a massively-multiplayer online game needs a year to be judged. But Im dreading it: I follow news about the game and it seems to me that there may just be things about it that cant be fixed.
SWG left a bad taste in my mouth about MMOGs. All that expertise, all that prior experience, all that money, and a franchise that youd think was a cant-miss proposition, and the result was a worse-than-average experience in a genre of games that is already very unsatisfactory.
As a consequence, I have been looking at every other MMOG coming down the pike with equal presumptive hostility. In particular, I was sure that City of Heroes, a MMOG with a superhero theme, would be a disaster. When the committed cynics at Waterthread started saying nice things about the late beta, I began to wonder.
Now Ive been playing it for a couple of weeks, mostly on the Protector server, with a martial artist character named "Faust", and I have to admit it: I was wrong.
City of Heroes still has the basic problems of all MMOGs, but as far as the genre goes, it is one of the best. Its actually fun to play, and even more amazingly, fun to play as a casual playerI can drop in for 30 minutes and still find something pleasurable to do. Even the feature that I was certain would suck, which was building your character around archetypes that made more sense in terms of MMOG conventions than the comic book narratives the game borrows from, works pretty well without seriously violating the sense that one is a superhero in a universe of superheroes. Basically, its one of the few MMOGs that has kept a clear head about fun being the number one objective.
Maybe the most astonishing thing about the game is just that the first day of play went without major technical gliches, and that so far, there are very few disastrous bugs or technical problems. The major issue at the moment is that one type of mission doesnt work correctly, but its easy to avoid doing them. Theres a lesson here thats crucial. The only other game of this kind to launch well was Dark Age of Camelot. It shares with City of Heroes a basic simplicity and cleanness of design. Its clear: dont try to do too much by your launch, and keep your design as minimalist as you can. Im also hugely impressed by the communication from the developers: they tend to be very forthright, very out in front of problems.
Many small features in City of Heroes are well-implemented. For example, I really like that when I get missions from my contacts, after a certain point, I can just call them remotely to tell them the mission is completedI dont have to run all over creation to tell them. There are a few classic MMOG issues that are in some ways worse in City of Heroes than any other game: what people call kill stealing is for some reason uniquely aggravated in the evolving culture of its gameplay. The game also has a treadmill just like any other MMOG, and I still maintain thats unnecessary, that designers are not thinking properly about how to scale challenges over time, and insist on making hard mean time-consuming. And finally, as is de rigeur for MMOGs, there are some really dumb and unoriginal names and designs for characters out there. Ive seen huge numbers of Wolverine and Punisher clones. On the other hand, I havent seen a single Legolas yet.
Theres also some things Ill be looking for the designers to do in the months to come that will help the game be more evocative of comic books. For one, Im getting very tired of fighting cookie-cutter enemies: there should be colorfully indvidual supervillains at every level of experience. Thats the essence of the genre, and its sadly missing from the lower-level gameplay and even from the mid-game. In fact, how about every character created getting an archenemy, a supervillain who pops up from time to time to attack your character?
There are other elements of superhero narratives that need implementation in some way eventually. Secret identities and all that comes with them are completely absent. The mission storylines are pretty decentI saved a mechanic and his family from some robots and now civilians remember that I did sobut there need to be more plot types, more content that evokes classic superhero tales. There need to be major public eventssay each city zone being attacked by giant robots, with everyone pitching in to repel the menace.
Im still going to play SWG later this month to be a responsible critic, but when I want to have fun, Im going to be battling evil in Paragon City as the mysterious and inscrutable Faust.
May 12, 2004
Email woes: if you've sent me email in the past week, it's been sitting in the spool waiting for a very bad technical problem with my email to be ironed out. Problem solved, but be patient--it's going to take me a while to work through 300+ emails.
In Nothing We Trust
Free us from
oversight, said the Bush Administration on September 12, 2001, because
you can trust in our professionalism and our ethical constraint. Were
the good guys. We wont do anything bad.
President Bush
more or less repeats this mantra today in response to the escalating scandal
of American prisons in Iraq, Afghanistan and Guantanamo, that it was just a
few bad apples, that were a democracy and and this shows how great democracy
is that it can expose a few isolated misdeeds. Trust us. The worlds collective
jaw drops. Does he really believe that? If so, hes even more isolated
and naïve than anyone has suspected. If not, then he and his inner circle
are just as calculatingly grotesque as the most spectacular conspiracy theorists
have portrayed them as being.
Look at what those
photographs show. What anybody with an ounce of common sense knows is that the
scenes being staged in them were not dreamed up by a bunch of reservists. Its
got the stench of military intelligence all over it. Im sure well
hear in court-martials to come that no direct orders were given to have any
of this happen, or that it was only private contractors. How stupid do they
think we are? You can see it easily: an intelligence chief says to the grunts,
Hey, we need some information out of these guys. See if you can figure
out a way. A few minutes later he says, Hey, I heard from a buddy
that Muslim men really freak out about nudity. No orders were given, sure.
John McCain's fury at Rumsfeld during the hearings was clearly about this issue.
We all know how it works, and we all know that what happened in the prisons
goes right to the top. Not in the abstract, "I take responsibility"
sense (though what does that mean? In what respect is Rumsfeld or Bush doing
anything when they say that?) but in the quite concrete sense that permission
to torture and humiliate Iraqis was sanctioned by the highest reaches of the
hierarchy.
A few months back,
Mark Bowden published a
spectacularly foolish article on interrogation and torture in the Atlantic
Monthly in which he mistook a kind of abstract ethical bullshit session
thinking about torture for the actual institutional practice of it. I agree
that there is a kind of thought experiment on torture and coercion that we have
to undertake in an open-minded manner. If you knew with 100% certainty that
a suspect in your custody knew where a nuclear weapon was hidden in a major
American city, and that if you didnt find out its location within 24 hours,
it would be detonated, I think most of us would say, Whatever it takes
to find out, do it.
That is a fiction, a thought experiment. Bowdens defense of torture, in response to many angry letters observing that it is very rare for interrogators to actually know who is guilty or who possesses information that justifies coercion, was basically, Well, Im only justifying this in the case of the real professionals, who know what theyre doing, and wont ever misuse coercion or torture for anything less than a vitally necessary end.
Welcome to yet
another fiction or abstraction, and a remarkably stupid one at that. When is
this ever the case? What real people in the world have the necessary professionalism
and the necessary factual knowledge of the specific information held by a prisoner?
In practical terms, none. As Adam Ashforth has argued in his work on commissions
of inquiry in South Africa, states use coercion or torture largely to demonstrate
that they can. Its a performance of power--and that's mainly what US soldiers
have been doing in prisons, torturing and humiliating captives just to demonstrate
that they can do so. Bowden says, Trust them. The whole point is
that you cant and you mustnt, regardless of how clear-headed or
fair-minded the aspirant torturer might be.
The domestic and
international terrain on these issues intertwines. Many critics of the Bush
Administration charge it with an assault on the U.S. Constitution. Sometimes
these charges get hung up on the details of particular cases, or on antipathy
towards particular individuals like John Ashcroft. The charge is accurate, but
what we have seen in the last month is that its not just or primarily
about a set of specific attacks on civil liberties. The Bush Administration
is attacking the core philosophy of the Constitution, at every moment and in
every way that they say, Trust us.
Amid the wreckage
of American legitimacy, nothing stands out more than Theodore Olson and other
lawyers from the US Solicitor Generals office standing before the Supreme
Court of the United States arguing that in war, the federal government can do
anything that it judges to be a prudential necessity for winning that war, that
no constraints apply and that no explicit powers, Constitutional or statutory,
need be granted to the federal government to do that which it sees as needful.
That the executive branch and the military need no oversight or review by any
other branch of the government.
To hear the official
legal voice of the United States government making that argument is the most
shameful thing I have heard in my life. The pictures from Iraq are nothing next
to it. Olsons argument was the equivalent of watching him drop trousers
and take a crap on the Constitution. The central genius of the Constitution
is that it constrains the government, that it says that government has no powers
save those granted to it by the Constitution. It thoroughly rejects the claim
that government must be free to expand its powers expediently.
That is the living,
beating heart of the United States: that as a people and a nation, we are suspicious
of power. That we recognize that we must never just trust in power, whether
it is interrogators or the President. This has nothing to do with whether the
people who hold power are good or bad people. Good people, people you like and
trust, can misuse power. In fact, thinking probabilistically, it is a certainty
that they will. I can trust an individual as an individual, but that is very
different from writing individuals in my government a blank check.
Abu Gharib is about more than the Iraq War, and more than Donald Rumsfeld. It is the purest revelation of the consequences of the Administrations contempt for the core values of American democracy, a contempt that they are spreading insidiously throughout the government of the United States. We have a precious few months to remove that cancer, to uproot that tree root and branch. If we fail in Novemberand make no mistake, it will be we, it will be the majority of Americans who make the wrong choice, who failthen I think historians are likely to write that this was the beginning of the end of the American democratic experiment, the moment where the mob handed the reins to Augustus and so guaranteed that one day they would burn, too, under the serenade of Neros violin.
Primal Scream
Stop with
the hindsight, says one writer. Be patient, says another.
Oh, no, lets
not stop with the hindsight. Not when so many remain so profoundly, dangerously,
incomprehensibly unable to acknowledge that the hindsight shows many people
of good faith and reasonable mien predicting what has come to pass in Iraq.
Lets not be patient: after all, the people counseling patience now showed
a remarkable lack of it before the war.
One of my great pleasures in life, I am ashamed to say, is saying I told you so when I give prudential advice and it is ignored. In the greatest I told you so of my life, I gain no pleasure at all in saying it. It makes me dizzy with sickness to say it, incandescent with rage to say it. It sticks in my throat like vomit. It makes me want to punch some abstract somebody in the mouth. It makes me want to scrawl profane insults in this space and abandon all hope of reasonable conversation.
Thats because
the people who did what they did, said what they said, on Iraq, the people who
ignored or belitted counsel to the contrary, didnt just screw themselves.
They screwed me and my family and my people and my nation and the world. They
screwed a very big pooch and they mostly dont even have the courage to
admit it. They pissed away assets and destroyed tools of diplomacy and persuasion
that will take a generation to reacquire at precisely the moment that we need
them most.
Noah
Millman, for one example, is a very smart person who says many useful and
valid things, but I find it impossible to understand how he can give George
Bush the credit for being right on big principles like the principled
need to defend liberty, while conceding that Bush appears unable to understand
the complicated constraints of real life. The principled defense of liberty
is nothing if it cannot be enunciated within the terms of social reality. Its
just an empty slogan, and worse, one that makes no distinctions between political
actors. Does Millman really think John Kerrywho he sees as inadequate
to the task of leadershipis a principled critic of liberty? Just about
everyone besides Robert Mugabe, Kim Il-Jong, ANSWER and Doctor Doom believes
in the principled defense of liberty. George Bush gets no credit for being right
in this respect, and deserves to be soundly rejected for being so, so wrong
where it really counts, in the muck and mire of real life. Thats the only
principled defense that counts: the one whose principles can be meaningfully
reconciled with human truths. A policy that insists on living in a squatters
tent in Platos Cave is a non-policy.
There is a struggle
against terror, injustice, illiberalism. It is real. It will be with us all
our lives. We must fight it as best we can. The people who backed the war in
Iraq, especially the people who backed it uncritically, unskeptically, ideologically,
who still refuse to be skeptical, who refuse to exact a political price for
it, who refuse to learn the lessons it has taught, sabotaged that struggle.
Some of them like to accuse their critics of giving aid and comfort to the enemy.
Right back at you, then. You bungled, and you dont even have the grace
or authentic commitment to your alleged aims to confess your error.
After 9/11, I wrote
about my disenchantment with one very particular and relatively small segment
of the American left and its dead-end attachment to a particular and valorized
vision of sovereignity and national self-determination, seeing those as the
only moral aims of international politics. I criticized the need to see the
United States as a uniquely demonic actor in world affairs. I still hold to
that criticism, and I still think it addresses a real tendency. Im sure
Ill say it again in the future. I do regret saying it as much or as prominently
as I did. That was about my own journey, my own arc of intellectual travel from
my origins, not about a national need to smack down a powerful ideology. The
subject of my criticisms was not especially powerful or widespread in general,
and is even less so now.
I regret it because
I and others like me helped the blindly naive Wilsonian proponents of the Iraq
War to caricature their critics as Chomskyites all. The Bush Administration
had its fixation on WMD; Andrew Sullivan, James Lileks, Michael Totten and a
supporting cast of thousands had a fixation with the loony left.
That allowed them to conduct echo-chamber debates with straw men, in which the
proponents of the war were defenders of liberty and democracy and opponents
were in favor of oppression, torture and autocracy.
Small wonder that
they won that debatebut constructing it as such allowed them to miss the
very substantial arguments by other critics, who said, "The war on Iraq
cannot accomplish what you would like it to accomplish in producing a democratic
and liberal state in Iraq, no matter how noble your aims are. The war on Iraq
will not enhance the war on terror, in fact, it will severely damage it. The
war on Iraq cannot be justified on humanitarian grounds without arbitrarily
and inaccurately defining Husseins Iraq as a worse situation than many
comparable othersand an arbitrary humanitarian claim damages the entire
edifice of humanitarian concern".
There were plenty
of people making arguments like theseperhaps even within the Administration--and
they were shouted down or completely ignored before the war and even early in
the occupation. From these arguments, most of what has come to pass was predicted.
Not because of mismanagementthough there has been that, in spades. Not
because of the misdeeds of individualsthough there has been that a-plenty,
both within the Beltway and on the ground in Iraq. Not because the Bush Administration
lacked a free hand to do what it wantedit has had that, more than any
US government in memory. But because of deep, irreparable flaws in the entire
enterprise.
A war on Iraq where
the build-up was handled much more intelligently and gradually, with much more
attention to building international consensus steadily. An Administration not
addicted to strident purity tests and not irremediably hostile to both internal
and external dissent. An argument for the war that took pains to build bridges
rather than burn them, and that accepted gracefully constraints on its own claims
and objectives. An occupation that was methodically planned and clear about
the challenges ahead. These are the preconditions for even imagining the ghost
of a hope that the war could succeed in its humanitarian purposes. In their
evident absence from the first moment, the war could not overcome its handicaps.
Liberalism and
democracy do not come from formalisms slapped down on top of social landscape:
they come from the small covenants of everyday life, and rise from those towards
formalisms which guarantee and extend their benefits rigorously and predictably.
Constitutions, laws, procedures: these are important. But they cannot be unpacked
from a box alongside a shipment of MREs and dispensed by soldiers. They do not
make a liberal society by themselves.
To be midwives
to a liberal and democratic society, occupiers have to blend in to that society,
to become a part of it, to work from below, to gain a rich anthropological sense
of its workings and everyday logics. To do that, occupiers must become vulnerable
to insurgents and terrorists; they must hesitate to use violence. The two imperatives
pull in opposite directions, as they must do so. Smart management can ameliorate
or cope with that tension for a while, and there have been success stories of
individual American commanders who effectively straddled for a while. But the
whole enterprise has not, could not, and DAMN IT, some of us knew that it couldnt.
So now the oscillations
grow more extreme. To fight insurgents, one must sabotage liberty, become not
just occupiers but oppressors. To promote liberty, one must be vulnerable to
insurgents, and even risk losing the struggle outright to them. You can have
the rule of lawbut if you do, you cant have prisoners kept forever
as enemy combatants or handed over to military intelligence for
reasons of expediency. The law must bind the king as well as the commoner or
it is worth nothing, teaches no lessons about how a liberal society works. Yes,
the enemies of liberty will use that freedom against you. Thats where
the real costs of it come in. Thats where you have to sacrifice lives
and burn dollars and be vulnerable to attack. Thats where you take your
risks.
That this administration,
and most of the proponents of the war, would be risk-averse in this way was
predictable, inevitable, and not altogether ridiculous. It is hard to explain
to military commanders why their troops cannot defend themselves behind barbed
wire and walls. It is hard to explain to soldiers why they have to do jobs theyre
largely untrained to doto administer, to anthropologically investigate
and understand another society, to bow to the cultural norms and sensibilities
of others, to advocate and practice democracy. To be risk-averse about liberty
is to lose the war, as we are losing it. Not just the war in Iraq, but the broader
war on terror. You can achieve liberalism only with liberalism.
Hindsight is 20/20, but some of us had 20/20 foresight. You could have it, tooit would just take joining us in the difficult messiness of social and historical reality.
May 3, 2004
No Longer a Bird in a Gilded Cage
I am sorry to see
that Erin OConnor
is leaving academia.
Some
see a pattern in recent departures from academia announced on blogs, but
the pattern, if it exists, is mostly that scholars who have been trapped in
the labryinth of part-time teaching or work at the periphery of the academy
have decided to let go.
OConnor is
different. Its very rare to see a tenured academic in the humanities voluntarily
leave a post, particularly one at a good institution, and to choose to do so
for ethical and philosophical reasons. Ive only known a handful of similar
cases, and theyve mostly involved people seeking some form of personal
fulfillment or emotional transition that they think is unavailable in their
academic lives.
OConnor seems
to have a little of that in mind as well, but her explicit reasoning is more
that she views the contemporary academic humanities as unreformable and corrupt.
I have a lot of respect for her enormous courage in choosing to leave. Not only
is it hard to turn your back on the gilded cage of lifetime job security, it
is hard to leave behind that part of your own self-image that is founded on
being a scholar in a university environment.
I share at least
many of, if not all of, OConnors misgivings about the American academy
in its present form. Academia, especially in the humanities, often seems to
me narrow-minded, parochial, resistant to the forms of critical thought that
it allegedly celebrates, and possesses a badly attenuated sense of its communicative
and social responsibilities to the publics which sustain it. In many research
universities, teaching remains privately, sometimes even openly, scorned. There,
scholars are sometimes rewarded for adherence to self-confirming orthodoxies
of specialization and mandarin-like assertions of bureaucratized privilege.
And what one exceptionally
dissatisfied respondent at Crooked Timber said is all too close to the truth:
some academics are tremendously pampered and intensely unprincipled, in ways
only truly visible to insiders behind the sheltered walls of academic confidentiality.
However, Im not leaving, or even contemplating leaving, and perhaps that would make me one of the noisome defenders of academia that OConnor criticizes.
I am not leaving because Im happy. I enjoy my teaching, am satisfied with my scholarship, and generally am quite pleased with my institution and my local colleagues here. I like many of the people I know in my discipline and my fields of specialization. I learn many new things every week, read widely, live the live of the mind, make good use of the freedom of tenure.
Swarthmore does
a pretty damn good job in many ways. I think I do a good job, too. I am proud
of it and proud to be a part of it all.
It is easier to be happy when the basics of my situation are so comfortable. I am paid well, I have tremendous autonomy in my teaching and scholarship, I have many compensations and benefits. I have bright and interesting students about whose future I care deeply. I have many colleagues whose company and ideas enlighten me. I have lifetime job security. My institution is in good financial shape, prudentially managed, led wisely.
Whats not
to like?
What is not to like, notes OConnor, the Invisible Adjunct and many others is that my situation is unusual in the totality of academia.
I think some of
that is the difference between a liberal arts undergraduate college and a large
research university: it is the latter kind of institution that I think is the
locus of most of the problems afflicting academia at present. There is also
one aspect of this that I do not take to be particular to academia, but instead
is true of all institutions, that some jobs are better than other jobs, some
institutions are run better than other institutions. It is better to work for
a top law firm than to work for a miserable firm of ambulance-chasers. It is
better to work for Google than it is to work for Enron.
What is a bit different,
however, is that academics mostly cannot pursue market-rational strategies that
respond to those differences intelligently and predictably, and the distribution
of talent in faculties cannot meaningfully be said to meritocratically map against
the good jobs and bad jobs. I do not imagine that I am here because I am so
much better than many of the people in jobs where they teach 5/5 loads, have
alienated students, get no sabbaticals, have poor benefits and low wages, and
indifferent or even hostile administrations. I think I am good at what I do,
but so are many of the people who seek jobs in academia, and who ends up where
is a much more capricious thing in the end than in many other fields of work.
And once you're established enough wherever you land, if you're tenured, you're
there as long as you want to remain--or trapped if you want to move elsewhere.
The conditions
of labor at the more selective institutions feed on themselves in good ways:
with regular sabbaticals, strong students, and institutional resources you can
improve both as a scholar and a teacher. With heavy loads and no support, youre
hard-pressed just to stay afloat. If the end state of a tenured faculty member
at the University of Chicago and the State University of East Nowheresville
are different, that often has a lot to do with conditions of employment along
the way.
It is hard to know
what the solution to all this disparity is. I am not into sackcloth-and-ashes
myself, so Im not going to punish myself for my good fortune by leaving
or donating half my salary to adjuncts. If I were, teaching at a good college
would only be the beginning of the good fortunes for which I must apologize,
and a relatively trivial one at that in comparison to being a white male American
who grew up in suburban California in material comfort with supportive and loving
parents. I do not see any magic way to make every academic institution wealthy
overnight, nor would I want to eliminate the weaker or more impoverished institutionsthe
diversity and number of colleges and universities in the United States seems
one of our national strengths even in comparison to Western Europe.
Instead, I think
that the smaller, simpler solutions which many academic bloggers have described
are the real beginning of meaningful reform.
Graduate institutions
should dramatically cut their intake of doctoral students. Yes, that would simply
move the principle of relatively arbitrary distinctions of merit to an earlier
moment in academic careers, but thats the whole point, to keep people
from devoting seven years of their lives to a system that frequently does not
pay off that investment of labor.
Graduate pedagogy
needs to shift its emphases dramatically to meaningfully prepare candidates
for the actual jobs they ought to be doing as professors, to getting doctoral
students into the classroom earlier and more effectively, to learning how to
communicate with multiple publics, to thinking more widely about disciplines
and research. At the same time, doctoral study also needs to reconnect with
and nurture the passions many of us brought to our academic careers at the outsetpassions
often nurtured in bright undergraduates by strong liberal arts institutions
like Swarthmore. The excessive professionalization and specialization of academic
work is killing its overall effectiveness and productivity. The possible purposes
of graduate training need to be opened up, not merely as a compensatory gesture
to disappointed academic job seekers, but as a deep and meaningful reform of
the day-to-day labor of professors presently teaching graduate classes. The
passive-aggressive combination of complacency, conformism and defensiveness
that often afflicts academic culture needs to give way to something bolder,
more infused with joy and creation, more pluralistic and varied in its orthodoxies
and arguments.
Tenure as an institution
needs to be rethought. If not actively abandoned, it should at least not be
automatically, reflexively defended as inviolate, because it presently serves
very few of the purposes which are often attributed to it. The use of adjunct
teaching in its present form at many institutions should simply be outright
abolished. Non-tenure track faculty should be hired on 1-year or 3-year contracts
at a salary comparable to a tenure-track assistant professor with benefits to
teach a normal load of courses, never on a per course basis, save in those cases
where short-term emergencies arise (such as serious illness or other unplanned
short-term unavailability of a tenure-track faculty).
Theres more that I could suggest, but I think many of these reforms would squarely confront the problems that have driven Erin OConnor to leave academia. How we get to them is really the crux of the matter. I think the role of insiders who love academia but want to see it realize its potentialand possibly stave off the threat of a collapseis essential. But so too, perhaps, are people who walk away. Heres to the guts to walk away from a sure thing.
April 21, 2004
Cry Me a River
The Chronicle of Higher Education has an article this week about single academics and their problems, the extent to which many of them feel like outsiders in the culture of academia. (Online version now available to nonsubscribers.)
Feeling like a
social outsider is one thing, and always worth discussing empathetically, as
a human concern for one's fellow humans. Particularly in small, rural colleges,
faculty social life is the main source of community, and if that community coheres
around marriages, life can be very difficult for a single person, whether or
not that single person is seeking a partner themselves. A goodly portion of
the Chronicles article is taken up with these kinds of issues,
and I sympathize and welcome any thoughts about ways that individuals and communities
can help address these feelings, to show a soliticious concern for the problems
of others, and strengthen human ties with an appreciation of differing situations.
The notion, given
much airing in the article, that feeling like a social outsider is something
for which one ought to be formally and structurally compensated, that all such
feelings represent forms of injustice or inequity, is silly. Some of the single
faculty quoted in the Chronicle article cry out for parity in benefits,
arguing that if faculty with children receive tuition discounts for their children
or health care for families, single childless faculty should receive some equal
benefit. If I never have a cavity, Ill never make full use of my dental
benefits: should I receive a comparable benefit to someone who gets a new filling
every five months? No, because I have the same benefit if I develop the same
condition. Same for the single faculty: the marriage and child benefits are
there for them too if at some point in their life cycle they apply to them.
As one administrator says in the article, Fair doesnt necessarily
mean equal. I paid taxes to educating other people's kids long before
I had a kid, and I welcomed doing so--because in paying those taxes, I was underwriting
the labor of social reproduction, which as a member of society, I benefit from
when it is done well and suffer from when it is done poorly.
In some ways, the
article documents just how perniciously the trope of minority status
and its associative moral landscape has spread to every single discussion of
how communities are constituted. To talk of single people as an underrepresented
minority in academia, as Alice Bach of Case Western Reserve University does
in the article, makes no sense. Underrepresented in the sense that academia
sociologically is not a perfect mirror of American society as a whole? Well,
yes, of course. But Bach seems, like some of her aggreived single compatriots,
to be saying that this lack of mimetic resemblance places a moral burden on
the faculty of each particular academic institution to fix the problem, that
the mere fact of a difference constitutes a moral failure. By that standard,
every academic institution needs to designate a proper proportion of faculty
to be paid below the poverty line, to be left-handed, to suffer the proper proportion
of death and injury at the proper ages, to be polyamorous, to be Goths, to be
Mennonites, to be hired with only a high school diploma and so on. If someone
can demonstrate that at the time of training or hiring, single faculty are specifically
identified and discriminated against and therefore that their underrepresentation
is the consequence of discriminatory behavior, then that person has a legitimate
point.
Otherwise, in the
absence of that evidence (and I think such evidence will never be forthcoming),
the aggrieved singles in the article are talking about the culture of academia,
which simply is, in the same way that academia is intensely bourgeois.
To argue that academia ought not to be bourgeois or dominated by married folk
is something that one can legitimately dobut not from a social justice
standpoint, only from an argument about aesthetics and cultural preference,
or from the standpoint that bourgeois society per se or marriage per se are
corrupted social institutions that we collectively need to destroy or reject.
Thats fine, go ahead and make that argument if you like. Laura Kipnis
has. Dont cloak it in complaints about underrepresentation or stigma or
minority status. Those ideological or cultural claims are not arguments about
discrimination and egalitarianismtheyre a different kind of argument.
It gets especially
silly when one of the complaints of single academics described in the article
is that theyre not marriedthat the solitary nature of academic work
is too stifling when youre not with a partner or children, or that household
tasks are more time-consuming because theres no one to divide the labor
with. At that point my head is spinning: so single faculty are discriminated
against, but one of the remedies for discrimination would be to get a partner
and kids? That it is an injustice that theyre not married and with kids?
The comparable benefit to health insurance for families or maternity leave would
be what, a colleage subsidy of a cleaning service or landscaping business for
single faculty to simulate having a partner who can do household chores? How
about we give single women a subsidy for a male-run cleaning service that only
does 25% of the chores after promising to do 50%, and also subsidize a service
that will come in the houses of single faculty and throw toys all over the floor
and triple the laundry load on a regular basis.
The person who really drove me nuts in the article was Benita Blessing, a historian at the University of Ohio. Colleagues who have children or spouses, she says, are free to leave boring faculty meetings while she cant just say that she wants to go home and watch reruns of Buffy, the Vampire Slayer. I really, really do try to see things the way other people see them, but this particular statement stopped me in my tracks. There are a million genuine and feigned ways that she could slip out of meetings if she likes: I feel no guilt for her lack of creativity. Then she complains that her department doesnt have parties for people getting tenure or promotions, only bridal and baby showers. Could it just be that this is her department? The whole article is so shot through with freakish anecdotal reasoning from alleged academics whom one would think should know better. Somebody throw Benita Blessing a party already, though Im guessing that shes going to complain even if they do. Envy combined with a discourse of entitlement rarely respects restraints.
How Not to Tell a Story
Hellboy
is a really enjoyable film. Matrix: Revolutions is not. I saw both about
a week ago. The contrast was a reminder that you can talk plainly about the
technical skill of telling a story.
At a basic level,
the problem with the storytelling in Matrix: Revolutions is that it rejects
both of the forks in the road that the mediocre Reloaded laid down, both
possible conclusions.
The first is to
play an entertaingly intricate and escalating series of tricky games with the
nature of reality, to return to the basic dilemma of The Matrix and
ask, What is real? The storyteller, in that scenario, has to have
a final answer in mind, but to allow his characters to be confused about the
answer, to have the action of the plot be a labyrinth, an ascending series of
false answers and red herrings, a fan dance tease. You can even end that story
with a little wink of doubt after the supposedly final answer is revealed, but
you do need to have a real and satisfying climax. This is the more intellectualized
storyit demands a high level of clever playfulness to work. It also would
require taking the story back into the Matrix itself for most of the filmas
Gary Farber notes, the odd thing about Revolutions is that almost
none of it takes place in the Matrix.
One possible strategy
for this kind of tricky, layered plot: suppose we find out in Revolutions
that the whole humans-in-vats being energy sources thing is just as absurd as
it sounds, that its just a higher-order simulation designed to deceive
the remaining humans, that what Morpheus and pals are doing is actually just
what the machines want them to do? What if the machines are really trying to
liberate humanity from the Matrix, and it turns out to be humans who put themselves
in it? What if the Architect is the good guy and the Oracle the bad guy? And
so on. In the right hands, this kind of escalation of doubt and confusion can
work beautifullybut it takes a storyteller who has thought it all out
in advance, who has an exquisite sense of how to use reversal and surprise as
a way to structure storytelling. It also takes a storyteller who is both playful
and willing to make some rules for his game and stick to them.
The only other
way to go is to play completely fair with anyone who has followed the story
to that point and reveal everything. Make the movie Matrix: Revelations,
not Revolutions. Solve all outstanding questions, lay out the secrets, explain
it all. Make those secrets basic, simple, and dealt with quickly through exposition.
That also is not
what Revolutions didinstead it dropped some more murky, oblique
characters into the mix, went on some time-wasting excursions to see old characters
whose pointless, plot-arbitrary nature was confirmed (the appalliingly annoying
Merovingian and his squeeze), offered some incoherently faux-profound dialogue
about the plots events, blew a shitload of things up hoping nobody would
notice how hollow the rest of the film was, and then threw into two incomprehensible
conclusions (Neos defeat of Smith and the final scene with the Oracle,
the Architect and Sati). Along the way there were isolated cases of really excrutiating
badnessTrinitys death scene was so protracted and excessive that
I found myself screaming at the television, Die already! Die DIE DIE!
Im sure there are Matrix fanboys out there who can explain all this, but
a dedicated fanboy can claim to see a pattern in a random piling of trash in
a garbage dump, too.
I got it right
in my comments on Reloaded: the Wachowskis want too badly to come
off like philosophers, but they think philosophy is about incomprehensible slogans,
meaningfully enigmatic glances and Ray-Bans. In Revolutions, theres
no hiding the naked emperor: they clearly dont have the faintest idea
what their story is actually all about, and so they perform the cinematic equivalent
of alternating between mumbling and shouting. I can see how they could have
played fair and explained it all. For example, make it clear that the machines
created self-aware software upon which they are now dependent, and make the
Matrix a literal Third Way in the human-machine conflictmake
it hardware vs. software vs. meatware, and make the software dictate a peace
on both humans and machines. Maybe the fanboys will claim thats what was
going on anyway, but that takes much more generosity than Im prepared
to show.
So. Hellboy.
Hellboy gets it right because it tells a story honestly. As one of my
students noted, when you see an opening quote about the Seven Elder Gods of
Chaos, you know that youre deep in the heart of Pulpville. The storytellers
know that too, and they satisfyingly plunk their butts down right where they
belong and stay there consistently throughout the film. The story moves along
smoothly (well, theres a slow bit in the beginning, maybe), the plot is
transparent to its viewers and to its own genre conceits, and everything is
played more or less fair. If the movie were a little more dour or took itself
seriously enough, one might ask questions like, Why does an agency of
paranormal law enforcers seem to know so little about the paranormal?
(then again, just look at the 9/11 Commission to find out how law enforcement
agents can not know a lot about what theyre supposed to know about) or
Isnt it wise when youre dealing with a quasi-immortal villain
to not assume hes dead? You dont ask these questions seriously
because the story is robustly built and has an assuredness to it at all times.
It knows what it is.
These are great examples for a straightfoward discussion of the technical craft of storytelling. Whats important about that discussion is that it can very rapidly scale up into much more critically complex conversations about genre, audience reception and audience formation, the history of representation, the indeterminate meanings of cinema as a form and much more besidesbut it also shows that we need not (and in fact often do not) lose sight of a technically-focused ground floor of cultural criticism in moving towards more difficult questions.
The Raines Must Fall
Having finally made my way through Howell Raines full postmortem of his tenure at the New York Times (its only a bit shorter than Neal Stephensons The Confusion) I cant say that I feel any great sympathy for him. Even when hes making points I agree with about the Times, he comes off fairly badly, writing a weird jambalaya of self-pity, arrogance and gracelessness.
He means to convince
anyone reading that he was done in by a cabal of no-talent hacks protecting
their jobsand I walked away convinced that there were in fact entrenched
no-talent hacks at the Times when Raines was brought in (this is hardly
news)but he mostly ends up providing convincing confirmation that the
sniping criticism of Raines during his tenure was valid. This is not a guy youd
want being the executive of anything, though he might make a good second-banana
bad cop for a person with real leadership skills.
Raines takes credit,
and deserves credit, for shaking up the Times utterly arteriosclerotic
coverage of culture. In the mid-1990s, it was stunningly irrelevant and stultifyingly
boring, both in the daily and Sunday paper. (Even the snotty high culture coverage
was often so late, as Raines observes, that the Post, of all papers,
was doing a more timely job on the same beat.) Raines helped the paper to figure
out that you dont send a man to do a boys job, and got reviewers
like Elvis Mitchell to write about culture they both understood and enjoyed.
However, the Sunday Arts & Leisure is still a snooze: the revolution is
only partially complete.
In fact, the Sunday
edition is in general still pretty boring. Raines seems to think he accomplished
a lot in this respect, but I dont see it. The Book Review is mostly predictable,
the Week in Review flounders uselessly most of the time, and this was the same
under Raines as it was before and since. The Sunday magazine has a better track
record for interestingly controversial articles, though, but I thought that
was one of the stronger parts of the Sunday edition before Raines arrived.
One small change
that I have loved about the Times that I think came in during Raines
tenure, though he doesnt mention it (I think) in the Atlantic piece
is the back end of the Saturday Metro section, with its really intriguing pieces
on current debates and ideas in academic and intellectual circles.
Raines doesnt
talk that much about columnists, and thats not surprising: the Times
went from bad to worse during his tenure, keeping some of the same boring old
pissants and adding some new boring younger pissants. Even the people I agree
with are boring me. And it becomes clear as one reads along that many of his
strongest internal critics were on the news staff, where the Times was
in pretty good shape, especially in international coverage, before Raines started
his tenure, and from the perspective of many external critics, actually got
worse during his time. When I compare the international coverage under Lelyveld
to the Times in the 1980s, its like night and day. The ideological
hacks mostly disappeared, and lightweights like Christopher Wren were mostly
swept out, replaced by much more interesting and energetic writers. The Africa
coverage went from being something that persistently annoyed me to being something
I learned from and found usefully distinctive from anything else in the mass
media.
The domestic coverage
has been uneven for a decade, but to be honest, all I ask of the Times
in that regard is that it be solid, detailed, and fairly comprehensive, because
thats its purpose in my household, to serve as the paper of record. When
I want something more, I go read The Economist, the features on the front
of the Wall Street Journal, the Washington Post for inside-the-Beltway
stories, and the Internet.
Thats the first and major thing Raines doesnt seem to get. He represents himself as coming in all gung-ho to expand the papers subscriber base, widen its appeal, reach new audiences, freshen up the Grey Lady. The core readership doesnt probably want or need the paper to do that in most of its news coverage. Im very glad to have the Times be more interesting in the places where it was just mind-bogglingly dull and snobbish, but when it comes to news, I demand that it first be respectable and meticulous. This is something that Raines didn't and still doesnt seem to think is important: his vision, by his own account, was all about making every single part of the paper equally provocative and edgy. Thats not your market niche, man.
Raines ventriloquizes for both actual readers and potential readers and says, This is what the public wanted, but my do-nothing, hack-job, fusty staff wouldnt let me. This reader says, No, thats not what I want when it comes to news and the Times. The Times is not required to be boring, but neither does it require a front-to-back overhaul. I dont require the Times to get there first, and I dont require it to get there sexily. I just require the paper to get it right and get it all. If Raines had spent more time worrying about shoring up standards of craftsmanship and meticulousness in reporting and less time worrying about sexing up the writing, he might not have had the Jayson Blair problem.
Raines reports
breathlessly on the internal culture of the Times as if he learned for
the first time as executive editor how office politics works, and as if the
Times is an unprecedented hive of professional cripples. Shouldnt
an executive editor be a seasoned old hand when it comes to issues like unproductive
senior writers, recalcitrant underlings, or peer networks that rally to support
each other? On banal questions of group dynamics, Raines acts like a sixty-year
old spinster being shocked by a first encounter with the birds and the bees.
Aside from that, its really hard to sympathize with Raines as he comes off in this piece simply because he sounds like an unlikeable prick. That's a pretty bad sign when you write an exculpatory account of your own behavior and you still manage to come off like an asshole. It's like watching a television commercial for food where they can't even manage to make what they're selling look appetizing. That generally means that the food in question is authentically nasty. Same thing here. He manages to get in some truly graceless shots at his former colleagues, some of them by name, others only indirectly. Its one thing for a crusader unmistakeably brought down by reactionary forces to shout defiance at his enemies, and another thing to pass the buck as aggressively as Raines does in the wake of a mistake that he at least had titular responsibility for. It makes me wonder if any leader in American public life will ever have the grace to just assume responsibility for failure on his watch and manfully go down with the ship.
At the very least, Raines could say a lot more about his shortcomings: there are very few unqualified or straightforward confessions in the article, and the half-hearted apologies take up a very small amount of the total space in the essay.
Readings and Rereadings #3
Lauren Slater, Opening Skinner's Box
Footnotes on Death in Iraq
I know this is a common fact about war, that many combatant deaths come only indirectly from military conflict, but has anyone noticed how many deaths of coalition forces in Iraq are non-combat deaths? I got interested when looking at the roster of the dead on CNN's web site and started a quick and imprecise count. It looked to me like almost a third of the deaths were attributed to either "non-combat gunshot wounds or injuries" or various kinds of vehicle accidents, with a few cases of death from illness or unspecified medical conditions. A few of the vehicle accidents probably had to do with the pressures of near-combat situations, but a lot of them were due to things like embankments crumbling or boats capsizing or just plain old traffic collisions. I thought for a minute that maybe some of the non-combat gunshot or injury deaths (about a third of the total non-combat deaths) were suicides, but the recorded cause of death as given at CNN actually explicitly note a suicide as such. I assume most of these deaths are actually training accidents or due to equipment malfunction.
Today's 3-Year
Oldisms
Emma: Today is
Monday!
Us: No, its
Tuesday.
Emma: No, today
is Monday.
Us: You have your
swim lesson on Tuesdays, and you just got back from your swim lesson. Ergo,
it is Tuesday.
Emma: There was a fairy who changed my swimming lesson to Monday this week. And she also gives people candy.
---------------------------------------------
Me: Emma, should I shave off my beard? [A frequently asked question.]
Emma: No!
Me: Why not?
Emma: Because you would look more like a boyfriend than a daddy.
Emergence and
the Metahistory of Efficiency
Im going
to gingerly venture in this space for the first time into waters that Ive
been heavily exploring for two years with other faculty and through very active
reading, namely, complex-systems theory, complexity theory, nonlinear dynamics,
emergent systems, self-organizing systems and network theory.
I am a very serious
novice still in these matters, and very much the bumbler in the deeper scientific
territory that these topics draw from. (Twice now Ive hesitantly tried
in public to talk about my non-mathematical understanding of the travelling
salesman problem and why an emergent-systems strategy for finding good answers
in non-polynomial time is useful, and I suspect that I could begin a career
in stand-up comedy in Departments of Mathematics all around the nation with
this routine.)
I do have some
ideas for useful applications of these ideas to the craft of historyAlex
Pang wasnt the only one burning the midnight oil with an NSF application
recently.
More generally,
I think there is one major insight Ive gotten about many of the systems
that get cited as either simulated or real-world examples of emergence.
The working groups Im in have been thinking a lot about the question of
why so many emergent systems seem to be surprising in their results,
why the structures or complexities they produce seem difficult to anticipate
from their initial conditions. Some complex-systems gurus like Stephen Wolfram
have very strong ontological claims to make about the intrinsic unpredictability
of such systems, but these are questions that I am not competent to evaluate
(nor am much interested in). I tend to think that the sense of surprise is more
perceptual, one part determined by the visual systems of human beings and one
part determined by an intellectual metahistory that runs so deep into the infrastructure
of our daily lives that we find it difficult to confront.
The visual issue is easier to recognize, and relatively well considered in A-Life research. Its why I think some simulations of emergence like the famous flocking models are so readily useful for artists and animators, or why were weirdly fascinated by something like Conways Game of Life when we see it for the first time. Emergent systems surprise us because they have a palpable organicism about themthey move in patterns that seem life-like to us, but in contexts where we do not expect life. Theres a deep human algorithim here for recognizing life that involves a combination of random movement and structural coherence, which is just what emergence does best, connecting simple initial conditions, randomness and the creation of structure. Purely random movements dont look lifelike to us; top-down constructions of structure appear to us to have human controllers, to be puppeted. So we are surprised by emergence because we are surprised by the moment-to-moment actions of living organisms.
When I look at
ants, I know in general what they will do next, but I dont know what exactly
any given ant will do in any given moment. This, by the way, is why most online
virtual worlds still fail to achieve immersive organicism: play enough, explore
enough, and you know not only what the general behavior of creatures in the
environment is, but precisely what they will do from moment to moment.
What I think is
deeper and harder to chase out is that we do not expect the real-world complex
systems and behaviors we actually know about to be possible through emergence,
in the absence of an architect, blueprint or controller. Some of this expectation
has rightfully been attributed by Stephen Johnson and others to a particular
set of presumptions about hierarchy, the so-called queen ant hypothesis.
But I also think it is because there is an expectation deeply rooted in most
modernist traditions that highly productive or useful systems achieve their
productivity through some kind of optimality, some tight fit between purpose
and result, in short, through efficiency.
My colleague Mark
Kuperberg has perceptively observed that Adam Smith has to be seen as an early
prophet of emergencewhat could be a better example than his bottom-up
view of the distributed actions of individuals leading to a structural imperative,
the invisible handbut as digested through the discipline of
economics, Smiths view was increasingly and to my mind unnecessarily parsed
in terms of models requiring those agents to be tightly optimizing.
Thats whats
so interesting about both simulated and real-world examples of emergence: they
create their useful results, their general systemic productivity, through excess,
not efficiency. Theyre not optimal, not at all, at least not in their
actual workings. The optimality or efficiency, if such there is, comes in the
relatively small amount of labor needed to set such systems in motion. Designing
a system where there is a seamless fit between purpose, action and result is
profoundly difficult and vastly more time-consuming than setting an overabundance
of cheap, expendable agents loose on a problem. They may reach a desired end-state
more slowly, less precisely, and more expensively in terms of overall energy
expenditure than a tight system that does only that which it needs to do, but
that excess doesnt matter. Theyre more robust to changing conditions
if less adapted to the specificities of any given condition.
We go looking for
efficiencies and thriftiness in productive systems partly because of a deep
underlying moral presumption that thrift and conservation are good things in
a world that we imagine to be characterized by scarcitya presumption that
Joyce Appleby has noted lies very deeply embedded in Enlightenment thought,
even in the work of Adam Smith. And we do so because of a presumption that productivity
and design, fecundity and cunning invention, are necessarily linkeda presumption
that I am guessing is one part modernist trope and one part deep cognitive structure.
We are disinclined to believe it possible that waste and excess can be the progenitors
of utility and possibility. Georges Batailles answer to Marx may be, as
Michael Taussig has suggested, far more important than we guess. Marx (and many
non-Marxists) assume that surplus must be explained, that it is non-natural,
that it is only possible with hierarchy, with intentionality, with design. It
may be instead that excess is the key, vastly easier to achieve, and often the
natural or unintended consequence of feedback in both human and natural systems.
The metahistory
that I think I see lurking in the foundations here is a tricky one, and a lot
of effort will be required to bring it to light. We will have to unlearn assumptions
about scarcity. At the scale of living things, making more copies of living
things may be thermodynamically incredibly cheap. At the scale of post-Fordist
mass production, making more material wealth may be much cheaper than we tend
to assume. We will have to root out our presumptions about efficiency and optimality
and recognize that many real-world systems whose results we depend upon, from
the immune system to the brain to capitalist economics, depend upon inefficient
effectiveness (productive non-optimality, wasteful utility).
I also think exploring this metahistory of our unconscious assumptions might help us contain emergence and complex-systems theory to a subset of relevant examples. Some of the people working in this field are too inclined to start sticking the label emergent on anything and everything. You could actually come up with a prediction about the limited class of systems that can potentially be emergent or self-organizing (and Im sure that some of the sophisticated thinkers in this field have done just that): they would have to be systems where many, many agents or discrete components can be made exceptionally cheaply and where simple rules or procedures for those component elements not only produce a desired end-state but also intrinsically terminate or contain their actions within the terms of that result, and probably some other criteria that might be identified by unthinking our prevailing assumptions about efficiency and designsay constraints on the space, environment or topology within which inefficiently effective systems might be possible.
April 6th, 2004
Dont Play It Again, Sam
I never dreamed
when I started my current book project in 1997 that writing about the British
system of indirect rule in colonial Africa as a central issue would turn out
to be relevant not just to understanding how African societies like Zimbabwe
got to where they are today, but also to understanding how current events in
another part of the globe are unfolding minute by minute.
But here we are.
The United States is trying to get an approximately colonial system of indirect
rule up and running in Iraq after June 30th, one more limited in its conception
and at least notionally shorter in its projected lifespan than the early 20th
Century British equivalent, but one nevertheless. It certainly makes me feel
like I had better finish my project as soon as I can before I feel compelled
to once again rethink what Im writing in light of the latest developments.
Im very hesitant
about casual comparisons and analogies, like most historians, but this general
resemblance seems to me to be unmistakable. This resemblance also clarifies
for me why I do not view Iraq and Vietnam as strongly analogous. Cheap rhetoric
aside, Vietnam was not an imperial war, and US power in South Vietnam was not
a form of colonial rule. The configuration of political authority, the nature
of the military conflict, the rhetorical framing of the struggle, the developmental
timeframe of the war: they were all quite different. The Cold War was its own
distinctive moment in the history of the 20th Century. So too is today, but
it is closer to the colonial past than any other moment since the 1960s.
The fighting in
the past week has been unnerving in its intensity, and seems today as if it
will get worse with news of a major ambush of US Marines in Ramadi. The question
is, does the analogy to British indirect rule help us understand what is happening
now and what may happen in the future? I think yes, and the news is not very
good.
Many defenders
of the current war say that the critics have too short a time frame for assessing
success, and they may have a point. British rule in Africa (and elsewhere) was
pockmarked with short-lived uprisings and revolts which seemed briefly significant
at the time, but which never really threatened British colonial authority fundamentally
until the 1940s and the simultaneous challenge of major labor strikes, mass
nationalist protest and the Mau Mau rebellion in Kenya, at a time when British
economic and military strength was at relatively low ebb and the nature of international
politics and morality had fundamentally shifted against empires.
So can the US simply
endure these small rebellions similarly, by taking the long view? Well, probably
not, and heres some reasons why.
First reason:
the British could simply ignore African resistance at times: lack of mass media
and a global public sphere meant that many uprisings or acts of unrest were
known only to local authorities until considerably later, and left more or less
alone to burn out without fear of public reaction.
Second reason:
colonial rebels lacked access to communicative methods for mobilizing and coordinating
unrest over larger areas, which is not true in Iraq today.
Third reason:
overt racism among British authorities and British society meant that they regarded
Africans in so dehumanizing a manner that they usually did not have to worry
officially or publically in public debate about what Africans thought, felt
or desired, a rhetorical option no longer as open to the United States government
today, though theres certainly some hint of this now and again. (Official
white supremacist practice contradicted by implicit liberalism under British
rule has transited to official, explicit liberalism contradicted by implicit
racial or cultural categorization of colonial subjects).
Fourth reason:
the British were restrained by some humanitarian considerations in exerting
their power, and when those constraints were egregiously overstepped, as in
Amitsrar in India, there were consequencesbut still, and particularly
in the early history of colonial rule, it was possible for British forces to
retaliate very openly with enormous force and with relatively little regard
for due process or rights against even suspected rebels or dissidents. The US
probably cant do the same thing, both because the world has changed since
1890 and because massive retaliation against suspected sources of rebelliousness
carries the risk of further inflamation of resistance among previously neutral
civilians.
Fifth reason:
Iraqi society is much more plugged into regional and global networks that can
reinforce and amplify resistance to US occupation in comparison to most African
societies in the early 20th Century.
Sixth reason:
British indirect rule, for all its rhetoric of the civilizing mission,
was ultimately much more modest in its ambitions in most cases than American
rule in Iraq is today. The bar for declaring success was much lower
then.
Seventh reason:
British indirect rule existed in an international system dominated by European
state that normalized imperial authority in general and racial hierarchy in
specific. American indirect rule in Iraq exists in a world that is largely mobilized
against imperial ambitions, often insincerely or instrumentally, but mobilized
nevertheless.
Eighth reason:
The direct relation of American popular opinion and elections to the continuance
of an imperial policy is structurally very different than what pertained in
Britain from 1880 to the 1930s. What was sustainable then politically is not
sustainable now without an even more seismic shift in American culture and society.
Ninth reason, and perhaps the most importantly concrete; British military power in relation to non-Western societies in 1890 was the most technologically asymmetrical that the world has ever seen: there was an unbridgeable distance in terms of firepower, logistical capability, and much else besides. Britain rarely exerted this power directly after the initial era of conquest in Africa and elsewhere, but when it did, there was simply no question of armed resistance succeeding. This is no longer the case today. American military power is still massively asymmetrical to the military power of armed insurgents in Iraq, but in ways that are of no use in exerting indirect rule authority over Iraqyou cannot assert indirect rule through bombing campaigns, artillery assaults, or nuclear deterrents. You can only do it with ground troopsand here automatic weapons and homemade explosives in the hands of insurgents coupled with the ability to vanish into the general population are enough to bring Iraqi combatants up to the point that they can exert meaningful force against American authorities. You can leave aside all the other comparisons but I think this alone is a devastating difference between the world of 2004 and 1890. Now "they" do have the Maxim Gun, more or less.
When I was a graduate
student, I once queried the local free weekly about writing a column of course
reviews drawn from universities and colleges in the regionI thought Id
contact the instructor, get a hold of a syllabus, slip into the back of the
room for two or three lectures in a large course or listen in on two or three
discussions, and then write a review. Im glad they declined the offer
(ignored, actually) given that I couldnt possibly have written such a
column and remained a viable graduate student. Moreoverand I didnt
know this at the timevery few professors would allow such a thing, and
in some institutions, theyd probably be prohibited from giving a stranger
permission to sit in on two or three course sessions.
Now theres
a way, sort of, to accomplish something rather like this, and thats to
take advantage of online syllabi and find really great examples of courses out
there worthy of praising. A great syllabus isnt necessarily a great course,
but it is likely to be, and a great syllabus is in its own right something useful,
interesting and compelling. Syllabus design is one of the subtle but central
arts of academic life. A good online syllabus is the antithesis of an online
education: it doesnt pretend to be teaching or instruction, just to be
a form of publication, a sort of annotated bibliography. Syllabi are good to
think.
Ill start locally. My Bryn Mawr colleagues Anne Dalke and Paul Grobstein are teaching a course this semester called The Story of Evolution and the Evolution of Stories. This course is a great model for cross-disciplinary--not interdisciplinary--teaching in a liberal arts institution. Sadly, we dont have much that's comparable at Swarthmore. Our cross-disciplinary co-teaching tends much more to the dour and respectable, to rendering service to sequential curricula or established interdisciplinary programs of study. This course, in contrast, is the kind of course that many different kinds of students from different disciplines could come to and gain a new perspective on their work without necessarily having that sense of having climbed a necessary rung of a preset ladder. I've long thought that most institutions of our type should have a few courses every year that are about a discovered conversation between two colleagues--not to be taught again and again, perhaps to be taught only once, and as exploratory for the faculty as the students. As I look over the idea of the course, I can see a lot of places where I would have a different point of entry into the conversation. I thought immediately of a paper by Peter Bogh Andersen, "Genres as Self-Organizing Systems", that I encountered via William Tozier, or Gary Taylor's book Cultural Selection. This strikes me as a sign of excellence in a course of this kind, that many different people would be able to look at it, immediately "get" the basic concept behind it, and think about some other texts or materials to bring to the table.
Miscellanea
1) I see Ralph Nader has been popping up in the media expressing bewilderment at the vehemence that his candidacy raises among its critics. Reacting in particular to the similarity in the many letters he received from friends and admirers begging him not to run, he says, "It's a virus", saying there could be no other explanation for the similarity between the appeals. No, Ralph. When everyone disagrees with you in the same terms, it's not a virus. It just means everyone sees the same thing. Generally, the only people who conclude in the face of that kind of mass concord that they themselves must be right and everyone else must be wrong are narcissists, paranoids or the Son of God. Take your pick.
2) Will Baude of Crescat Sententia draws attention to the all-important campaign to get George Lucas, another isolated destructive narcissist of the first order (he'd make a great running mate for Nader) to include the original version of the Star Wars trilogy on this fall's DVD set. I don't much care about most of the original scenes, and in fact, most of the CGI additions were kind of nice, if sometimes rather busy. But I do care--enormously--about Han Solo shooting first in the cantina. I don't know if any other director has ever so revoltingly and pointlessly mutilated his own work (voluntarily!), but please, George, give us the option to undo your foolishness.
3) 3-year olds are a hoot! Every day is some fascinating new angle from the peanut gallery on something I'd never thought of that way, or some recombinant view of something that makes a kind of weird, interesting sense. This week's special, while watching the gazebo scene early in The Sound of Music:
(Rolf shows up to meet Lisle in the gazebo.)
Emma: Who is he?
Us: He's the Nazi Boy.
Emma: What is his name?
Us: Rolf.
Emma: Who named him?
Us: His parents. We think.
Emma: I didn't know that Nazi Boys had parents.
Piling On Intelligent Design
Everywhere I click
in the last few weeks, folk are talking about Intelligent Design theories and
working themselves into a mighty froth over the theories and the tactics of
those who advance them.
Rather than joining
the pile-on right awaythough as youll see, Ill get around
to it eventuallyI thought it might be worth taking a deep breath beforehand,
partially because it doesnt seem to me absolutely intrinsically impossible
that one could find evidence of intelligent design in the universe. I suppose
thats what I now class myself as an agnostic rather than an atheist. I
see no reason at all to think that such a designer exists, but Im an open-minded
guy.
So perhaps the
first reaction one should have to intelligent design theories is to specify
in advance what real, meaningful evidence could reasonably occasion a scientifically-sound
hypothesis in favor of an intelligent designer. There are lots of personalized
ways ID could be confirmed. Dying and finding oneself in an afterlife where
a Supreme Being personally affirmed that he was in fact the designer of the
universe would be one such source of evidence. A bit hard to repeat the experiment,
though. Revelatory personal contact with God would be confirmation for a single
person (though there would always be the possibility that you were suffering
from mental illness) but that also cant be shared or repeated.
What external,
repeated evidence could there be? What would ID predict? God or his agents could
appear physically in the world and cause events to happen for which there could
be no other explanation save divine power, where we widely agreed that we had
witnessed such events, or had repeatable confirmation via video or other recording
devices that such events had happened. God could put a genetic watermark into
the DNA of all living things that spelled out Organism by God in
English. Equally unmistakeable signsand were not talking a Rorsach-blot
picture of a weeping Jesus on a tree stump herewould be enough. We could
probably list them predictively with some reasonable precision.
What would not
suffice, as many have noted, is a demonstration that our current theories cannot
explain some aspect of observable reality. That proves nothing about an intelligent
designer. And as many have also noted, even if one conceded most of the ID arguments
of this kind, they would tell you nothing about the identity of the intelligent
designerit could just as easily be Yog-Sothoth or a mad scientist from
Dimension X as it could be God.
The thing that
is puzzling in a way is why most Christians would bother with intelligent design.
Modern Christianity, even high rationalist Catholicism, acknowledges the special
role of faith. Who is intelligent design intended for? A Christian who needs
such an exotic, Rube-Goldberg crutch to achieve faith is very possibly a person
whose belief in God is already on the verge of collapsing, unless theyre
a strict blind watchmaker deist. And yet, if this was the point
of ID, I personally would have no real problem with it. Carl Sagan, Richard
Dawkins and various atheists who have made a point out of confronting and pursuing
religious people have typically misunderstood three things: first, the social,
cultural and psychological generativity and productivity of religious beliefs;
second, the conditional rationality of many of them (e.g., they're reasonable
readings or interpretations of certain events or phenomena but only in the absence
of additional information); third, the degree to which it is completely rational
(indeed, scientific) to be skeptical about the authority of science and scientists,
especially when that authority is marshalled behind the making of public policy.
If any individual
needs ID to bolster his personal faith, thats fine with me. If believers
want to share ID among themselves, then that too is fine, but considered purely
as a matter of intellectual and social history, that says something interesting
and perhaps even comforting about the ascension of scientific reason as the
dominant spirit of our age, that Christian faithful would feel the need to translate
their faith into the terms and norms of pseudo-science in order to legitimate
it among themselves.
This is not what
the struggle over intelligent design is about, however. Its proponents do not
use it as a private foundation for their faith, or a shared liturgy. They want
it to stand equally with evolution within the architecture of public reason.
This is where the opponents of ID draw the line, and rightfully so, because
what it reveals is that ID is a highly intentional mindfuck of the first order.
Its not intended for the faithful, and its not based on credible
evidence. Its intended for those who do not believe in God. It is a tool
of subversion intended to produce conversion. It is a Trojan Horse attempt to
speak in the language of the non-Christian Other while actively trying to obfuscate
and sabotage that language. It is dishonest. This is why Brian
Leiter and many others are perfectly right to react with such intensity
to ID, because it is often a quite conscious attempt to pollute and despoil
the utility and value of scientific thought and illicitly limit its domains
within the social and intellectual life of the nation.
Christians have as much right as anyone to persuasively address their fellow citizens on behalf of their own cultural and social projects. However, participating in public reason in a democratic society obligates us all to honesty, to placing all our cards on the table. I will gladly hear a Christian try to persuade me to their faith, even if they talk in terms of arguments for intelligent design, as long as it is a two-way conversation, transparent to the public sphere. Trying to convert me by monkey-wrenching the general productivity of my everyday working epistemologies is a different matter, however. I react to that about as well I would react to someone slashing the tires on my car.
Readings and Rereadings #2
Holly Elizabeth Hanson, Landed Obligation: The Practice of Power in Buganda
March 28, 2004
Category Error
Sasha
Issenbergs typically entertaining, elegant critique of David Brooks
(I get to call it typical because I graded his confident, intelligent and meticulous
writing quite often when he was a student here) is already getting some justified
and mostly
positive attention from webloggers.
One of the best
features of Issenbergs article is his coverage of Brooks reaction
to the piece. Issenberg finds that most of Brooks characterizations of
red and blue America, or anything else, are based on stereotypes rather than
reportage, inference rather than data, cliches rather than research. Brooks
protests that this is a pedantic objection, that Issenberg doesnt get
the joke, or is being too literal. (And tosses in to boot a condescending jab
about whether this is how Issenberg wants to start his career.)
I think Brooks
would have a point were he another kind of writer, or inclined to claim a different
kind of authority for his work. If Bobos hadnt been sold and framed
as a work of popular sociology, but instead merely as witty, personal social
observation in the style of Will Rogers or Roy Blount Jrbasically as a
folklorist of the professional classesthen Brooks would be entirely justified
in putting on his best Foghorn Leghorn voice and saying, Its a joke,
son.
But as Issenberg
observes, thats not the weight class Brooks tries to fight in. He wants
to be the latest in a line of popular sociologists diagnosing the American condition.
Issenberg perhaps overstresses the untarnished respectability of that lineage:
theres a few quacks, cranks and lightweights scattered in there. Is Brooks
the latest such lightweight? I think Issenberg makes a good case that he is.
This is not to
say that there isnt some truth in what Brooks has to say, but the odd
thing is that the truthfulness of his writing has to do less with how we now
live than how we think about how we live (and more, how others live). Its
not that you cant buy a $20 meal in Franklin County, but that the professional
classes of Blue America think that you cant. Red and Blue America works
not just because its backed by some sociological data (not collected by
Brooks) but because once named, we all recognized its stereotypes and their
correspondence to mostly private, mostly interior kinds of social discourses
in contemporary American life--a point Issenberg makes astutely in his article.
When professional middle-class urbanites talk amongst themselves about gun controlwhich
they mostly favorthey often lard up their conversations with references
to gun racks on pickup trucks and other visions of the rural Other, and it works
in the other direction too.
If Brooks can ask Issenberg if this is how he wants to start a career (seems a pretty good start to me) then I think Issenberg and others are justified in asking Brooks if this is how he wants to sustain one, by feeding us back our stereotypes and acting as if he has accomplished something simply because many of us nod in recognition. If Brooks would like to move beyond that to showing us something we alreadyand often incorrectlythink we know about ourselves and our fellow Americans, hell probably have to get serious about those trips to look for $20 dinners. Or he can hang up the sociologists hat and settle into the role of observer and witticistbut even there, we often most treasure and remember the observers who do something more than hold up a mirror to our confirmed prejudices.
Middle-Earth Online: A Prediction
I was just joining in a group-whine in one discussion forum about the failure of massively-multiplayer persistent-world computer games, and we were commenting in particular on how freakishly bad the initial experience of gameplay is in most of them.
MMOGs, almost ALL
of them, go out of their way, almost by design, to make the initial experience
of a player as boring and horrible as possible.
Which doesn't fit the ur-narrative of the "level up" heroic fantasy, if you think about it. In the ur-narrative, the protagonist begins his or her heroic career usually in the middle of a contented or at least static life (Frodo Baggins, Luke Skywalker) but the hero's journey doesn't start with ten hours of killing household pests. It starts with a bang: with tension and high stakes, with ringwraiths and stormtroopers. If heroic fantasy was written to match a MMOG, nobody would ever get to Chapter Two.
So I thought about that a bit more. Since there is going to be a MMOG game based on Tolkien's Middle Earth, I wondered a bit what the novel Lord of the Rings would look like if it were based on a Middle-Earth themed MMOG. Here's what I came up with:
Continue reading "Middle-Earth Online: A Prediction"
March 23, 2004
You Don't Know What You've Got Till It's Gone
Invisible Adjunct is closing her blog and giving up her search for academic employment.
There are two things I'm sure of. The first thing is that this is pretty solid evidence that academia collectively misses the boat sometimes when it comes to hiring the best and the brightest, or uses profoundly self-wounding criteria to filter between the elect and the discarded. I think anyone who read Invisible Adjunct's site could see unambiguous evidence that this was a person who had a productive, committed and deeply insightful understanding of what academic life is and what it could be, the kind of understanding I take to be intrinsically connected to effective teaching. There was also ample evidence of a fine scholarly mind on display, combining passion for her subjects of expertise with precision of knowledge.
The second thing is the collective value of a good weblog. Invisible Adjunct's site is what made me excited about doing this for myself, and connected me to people who shared a moderate, proportionate, and reasonable critical perspective on academia--something that is hard to find anywhere, in the world of weblogs or anywhere else. I don't think there is anything else that even comes close to serving that function, and it is clear that it was possible not just because of the domain name or the topics, but because of the rich table of information and useful provocation that the host so regularly set and the tone of moderated principle she struck day after day.
March 23, 2004
Waiting for Menchu
Not for the first
time, a reported campus hate crime has
turned out to be a hoax, this time at Claremont. A part-time instructor
reported that her car had been vandalized and hate slogans scrawled on it, sparking
campus-wide efforts to confront racism at Claremont. It now appears likely that
the instructor did it herself.
Ah-hah!, say many
critics of
academic life and campus identity politics. This just proves that hate crimes
on campus are exaggerated and the culture of victimization has run rampant.
Nothing to see here, move along, say those
who remain deeply concerned about questions of racism and discrimination within
higher education.
This exchange reminds
me in many ways of the debate over the fabrications of Rigoberta Menchu. For
many of the combatants, that affair became a battle first and only briefly about
Menchu herself and Guatemala, and more potently about the ulterior motives of
her defenders or her critics. For the critics, it was evidence of the conscious,
premeditated and instrumental lies of the academic left; for the defenders,
it was evidence of the lurking malevolence of a conspiratorial right and the
need to maintain solidarity in the face of this threat.
There were more
than a few people who also threaded the needle in between in some manner, most
prominently David Stoll, who revealed Menchus prevarications. What struck
me most powerfully was that Menchus real story, had it been written in
her autobiography, would still have been interesting and valid and important
and reasonable testimony to the struggles of Guatemalans under military rule.
The question for me was, Why did she, with assistance from interlocutors,
refashion herself into the most abject and maximally oppressed subject that
she could? The answer to that question, the fault of that untruth, lies
not so much in Menchu but in her intended audience.
Here I think the
academic left, that portion of it most invested in identity politics (which
is not the whole or necessarily even the majority of the academic left), takes
it on the chin. Menchu is what some of them most wanted, a speaking subaltern.
You build a syllabus
and go looking: is there any text, any material, that will let you say, This
is what illiterate peasant women in this place think. This is what ordinary
migrant laborers in 1940s South Africa thought. This is what serfs in medieval
Central Europe thought. This is what slaves in classical Greece thought.
You know those people existed and presume they had thoughts, feelings, sentiments.
You want those thoughts written in teachable, usable, knowable form.
You want what people
in my field call the African voice. If you dont have it in
the syllabus, in your talk, in your paper, in your book, somebodys going
to get up in the audience and say, Where is the authentic African voice?
and mutter dire imprecations when you say, I dont have it. I cant
find it. It doesnt exist. You may quote or mention or study an African,
or many, but if theyre middle-class, or Westernized, or literate,
or working for the colonial state, somebodys going to tell you thats
not enough. The light of old anthropological quests for the pure untouched native
is going to shine through the tissue paper of more contemporary theory. You
may move into more troubled waters if you say, as you ought, I dont
need it and there isnt any such thing. Theres just Africans and
Europeans and anybody else: everything that anyone has ever written or had written
down about them is grist for my mill. A thousand voices, no Voice.
Some people wanted
Rigoberta Menchu. They wanted La Maxima Autentica, the most subalterny subaltern
ever. They bought her book, taught her book, willed her into being. She fit.
I dont blame Menchu for giving an audience its desire, and I dont
really blame the audience for that desire either. Its not the highly conscious,
totally instrumental, connivingly ideological scheme that some on the right
made it out to be. Its a needy hypothesis gone deep into the intellectual
preconscious, a torment over knowledge unknowable. Somewhere there probably
is a peasant woman who lived Menchus fictional life, more or less. We
dont have her book, her words, and probably if we did or could, theyd
be more morally complex, more empirically ambivalent, more reflecting the lived
contours of an actuality (suffering included) than the searingly unambiguous
jaccuse that some visions of the world require.
This is all similar
to when someone fabricates a hate crime on a campus, or exaggerates the modestly
offensive stupidity of a drunken student into the raving malevolence of Bull
Connor. There is an overdetermination here, an always-already knowledge, a neediness.
Of course some people are going to fabricate such crimes, following the logic
of a moral panic, a deep prior narrative, a chronicle of a deed foretold. Everyone
knows such crimes existand of course (this is important) they
do. But they are presumed to exist more than they exist, they are needed
to exist more than they exist, because our received narratives of racial and
sexual injustice tell us that institutional and cultural racism is the iceberg
below the sea, an iceberg signaled by the visible tip of extraordinary hate
crimes. Crime has an intentionality that is tangible, a concretization: from
it we infer the concrete intentionality of what is hidden from view.
So campuses mobilize
at every blackface, at every act of minor vandalism, at every hostile word or
mysterious epithet. The sign is given!
But no one knows
how to deal with subtle, pervasive forms of discrimination, and thats
partly because the discourses we have available to us about fighting discrimination
hold that it is equally bad regardless of its form or nature, that the harm
suffered by being misrepresented, slighted, overlooked, denigrated, condescended
to is one part of a seamless and unitary phenomenon that includes being lynched
and put in the back of the bus. And they are connected. They are part of a connected
history, but they are not the same. History contains continuity and rupture
both.
The gleeful critics
of campus politics roll their eyes at this equivalence and take it as evidence
of the triviality of the academic left. I agree with conservative critics that
its a mistake to stress the continuities between the brutalities of Jim
Crow and the subtleties of unconscious stereotype and subtle exclusion in present
practice, but this is not to say that the latter is non-harmful, or just something
to shrug off. One thing I learned by being a white man living in a black country
is that it is an incredible psychic drain day after day to know that you are
marked as a stranger, as socially different, by mere fact of your physiognomy.
It exacts a real toll on you, and every subtle thing that people do to remind
you of it, without any malice, digs the psychic claws deeper and deeper.
This innocent wounding,
this cumulative stigma, is the core of the problem. Many look for, expect or
anticipate hate crimes on campus as the visible signs of a pervasive malevolence,
an illegitimate system of holding power, as an indication of a willfulness and
agency that is the illegitimate and contestable cause of the sense of alienation
and unease that some students, some faculty, some people have within white-majority
campuses. Those crimes come less often than predicted, and when they come, they
mostly dont seem to be the acts of Simon Legrees spiritual descendents,
deliberate acts rich in the intentionality of power, but accidents and oversights,
misspeech and crudity. Some see in these incidents the secret of racial conspiracy
revealed, rather like Eddie Murphys brilliant sketch on Saturday Night
Live where disguised as a white man, his character finds that white people
give each other free money and privilege once racial minorities are out of sight.
They overdeterminedly read a synecdoche, a single moment that contains a hidden
whole. And when the right number and type of crimes do not come, some make them
come, certain that even if the incident is false, the deeper truth is not.
Rigoberta Menchus real story is still interesting and powerful: a woman with some education, some status, some resources, some agency, in confrontation with a state and a social order, witness to terror and suffering. Its ambiguities are what could teach us, not its stridency. If we want to confront racial alienation on campuses, we will equally have to embrace its ambiguities, its subtleties, and recognize that it cannot be easily marched against, confronted, protested, forbidden by statute or code, expelled. It is in us, it is us, and the world has changed in the time we have all come into being and found ourselves where we do. It is not dogs and firehoses now, but small words and the pain of a thousand pinpricks. Until that is fully understood, there will be occasions where stressed, needy people tired of waiting for Godot try to summon into being the spirit whose ubiquity they have too busily prophesized.
March 23, 2004
Via Pandagon, evidence that whatever was funny or smart about Dennis Miller has evaporated into dust and blown away. And I regret it, because I do think he was both funny and smart once upon a time. This judgement has nothing to do with ideology. I am perfectly prepared to credit and like some of the transformations in his own politics he's talked about in the media, presuming they're for real and not just somebody trying to make a career move; some of what he talks about resonates with me. But this is as shameful a meltdown as anything Dan Rather or anyone else has ever had on live media. Miller likes to talk as if he's got cojones: well, anybody with real balls would get up the night after pulling this kind of stuff and apologize unreservedly to his rapidly shrinking audience and to his guest.
Been playing the full verson of Unreal Tournament 2004 the last few nights for about an hour or so each night (more than that and I feel like my eyeballs are bleeding). It's really good, at least the Onslaught mini-game, which is clearly influenced by Halo. What's nice is that though I haven't played an FPS for two years or so, I'm actually not a complete noob at it--I'm doing pretty well. It seems to me that multiplayer games like this only have a short "golden age", though, before cheats and hacks become widespread and cheeseball tactics take hold. Onslaught is pretty well designed to prevent some of the cheesiest tactics, like tank rushes, but I can already see a few stunts that could spoil the fun if lots of people start to pull them.
Speaking of Unreal Tournament, the Penny Arcade guys have come up with one of the funniest and most spot-on summaries of the online world in general with this cartoon.
March 22, 2004
Beyond the Five-Paragraph Essay
When I hand back
analytic essays, I try to leave room to do a collective post-mortem and talk
about common problems or challenges that appeared in a number of essays. I think
it helps a lot to know that the comment you got is a comment that other people
got, and also to know how some people dealt more successfully with the same
issue. All anonymous, of course, and following my
Paula-like nature, nothing especially brutal in terms of the actual grades
dispensed.
I usually base my comments on some scrawled meta-notes I keep as I work through each batch of essays. Sometimes there are unique problems that arise in relation to a particular essay question, which is sometimes a consequence of my having given enough rope for certain students to hang themselves in the phrasing of the question. Often there are problems Ive seen before and commented upon.
Read the Rest of "Beyond the Five-Paragraph Essay"
March 16, 2004
Readings and Rereadings #1
I've been meaning to use this blog to compel myself to tackle the backlog of 50 or so books of various kinds that are sitting on my "to be read" shelves. So here I go. What I plan to do in this part of the blog is put short or long reactions to some or all of a book--not to do formal "book reviews". Some of these may amount to no more thana paragraph. If I can stick to it, I hope to tackle 2-3 books a week this way most weeks.
So here goes with number one: Noam Chomsky, Hegemony or Survival.
March 16, 2004
Anybody else like Tripping the Rift? on the Sci-Fi channel? It's sort of like "Quark" meets "South Park". Obscene, a bit stupid at times, tries too hard, but still funny.
I feel a fragging coming on: the demo for Unreal Tournament 2004 is kind of fun, especially the "Onslaught" game. Been a while since I've done this kind of thing: I may actually get the full edition.
This is a mirror of a really fascinating archive of photos from the region around Chernobyl.
March 15, 2004
Terrorist Tipping Points
Following up more prosaically on my thoughts about the hard men, the atrocity of March 11th makes me think again about what moves below the surface in the conflict with terrorism.
Somebody put those bombs on those trains in Spain, and yet that same somebody doesnt wish to stand forward and be clearly identified, or tie these acts to some concrete goal or demand. So someone someplace has a model of causality in their head, that to do this thing without any clear public explanation will still somehow produce a result they deem desirable. But what? A general climate of fear? An unbalanced response by governments? A sick morale-booster for terrorists embattled elsewhere? A victory for the Spanish opposition? Or nothing more than a nihilistic desire to act somehow, with no real conceptual belief about what action will accomplish? Particularly if it turns out to be ETA that was responsible for March 11th (something that is appearing increasingly unlikely) that last is about the only plausible interpretation.
What March 11th really demonstrated, however, is that any time a small group of people decides to do something like this in the United States or Western Europe, they probably can. Given the degree to which Americans have portrayed al-Qaeda as boundlessly blood-thirsty and completely unprincipled, the question of the day is thus not What will they do next? but Why havent they done more? The answers, I think, are uncomfortable.
First, the strength of US reaction to 9/11, particularly in Afghanistan, when we were still focused on al-Qaeda and international terrorism rather than the Administrations unhealthy obsession with Saddam Hussein, communicated something very valuable and important, that major terrorist attacks would have major consequences. Osama bin Laden and his lieutenants may have reckoned that 9/11 would result in the lobbing of a few more cruise missiles at deserted camps and innocuous factories in Sudan. Having seen that this was incorrect, having suffered severe damage to their movement's fortunes, they and others may hesitate to act again against civilians within the domestic borders of the United States for fear of even graver consequences. On the other hand, this is where March 11th is a sobering reminderbecause it may demonstrate that a terrorist movement which has nothing left to lose has no more fear of consequences. The worst atrocities might come paradoxically when a terrorist organization is closest to being defeated.
Second, for all of my anger at aspects of the Bush Administrations homeland security initiatives, I still have to concede that many of the precautions taken and the investigative work completed have made it more difficult for existing terrorist cells in the United States to act. It is easy to be cynical about all the orange alerts, not the least because the Administration has been so willing to use security concerns to bolster its own narrow partisan fortunes (not something a genuine War President ought to do) but even Administration critics have to concede the very real possibility that the alerts and accompanying measures have prevented one or more attacks.
But that still leaves us with one additional consideration, which is the possibility that existing terrorist cells capable of acting have chosen not to act. This is what is so difficult to calculate. Everyone is looking at the political results in Spain and asking, Is that what the terrorists wanted? Will that reward them? Precisely because we have to treat terrorists as people with their own agency, making their own moral and political choices, we have to consider the possibility that they might refrain from attacking for any number of reasons, even including, impossible as it seems, their own contradictory and hellishly incoherent form of moral scruples.
This is a critical issue. Even in the best case scenario, we have to assume that there are still people at large in the United States and Western Europe who could stage terrorist attacks. Anybody who devotes even a small amount of time to thinking of plausible targets knows that not only is there a huge surplus of such targets, there must always be so in democratic societies. The train attacks in Spain could easily have happened on Amtrak: in the past ten months, Ive sat on Amtrak trains where people in my car have left a backpack on a seat and gone to the bathroom or club car, or so Ive assumed. If they were leaving a bomb instead, how could any of us tell? Trains only scratch the surface: a hundred ghastly scenarios spring to mind. Without any effort, I can think of ten things that a handful of suicide bombers could do in the US or Western Europe that would have devastating psychological and possibly even economic consequences at the national and international level.
If there are terrorist cells in the US and Western Europe capable of acting, and they have not acted, we can perhaps console ourselves that Afghanistan taught them to fear the consequences. We can also imagine perhaps that they are intimidated by security precautions, unimaginative in their choice of targets, or incompetent in their logistics. Far more, this all begs the question: what do they want, how do they imagine they will get it, and how does that dictate their actions? For all that it is soothingly simple to imagine them to be mindless killers who would commit any atrocity, we nevertheless face the complicated fact that they likely could have already committed atrocities beyond those already inflicted. What internal calculus tips a small group of men over to the commission of horror? There is no invasion force that can keep that tripwire permanently still: there is nothing to invade. The worst dilemma, however, is that we do not know and perhaps cannot know what the terms of that calculus are, whether it moves to action because of rigidity and repression or in the absence of it, whether it seeks anything concrete in terms of results or reactions. If it only seeks pure destruction and the maximization of pain, then I don't really understand why there have not been more attacks already. There must be more to it than that.
March 10, 2004
Triumph of the Will, or in the name of my father
Because one of the major themes of the book Im writing now is the nature of human agency in historical processes, Ive been thinking a lot about whether some individuals are able to act in the world through drawing on unpredictable determination or mysterious inner strength, through a ferocious desire to make things happen. Through will.
Will gives me a thrill. If theres anything in President Bushs defense of his post-9/11 strategy that resonates in me, it is the invocation of will, of a steely determination to stay the course.
I know Im weak and frightened. Ive always been. When I am traveling or working in southern Africa, I preemptively flinch at even the slightest hint of tension. In my first stay in Zimbabwe in 1990, when a policeman politely but quite earnestly commented that he would have to shoot me if I didnt stop walking while the presidents motorcade went past and then meaningfully swiveled his gun towards me, I waited frozen and then returned to my apartment instead of proceeding onto the archives. I crawled inside like a rabbit frightened by predators, emerging only with the next day.
I dont mean to overstate. I have willingly gotten into strange and sometimes threatening situations every time I have spent time in Africa. Not with fearless bravado, rather with a kind of sweet and stupid cheerfulness, a determination not to listen to the warning bells going off in the back of my head. I listen to my anthropologist friends who programmatically seek out opportunities to attend unnerving religious rituals and tense, near-riotous political situations and I wonder wistfully why Im so scared and theyre so brave.
I know that if it came to it, Id piss my pants in a minute. Big Brother wouldnt need a cage full of rats on my face in Room 101 to get me to betray my deepest commitments.
I found that out when I traveled with my father in South Africa. When we were confronted with a rather trivial example of a shakedown by a corrupt official in a game park, I was ready to unload my rands on the man in a country minute, just because he had a knife and a walkie-talkie (and, I imagined, a bunch of tsotsi pals waiting down the trail to ambush us). But Dad just stared him down, and the guy caved.
Yet here I am willing, perpetually willing, to talk about what we ought to do in a world where people want to kill us, want to kill me. What good am I?
Theres more than one flavor of will in the world, though, and all of them can make things happen that would not otherwise happen.
Theres a pure will to violence and survival thats a highly masculized combination of sadomasochism and swagger. We mostly see it our fictions, in Rocky films or in the umpteen thousandth time that Wolverine staggers through a comic book stoically bearing the pain of a hundred knife thrusts to his abdomen, but it really exists. The trick is not minding that it hurts. Mostly in the real world this amounts to nothing: lacking mutant powers or cinematic magic, the man of a thousand wounds usually staggers towards death, perhaps performing some small miracle of salvation or destruction on the way. Sometimes it is more, a person who shrugs off pain and fear to stagger through to some better day.
This kind of will is related to but not identical to the soldiers will, the will to fight when necessary or ordered, the will to act remorselessly if need be, to defend what is yours and take what you must. My father had some of that. When a crazy man with a gun killed people at another branch of his law firm, Dad wished hed been there, believing that he could have stayed calm under fire and stopped the man before anyone died. Dad used to tell me how the Marines taught him to kill or disable someone by striking their windpipe hard. I dont think any of this was bravado, or something he was proud of. They were quiet facts, stated calmly, based on a belief that if it came to it, he could do what was needed without pause or regret. I believed him.
The soldiers will is not the will of the hard man. The hard man is the man who haunts our nightmares. The hard man is the man who disproves the easy, lazy adage that violence never solves anything or causes anything meaningful to happen. The hard man can drive history like a whipmaster drives a horse, frothing, eyes-rolling, galloping heedlessly ahead. The hard man dreams not of the world he desires: his will is fire, and burns down thoughts of better days. The hard man only knows what he does not want and cannot accept, and his determination to strike out against the object of his fury is mighty. The hard man bombs pubs and buildings and planes; he cuts ears off defeated rivals, hands off innocent children, heads off journalists.
When we think of will, the hard man is the one we both fear and yet sometimes secretly desire. He laughs contemptuously at the doubts that afflict us, sure that he floats above us like an iron balloon, unyielding and untouched. We forget too easily why fascism authentically, legitimately attracted many before 1939: not just the purity of its conception of nation, not just its focus on essence, but also the hardness and clarity of its commitment to transformation, its baptismal yearnings.
The hard man's close cousin is the fierce dreamer, the obdurate idealist, the person who looks at today and can only see the ways in which it is not some ideal tomorrow. I may be too quick to accuse some of utopianism--that will require some reflection--but I do not think I am wrong to fear the utopian's will and regard with suspicion anything redolent of it.
None of these are the will to do the right thing even if all the world says otherwise. To do the right thing, but not quickly, not eagerly, not with braying certainty. The will to do the right thing comes from men and women bound by honor, directed by wisdom, burdened by a mournful understanding of their duty. Atticus Finch does not rush ahead, beating his chest and howling a war cry. Will Kane seeks allies and the support of his community, even when he wearily understands that he is all alone. There is no eagerness in him. The lonesome righteous can make horrible mistakes, auto-imprisoning himself in obligations, like Captain Vere in Billy Budd. He or she can end up staring with melancholy regret at his dirty hands. This is the kind of will I most admire, the kind of courage which stealthily rises to lift the whole world on its shoulders and reluctantly hurl it into a new orbit. Against the hard man, we raise the quiet man as his opposite.
Dad may have had the resolve of a soldier, but he also had this kind of determination as well. He would have stayed the course even if he was the last person left to hold the rudder. There was a rawness to his integrity: it was like sandpaper, flaying the sensitive nerve-endings of some around him. It was uncompromising both when it ought to have been and sometimes perhaps when it would have been better to bend rather than break. Nor was he tested as sorely as some have been: he never had to risk his own career, his livelihood, his future the way that some whistleblowers have. I think he would have, though, if it had ever come to it.
This is the will I long for now, and its not what were getting. Oh, theyd like to have us think so, but the lonesome righteous doesnt scorn allies, doesnt rush to the last stand at the OK Corral. He does his best to avoid the fatal breach in the social order. He doesnt talk tough and swagger.
Id trust in Atticus Finch, not Napoleon. Id trust in Omar Bradley, not George Patton. I wont trust the hard men or the men who playact at being hard men, those who theatrically declare they will be stopped by nothing. I wont listen to the men who shake their heads sadly at our inability to travel all the way to atrocity, who tell us we must act by any means necessary. But neither will I trust those who lack the will to justice, the will to fight if they must, the will to defend, those who snidely declare in advance that they will blow with the least wind and worry more about their own personal purity than the larger obligations of our times. I may be weak and frightened, but Im not having any of that. Ill trust in the people who both love and defend; Ill trust in the will of the fierce and quiet. Ill listen for the distant echoes of my fathers footsteps.
Battle of the Moms
Lots of recent online (and, I suspect, offline) discussion about Caitlin Flanagans article in the Atlantic Monthly that criticizes working women and praises stay-at-home mothers.
At least some of the bad juju circulating in those discussions (and Flanagans piece) concerns settling old scores within feminism. There are many who have never forgiven the feminists of the 1970s for the evident disdain they demonstrated towards middle-class women who remained in the home. With good reason: women who felt liberated from domesticity tended to falsely assume that all women should want the same. Just as a matter of politics, that mistake was costly, alienating many women who might have been sympathetic to a more loosely conceptualized feminism. The womens movement has never really recovered from that blunder, losing the sympathy both of middle-class women who have chosen domesticity and working-class women for whom the workplace is not liberation but brutal necessity.
Taking a step back, its interesting that the conversation continues to pit two sets of women against each other, each vying for the title of best mother, each notably defensive about their own choices and lives while projecting the charge of defensiveness onto their opponents.
Its a struggle thats hard to imagine between men about fatherhood, for a lot of reasons. For one, theres a wider plurality of archetypes of the good father out there: men can get kudos for being domestic and attentive or being strong and above the fray, for being breadwinners or slackers. Its also clear that men dont fight about fatherhood because they dont feel defined by it: the battle over manhood is sited somewhere else. Women, on the other hand, cant escape motherhood even when theyre not mothers: they are held accountable to it by society, and hold each other to it as well.
There are brush fires that burn in the struggle over parentingsay, for example, the question of whether or not to ferberize kids. (We tried it, and it didnt really work for us, both in terms of the emotional impact it had on us and our daughter, and in terms of results.) Then theres the wildfire of staying at home versus day care versus nannies. In either case, the small or the large, everyone involved would gain a lot of perspective by reading Ann Hulberts Raising America, a history of advice aimed at American parents by various experts. One thing I take away from Hulberts book is a confidence that kids are resilient, that the parental choices we treat as momentous have far less import that we might guess. Another thing I take away is a wisdom about how remarkably stable the long-term terms of contestation over parenting (permissive vs. strict, involved vs. distant) has been within the American middle-class, and how much those contests are about middle-class manners and self-presentation rather than a disinterested evaluation of the development of children.
One thing in Flanagans piece and the reaction to it where I feel a bit distant from almost everyone in the debate has to do with Flanagans charge that middle-class feminists are exploiting and thus betraying other women by using them as domestics and nannies. In a way, its a silly point, because its awfully hard to contain to domesticity. Whats the difference between a once-a-month cleaning service and all the other kinds of service jobs that the middle-class makes use of? If the charge of exploitation attaches generically to domestic work (not to specific low-wage conditions of employment), then it attaches to all service-industry labor and Flanagans critique is suddenly a lot less about child-raising and much more a back-door socialism.
But I feel differently about it also because Ive spent a substantial amount of time living in southern Africa. During my first fieldwork in Zimbabwe, I was intensely phobic about domestic service, and felt as Flanagan does, that it was exploitation. Id read Maids and Madams, I knew that domestic workers in southern Africa were exploited. So I was determined to wash all my own clothes and clean my own apartment (there were no laundromats in Harare, even in the good old days of the late 1980s and early 1990s).
The family who lived in the small home behind my apartment building had a different opinion about domestic service, since they provided it for everyone else in the building. From their perspective, I was a selfish prick. I could pay to have my clothes cleaned, but here I was occupying a unit in the building and refusing to employ them. They werent at all happy about it, and once I became aware of that, I really didnt know what to do. I went on washing my underwear in the bathtub but grew more and more puzzled about my reluctance to do what virtually everyone around me regarded as the right thing, including local leftists I knew whose commitment to fighting racial segregation and colonialism had been deep and abiding for the entirety of their lives.
I began to realize that it really wasnt about exploitation for methat was just a superficial thing, a cheap ideology, a slogan, and not at all consistent with my casual willingness to take advantage of other peoples affordable labor in other spheres of my life. What it boiled down to was that I was intensely uncomfortable about having strangers inside my domestic space. Not racially phobic, but generically, universally so. I didnt want any people seeing my dirty clothes, my books, my things, my way of life, if they werent very close friends or family. I still feel that way, actually. For a very long time, I blocked my wife from hiring a once-a-month comprehensive cleaning service for this same reason, even though we were finding it increasingly impossible to handle that kind of cleaning with a toddler around. I just didnt want them seeing the normal material conditions of my life. (I still dont allow them in my home office). I was eventually convinced--and view that service like any other comfort in my life provided by human labor, made possible because I earn more than the people whose labor I purchase. I do it because I can. If I don't like it, that's for different reasons entirely.
I wonder a little if the stay-at-home moms argument doesnt come from some of the same attempts to assert privacy, to cocoon some of our lives away from the world, to close the circle of family and shield ourselves from the world. I have some of that same attitude myselfbut Id like to avoid draping myself in laurel leaves and anointing myself Ace Exploitation-Fighter for having what is ultimately less a principle and more a phobia.
February 24, 2004
Purge the PIRGs
The discussion
of Ralph Nader online has produced an interesting eddy in its wake, namely an
equally passionate attack on Public Interest Research Groups (PIRGs), which
Nader played a role in founding.
I dont actually
map my feelings about PIRGs onto Nader, though their mutual connection doesnt
help me get warm and fuzzy about either of them. In many ways, I object more
fundamentally to PIRGs. Theyre a scam.
Like
Jane Galt, I first reached that conclusion as a canvasser for a PIRG one
summer in the early 1980s. I only lasted about three weeks before the toxic
combination of viciously exploitative labor practices and a recognition of the
organizations total lack of concern for political commitment completely
alienated me. If you pounded the pavement all evening but came in just shy of
quota, you didnt get paid at all for your work. The people running the
canvassing operation had zero interest in the issues or the ideas: they were
in manner or functioning little different than the boss of a sweatshop factory
floor. Keep the money rolling in and send it along to the state organization:
that was the sole priority. The spiel we were told to memorize was a frankly
deceptive presentation of the organization and its activities. PIRGs have a
long habit of parasitically attaching themselves to legislation and claiming
credit for itand only if they deem it something fuzzy and blandly liberal
enough that it is likely to raise more money or garner good publicity. Theres
no coherent agenda beyond that, and never has been.
My antipathy deepened
when a PIRG came rolling into town at Wesleyan University, when I was an undergraduate,
seeking an automatic fee attached to the tuition bill. The whole presentation
was slimy both in content and style. First, they dangled an internship in front
of student officers, and then they shifted abruptly to left-baiting and bullying
when anyone (a class of people that most definitely included me at that point)
asked why on earth a PIRG should get an automated chunk of money every year
when no other student group had the privilegea chunk of money which would
be completely non-transparently spent, moreover. As a magnaminous gesture, they
finally offered a system where you could come and get a refund of your PIRG
money if you were willing to show up at a basement office during a one-day window
once every academic year and ask for it. This is all standard for PIRGs then
and now: they push for mandatory fees, and accept as a fall-back an opt-out.
Its not just
that PIRGs are sleazy in their fund-raising and opportunism. Reading Jane Galts
essay, I wonder a bit at whether they havent played a subtle but important
role over two decades in disillusioning young liberals and leftists and driving
them rightward as a result.
Based on my own experience and the experience of people close to me, Id say that liberal nonprofits in general are usually not what they seem on the outside, or at least, rarely apply their outward convictions to internal matters. They often have unfair, exploitative or even discriminatory labor practices. Theyre often intensely hierarchical, non-democratic and non-transparent in their internal organization. But PIRGs are in a class of their own. At least with something like the ACLU or Amnesty International, whatever their internal cultures are like, they stand for something consistent politically and socially. PIRGs dont even have that.
February 23, 2004
The Old Man
and the Flame
The inner flamer.
Its such a temptation to let it loose. I feel like Black Bolt of the Inhumans:
if I but speak, half the city could be destroyed.
In my salad days,
I could crack off a mighty flame. Ah! In the Usenet days of alt.society.generation-x,
when the resident objectivist could drive me to the dark side of the Force with
a single post. Or rec.arts.startrek.current, when all it took to set me off
was the resident feminist Voyager fan praising Captain Janeway and
telling all the critics that they were misogynists for hating the show. Many
Shubs and Zuuls knew what it was to be roasted in the depths of the Sloar that
day I can tell you.
These days, theres
only one moment where I feel completely and gloriously justified in letting
Mr. Hyde off his leash, and thats in conversations dealing with Ralph
Nader and his defenders. Not at Nader himself, really, because its obvious
what his problem is. Its the people who still defend him and proudly announce
they voted for him in 2000 and theyll do it again who drive me out of
my tree. Theyre a miniscule number of people overall, and not really that
influentialbut I suppose they could be just influential enough, which
is very bad. As I said over at Chun
the Unavoidables, the incoherent mish-mash of justifications for voting
Nader, as well as the complete shamelessness of those offering them, just brings
out the worst in me.
I sometimes wonder
why I cant flame more often, or when exactly it was that I developed a
helpless compulsion to fairness. Maybe theres something to this notion
that the older you get, if you get more and more comfortable and attached to
responsibilities, the higher the cost of acting up, the more you become a kept
creature of the system. Maybe Ive just become The Man.
Maybe. Id
like to think its something more, that it is about taking the ethical
responsibilities of my profession seriouslysomething that I feel the usual
Punch-and-Judy responses of both right and left inside and outside of academia
dont do, no matter how strenuously they claim to. More pressingly, its
about efficacy, about how you make your convictions meaningful and powerful
in the world.
The flamer really
has only a few roads to efficacy and influence. There's one in which he or she
forces everyone to accept his or her view through command over institutional
power (in which case the flame itself is no more than a calling card for other
kinds of compulsion). There's another in which achieving results in the world
doesnt matter, in which only the unsullied narcissistic purity of expression
is the issue. I agree that the latter view sometimes produces beautiful prosea
brilliantly written flame, curse or diatribe is a pleasure to read. So thank
god for the occasional narcissist, but only if they also happen to be brilliantly
bilious stylists.
I suppose sometimes the flamer might hope to change the behavior or views of others through shame, and thats the one time I still think its worth it to let the beast out (as I do against Nader voters): when only outrage and defiance has a hope of breaking through a wall of denial and stupidity. That's my only defense in that case: Nader voters appear so unpersuadable by any other means--in fact to be proud of their near-total invulnerability to any persuasion--that there's nothing left besides flinging feces at them. There are others on the American political landscape similarly cursed with mule-headedness, but I don't flame them because either I don't understand or know them well enough to be sure of their unpersuadability (whereas I feel like I understand Nader voters very well) or because, frankly, they're powerful enough numerically or culturally that it's important to keep trying to push the boulder up the hill no matter how Sisyphean the task.
That's one other thing a flame can do: when your cause is lost and hopeless and yet you are still certain that you are right, a flame can be the last thing you do before defeat, a refusal to go gentle into that good night. In that case, a flame is an admission of fatal weakness and should be read as such. Perhaps that's me and Nader voters: I know nothing can stop them so why the hell not scream at them, just to get my own catharsis.Finally, the flamer
can be a blackmailer who demands he or she be given what he or she wants or
or he or she will lay waste to any possibility of a reasonable exchange between
equals. Thats the Ann Coulter approach to the public sphere: I am become
Flamer, Destroyer of Worlds.
Being addicted
to mediation and fairness, to exploration of complexity, is actually pretty
exhausting. You get a lot of shit from everyone in all directions, and very
little thanks for it. Some days Id rather be an anarchic free spirit,
rather go back to dancing in private glee after dropping the bomb on the weakest
link, the most suspect argument, the most clueless fool, rather go back to being
the hairy eyebrowed bombthrower hurling discord from the back of the room. This
other guy who usually comes out to play here and elsewhere in my professional
life, well, hes not the guy I imagined Id become. Hes dour
and perpetually disappointed in the weaknesses of himself and even more of other
people. In one virtual community I have participated in, a departing member
who took the time to spew venom on his way out said that I was a person who
wanted to be superior to other people and liked by them because of it. I remember
that because theres something to it. I suppose its actually confirmation
of its accuracy that I dont think its all that terrible a thing
to be. I also admit that a commitment to reasonable persuasiveness and unvarnished
if polite truth-telling can often be a quite satisfyingly contrarian, dissenting,
provocative thing in its own right.
Still, flaming seems a more glorious, free thing to be and do. It would be liberating to stop bothering to instruct, cajole, plead, work with, mediate and persuade, to worry about nothing but ones own blazing righteousness and care little for the consequences or the results. Thats rather like voting for Nader. Which reminds me of why I really stopped doing it, because I saw again and again that when even a few people flame, the whole discursive house burns down.
On How to be a Tool
I just saw a call
for a March 3rd rally against the Comcast-Disney merger led by PennPIRG, Media
Tank, Prometheus Radio Project, the Communication Workers of America, and Jobs
with Justice.
Joining this rally
is about as good an example of being a tool as I can think of. Media monopolization
is a real issue, but rushing to the barricades to defend Disney from Comcast
is about the worst possible way I can think of to deal with the overall problem.
Disney executives ought to come outside and give anyone at the rally $10.00
coupons to the Disney Store in thanks. The fact that PennPIRG is apparently
the key organizer just reinforces my low opinion of the opportunistic and amateurish
nature of PIRGs in general.
Its actually
hard to know who to sympathize with in the Comcast-Disney struggle. Ive
had a low opinion of Comcasts services for a while. Their technical management
of their high-speed Internet service after Excite@home went belly-up was horrible.
The hysterially overwrought, manipulative drumbeat of attacks against satellite
dishes on Comcast channels is a pretty good argument against media monopolization.
Their current level of service in their On Demand offerings are
beyond lame. Its no wonder they want to acquire Disney to provision content,
because the content that they generate themselves is usually one bare step above
the kinds of public-access channels that have recently released mental patients
whove gone off their meds hosting talk shows. If Comcast succeeds, expect
a whole lot of problems of integration between the two operations: the learning
curve will be by far the steepest on the Comcast side.
On the other hand,
if Disney shareholders cant see that Michael Eisner and his inner circle
of sycophants is dragging the company down, they arent paying attention
and deserve to lose value on their investment as a result. Any parent with young
children can see it: the company has foolishly surrendered the long-term stability
of the high ground in childrens media by relentlessly cannibalizing its
own properties, spewing a tide of made-for-video junk that permanently degrades
the value of their most lucrative properties. There are so many misfires coming
out of Disney lately that its a wonder that there have been any successes
like Pirates of the Caribbean at all. It used to be that you could
see a real difference between the weak storytelling and cheaper animation of
non-Disney kidvid, as in the work of Don Bluth. Now Disney has voluntarily sunk
to the bottom in pursuit of a few quick bucks. Tack on to that Eisners
evident inability to attract, recognize and maintain talent, almost certainly
because of his own authoritarian management style, and you have a company that
is heading for a serious crisis. If I owned a lot of stock in Disney, Id
sure want to give Eisner the boot, and if it took Comcast to do it, I might
well cheer them on.
It probably isnt going to be a story that ends happily ever after for anyone, least of all the consumersbut in a world where theres a lot to protest (including media monopolization) being a puppet for Michael Eisner strikes me as a low priority.
Quicksilver and Foucault
I am finally almost
done with Neal Stephensons Quicksilver
(just in time for the sequel!) Stephenson reminded me of why I find early modern
Europe so interesting, but also of why the work of Michel Foucault was so appealing
to me and to many other historians when we first encountered it.
It is easy to label
postmodernism as a single agglomerated villain and attribute to it every bad
thing in the past thirty years. It gets blamed (sometimes in one breath by the
same person) for dragging intellectuals into total irrelevance and for accomplishing
a devastatingly comprehensive subversion of Western civilization. In academic
arguments, a generalized postmodernism often functions as an all-purpose boogeyman
in declensionist narratives, the singular explanation for why the young turks
arent as good as the old farts. (Though that may be shifting: the genuinely
ardent postmodernists are themselves becoming the old farts, and will presumably
shortly be blaming something else for falling standards.)
This general posture
allows people to get away with some appalling know-nothingism at times. When
reading E.O. Wilsons
Consilience, I was excited at first by his ambitions to achieve the
unification of knowledge, to re-create the practice of the Enlightenment
when science and philosophy, interpretation and empiricism, were joined together.
Then I began to realize that Wilson meant unification roughly the
same way that Hitler meant to unify the Sudetenland with Germany. Nowhere was
this more evident in his treatment of Foucault. Wilson basically admits that
he just read a bit of his work, haphazardly, and thought Come on, get
over it, things arent so bad.
I say all this
as someone who does often talk about an agglomerated postmodernism rather loosely,
and who certainly views it quite critically. I reject almost all of the deeper
ontological claims of most postmodernists and poststructuralists, and I find
the epistemologies that many of them propose crippling, useless or pernicious.
And yes, I think that a lot of them are bad writers, though lets leave
that perennial favorite alone for once. But I still recognize the ontological
challenge that postmodernism, broadly defined, offers as a very serious, substantial
and rigorous one. Nor do I just brush off the epistemological challenges that
postmodernists have laid out: theyre real and theyre important.
(Though yes, at some point, I think its perfectly fair to say, Yeah,
I get it, I get it and move on to other things. Youre not required
to read and read and read.)
The thing I regret most about casual rejectionism of a loosely conceptualized postmodernism (or any body of theory) is that it seems to deny that it is possible to read a single work and extract some insight or inspiration from it that is not really what the authors full theory or argument is meant to lead you to. It's rather like one of the professors who I encountered in graduate school who would circle words or terms he didn't like and ominously ask, "Do you want to be tarred with that brush?" It's a theory of citation as contagion.
Taken in totality,
I think Foucault is doing his damnedest to avoid being pinned down to any particular
vision of praxis or anything that might be summarized as a theory,
in a way that can be terribly coy and frustrating. Inasmuch as he can be said
to have an overall philosophy, I find it despairingly futilitarian and barren,
and I accept very little of the overall vision. Taken instead as a body of inconsistent
or contradictory suggestions, insights, and gestures, his work is fairly fertile
for historians.
If nothing else,
he opened up a whole range of new subjects for historical investigation from
entirely new angles: institutions like prisons or medicine and their practices,
forms of personhood and subjectivity, and sexuality. Its interesting that
the historical work which Foucault inspired often ended up documenting that
he was wrong on the actual details and often even the overall arguments, but
even then, you can clearly see how generative that his choices of subjects were.
What Foucault does
most for me comes from his attempt to write genealogies instead of histories,
his attempt to escape forcing the past as the necessary precursor to the present,
to break the iron chain and let the past be itself. Thats what brings
me back to Stephensons Quicksilver and early modern Europe in general.
The temptation
is so powerful to understand early modern Europe as the root of what we are
now, and everything within it as the embryonic present, all its organs already
there, waiting to grow and be born. But what I find so dizzying and seductive
about the period is also its intimate unfamiliarity, its comfortable strangeness.
I dont feel as epistemologically and morally burdened by alterity as I
do when Im dealing with precolonial African societies, where theres
so much groundwork seemingly required to gain the same sense of interior perspective,
but on the other hand, I always feel that around every corner in early modern
European societies the familiar makes itself strange right before my eyes. The
genesis of the present but also the possibilities of other histories; the world
we have inherited but also all its dopplegangers and ghosts.
Thats what
I feel Foucaults idea of genealogies helped me to explore and understand,
and what I think Stephenson manages to deliver in Quicksilver. The thrill
of natural philosophy unbinding the world, so much a part of the more whiggish
history of science is there, but also its distance. The Royal Society are ur-geeks
and proto-technophiles and yet, theyre also aliens. Jack Shaftoe is the
libertarian dream, the free man cutting loose of the constricted world around
himbut hes also the passive, drifting inhabitant of a commercial
and social landscape unlike anything we know today, to whom events happen, recapitulating
the narrative structure of the picaresque. Reading Quicksilver is like
wearing bifocals: you can switch in and out of being able to locate yourself
within its episteme. Im not entirely sure its a good modern
novel, really, nor is it good historybut it is a good genealogy as well
as genealogical simulation of the narrative roots of the novel form.
This isnt a pleasure limited to representations of the early modern world: Jeff Vandermeers edited anthology of pseudo-Victorian/Edwardian medical prose, The Thackery T. Lambshead Pocket Guide to Eccentric and Discredited Diseases delivers some of the same satisfactions through simulation (rather like Philadelphias Mutter Museum does by simply showing you the medical artifacts and exhibitionary vision of the same era). But simulations or explorations of the Victorian usually feel much more like recursions of the present than trips to a fever-dream alternative universe. Quicksilver, like Foucault, travels farther and tries harder to give us a way of representing the early modern European world that doesnt just make into a toddler version of our own times.
And Now For Something Completely Different
Well, not quite--I see my colleague Prue Schran has joined the conversation about Swarthmore and speech. Actually, I quite agree with a lot of her observations--they're relevant to what I was writing about in "A Pox on Both Houses", as well as some older posts at Easily Distracted about the conservatives-in-academia question. But attitudes and formal speech policy are absolutely not the same thing, and if attitudes rather than policy are the issue, the lever that will move them really is subtle, sympathetic moral and intellectual suasion, or at least that's my feeling. Feeling restricted or ostracized by the pervasive attitudes or unspoken orthodoxies of colleagues is very different than being formally restrained by a quasi-legal code--though of course the existence of the former phenomenon is why it is hard to trust to any procedures outlined in formal policy.
There's also the more arcane issue of how to untangle policies on harassment and speech. I think FIRE is overly sanguine both about how easy it is to disaggregate them, either legally or conceptually. Also, O'Connor offers some extra tricky arguments on top of that about the alleged legal invulnerability of academic institutions to federal employment law (is that really true? Where's the Volokh Conspiracy when you need it?) and the legal freedom of colleges and universities to restrict speech if they want unless they otherwise claim that they're not restricting speech, in which case O'Connor sees them as open to legal claims of fraud. At that point my head spins a bit: if colleges have published speech codes or harassment policies which O'Connor and FIRE say clearly and unrestrainedly restrict freedom of speech, and O'Connor acknowledges that colleges and universities are legally free to do so, then by their reading, wouldn't a charge of fraud be legally untenable? Where's the fraud if you have a published speech code that restricts speech and you're legally free to do it? Unless, of course, the kind of thing I've been suggesting is true, that there is a reading of many college policies as also trying, authentically, to promise academic freedom, and that it is the authenticity of that intent which makes its contradiction by speech codes potentially fraudulent.
Maybe this is an indication that the only solid ground for challenging speech codes is a moral or ethical one--that we shouldn't have codes because they're wrong to have, because liberty is the bedrock value of academic life, and leave the legal issues aside. That's certainly one of FIRE and O'Connor's most salient consistent observations, that whatever their merits or problems on paper, faculty- or administration-authored speech codes are basically a case of amateurs meddling in the construction of bodies of pseudo-law, hoping to direct the power of a quasi-state entity (their institution) to regulate local behavior.
Anyway, on to more diverting things. A couple days ago, my toddler and I found a great new thing at Noggin's website, called ScribbleVision. Basically, you color in a bunch of things and then ScribbleVision animates your colorings in a series of scenes featuring the hand-puppet characters Oobi and Grampu. It's one of those things that will very rapidly have the adults taking the mouse away from the kids. I was especially proud of my scene of Sauron's Lidless Eye dawning over Oobi's house, with a demonic rooster crowing in the background. Let's say that my impression of Oobi and Grampu's animated actions and expressions changed somewhat against that backdrop.
February 17, 2004
The Argument Clinic (Apologies to Monty Python)
There is a real difference between my reading and Erin OConnors reading of Swarthmores policies on speech, one which may reflect some very deep differences in the ways we approach working with the interpretation of texts and much else as a whole.
There are also
stylistic differences: Im long-winded, obsessed with nuance and ambiguity,
and uninterested in calling people to the barricades even when there is an evidently
urgent need to get them there. OConnor is trying to mobilize people, and
to do so with as much immediacy and intensity as she can. On the whole, I think
we agree about a lot of the problems facing academia, and in particular, about
the dangers to speech rights in academia today. OConnors way of
framing these issues is certainly much more powerful in getting people to acknowledge
and confront those dangers. But I still worry about collateral damage on the
way. Sometimes, I think complexity really is important, not just as an aesthetic
preference but as the heart and soul of an issue. Perhaps on speech rights,
what is more important is the root principle of the matter, and assertions of
complexity are an unhelpful distraction. I would rather build bridges and mediate
between opposing sides, playing for small positional gains. OConnor would
rather burn bridges and achieve victory in our time. You make the call, dear
reader. There are reasons to prefer either approach, and reasons to think that
in either case, we are kids with hammers who think everything in the world looks
like a nail.
OConnor raises
some real potential problems with Swarthmores policies, most of which
we broadly share with all colleges, and indeed, all institutional entities with
sexual harassment or anti-discrimination policies.
Here are three
probing questions that I think are pretty cogent that I get out of OConnors
second post on this subject:
1) How do we resolve contradictions in policies where one part says one thing and another part says another thing? Doesnt Swarthmore's sexual harassment cancel out or make actively irrelevant any statement anywhere else about protecting speech?
2) Isn't trusting in grievance procedures dangerous given that they tend to violate due process concerns? Is there any reason to think that Swarthmore's procedures are any more protective of due process than most colleges? Hasnt that already been a slippery slope elsewhere? Isn't Burke concerned about that?
3) What about this little section on discriminatory harassment? Doesn't that cancel out the general harassment policy? Can we talk about how to read those two in relation to one another?
I have a straightforward
answer to the first question, which is that as I read it and understand it,
our policy on non-harassing speech takes precedence over everything else, that
it is the largest and most expansive principle we assert on the issue of speech.
Harassment (sexual, general, discriminatory) is only a situational, contextual
exception from the general principle, and only becomes meaningful when it can
be proven to exist according to a defined set of precise criteria. In this sense,
harassment under Swarthmores policy functions rather like the defamation
or incitement to violence functions in relation to the First Amendment. The
First Amendment is the bedrock principle; defamation or incitement are special
cases which restrict speech only in relation to a judicial finding, and only
within narrowly constrained and defined bounds. They exert no prior restraint:
you cannot in advance define particular acts of speech, particular words, particular
phrases as defamation or incitement. Its all about context. If you take
Swarthmores policies on harassment to completely cancel out or obviate
the existence of a comprehensive protection of speech in our policy, as OConnor
does, then you are basically setting yourself up as a free speech absolutist
in general, and arguing that any circumstantial restriction on speech annihilates
a foundational protection for speech, that the existence of libel laws definitionally
and intrinsically cancels out the First Amendment. You can make that case, and
some do. I think its incorrect. Im not clear if this is OConnors
general position on speech rights.
I might also note
that to take this position is to argue that Swarthmore (or any other college)
can never actually articulate a policy that sanctions harassment which makes
reference to speech acts. Id actually be curious to see whether OConnor
thinks that it is notionally possible for a university to reserve the right
to expel a student who personally harasses another student on a repeated basis
but commits no direct violence against them. If one student followed another
student around campus saying, Faggot. Faggot. Faggot continuously
for a week, is there any legitimate grounds for saying, Listen, thats
a problem that goes beyond moral persuasion directed at the harasser?
If so, is there any way to construct a policy that legitimizes administrative
action without making reference to speech? We went out of our way, at any rate,
to avoid defining that speech as a class of speech like hate speech
which would be definable without reference to context. In fact, it doesnt
really matter what one community member says to another if theres a finding
of general harassment here: the content of the speech is irrelevant. If the
content is irrelevant, I really think its not about a restriction on speech.
Except for the
sexual harassment and discriminatory harassment policies, and here I can only
reiterate that I believeI hopeour general protection of speech is
firmly understood to be the bedrock principle that has precedence over those
policies.
On the second question, of whether the sexual harassment policy is a ticking time bomb or slippery slope, in particular because it is adjudicated through a grievance procedure which has no due process protections as theyre commonly understood, well, thats a real point. Its my big problem with most such policies on college campuses, and the major place where they are typically mischieviously misused. OConnor is right to say that I essentially trust my colleagues and my institution and trust that nothing will go wrong, but its also right to suggest that this is a flawed approach. I agree here that we share in common with most academic institutions a serious problem that could well obliterate any of the best intentions of our policies. I would also underscore, as I did in my first post on this subject, that I regard hostile environment standards as intrinsically dangerous. (Though I suppose here too I wonder whether O'Connor thinks that there is anything that would consistitute sexual harassment besides quid-pro-quo, and how you could identify it in a policy without reference to speech acts.)
On the other hand,
I think OConnor simply shrugs off the question of legal exposure and liabilityand
easy as it would be for me to do so, I have enough empathy for those who have
a legal responsibility to safeguard the resources and wealth of this institution
to recognize that you cant have a revolution in one country on these issues.
Barring a serious statutory reform of harassment law in general, it is insane
for any single institution to voluntarily expose itself to serious liability
by failing to conform to existing legal standards, whatever the weakness of
those standards.
On the third question,
I have to confess that Im busily inquiring about just where the policy
statement on discriminatory harassment came from. I remember the debate on the
general harassment policy and the question of hate speech, and how
we came to the policy we have. I remember the same for the sexual harassment
policy. But Im honestly puzzled about this smaller statement, and where
it came from, particularly because it seems more pressingly contradictory to
the statement on general harassment and speech rights.
Id sum up by saying, however, that I really think OConnor simply doesnt give Swarthmore enough credit for drafting a policy which is actually quite different from the campus norm, and which actually intended to repudiate the idea of a speech code, with its prior restraint on defined classes of speech acts. I don't see the policy as a "trojan horse" with sinister conspirators inside, much less see myself as one of the Greeks waiting to pillage. As I see our existing policy, students here could hold all the affirmative action bake sales they like without any fear of sanction or judicial action by the college against them (though not without fear of being criticized for doing so). OConnor chooses to portray me as a person who conveniently "pretends" otherwise. No, I just think its more complicated than she lets on, and that there is as much reason for optimism as there is for criticism, that the devilat least in this caseis in the details.
Thanks for Playing
Well, at least
this time, Erin OConnor
has it really wrong.
Swarthmore has
no speech code. The community specifically rejected having a speech code when
we considered the issue. We specifically insisted that speech which might be
regarded by some as offensive is non-adjudicable, and underscored that the college
administration can bring no sanction against individuals for what they say regardless
of how offensive it might seem to others.
There is only one
way that speech can be an adjudicable issue at Swarthmore, and that is if it
is part of a repeated, persistent attempt by one individual to personally harass
another individual. The standards for this are very precisely enunciated in
our policy on general harassment. You cannot generically harass a social group
or identity. There is no one-time instance of general harassmenta single
statement cannot be taken by one individual to represent an act of general harassment
by another individual directed at them: it must be persistent and repeated.
Our sexual harassment
policy, from which OConnor draws almost all of her quotes, was adopted
at a different point from our general speech and harassment policy, and I agree
has a few emphases which differ from the overall policy, in that it states that
it is possible for a one-time action to represent a hostile environment
against which someone might have a grievance. Three things are worth noting
about this policy, however. First, the general speech policy supercedes it,
as I understand things, e.g., the specific protections granted free speech are
the most important policy dictates we have on this subject, and the sexual harassment
policy does not contradict or contravene those protections. Second, the sexual
harassment policy contains an important qualifier which OConnor notably
fails to cite: The perception of conduct or expression as harassing does
not necessarily constitute sexual harassment, and goes on to state that
every complaint must be carefully examined on its own merits. No statement or
idea or expression is categorically identified, outside of the context of a
specific complaint, as prohibited or punishable. A grievant is directed to ask
a perceived harasser to stop, and if they do not do so, is given the option
to pursue a grievance procedurebut there is no a priori finding that any
given expression creates a hostile environment. Third, I would note that aspects
of this policy take the form that they do in order to achieve compliance with
existing federal law on sexual harassment: if there is an issue here, it is
an issue whose locus is far beyond this campus.
This is not to
say that Im entirely comfortable with the content of this specific policy:
I found it overly broad in several respects when the faculty voted on it, and
Im especially concerned about the ways a hostile environment
standard can and has been misused on college campusesbut it is specifically
the hostile environment standard which federal law has legitimated.
To expressly repudiate it in college policy is an invitation to a devastating
liability claim against the college at some future date, because it would place
the college at odds with a clear body of legal precedent. (When institutions
or employers lose such lawsuits, it is often precisely on the grounds that they
were made aware of a hostile environment and did nothing to correct it. Were
we to state outright that we reject that a hostile environment can actually
exist, wed be wide open to such a finding.)
Still, I have to
stress again that the impression OConnor gives about even this aspect
of the sexual harassment policy is downright wrong even beyond her mischaracterization
of it as an overall policy governing speech. A Swarthmore student or member
of the faculty expressly cannot be punished merely for saying something that
has the characteristics described in the sexual harassment policywhich
OConnor implies. There is nothing adjudicable unless there is a grievance
from a specific grievant, and that grievance must meet the specific test of
being harassment with specifically sexual intent. John Ashcroft couldnt
file a grievance against Arthur Schlesinger under Swarthmore policy unless he
thought Schlesinger was making a quid-pro-quo demand for sexual favors from
Ashcroft or if Schlesinger was making Swarthmore a hostile working environment
in a sexually demeaning way. (Since neither of them work here, the hostile environment
standard wouldnt apply in any event.)
Let me quote from
the Swarthmore College policy statement on uncivil or demeaning non-harassing
speech, since OConnor didnt see fit to share this with her readers
(although speechcodes.org
does reprint this policy in full):
As a member of Swarthmore College, one's moral responsibilities extend beyond formally sanctionable conduct. All of us, therefore, have a responsibility not to indulge in gratuitous offensive expression just because it may not be subject to official sanctions. Anonymous offensive expression is generally inexcusable, but the risk of harm in making adjudicable all forms of offensive expression would not only outweigh the benefits of official proscription, it would also seriously endanger academic freedom."
"Even when individuals (or groups) admit authorship, however, they act irresponsibly if they are unwilling to engage in a defense of their views, especially with those targeted. Perpetrators of alleged non-adjudicable but uncivil expression should engage the objects of their attacks through discussion and, possibly, mediation. If they do not, however, no disciplinary action will be taken, though College officials or anyone else may publicly decry the content and manner of such expression."
"It needs
stressing again that the College will in no way formally discourage any argument,
provided it does not include threats of violence, though what is said may be
deplorable and very possibly more diatribe that argument.
Thats not a speech code. Its the antithesis of a speech code. Its a specific protection extended to speech, and a specific forbidding of judicial and quasi-judicial forms of sanction against speech by the administration or the community.
A Pox on Both Houses, or Conservatives in Academia (again)
Its Punch
and Judy Show time, with academic blogs trading knocks over the question of
whether conservatives are discriminated against in academia. Let me once again
go over some of the important complexities that seem to me to be absent from
most of the discussion.
1. Different disciplines and units, at different scales of institutions, are fairly non-comparable when were talking about existing distributions of political or social views. The humanities are not the same as the business school or the law school.
2. No one is ever asked bluntly in the humanities what their political affiliation is at the time of hiring. The discussion of the politics of a candidate in history or anthropology has never, in my own experience, involved any speculation about political affiliation. If there is a conversation about politics, it is likely to be about much more arcane, disciplinary arguments, about what specialization or methodology a person uses. Ive occasionally heard someone pronounce this or that methodology or form of scholarship reactionary, but thats a highly mobile epithet and can be applied to almost anything, including ideas and forms of practice that are highly, intensely leftist on the general map of American political life. To ask whether someone was a Democrat or even a leftist or liberal (or conservative) in a discussion of hiring would be like confessing that youre the village idiotit would seem a hopelessly unsophisticated way of thinking.
3. Thats not to say that someone who was identifiably a conservative or libertarian wouldnt be in for some rough sailing in some academic disciplines, both at hiring or afterward. I was and remain surprised at how reluctant many people participating in this discussion are to just say, Yes, in some disciplines, an identifiable conservative may be treated very poorly. Most importantly, in most of the humanities theres a default assumption that everyone around the table more or less broadly can be classed as a liberal, and a certain stunned incredulity when someone departs from that assumption. It is very, very hard to have certain conversations or advocate particular views that are held more widely in the public sphere in some departments or disciplines. I find that as I take increasingly contrarian positions on some of these kinds of issues that it is harder and harder to find a context where I can profitably converse with them about colleagues.
4. On the other hand, collegiality is a powerful cultural force in many colleges and universities, and its stultifying or comforting effects (take your pick) often have nothing to do with politics in any sense. A conservative or libertarian who is a mensch about his or her views and research may well be admired, even beloved, by liberal or left colleagues, and fondly regarded as valuable because of their views. On the other hand, someone like Daniel Pipes who is running around picking broad-brush fights with everyone whom he perceives as a bad academic, usually based on a paper-thin reading of their syllabi or even just the titles of their research, is going to be loathed, but as much for his behavior as his political views. A liberal or leftist who plays Stalinist Truth Squad in the same way is going to be equally loathed and avoided. Ive seen departments where everyone treats a particular person as a politicized pariah even though the political views of that person are exactly the same as the general distribution in the department, and its entirely about strident, personally confrontational, abrasive, self-aggrandizing behavior. Now it may be that conservatives, having been sneered at, are more inclined, almost out of necessity, to go on the offensive, and create a feedback loop in the process. But the mode of action is more important than the views.
5. Along the same lines, ostensible political views and intellectual temperment may not map well onto each other. Tempermentally, most academics are highly conservative in the (Edmund) Burkean sense: they tend to oppose any change to their own institutions and they tend to argue strongly in favor of the maintenance of core traditions and practices. Many of the critiques of academic life circulating in the blogosphere now have less to do with the party affiliation of academics and more to do with this tempermental leaning, and the behaviors or attitudes which are justifiably seen as troubling would be no different if the party affiliations or political views of academics were changed, barring major changes to the nature of the institution. Magically turn everyone in the humanities into Republicans tomorrow, and theyd still exhibit all the behaviors that everyone is complaining about. Indeed, some of the conservative critics of academia seem to me to be actively campaigning for just this option.
6. Why do conservatives care about the humanities at all? The answer might be that for both the cultural right and left, the humanities or more broadly, mass culture, are an important alibi for explaining their failure to outright win the culture wars of the past twenty years. Rather than asking, What about our views is just not appealing and may never be appealing to the majority of Americans, they would prefer to assume that those views would be appealing if not for some partisan interference in the natural course of events. For the cultural right, higher education in general and the academic humanities in particular are the boogeyman of choice, to which I can only suggest that theyre vastly, gigantically overstating the possible influence of those institutions. I think the same thing is thought about mass culture in the other direction. In both cases, theres a systematic effort here to avoid thinking the unthinkable thought, that maybe, just maybe, the majority of people have thought about your view of things and they just dont like it, for good and considered reasons.
7. If conservatives arent going into academia, theyre not going into it well before they could be discriminated against. That means that conservatives should not casually ignore the possibility that there are market-rational reasons that conservatives dont go into many fields (especially since it seems to be a compliment to them ).
Now add some new points about the latest wave of discussion on this issue:
8. Quick reads of syllabi and specializations are very lousy ways to decide what someones partisan politics or even general political philosophy might be, for a lot of reasons.
9. Being intolerant towards your students is different than being intolerant in hiring decisions. A student reporting intolerant asides or behavior in the classroom by a teacher is not evidence of systemic discrimination in hiring or training practices. This is a different kind of problem, a pedagogical flaw that may include behavior that is not especially or notably political, but that is simply about the failure to run a classroom which generates multiple possible outcomes and nurtures critical thought. Pedagogies which narrowly reproduce ossified orthodoxies are a common problem in academic life, and will remain a problem regardless of the party affiliation of academics.
10. Your party registration is not much of a guide to the way you actually act on your political views in an institutional environment. I have known people who are intensely active in a political party but where youd never guess what their affiliation is from their scholarship or pedagogy. Ive known people who could care less about formal politics who talk nothing but ideology in the classroom. In general, everyone in this discussion is failing to leave room for the professionalism of academics, which is often the most powerful determinant of their behavior.
11. The entire class of people with postgraduate degrees skew significantly Democratic in registration: its worth asking how much academic departments differ from this general proportion. Granted, when you hit 100%, as with Duke's Department of History, you're obviously different from the general population of people with Ph.Ds, but I wonder how much so. (Extra bonus point: can anybody guess my political affiliation? Hint: Swarthmore's History Department is not 100% registered Democrat.)
12. Kieran Healy rightfully observes that conservatives talking about this issue are making an interesting exception to their general tendency among conservatives to assume that results in the market are probably based on some real distribution of qualifications rather than bias or discrimination. It might be fair to assert in response that academic hiring is a closed or non-market system, and this is precisely what is unfair about it. But if so, it requires that one demonstrate that there is a class of potential, qualified individuals who are being discriminated against at the time of hiring, or that these individuals are being discriminatorily weeded out at the time of initial acceptance for training. If not, then the argument that conservatives are being discriminated against in academic hiring practices is exactly comparable in its logics and evidence to the logic of most affirmative action programs and many other antidiscrimination initiatives, that there is a subtle systemic bias which is producing unequal results that prevents a normal sociological distribution of candidates in particular jobs. It behooves conservatives who want to claim this to either concretely explain why this argument only applies to conservatives in academia, or to repudiate the standard conservative argument against affirmative action and other public-policy programs designed to deal with subtle bias effects.
13. On the other hand, most of the people mocking or disagreeing with the claim that conservatives are treated poorly in academia seem to me to be equally at odds with many standard representations of bias effects that are widely accepted by liberals or leftists, namely, that bias is often subtle, discursive, and institutionally pervasive, and that hostile environments can exist where no single action or statement, or any concrete form of discrimination can be easily pointed to as a smoking gun. Most of those claiming a bias against conservatives in academia are pointing to exactly these kinds of hostile-environment incidents and moments, and seeing them as causing the same kinds of psychological and inhibitory harms that this type of discrimination is said to cause in other contexts. I accept that people edging away from you in an elevator is a type of bias-effect that is harmfulan often cited instance of the kinds of subtly pervasive discrimination that African-Americans may suffer from in mostly-white institutions. Ive never experienced myself because Im white, and had I not read of it in the personal, anecdotal accounts of many African-Americans, I truthfully would never have noticed it. Same here. I dont understand why it is so hard to accept that self-identified conservative undergraduates, graduate students, and faculty report experiencing many similar forms of pervasive, subtle bias. What I'm seeing from many of those who dismiss these claims is a collective eye-rolling, a sort of "big deal, so your professor sneered at you, get over it". And yet few of those doing that eye-rolling would say the same to a student of color or a woman reporting similar experiences. The grounds on which many critics are doubting that such bias exists would have to, in all honesty, extend to all anecdotal, experiential or narrative claims of bias. The only way to salvage such claims would be if they could be profitably correlated with quantifiable evidence of discriminationbut in this case, we have some evidence to that effect. The only other way to salvage this point is to say, "It's wrong to be biased against people because of their race, gender or sexual orientation, but not because of their politics". A few seem willing to say just that: I can only say I think that's a big, fat mistake on a great many fronts.
Short notes
1. Regarding my earlier woes with my home PC, to my amazement, PestPatrol's tech support gave me a fairly simple command line to run through one PestPatrol utility to fix the aftermath of cleansing SahAgent off our home PC, and it worked, restoring all Internet functionality without any further difficulties. That is just about the first time ever that a tech support person has been able to give me straightforward advice that had straightforwardly good results. I've been reading up more about Winsock2 Layered Service Provider spyware like SahAgent and if anything I'm more appalled than I was before. Is there any question in anyone's mind that this sort of thing should be illegal? I don't see any difference between it and a virus in terms of destructive intent.
2. I'm fairly embroiled in an interesting discussion of what makes for a good childhood over at Crooked Timber--very high quality conversation, particularly the comments from Russell Arben Fox. Short summary of my arguments: I don't like "prosocial" children's programming, which I've hammered at before in Saturday Morning Fever. Not even Veggie Tales. And I let my daughter play "Morrowind" and "Neverwinter Nights" with me on my PC, monster-slaying and all. (When the PC is working.) Anyone who fears for her future goodness, take heart: she won't let me steal anything with my thief character and consistently tells me that I shouldn't kill monsters. Unless they're mean ones.
3. Responding to Laura over at Apt. 11D: I do most of the cooking, clean the dishes about 25% of the time, do all the stereotypically manly jobs like assembling toys and furniture or lifting heavy objects, dress and diaper if I'm closest (and did most of the diapering and baby care from months 3-12) and many other sundry acts of parenting. I also read the bedtime story. I am sorry to admit that I aggressively evade doing the laundry as for some reason I pathologically hate doing it. I would say I definitely don't pull 50% of the domestic weight, so yeah, I kind of suck. But I also think I'm more one of those slacker boys Laura is talking about who has cut back at work to spend time with family rather than the Type A achievement-chaser, which maybe I once was to a greater degree. Which is, I'm beginning to sense, a more complicated choice in its professional consequences and ego-impact than it first appears. No wonder men (and Type-A superwomen) get all angsty and weird at middle-age.
4. Quite a trollish thread over at Kuro5hin on blogging. My short response to the troll who kicked it off would be that yes, of course most personal webpages of any kind are banal. That's hardly a new thing, nor a result of Moveable Type. I remember very well that one of the reasons Justin Hall's links.net, whose tenth anniversary makes me feel very old and decrepit, got such a readership at the outset--it wasn't just nekkid pictures and stories about sex and drugs that drew people, but also that almost every other "home page" out there was a bunch of links to other links and nothing more, while Justin was putting up new and interesting material almost every day. Content then and now is king, and can come from anywhere, whether a blog or Atlantic Monthly Online. Blogs that originate content are more interesting to me, and more what I aspire to for myself, than blogs that do nothing more than link to content elsewhere. But even in collective banality, there are interesting things to see and think about. Even at their worst, the Web in general and blogs in specific represent an interesting sociological aggregate, a way to track the massed preoccupations of various constituencies and the movements of various memes.
February 3, 2004
From Hell's Heart I Stab At Thee
Well, somehow my
wife took an accidental wrong turn while web-surfing on my home PC, I think
because she misspelled a common domain name for a children's media site. I came
home to find something squatting malevolently in the computer called SahAgent,
which seems related to Gator. Busily infesting our PC, it kept popping up advertising
windows every few minutes into the desktop while keeping a log of all our web-surfing
and downloading or unzipping more and more executables of various kinds that
wanted access to the Internet. I rushed to get an application Id used
once before to search for spyware called PestPatrol (I know, youre all
screaming, Use AdAware instead, dummy! Be really careful removing SahAgent,
idiot! I am today a bit more knowledgeable than I was on Friday.) PestPatrol
quickly recognized and then supposedly straightforwardly cleaned the system
of tons of SahAgent-associated crap (also lots of things related to a driver
called WildTangent that I think I may have foolishly allowed on
the machine when visiting Shockwaves site and playing games there.)
Unfortunately that
was also the end of our home Internet functionality altogether: browser, email,
everything has gone bye-bye. PestPatrols tech support has some ideas that
Ill try tonight, but I have the bad feeling Im going to end up reinstalling
Windows from scratch. Bye-bye two days of my life if so. I know, I know, all
the techier-than-thou out there are rolling their eyes and saying, Use
Linux, asshole or Thats your fault for using Internet Explorer,
fool. Blaming the victim, guys.
I find myself so gobsmacked at the very nature of the experience (like so many others before me). If I happened to dial a wrong number on my telephone, and the mere act of doing so more or less destroyed my telephone, I suspect there would be real legal consequences for whomever was keeping the telephone-destroying answering machine out there. There are some strange differences in both law and practice in the case of computers and the Internet that to me seem inexplicable.
With SahAgent or Gator or what have you, somehow, somewhere, somebody real is making real money by hijacking other peoples computers and sending them advertisements whether they want them or not, downloading software involuntarily onto their machines and the like, and yet that person or persons is basically legally untouchable. Somebody somewhere is paying to squat on domains that are misspellings, just waiting for an accidental visitor so they can seize control of their computers. Whoever these people are, theyve cost me time and money. Theyre going to end up costing Microsoft money as well, because Ive been weighing whether having a PC in order to play games and get access to a relatively wide range of software is worth the hassle, and this has pretty well decided itits probably not worth it, and our next machine may be something else, while my gaming shifts to consoles. (PC games are dying, anyway.)
Grrr.
January 26, 2004
Evil
A colleague of
mine once suggested to me that everything about my stance on 9/11, including
my Gitlin-esque criticism of the academic left, was fair enough except for one
thing. I used the word evil to talk about both the attack and the
larger ideologies that motivated it.
I know why she
has misgivings. President Bush munches through the word like a kid lost in a
candy store, with appalling casualness, and hes hardly alone. Bloggers
left and right label things evil with abandon. Dont like something?
Critical of a practice or an idea or a person? Must be eeevil.
Still, I wont
give up the word. Evil exists, and refusing to see it as such when it presents
itself is a dangerous kind of myopia. If we gave up all the words and ideas
that are overused or misused, wed be mute.
I was thinking
a lot about evil over the weekend when I read Peter
Landesmans cover story in the New York Times Magazine on forced prostitution
and sexual slavery around the world. It clarified a lot for me about what I
think evil is and is not, though at the high price of being one of the three
or four most disturbing things I have read in the past year. Some of the details
from the article would seem unreal if they appeared in a novel by an unusually
lurid and imaginatively depraved author. I cant make myself repeat here
some of the material about the forms of compulsion and brutality used on child
prostitutes that the article describes, even though some of the images will
be hard for me to ever forget. I almost would say do yourself a favor and dont
read the article: some of the details are that painful, that scarring.
The men and women
described in the article as involved in sex trafficking are committing evil.
Theres nothing in between them and the pain they cause, no excuses or
alibis, no veils. No possibility of misunderstanding the relation between action
and consequence. There isnt even the defense of ideological loyalty or
cultural self-defense that torturers and killers in places like South Africa
have feebly offered for their intimate crimes. Theres no enemy to fight.
Just a child or woman being raped, abused, starved, mutilated for personal gain.
I suppose some of the people involved might say that in the midst of enormous
poverty, all choices are bad, or that when there is slavery, one either enslaves
or is enslaved. Both excuses are transparent bullshit here. The people profitting
in this case are looking their victims in the eye and committing their crimes
in plain sight, and for nothing more than their own gain.
Contrarily, the
article helped convince me even more of something Ive already concluded,
that in the end its terribly difficult to label more abstract actions
taken by leaders or authorities as evil, no matter how horrible their consequences.
People in power are insulated in so many ways, to the point where it is both
plausible and often quite true when they say that they did not or cannot see
how their actions translate into distant, intimate suffering for other people.
This is not to say that you cannot use the word in these situationsthe
top bosses of the organizations described by Landesmans article must know
very well what is going on, even if they dont know some of the hellish
particulars. Most of the worst authoritarians and tyrants of recent history
got blood on their own hands at some point, but even if they dont, few
of them can plead that they didnt know what was going on.
Its just
that you have to leave room for the actual complexity of how things happen in
the world, and for a real and meaningful distance, with ethical meaning, between
what power does, what plans are made, what policies dictate, and what happens
between individuals, in the intimacy of everyday life. Most human suffering
isnt something that one person does willfully, with foresight and understanding
of the consequences, to another. Neither does it come immaculately and sponteaneously
from the air or the ground, but the suffering that comes from and resides within
society is distributed in its origins and its infliction. The things we do in
our lives, whether were high officials or ordinary people, have consequences
and sometimes very bad ones, for which we ought to be held responsible and hold
ourselves responsible, sometimes in very serious ways. But evil is a term Id
reserve in this context for exceptional circumstances where the connection between
particular actions and the serious suffering of particular individuals are clear
and are known to the actor and known in advance by him or her to be morally
indefensible.
Landesmans
article also made me realize that when I think of evil, it is not something
outside the human frame of reference. Evil is not a word for the things we do
not understand, and if we try to forbid the investigation of evil on the grounds
that it is strange, mysterious, alien to us, we shouldnt be using the
word. If Mohammed Atta cannot be understood, if what he did makes no sense whatsoever
to a self-proclaimed normal person, its not evil. Jeffrey Dahmer was not,
at least to me, evil, because I have no emotional window at all into his crimes.
I dont understand any of his desires or his actions. He was unquestionably
horribly damaged and terribly dangerous and the world would have been better
if hed died before he ever hurt anyone, but I cant see him as evil.
This is what Inga
Clendinnin asks us to think about the Holocaust: if we declare that we cant
understand someone like Heinrich Himmler from the inside out, as another human
being who did things that we plausibly could imagine ourselves doing, we dont
know any of the things that we need to know about Himmler or the Holocaust itself.
This is why I think
that not only the people who enslave others and sell their sexual services are
evil, but the johns as well. Its one thing in the context of the legal
sex industry in the United States for a consumer to make a default assumption
of contractual consent by a performer when viewing a porn tape or a stripper.
Its another thing if were talking about an adult sodomizing a ten
year old prostitute in some hovel in a small American city.
I can make that
judgement because I have an interest in profit, and understand very well what
I would and would not do for it, and why I would or would not. I can make that
judgement because I get erections and feel sexual desire and understand very
well what I would do and not do to satisfy that desire.
It is not that
I accept Catherine MacKinnons view of male desire as always violent and
violating, whether its arousal when watching a supermodel on television
or purchasing the services of a ten-year old slave. Its precisely because
I see an absolute distinction that MacKinnon does not see between desire and
evil. I see acceptable male desire, with whatever complex ethical issues it
does or does not raise, desire involving the consent of others, with whatever
complexities come with the concept of consent. And I see desire that is absolutely
evil, that violates and hurts and steals. There is a difference there that the
conventional MacKinnonite argument obliterates and banalizes.
I see that distinction because I have greed and I have desire and there is much that I choose not to do in response to those feelings. Not because I am already economically comfortable and not because I am already monogamously satisfied in my romantic and sexual life. This is not an Olympian judgement on mere mortals. Anyone can and might choose to inflict unnecessary and intimately human suffering on others. Anyone can commit evil, and anyone can choose not to. It doesnt matter who you are or where you come from: there is no excuse for staring another person in the face and torturing them, mutilating their minds and raping their bodies, stealing their childhoods and their humanity. No accident of character, no fact of culture, no arrangement of circumstance: no excuse.
Wishing I Was Simon, Knowing That I'm Paula
American Idol
and Survivor are really the monarchs of the reality show universe, and
of them, I find Idol the most consistently fascinating. Its like
a full-body physical performed on the American Dream, and what it finds, I think,
is that the Dream has split into two very different forms out there.
Theres the
meritocratic dream that Simon, Randy and Paula vigilantly guard. In it, you
can get ahead if youre good enough, talented enough, if you've got the
right stuff. Then theres the moral outrage and personal pain of the people
turned back by the sentinels. I understand most of the rejects to be saying,
Success is my birthright
everyone should be a success. Theres
a sort of weirdly egalitarian howl of rage coming from the disappointed contestants.
How dare Simon turn me away? He has no right. Who chose him?
Americans all have the right to be American Idols! Everyone deserves to have
their desire, that's the American Dream!
This is the only
way I can understand the anger and hurt that most of the losers in initial auditions
display, because otherwise there is the troubling alternative possibility is
that they honestly, deeply, insanely believe that they really are good singers
who could be major successes in commercial music. Ok, yes, a few of the folks
weve seen appear to be authentically deluded, but I cant believe
that all of them are. Theres something deeper going on here, a clash between
two Dreams.
Seen as Idol
shows it to us, the meritocratic Dream looks like only healthy one. As Simon
Cowell reminds us, what do we think would happen if the losers on Idol
actually tried to sell their wares in the entertainment world, if we turned
on the radio and heard song after song from the miserable ranks of Houstons
Idol hopefuls? Wed all turn the radio off. Its one thing
to watch the untalented receive their just blowoff from the mercilessly funny
Simon Cowell, and another thing to be subjected to the untalented when we expect
to hear something good.
But like most people
watching, I keep mulling over my comfort level with Simon Cowell. Hes
an enjoyable, witty spectacle in his own right, but theres also a kind
of excitement watching someone be so ruthlessly honest because you realize that
you rarely see it, and at least in my case, almost never do it in that way or
for those reasons. I think Cowell is being perfectly straight and totally authentic
when he says that he sees what hes doing as a kind of public service to
an American society besotted by an unwholesomely egalitarian narcissism. But
in this season, hes clearly beginning to wonder when the lesson is going
to sink in. The pupil has received many strokes of the best from the master's
switch, but he keeps coming back and asking, Please, sir, may I have some
more?
It all reminds
me of some of the practical dilemmas that every teacher faces when grading.
Now Cowell and his companions are dealing with an unmistakeable and unbridgeable
gulf between wretchedness and excellence, though I sometimes wonder if the early
rounds of Idol dont omit some duller auditions from ambivalently
mediocre people in between. In ten years at Swarthmore, I think I have only
once graded a paper that was unmistakably awful at the level of sheer badness
that Cowell is stomping on, and I pretty well stomped on it myself, though with
kinder, gentler rhetoric. Most of the time my lowest grades (D or F) reflect
failure to complete assignments or similar problems.
But there is this nagging issue about what one does with an ordinary essay, a bland, decent, no-foul essay, an essay that would be good enough in most professional and real-world contexts. This being Swarthmore, a highly selective institution, a bland essay is a really good essay in the wider universe of analytic writing by undergraduates in America. (As our students are fond of saying, "Anywhere else it would be an A.") Why should I grade that paper harshly? And yet, such an essay, reasonably common, often stacks up unfavorably against a smaller number of papers that are remarkable. Its a puzzle. A tightly meritocratic vision would argue for making the spread between the ordinary and extraordinary as wide and unmistakeable as possible. A more egalitarian vision would say to minimize the distance when the ordinary is good enough: why deal unnecessary rebuke to a student who has done nothing wrong, who has made a good faith effort, and who one can confidently certify as a capable person.
In practical terms,
this comes down to whether a bland, ordinary essay gets a "B" or a
"C", at least in my classes. I almost always slant towards giving
the good enough work a good grade--a "B", in this case--and having
the difference between the best and the good and the solid be relatively small
(though I like to tightly reserve an "A" for the distinctively excellent).
I dont mind saying that somethings bland or descriptive or generic
in comments, but somehow I do mind coupling that to a strongly negative grade.
I do mind hurting people for what seems a subtle or small distinction.
I watch Simon Cowell and I sometimes wonder if maybe thats a mistake, wonder if it's a bad idea to be a Paula. A very select few of the people that Simon dished up abuse towards didnt seem unspeakably bad, and even he observed that a few of them might have careers as singers in bars or local theater or Broadway or weddings. Isnt that another kind of kindness, to tell people that theyre dreaming the wrong dream? Certainly it wouldnt be kind or right if you knew one of the truly wretched to tell them theyre great singers or marvelous performers no matter how much you loved them or enjoyed their company. Anybody who has to grade the work of students is running errands for meritocracy, in the end, and it ill-serves us to self-delude too much with gentle words about the dignity and self-worth of all people in all things that they set their minds and hearts to accomplish. But maybe Paula's the best of both worlds: the meritocracy guarded, while the pain dulled with soothing words.
January 21, 2004
I Also Froth
Dorothea Salo at
Caveat Lector has some
great "frothing-at-the-mouth" thoughts on journal publication.
There is nothing that stuns me more than the relative immobility or inertia
of most academics on this subject. On libraries and digital publication in general,
it might be right to blame the tools and be suspicious
of the rush to digitize, but on this particular question, its absolutely
insane to oppose a major transformation of how we produce journals.
Let me go over the relevant facts to doubly emphasize Dorotheas observations.
1. Academics receive no monetary compensation for writing and publishing journal articles. Their only compensation is in the form of reputation capital, achieved when other academics read, circulate, cite and teach their article.
2. Academics generally receive no compensation for doing the work of peer review for journals (in constrast to peer reviewing books, which gets you a small honorarium in money or a larger one in books from the publisher).
3. The only costs involved in journal publication which require a publisher to handle are the actual printing and the costs of distribution, and possibly the cost to advertise or promote a new journal. There are no payments to authors and no payments to peer reviewers to consider.
4. All significant costs of journal publication can be eliminated by transferring the journal to electornic form. Journals published in electronic form have minimal costs associated with themthe only major cost being the maintenance of servers on which the journal is resident. A electronic journal that was syndicated to many sites would distribute this cost significantly. Possibly maintaining a searchable archive of back issues of the journal would have some extra cost.
5. Academics who give away their intellectual labor for free to publishers of journals are employees of institutions who are then forced to pay considerable sums to reacquire the fruits of their employees labor by buying back the journals from publishers. To add extra insult, the labor time necessary to write the journal articles is often supported by an additional outlay in the form of sabbatical and research support.
6. Electronic distribution of journal articles is a far more affordable and equitable way to circulate knowledge globally. The cost of access for an academic institution in the developing world is minimala single good Internet connection and some computers could buy access to every academic journal in the world if they were all available online. Subscribing to the same number of print journals costs incalculably more. If print journals are lost, destroyed or stolen, they are gone. If Internet access is lost for a time, the journals would be available again once it was restored.
7. Journal articles are generally short. Reading them on a computer monitor, or printing them out for reading, is at least tolerable.
8. There are now readily usable, easy, technical standards for the distribution of short digital publications that require no or minimal licensingboth html and pdf will do just fine, and articles written in standard word processors can be quickly transferred to either standard.
Stacked up against
this, I can see only one argument that meaningfully favors the current universe
of print journals, and thats the relatively tangible permanence and inalterability
of print. But if electronic journals maintained by the same volunteer networks
of academics, with publishers cut entirely out of the loop, were maintained
with steady protocols for backups and regular transfers to new storage media,
and if published articles were locked from later changes and carefully
date-authenticated to prevent Stalinist-type alterations of the intellectual
record down the road, this concern is a non-issue.
I can see every reason in the world to oppose any precipitious move towards digitizing everything, and every reason to hold onto print culture overall and books in particular. However, its a crime to continue publishing academic journals in print form.
Its staggeringly stupid in economic terms, and functionally unnecessary. Frankly, every academic institution in the world ought to make it a primary condition that any research done by faculty that is published in journal form must be published electronically, because any other form of publication imposes a double cost on a university or college. It pays coming and going.
If the compensation of publishing in journals or doing peer review is reputation capital, then academics are incredibly ill-served by relying on publishers who restrict the availability and circulation of journal publications. If you publish a journal article, you want it assigned in classes, you want it available for viewing by anyone and everyone at every hour of every day on any computer, you want it to be searchable. You don't want somebody to have to pay directly or indirectly through a library to get your article. The only restraint on circulation you want is that anyone using your work should have to acknowledge it, but there is no difference on this issue between electronic and print publication.
Its also a socially unjust form of academic publication. My many colleagues who express copious concern about justice for the developing world or questions of global equity in academia ought to line up aggressively behind the digital publication and distribution of journals.
Everything about academic journals is easily migrated into digital form. There is no reason in the world not to do it, and do it right now.
Good Job, Iowa
Well, whaddya know. Maybe I won't have to vote for Dean in November, which would suit me fine, and good riddance to Gephardt. I still think there's no question that even some conservatives ought to prefer Dean to Bush, but maybe we can get a Democrat more to my liking and theirs up there as a choice. Edwards is smart, capable, and interesting: what a novel thing if he were to get the nomination. Kerry wouldn't be horrible, but I think he'd represent another iteration of a proven failure in political strategy, and that's to nominate a guy who is no more than a superior technocrat and manager, who can govern well, but without any kind of vision of his own save a justified belief in his own superior managerial skills. Kerry is a caretaker. That would certainly be better than Bush, but it's probably hard to get many voters to see that. Edwards, I think, actually seems to have a deeper, more vision-driven sense of what's interesting and important and why, and probably the deeper ability to connect with the general electorate as a result. Hell, he might even be able to win. What a concept.
Burn the Catalog
I was doing a bit
of last-minute refurbishment of my Honors seminar syllabus last week, trying
to see if there were new books or articles on particular topics or themes that
I might have overlooked. I had also reorganized the syllabus somewhat and had
one week that was a conceptual oddball of sorts, organized around a somewhat
diffuse view of the causes of colonialism in Africa that is starting to be a
major part of my current manuscript, and I was hunting for older materials that
I might stitch together to explain my perspective.
Using our librarys
catalogue, Tripod, I was both impressed
at how generally strong our collection is for a small liberal-arts college (shared
with Bryn Mawr and Haverford) and frustrated at just how useless a typical electronic
catalogue has become. The information technology revolution has become something
akin to the tearing down of a dam: the waters are free, pouring across the landscape,
but if you want to use them to irrigate some crops or even just to take a drink
of water, you have to leap headlong into the floodwaters and be swept away by
them.
Our librarians
are eager to teach information literacy and research skills, but its hard
to get the students to respond. Part of that is that to learn those skills from
the librarians involves giving up time to listen, and part of it is that most
of our students can sort of muddle through at 2am using online materials available
through Tripod, especially full-text resources. There are interesting hierarchies
of use starting to emerge as a consequence: on some papers, you dont see
students necessarily choosing the best work or data for their project, but preferring
instead by default the resources that are available in full-text form.
I dont really blame them. This is not just about availability, but about the near-impossibility of teaching undergraduates the kinds of search heuristics that will reliably produce useful material on most research subjects. The main reason that I dont think students learn from our librarians is that theyre not learning from their professors how to search, either, and in some cases, because the professors themselves dont really know how to navigate the brave new world of catalogs and databases. I used to be a snot-nosed punk and think that was about Luddism and sloth, but Im realizing that the fault lies less in ourselves and more in our tools. I think I know a lot about the tools and how to use them, but I'm finding it harder and harder to communicate effectively with my students about how to reproduce the search techniques that work for me.
Electronic catalogs,
wherever you go in the academic world, have become a horrible crazy-quilt assemblage
of incompatible interfaces and vendor-constrained listings. Working through
Tripods article and specialized subject indices, in a relatively small
collection, you still have to navigate at least five completely different interfaces
for searching. Historical epochs of data collection and cataloguing lie indigestibly
atop one another. The Library of Congress subject headings, which long ago slid
into uselessness, now actively misrepresent existing nodes and clusters of knowledge
in many academic fields. Or sometimes, the LC headings are so insanely specific
that they are inhabited and may always be inhabited forever and ever by one
or two monographs, using subject headings that could never be found intuitively
by a researcher, but only by someone who already knew about the one or two monographs
anyway. At their outer reaches, the categories sometimes become positively Borgesian,
as if theyre part of the planned expansion of human knowledge to some
infinite point of universality.
To get a catalog to associate materials that I know are associated in scholarly practice, I often have to execute exotic combinations of keywords and authors. Disciplines dont guide me to those clusters of scholarship, subject headings dont guide me to them, and even the keywords that most obviously ought to guide me indiscriminately lump those clusters in with works that have almost no relationship to them.
I can only readily
track new or interesting publications in fields whose everyday sociology as
glimpsed in conferences and workshops and introductions to books and listservs
and bibliographies is well known to me. If I want to find out what interesting
books have been written by Africanists in the last year the most compact way
is to go to the African Studies Association meetings and prowl the book fair
with a notepad. Otherwise I have to recall which friends or known associates
of mine are working on books and search their names; search many subjects at
a high level of specificity (basically most of the major ethnonyms and all of
the countries of Africa one by one); search for new titles in ongoing series
(if a catalog allows me to search that field); search particular publishers
who often put out Africanist works (or get their catalogs); do some highly date-sensitive
searches in combination with subjects or keywords; and maybe search a few carefully
chosen combinations of especially perfect keywords.
Moving out of those
known sociologies into areas I dont know as well or at all, I have to
tack back and forth into the information wind with keywords, publication dates,
the few known canonical signposts, and reading titles like tea leaves for hints
to content. Occasionally I get lucky and theres a good description of
the book or article, or even a full-text version I can scan quickly, and that
helps a lot, though the body of the full-text versions are themselves not often
searchable from the catalog level. Or I go to the bibliography of the newest
relevant book I can find and look for new things there. Academics with graduate
students have an army of foot-soldiers who regularly hunt down whats new
and au courant, which can help a lot.
On the other hand,
theres Amazon.com. Im hardly
the first to
note that Amazon as a catalog or research tool is easier to use and significantly
more productive than conventional academic library catalogs. I can see the table
of contents of books most of the time, and a range of associated materials--and
now even parts of the book itself are searchable. More significantly by far,
I can follow the actual patterns of use and association among readers through
the People who ordered this book also ordered
links.
There are weaknesses,
of course, in using Amazon as a research tool. Its just for books that
are currently in printno articles, no research materials, no dissertations,
not that many obscure monographs. The subject headings are mostly as useless
on Amazon as the LC headings are in other catalogs. Keyword searching is just
as messy and inconsistent in the results it produces. The patterns of reader
association can become dangerously inbred: its still up to the searcher
to make the intuitive leap from one circular cluster of associated materials
to the next. But I still find myself using Amazon when Im trying to find
out whats new in certain fields: it acquaints me with the hidden structures
of readership, it uncloaks the invisible college.
Im to the point where I think wed be better off to just utterly erase our existing academic catalogs and forget about backwards-compatibility, lock all the vendors and librarians and scholars together in a room, and make them hammer out electronic research tools that are Amazon-plus, Amazon without the intent to sell books but with the intent of guiding users of all kinds to the books and articles and materials that they ought to find, a catalog that is a partner rather than an obstacle in the making and tracking of knowledge.
Flush Away
Goodness knows I'm a strong critic of graduate pedagogy in the humanities and social sciences, and a strong believer that doctoral study is in desperate need of reform. I know only too well how many stories of shabby behavior towards graduate students and folks on the job market are out there, waiting to be heard. Any academic who has a touch of honesty and a willingness to listen has heard real, unmistakeable horror stories of interviews that went badly wrong (I myself haven't forgotten the two guys who interviewed me while stretched out on their respective beds, yawning, with shoes off), or tales of misconduct and needless cruelty by professors towards doctoral students.
In a way, I think the horror stories potentially overlook the real issue, which is how far short graduate school in the humanities and social sciences falls from its institutional and intellectual potential, how narrow it has become in its circumscription of the boundaries of academic professionalism, something commentators at Invisible Adjunct, as well as Mt Hollywood and other sites have been discussing avidly of late. It's true, as many have noted, that to get through graduate school isn't necessarily or even frequently a sign of superior intelligence or ability, but just a kind of dogged ability to suffer abuse and a relentless and usually unjustified optimism about one's employment prospects. I agree that any faculty who pat themselves on the back after hearing new data about the absurdly high attrition rates in graduate programs and say that it's just survival of the fittest, a flushing of the system, need to think again. The data on attrition is an appalling indictment of serious flaws in the entire system of academic training.
All of this being said, Erin O'Connor has put up an email from a reader that suggests that we could go overboard here. I don't know if O'Connor views her correspondent sympathetically--she notes that when satire is like life, it gets hard to satirize, but it's not clear who it is that she sees as the butt of the joke.
The butt is clear to me. Her correspondent is an asshole, and whomever he or she is, I'm kind of glad their academic career ended at the 1998 AHA. This is a person who walked into an interview, told the interviewers that they were unprofessional in scheduling interviews too closely together, snottily rebuffed one of the interviewers who had insufficient appreciation for the candidate's publication record, regarded the substance of the interview as "insipid banter", and then went on to remind the interviewers that they were unprofessional before leaving.
Ok. Let's go over this one a little, shall we? If ever there was a poster child for the ugliness of academic entitlement, and evidence that one of the worst things we do to doctoral students is convince them that they're owed a job because of their brilliance or whatever, it's this person.
If you're being interviewed for a job, you're a supplicant. That's not just how it is, but really, how it ought to be, at least until the One World Government creates a giant job-matching master computer that places us all in jobs based on our Myers-Briggs Test. The people in that room don't owe you jack besides a fair hearing for 45 minutes. You have to convince them that you're not just a good scholar and good teacher, but somebody they'd like as a colleague. Do I want somebody as a colleague who is going to act like a prissy psycho every time there's a slight error in scheduling, who gets their hair constantly ruffled because of their exquisitely tuned sense of what constitutes professionalism? No way. It might be unfair to use collegiality as a major evaluative tool later on, at tenure time, but it definitely isn't unfair when you're trying to decide which of 15 people you don't know yet might be someone you want to have as a colleague. Whomever this person is, he or she basically announced in the first fifteen seconds, "I'm an arrogant jerk, and you made the mistake of deciding I was one of the 15 people you wanted to talk to this year. Let's make each other uncomfortable for the next 45 minutes, shall we, and then you can move on to people that you might actually want to work with".
There is another class of horror story that maybe we don't hear about as much as we try to think about the reform of academia, and that's about bad students and bad job seekers. I think it's right and proper that we don't dwell on those stories, partly because they're an easily distorted or exaggerated security blanket for professors who want to resist reform, and partly because they're not a systematic or structural problem, just an idiosyncratic or individual one. But let's not forget that there are plenty of sins to go around here, and that not every unpleasant experience lies wholly or even largely on the doorstep of working academics. The aggressive pursuit of reform shouldn't be the affirmation of every single axe-grinder out there.
January 14, 2004
Eyes Wide Shut: Africanists and the Moral Problematics of Postcolonial Societies
This is an article I wrote for the African Studies Quarterly, part of a special collection of essays reflecting on an earlier article by Gavin Kitching about why he stopped being an Africanist. The rest of the essays are also very interesting, and definitely worth a read.
January 13, 2004
Nazgul the Baby Doctor
I have read a lot
and taught about Barbie several times in the last ten years. But my last material,
physical encounter with the doll itself was probably around seventh grade, when
the kids on my block used to hold big confabs of all our various dolls and action
figures and we all did the venerable GI Joe in the shower with Barbie
thing.
We got a pretty
nifty dollhouse for Emma this Christmas, and we figured that enjoyable as it
might be to have Saruman, the Lord Humongous, Dr. Zaius and Tomar Re from my
action figure collection hangin at the house, Emma might appreciate a
couple of dolls of her own. So we swallowed and surrendered to the inevitable
and got a Baby Doctor Barbie for her. Well, first, this didnt work so
well because Barbies out of scale with the house, about three inches too
tall (Saruman et al are almost righttheyre maybe half an inch short).
What really struck
me, however, was just what a crappy doll Barbie actually is. Shes got
minimal articulation, shes stiff and horribly inanimate, she doesnt
stand up on her own no matter what you do, and in order for this particular
Barbie to grip her neonatal medical instruments, they have to be jammed through
a hole in her hand like a stigmata. If you actually wanted to play out a narrative
of Barbie treating the two little babies that come with her, youd almost
be just as well off with a rag doll or a popsicle stick figure in terms of the
resemblance between what youre imagining and what youre holding.
Contrast that with
action figures, almost any action figures, not just the especially cool ones
I tend to collect. Good articulation, vivid expressions, great accessories that
the character can readily grip and use. Now it so happens that those accessories
are usually used to maim, kill and destroy, but what of it? Its a lot
easier to imagine my Nazgul figure as a baby doctor (if I take away his sword)
than Barbie. At least he can stand and hold things, and strike a wide variety
of poses while clutching a baby.
Barbie, in contrast, really is only one thing: a platform for clothes and an object to be looked at. At this point, feminist cultural studies scholars are saying, Well, duh, and wondering just how much of Barbie scholarship Ive actually read. As always, its one thing to read it and another thing to experience it. I dont doubt that Barbie, like all culture, can be poached by its consumers, and made to be and do things that arent suggested by its material nature. All the more so because of Barbie's cultural ubiquity.
At the same time, there just isnt any way to dodge what Swarthmore students have been telling me for a decade in my History of Consumption class: boys toys are still vastly better-made, more varied, more complex, more interesting, than girls toys.
Warbloggers Circle Wagons; Dog Bites Man
It didnt
take Nostradamus to predict that in the wake of reports about Paul ONeill,
the wars defenders would observe, accurately, that any post-Gulf War administration
would have had such conversations about Hussein, and inaccurately, that there
was nothing of interest in ONeills statements. The critics (of whom
Im one) naturally noted that this confirms suspicions that Bush exploited
September 11th to accomplish a pre-existing policy agenda (Husseins removal
and a general rejection of multilateralism) rather than concentrating on the
most effective prosecution of the war on terror.
Its a tedious
debate, partly because positions on both sides have hardened into total inflexibility.
For me, given that I completely reject the kinds of anti-war arguments that
follow Chomsky or Soros lead or anyone similar, this immobility is especially
frustrating. I am potentially persuadable about the war, and have been ever
since September 11th. I needed and still need to be convinced that this war
in particular was urgently necessary, justifying the poor management of the
build-up to war. Most of the evidence that it was has now been disavowed by
the Administration itself.
The Hussein
was a tyrant, and needed to be overthrown because of his tyranny argument
is just about all thats left for the pro-war advocates. Theyre right
in this respect that it is great that his regime is overthrown and he is in
custody, but this remains a horribly dishonest argument for the war itself.
One might ask how I can say, A great thing to have done, but it was wrong
to do it. If I told you that I had a vitally important meeting to get
to that started in 30 minutes, and it was an hour away by car, and you drove
me at 120 miles an hour to get there on time, Id be grateful to have gotten
there once I was there, but I wouldnt want you to ever do that again--nor
would I have wanted it the first time.
The argument is
dishonest both in that its not how the war was sold to a democratic society,
even by most pro-war pundits and bloggers, and dishonest in that none of those
who offer it actually mean it to be a continuing rationale for policy in general.
Dont try to tell me that Im somehow supportive of Husseins
tyranny for criticizing the war, because I can serve up a steaming load of the
same whup-ass back on your plate, as Ive observed here before. If this
is the real reason for war, then you have to support another fifteen such wars
right now or be accused yourself of the same support for tyranny elsewhere.
Wilsonian idealism at this level is an all-or-nothing thing. Once you agree
that we must bow to what is pragmatically possible, that cost-to-benefit matters,
the war in Iraq is open to criticism.
The biggest intellectual
sin, among many, in Chomsky, is that there is no information or data which would
lead him to rethink his arguments, nothing which can falsify his case. If it
were revealed tomorrow that Saddam Hussein was three weeks away from planting
nuclear weapons in the fifty biggest cities on Earth or had made a deal with
space aliens to sell humanity into slavery, Chomskys case wouldnt
alter one iota. It would still be the fault of the United States and the war
still wouldnt have been justified.
I think most of
the defenders of the war have backed themselves into the same predicament: there
is nothing that could ever falsify their case, nothing that would make them
re-think, no event or information that would require a careful reconsideration
of their arguments. Not Paul ONeill, nor an occupation that has from the
outset gone the way that the critics of the war predicted it would, nor the
absence of deployable WMD, nothing.
The reasonable
thing to do is be predictive. What information would change your mind, whatever
your feelings are? What developments would alter your assessment? Say it now
and say it explicitly.
For me, its
simple. The discovery of actually-existing WMD that was readily deployable;
revelation of substantial, sustained connections between Hussein and al-Qaeda
or similar groups; evidence that Hussein independently was preparing to order
or support terrorist attacks on the United States, Western Europe or other nations;
evidence of imminent plans for territorial aggression against Iraqs neighbors.
All or any of that would be sufficient for me to concede a reasonable case for
the immediate war as it was conducted, even with its enormous costs and risks.
The other thing that would change my thinking is a change in what I understand
to be the cost-benefit ratio. If next year, the United States withdrew in an
orderly fashion, a meaningfully democratic government was elected in Iraq and
neighboring regimes also made serious moves to democratize and liberalize their
societies, Id be glad to confess my error.
For the prowar advocates, is there anything that would change your mind? If Dick Cheney gave a national speech where he said, Yeah, its all about Halliburton, would you feel differently? Is there any series of events that would change your assessment? If the occupation was still going five years from now and 10,000 Americans had died in the conflict in the interim, would you feel differently? If theres nothing that fits the bill, then seriously, stop blogging about it. Stop writing about it. Theres no point: this is faith, not reason, and theres no need to bore the rest of us as you bear witness.
January 13, 2004
Dear Rush Limbaugh:
if you have an iota of self-respect, youll make an explicit, humiliating
apology to Donovan McNabb. I dont follow football that closely, but I
usually watch the playoffs. McNabb may not be the greatest quarterback of all
time, but hes clearly one of the best quarterbacks playing right now.
Im talking about the 4th-and-26th miracle last Sunday, which was just
freakish, but most of the rest of his play, especially the scramble and pass
into the endzone and his general cool-headed performance in a game when his
offensive line actually failed him a number of times.
As a veteran viewer of Mystery Science Theater 3000, Im always on the lookout for great new bad movies. My brother Kevin thinks hes found one, the kind that can become a cult experience on par with Plan Nine From Outer Space. Its called The Room. Watching the trailer for the film, I think Kevin might well be right.
January 9, 2004
Uncharacteristically brief notes:
1) John and Belle's daughter Zoe reminds me a lot of my own daughter, and I particularly found John's entry about toddler storytelling interesting. Emma managed to creep me out a bit the other day with a weird Book-of-Revelations tale that began with the lines "He is the passenger dog that leads the way. He holds out the signs." The Passenger Dog kept popping up throughout the story, along with the Walking Road. I also enjoy Emma's use of reasoned argument, which actually tends to be pretty sensible and rigorous and tends to expose both arbitrary "cultural" rules (like questioning why Mommy sleeps with Daddy given our assertion that big boys and girls need to learn how to go to sleep by themselves) and hidden constraints on what is possible. Lately she's been really jonesing for a pet and is becoming increasingly wily in her arguments. Here's the latest round:
Emma: "I miss our old cat". [Cat died when Emma wasn't even quite a year old; I don't think she actually remembers the cat, but she knows we had one and has seen pictures.]
Melissa: "Yes, she was a good cat."
Emma: "We should get another cat now."
Melissa: "Honey, cats give Daddy asthma, and he's been much better since the cat died."
Emma: "I could help Daddy with his asma."
Melissa: "It makes it hard for him to breathe."
Emma: "I could help Daddy breathe."
Melissa: "I don't think so, honey."
Emma: "We could keep the cat away from Daddy."
Melissa: "Where?"
Emma: "In my office." [she gets to use the old computer in my home office, and is rapidly colonizing the whole room]
Melissa: "That's Daddy's office, honey."
Emma: "We could get him a new office."
2) I really appreciate a lot of what Erin O'Connor has to say but like Chun the Unavoidable, I found the anonymous author of a MLA "expose" creepy rather than sympathetic, and her latest martyr to academic oppression, while unquestionably someone who has been treated unjustly, is at least a more complex case than she lets on, given that he is actually advocating constraints on the academic freedom of his colleagues at Cumberland College by promoting much tighter religious requirements in hiring and curricular design. I don't see why the complexities can't be up front from the outset.
3) I love the idea of more money for space exploration but I don't get what's wrong with sticking to robots and probes for the moment. Building lunar bases when there is nothing to keep us on the Moon except national pride or a Mars-or-bust project seems to be more cart-before-horse stuff, just like the shuttle and the ISS. I'm skeptical about the "look at all the cool stuff that got invented because of Apollo" argument in that it gets applied selectively--it's certainly an equally good argument for massive military expenditures, for example.
4) Some editing of my blogroll, which is really just mobile bookmarks, reminders for me of what to read. I'm especially happy to add Russell Arben Fox, whose blog I really enjoy.
Leading Horses to Water
Lately Ive
been wrestling with two complicated experiences, things I wanted to write about
in this space, but felt inhibited about exploring fully. So my language here
will be a bit oblique. What is spurring me to write is first, a disappointing
experience with a grant I was hoping to get, and second, ambivalent feelings
I have about a government-funded educational project that I have an institutional
connection to.
What I'm concerned
about is the nature of incentive in academic life. Many of academias critics
regard this question as the greatest specific damage that the tenure system
inflicts on higher education, that the lack of a risk-reward calculus post-tenure
gives few scholars the incentive to excel, and punishes few for the failure
to do so. This isnt quite the way I want to come at the problemfor
one, I think it involves a highly questionable assumption that such incentives
meaningfully and typically exist in business or other institutions, and are
the main engine for creativity and innovationbut I recognize the importance
of the general issue.
How can any given
institution articulate a set of preferences or desires for particular kinds
of scholarship, teaching or activity from its faculty, and do you need rewards
or incentives to get otherwise independently-minded faculty to respond to those
desires? What makes academics change course to pursue a particular kind of reward?
Theres an
obvious bookend to this question, and thats whether there are any meaningful
sticks at all to go alongside the carrots, but thats a matter for another
essay, and more directly involves some of the problems with tenure.
Heres my
list of meaningful institutional incentives that I can think of, ranked roughly
in the order of their importance to the average tenured faculty member (obtaining
tenure being an almost separate class of initial incentive):
1. Secure, reliable, long-term commitment to funding sabbatical leaves for an individual faculty member
2. Significant permanent salary increases
3. Structurally permanent course releases
4. Short-term course releases
5. Short-term programmatic stipends
6. Permanent relief from some or all service and administrative work unconnected to the facultys personal interests or objectives
7. Significant autonomous control over a dedicated, personally customized institutional unit or resource (major research project, institute, department, etcetera)
8. Personalized endorsement or warm acknowledgement of a faculty members research or pedagogical projects by top administrators
9. A strong degree of privileged access to administrative decision-making for an individual faculty member
10. Generalized administrative endorsement of an overall position or faction to which a faculty member subscribes within the institution
11. Committed, differentialized administrative and collegial non-interference in an individual faculty members perception of his or her own pedagogical, scholarly and administrative domains
12. Generalized support for faculty as a whole in grant-seeking efforts
Theres some
more general atmospheric incentives I could describe, but faculty
are often not aware of them as incentive until the atmosphere or culture shifts
negatively for some reason. There is also a class of incentive that can be powerfully
motivating that mostly lies outside of any single institution, most crucially
seeking reputation capital through scholarly production within a specialized
field or discipline, publishing textbooks or other works for profit, and seeking
credibility and circulation within the wider public sphere. Institutions can
decide to help faculty seek those rewards with sabbaticals, salary replacement
and so on, but not much more than that.
Not all of these
incentives are equally motivating to all faculty. When I recently helped put
together a faculty seminar here, I was actually struck at how the offer of a
course release was a powerful disincentive for some of the faculty that were
attractive potential participants. Nor can they be found at all institutions.
Swarthmore, for example, does not give a significant raise at promotion to associate
or full professor (the title is largely honorary) and we give relatively minimal
merit-based raises. Most smaller undergraduate institutions avoid giving structurally
permanent course releases or structured relief from administrative or service
responsibilities, unlike large research universities, where such incentives
are often part of the package used to woo especially desirable senior scholars.
Some of the intangibles are sometimes the strongest rewards for certain people:
I know that getting some sense that Swarthmore officially shares my belief that
a generalist and interdisciplinary approach to faculty development is more in
line with our institutional values than continued narrow specialization would
be a more desirable reward and more motivating to me than more money.
In any event, most
of these incentives are generally designated rewards for individual productivity
and achievement, and often work in concert with disciplinary or public rewards
for scholarly achievement or general reputation. They are rarely used to reform
pedagogy or academic administration, to encourage academics to study one subject
more or another subject less, to achieve any more targeted institutional or
social goal.
In this case, the
only incentives that matter are internal and external grants and funding that
are differentially targeted at particular kinds of research or institutional
reforms, or some kind of highly focused top-down advocacy and support from the
top reaches of academic administration in a particular institution.
If an external
agency or internal administration actually wants to change academia in some
particular fashion, even if it merely to encourage the study of some new discipline,
or endorse one approach over another, they have to think very carefully about
how to proceed. These kinds of incentives, properly designed, really can have
a major impact, but it is very easy for them to be diverted, co-opted or ignored
if theyre not actively looked after by the people who set them in motion,
if theyre not very precisely defined and targeted at the outset, or if
theyre paired alongside an incentive that pulls in another, contradictory
direction.
In the work Ive
done with several foundations giving academic grants, Ive been impressed
with the extent to which they have been very clear about the specific kinds
of things they wanted to reward, which went well beyond simple excellence.
(Say, for example, wanting to diversify the pool of institutions receiving support
for their doctoral students, and to widen the range of disciplines being rewarded).
To stay on target takes not just precise standards but also constant monitoring
and intervention by the grant giver. If you want to reward some particular kind
of behavior or choice among academics, you have to build in some kind of evaluative
weighting when making your choices, and aggressively shepherd decisions so that
weighting is always taken into account.
In the case of
my own disappointing experience, I applied with a collaborator from the sciences
for a grant that I understood was supposed to reward collaborations across disciplinesbut
essentially each one of us were evaluated separately, as if we were two unrelated
people who each had to excel above the competition in isolation from each other.
If you dont build in a weighted reward for applying as interdisciplinary
collaborators, so that all non-collaborative projects are at a structurally
inflexible disadvantage no matter how excellent they might be, youre actually
discouraging collaborative applicant. It makes it two times as hard to be selected:
your project has to pass muster twice, and is subject to twice the possible
political and institutional vagaries in that judgement. It's hard to reflect
on a grant you didn't get without descending into sour grapes--you always have
to take seriously the possibility that your proposal was flawed, and you also
have to know, if you've been a part of making decisions about grants, that the
decision often comes down to very small, fractional differences between generally
excellent applicants. But I definitely walked away from this one feeling this
case revealed a problem with how some grant-givers actually look after the goals
they have set.
On the other hand,
you can look after those goals too aggressively or inflexibly. In the case of
the federal project Im involved with, its clear what the grant-givers
would like to encourage, but theyre so rigid in their approach (because
of statutory limitations and a mass of their own home-grown bureaucratic regulations),
most of the things theyd like to see come to pass will either not happen
because they entail considerable extra make-work for participating
faculty or because they are non-adaptive to particular institutional cultures.
So theres a disincentive to respond to this intended incentive, making
it stillborn. (One suspects this is fairly common with a lot of federal grants.)
I think it is very
possible for foundations, political groups, philanthropists and governments
to shift academic institutions this way or that, to encourage or discourage
particular kinds of research or teaching, and to reasonably hope that there
will be visible general social benefits from these initiative. This can be done
indirectly, through various incentive structures, rather than through crude
statutory restrictions on public universities or other blunt instruments. (Or,
in the case of a reformist college administration, I think its easier
to gently push things in a new direction rather than pursue pogroms and such
in the neo-Stalinist approach favored by John Silber.)
Any institution thinking along these lines is best advised to be precise, be clear and stay heavily involved at every stage of the process of changebecause academic inertia is a powerful, pervasive force, and tends to quietly and without malice subvert well-meaning hands-off forms of benevolence to its own ends, to reinforce the status quo.
January 6, 2004
Happy New Year to all.
Kind of a tough end to the holiday season for me. Our post-Christmas travels took us to meet family in Las Vegas, which was interesting, but I couldn't help but feel nervous about terrorism (and I'm still feeling an enormous sense of dread). More burdensome has been the horrible week-long viral syndrome that our group picked up at the end and dispersed with family members back to their various homes: my wife and brother both report that it is "the worst they have ever felt in their lives", which is saying a lot, and my 3-year old, lacking the perspective to make the same claim, nevertheless seems to have felt the same way. I am so far unscathed but I feel under the sentence of doom--and I'm doing my civic best to hide away from anyone so as to avoid unleashing it on my neighbors and colleagues.
January 6, 2004
The Real Third Party, or Why Some Conservatives Ought to Vote for Howard Dean
Howard Dean is
not my favorite candidate (of the available field, I like John Edwards best)
but its screamingly apparent to me that a Dean Administration would be
hugely preferable over a second Bush Administration. Im not talking modest
improvements here, but the difference between the continuation of the best traditions
of American democracy and the continued magnification of serious internal and
external threats to those traditions under Bush. More importantly, I think that
there are quite a few conservatives who ought to feel the same way if they're
at all true to their convictions, and I am baffled about why they do not.
Let me be clear
here: I am not a registered Democrat, nor would I vote slavishly for any of
the partys candidates. If by some bizarre twist of fate Kucinich, Braun
or Sharpton were nominated, Id rather cover myself with honey and lie
on a hill of fire ants than walk into a booth and pull the lever for any of
them. If I was compelled to vote in that circumstance, Id pull the lever
for Bush and then scrub myself with steel wool for a week afterwards. Id
feel only slightly less viscerally repelled if Gephardt was the nominee but
Id probably still stay away from the voting booth if that was the scenario.
If it were Joe Lieberman, well, Id vote for Bush just because better the
Bush you already have than the Bush you dont: Lieberman is like Bush and
Ashcroft rolled into one person.
Dean is a different
kettle of fish. His actual record in governance is moderate, and for all that
Karl Roves little team of operatives (and some of Deans rivals)
would like to tag him as a wild-eyed ultra-liberal, most of his actual positions
are reasonably mainstream, and at times more conservative than the bulk of his
competitors (on gun control, for example). The most liberal thing
about him is his unbending opposition to the war in Iraq, which is only the
first thing that should make him a preferable choice for some conservatives.
The main thing that makes him preferable is that Dean is not George Bush, and
that his past record and stated positions, especially with the likelihood of
a Republican Congress, makes him inevitably less harmful to certain kinds of
conservatism than George Bush has been and will be.
Lets start
with the libertarian branch of conservatism, in either of its chief manifestations,
the bedrock defense of civil liberties and individual freedom to act or the
core belief that government which governs least, governs best. For
anyone whose conservatism primarily originates from these convictions, George
Bush is unambiguously the most dangerous American President since Franklin Roosevelt.
It should be enough
to note that the initially alarming interest that the Ashcroft Justice Department
took in neo-censorship prior to 9/11 has turned out to be less of a threat to
civil liberties than one might have supposed at the time, but only because the
Administration has been too busy pursuing a much more breathtaking assault on
the Bill of Rights. Any conservative who comes from a libertarian perspective
ought to be openly terrified by the Patriot Act and its various bastard policy
offspring, and most of all by the Administrations stated intent to ignore
constitutional protections and rights for American citizens (as well as non-Americans:
the obligations of liberalism are universal, or so were often told) and
to even deny the validity of judicial review of its actions. Let me go over
this again: this is an Administration which has asserted that it can deny constitutional
rights to American citizens based on its private, classified and secret determination
of whether someone is an "enemy combatant", and has asserted that
the courts have no right to review such a determination. This is also an Administration
which has asserted that critics of its policies are aiding or comforting its
enemies, and so for the first time in a long time, it doesn't require a paranoid
to be nervous about the short slippery slope between secret unreviewable determinations
that someone is an "enemy combatant" and a disdain for all opposition
and criticism. For the first time since the Nixon Administration clashed with
the Supreme Court on executive privilege, I think its fair
to honestly wonder whether a second Bush Administration would actually bow to
a ruling by the Court that it cannot arbitrarily deem Americans seized on American
territory to be enemy combatants. Thats assuming that Bush
doesnt have an opportunity to pack the Court first. Whats
the comparable threat from Howard Dean? What, he might say something blandly
positive about a schoolmarmishly oppressive hate-speech code on a college campus?
Suppose thats
less important to you as a libertarian-leaning conservative than the feeling
that the federal government should be smaller and less intrusively involved
in local and state affairs. Again, the Bush Administration is in this respect
vastly worse than the Clinton Administration or any other post-World War II
presidency save perhaps Lyndon Johnsons, and not merely on national security
grounds. The Administrations assertions of federal power over a huge range
of issues, many of them not at all related to national security, have been sweeping
and precedent-setting. Its back to the days of unfunded or underfunded
mandates from Washington roughly and heedlessly overriding local prerogatives
and standards. Purely from a checks-and-balances standpoint, a Dean Administration
would have to be preferable: even if Dean wished to be as pervasive in his use
of federal power (and the evidence from Vermont is that he wont), hes
going to be checked by both Congress and the courts.
Lets suppose
your conservatism is instead about good fiscal policy and a healthy respect
for free market capitalism. I grant you that some of the Democratic candidates
are anathema to a conservative of this kind, particularly Richard Gephardt (not
to mention the no-hope fringers like Kucinich). Howard Dean, on the other hand,
has a quite reasonable record in this area. In contrast, George Bush does not.
He has been one of the most protectionist Presidents in recent memory, and in
a way that is nakedly, avidly about personal political gain. In some ways, a
philosophically committed protectionist might be better from the standpoint
of sound fiscal management, because at least in that case, the protectionist
in question might not make policy on the basis of seeking votes in Pennsylvania
but instead with a strategic economic vision in mind. Bushs protectionism
is of a piece with his drift towards crony capitalism, and any conservative
whose political ideology is primarily about sound economic policy ought to view
that drift with alarm, given the devastating impact of similar economic policies
in much of southern and eastern Asia. Leaving that aside, the Presidents
staggering disinterest in deficit management and his heedless off-loading of
fiscal burdens onto state and local governments ought to be equally troubling.
Whether Bush is really a big-government spendthrift or is just cynically
forcing some later Administration to radically downsize government because he
lacks the political strength or will to do it himself, he is still appalling.
You might think
that a strong-defense, national-security conservative would at least find Bush
preferable to Dean, but at least for the rational-pragmatic school of conservatives,
Dean is a better candidate. I grant that Deans unwavering commitment to
pull back from Iraq is going to cause a number of problems: his election would
immediately destabilize Iraq still further (if thats possible). But that
mess is not of Deans making. Cleaning up will be hard for anyone. What
is more important for someone whose primary concern is with maintaining American
strength in the world is that Bushs once and future mismanagement of the
most crucial challenge of our times is a mortal danger to American influence,
not a strengthening of it. The war in Iraq, or more specifically, the bluntly
incompetent handling of it by Bush and his advisors, has done enormous damage
to the power of the United States, damage that it will take a generation of
leaders to undo. Dean is not the man to begin that work, but he will at least
staunch the bleeding and prevent further self-inflicted wounds. Dean is not
the ideal candidate for a national-security conservative, not the man who best
knows how to be strong where the U.S. needs to be, and in the ways it needs
to be, but he is by any standard preferable to Bush. He can begin the process
of reconstructing our influence and strengthening the struggle against terrorism
simply by not being George Bush.
I suppose it should
be obvious that a neo-isolationist, narrowly nationalistic conservative like
Patrick Buchanan should be opposed to Bush, but given that the net effect of
Bushs policies are isolationist, perhaps thats not so.
So whats
left on the right? Who should really want Bush rather than Dean? Only two kinds
of conservatives, as far as I can see. First, neoconservatives, who as a colleague
of mine has observed, are really the strongest contemporary disciples of the
Wilsonian tradition of idealist American foreign policy, the naïve belief
that the United States can compel the world by military force to become the
world we desire. I am not the first to observe in this light that it is hardly
surprising that many of the neoconservatives have intellectual and personal
roots in the statist left, and that their actions have been largely consistent
with a philosophy that celebrates the possibilities of compulsion exercised
by a overwhelmingly strong government, with little interest in the constraints
imposed by respect for the rights and freedoms of the governed. One has to wonder
where popular anti-intellectualism is when you need it, because the influence
of neoconservatives on the Bush administration is vastly out of proportion with
their actually existing demographic or political presence in the electorate.
They dont speak for anybody besides a fairly narrow if influential group
of inside-the-Beltway elites, but if youre a committed neoconservativemake
that latter-day Wilsonian idealist who believes that military power alone is
sufficient to compel the world to be as we wish it to bethen by all means,
vote for Bush. Hes your man.
Who else? Well,
the one major demographically important segment of American conservatism that
ought to be for Bush rather than Dean is the religious or cultural right. For
a conservative who could care less about the size of government, or about pragmatic
assertions of national strength in the world, or about sound fiscal management,
who primarily sees the President as the leader of a moral crusade to purify
American society, Bush is clearly the best choice, not only over other Democrats
but even within the Republican Party. No Republican leader in the past forty
years has had the will and boldness to pursue the chosen agenda of this constituency
with such unrestrained gusto. If this is your conservatism, theres no
question about who you ought to vote for.
What I dont understand is why libertarian-leaning or pragmatic conservatives are willing to go along with the modern Republican Partys captivity to interests that they ought to view as anathema. The Western Republican Party has become a kind of impotent wart on the ass of the Southern Republican Party. In this respect, the elections of Arnold Schwarzenegger and Jesse Ventura really should serve as an indication of the electoral viability and political legitimacy of a genuine third party in the United States, a socially and culturally libertarian, fiscally prudent, pragmatic party that is strongly committed to checking the authority and size of government without compromising its necessary functions and positive capabilities, strongly pro-market capitalism but anti-monopoly and anti-cronyism.
This is the political
faction that speaks for what Jonathan Rauch and others have called the
radical center, not a center that is the proverbial dead armadillo in
the middle of the road, choosing a little of this and a little of that from
the ideological smorgasbord in order to bolster poll numbers (as technocratic,
managerial politicians like Michael Dukakis, Bill Clinton, George Bush the Elder
or Al Gore have done). This is a center that has a coherent political philosophy,
a consistent set of convictions that sets it apart from both the old statist,
unionist, urban core of the Democratic Party and the cultural fundamentalism
and neoconservative idealism of the current Republican Party. This faction is
ill-served by both parties, but at the moment, the most pressing threat to its
interests and needs by far comes from George Bush.
Any conservative who is not a committed member of the religious right or a neocon needs to give serious thought to Howard Dean. He cannot possibly be worse than Bush for the abiding interests and beliefs of those conservatives, for the "radical center" and he quite possibly could be substantially better. If you're a libertarian, a fiscal conservative, or a pragmatic conservative, and you would like to see candidates that you can vote for with passion, then the time has come for you to consider leaving your party altogether--just as there are Democrats who ought to think about doing the same. But that's for the future. The now is that the majority of Americans, conservative, centrist, liberal, libertarian, what have you, need to stop George Bush before the wounds he is inflicting on America become mortal--even if that means pulling the lever for Howard Dean.