May 20, 2004 Busy couple of weeks here with grading, Honors exams, and some family matters, so blogging has been and will be lighter than usual for a bit. May 20, 2004 Preparing a Place in the Museum of Failure Norman Geras
argues
strongly that as a supporter of the war in Iraq, he bears no responsibility
at all for Abu Ghraib. I agree that
those who supported the war with a rigorously reasoned case do not have
to feel personally responsible for Abu Ghraib. I think it is appropriate
to hold war supporters directly responsible for Abu Ghraib if (and only
if) they fail to regard systemic abuse there and other American military
prisons as being a grave concern by the very same criteria that we held
Hussein's misrule a concern. Abu Ghraib
does have serious consequences for at least some of the arguments in favor
of the war, and I don't think one can dodge those consequences. It's possible
but highly unlikely that this is merely seven bad apples doing bad things--even
if that were so, this is where the basic point about oversight comes in.
A failure to have effective oversight is a guarantee of "bad apples"
having impunity to do what they do. The furtive, paranoid unilateralism
of the current Administration, its stonewalling of entities like the Red
Cross, its apparent disinterest in due diligence practices within its
own institutional frameworks, made Abu Ghraib inevitable. Beyond that,
however, the evidence is considerable that this abuse was not merely an
accident of mismanagement, but a deliberate policy, deeply integrated
into the Administrations entire approach to the war on terror.
Supporters of the war do need to regard that as a serious issue for their
case, because the war cannot be supported as an abstraction. It can only
be supported as a concretized, real-world project, and if it is done badly
in the real world, it eventually will (and I think already has) do as
much damage as the things it set out to fight. If you support the war
as part of a battle against illiberalism, then illiberal conduct by your
own "side" in the war has to mean something to you, have inescapable
implications for your struggle. You can't just shrug off the creation
of a gulag in Guantanamo where people have no rights, or evidence of a
consistent policy of humiliation and abuse. To understand
this as a conflict that is resolvable strictly through military means
or through the imposition of formalist structures is my mind to absolutely
and completely misunderstand the nature of the larger conflict against
terrorism. To extend the military trope, its the equivalent of fighting
the wrong battle with the wrong weapons in the wrong placeand in
military history, thats how you lose a war even when you may have
superior resources and force at your disposal. Those who do misunderstand it this way almost all share two things. One, a belief in the universal and inescapable obligations of modern liberalism. Its no accident that some Marxists, some liberals and many neoconservatives have found the war attractive, because they all derive tremendous intellectual strength from unversalist frameworks. This I find laudable and important and I recognize many supporters of the war who take this approach as intellectual cousins. (Those who do not share this commonality, like those parochalists and chauvinists on the American right who have endorsed brutality at Abu Ghraib, I recognize no connection with.) But these
supporters on both left and right share another attribute which I do not
share: a belief that liberalism comes from above, that it can be imposed
by power, that it emanates from the structure of the state and is guaranteed
by securing a working monopoly on the means of violence. Equally, these
thinkers share a belief that illiberalism and oppression emanate from
the top, have their source in malformed states and ruling elites who have
illegitimately seized control of the state in spite of the natural and
rational desire of most people for liberal democratic norms. In essence,
many of them--some from the left, some from the right--are statists. This
is what the shorthand of "Wilsonian" is all about: a grab-bag
aggregate that usefully links ideologically diverse arguments through
their common understanding of the nature of political change and the sources
of illiberalism in the world. Fundamentally,
this is a clash between different models of change-by-design in the world,
of how one does praxis. Even when I was more strongly influenced by Marxism,
I was always drawn to the Gramscian vision of politics, to the notion
of a war of position, because that seemed much closer to me
to how meaningful, productive, generative change in the world actually
comes about, in the messiness of everyday life, in the small and incremental
transformation of consciousness. I do not believe, and have never believed,
in revolutionary change, in the proposition that a sudden, sharp disjuncture
between the flawed present and the shining future can be produced by a
seismic transformation of social structure directed by the state, by political
vanguards or other major social institutions that possess strong governmentality.
Real revolutions
happen in history, and they are genuinely disjunctive, deeply and abruptly
transformative. The ones that are productive largely happen by accident.
They happen because smaller social transformations have been building
towards a point of criticality, towards a sudden phase change. They do
not happen by design or intention. Real revolutions can be guaranteed
by changes at the top, by the creation of laws and rights and constitutions,
but they don't come from those things. False revolutions
happen in history, and they are much less disjunctive than their supporters
pretend. These are the classic political revolutions, the ones that try
to force history into a new mold by totalizing design, from above. They
can do almost nothing generatively useful at the level of real social
change: they can only destroy and terrorize. They cannot create. The only
good example we have in modernity is the American revolution, and it is
notable that its most fundamentally radical achievement was to specify
constraints on its own transformative capacities. Its moderation was the
essence of its radicalism, and the source of its long-term fecundity.
Power has a thermodynamic character: good things can happen when more energy is added to an existing system, but only if those bringing power to bear have modest ambitions and tremendous respect for serendipity and unintended consequences, for the organic evolution of events. The more ambitious the design, the more totalistic the ambitions, the more fatal and destructive the consequences are likely to be. A human world fully embued with the humanistic values of the Enlightenment is a world we all should desire, and we should harshly regard the world where it falls short of that. But this is where we have to have faith in the desirability of those values, and play the game steadily towards victory. It is the
velvet revolutions of the 1990s that we should cast our covetous
eyes at. The fall of the Berlin Wall and the defeat of apartheid are the
real triumphs of our age. No invasions or interventions have a share in
those victories, but the resolute moral and political will of many states,
civil institutions and individualsbacked where necessary
by military powercan claim a great share of the credit. I don't
deny that on occasion, positive revolutionary-style change does come from
above, but this is a rare circumstance, and all the historical stars have
to be in alignment for it to happen. That was not the case with the war
in Iraq. The Iraq
Wars structural failure is that it is closely allied to the false
revolutionary project, to statism, to the belief that social practice
usually can be highly responsive to and conforming to the will of strong
power, if only that power articulates its will clearly. This is the failed
conceit at the bottom of the well, and where Iraq differs from Afghanistan.
Afghanistan I support because its primary logic was self-defense, and
its secondary logic put forward a sensible, consistently reasoned proposition
that failed states represent a clear and imminent danger to the security
of liberal democratic nations. The national security logic of Iraq, in
contrast, was weak before the war and has gotten dramatically weaker since. Alongside this deep philosophical shortcoming, the failure at Abu Ghraib is indeed a sideshow. It is the deeper failure that the reasoned supporters of the war need to hold themselves accountable for. The Iraq War will take its place eventually as an exhibit in a museum alongside Cabrini-Green, state-run collective farming, Brasilia, the Great Leap Forward, Italian fascism, and other attempts to totalistically remake the substance of social practice from above. May 13, 2004 Welcome
to Paragon City Im
supposed to write an assessment of Star Wars: Galaxies and Ive been
putting it off because I feel I need to go in and play the game again
just to challenge my established prejudices. The conventional wisdom is
that a massively-multiplayer online game needs a year to be judged. But
Im dreading it: I follow news about the game and it seems to me
that there may just be things about it that cant be fixed. SWG left
a bad taste in my mouth about MMOGs. All that expertise, all that prior
experience, all that money, and a franchise that youd think was
a cant-miss proposition, and the result was a worse-than-average
experience in a genre of games that is already very unsatisfactory. As a consequence,
I have been looking at every other MMOG coming down the pike with equal
presumptive hostility. In particular, I was sure that City of Heroes,
a MMOG with a superhero theme, would be a disaster. When the committed
cynics at Waterthread started saying nice things about the late beta,
I began to wonder. Now Ive been playing it for a couple of weeks, mostly on the Protector server, with a martial artist character named "Faust", and I have to admit it: I was wrong.
Maybe the
most astonishing thing about the game is just that the first day of play
went without major technical gliches, and that so far, there are very
few disastrous bugs or technical problems. The major issue at the moment
is that one type of mission doesnt work correctly, but its
easy to avoid doing them. Theres a lesson here thats crucial.
The only other game of this kind to launch well was Dark Age of Camelot.
It shares with City of Heroes a basic simplicity and cleanness of design.
Its clear: dont try to do too much by your launch, and keep
your design as minimalist as you can. Im also hugely impressed by
the communication from the developers: they tend to be very forthright,
very out in front of problems. Many small
features in City of Heroes are well-implemented. For example, I really
like that when I get missions from my contacts, after a certain
point, I can just call them remotely to tell them the mission
is completedI dont have to run all over creation to tell them.
There are a few classic MMOG issues that are in some ways worse in City
of Heroes than any other game: what people call kill stealing
is for some reason uniquely aggravated in the evolving culture of its
gameplay. The game also has a treadmill just like any other MMOG, and
I still maintain thats unnecessary, that designers are not thinking
properly about how to scale challenges over time, and insist on making
hard mean time-consuming. And finally, as is de
rigeur for MMOGs, there are some really dumb and unoriginal names and
designs for characters out there. Ive seen huge numbers of Wolverine
and Punisher clones. On the other hand, I havent seen a single Legolas
yet. Theres
also some things Ill be looking for the designers to do in the months
to come that will help the game be more evocative of comic books. For
one, Im getting very tired of fighting cookie-cutter enemies: there
should be colorfully indvidual supervillains at every level of experience.
Thats the essence of the genre, and its sadly missing from
the lower-level gameplay and even from the mid-game. In fact, how about
every character created getting an archenemy, a supervillain
who pops up from time to time to attack your character? There are
other elements of superhero narratives that need implementation in some
way eventually. Secret identities and all that comes with them are completely
absent. The mission storylines are pretty decentI saved a mechanic
and his family from some robots and now civilians remember that I did
sobut there need to be more plot types, more content that evokes
classic superhero tales. There need to be major public eventssay
each city zone being attacked by giant robots, with everyone pitching
in to repel the menace. Im still going to play SWG later this month to be a responsible critic, but when I want to have fun, Im going to be battling evil in Paragon City as the mysterious and inscrutable Faust. May 12, 2004 Email woes: if you've sent me email in the past week, it's been sitting in the spool waiting for a very bad technical problem with my email to be ironed out. Problem solved, but be patient--it's going to take me a while to work through 300+ emails. May 12, 2004 In Nothing We Trust Free
us from oversight, said the Bush Administration on September 12,
2001, because you can trust in our professionalism and our ethical
constraint. Were the good guys. We wont do anything bad.
President
Bush more or less repeats this mantra today in response to the escalating
scandal of American prisons in Iraq, Afghanistan and Guantanamo, that
it was just a few bad apples, that were a democracy and and this
shows how great democracy is that it can expose a few isolated misdeeds.
Trust us. The worlds collective jaw drops. Does he really believe
that? If so, hes even more isolated and naïve than anyone has
suspected. If not, then he and his inner circle are just as calculatingly
grotesque as the most spectacular conspiracy theorists have portrayed
them as being. Look at what
those photographs show. What anybody with an ounce of common sense knows
is that the scenes being staged in them were not dreamed up by a bunch
of reservists. Its got the stench of military intelligence all over
it. Im sure well hear in court-martials to come that no direct
orders were given to have any of this happen, or that it was only private
contractors. How stupid do they think we are? You can see it easily: an
intelligence chief says to the grunts, Hey, we need some information
out of these guys. See if you can figure out a way. A few minutes
later he says, Hey, I heard from a buddy that Muslim men really
freak out about nudity. No orders were given, sure. John McCain's
fury at Rumsfeld during the hearings was clearly about this issue. We
all know how it works, and we all know that what happened in the prisons
goes right to the top. Not in the abstract, "I take responsibility"
sense (though what does that mean? In what respect is Rumsfeld or Bush
doing anything when they say that?) but in the quite concrete sense that
permission to torture and humiliate Iraqis was sanctioned by the highest
reaches of the hierarchy. A few months
back, Mark Bowden published a
spectacularly foolish article on interrogation and torture in the
Atlantic Monthly in which he mistook a kind of abstract ethical
bullshit session thinking about torture for the actual institutional practice
of it. I agree that there is a kind of thought experiment on torture and
coercion that we have to undertake in an open-minded manner. If you knew
with 100% certainty that a suspect in your custody knew where a nuclear
weapon was hidden in a major American city, and that if you didnt
find out its location within 24 hours, it would be detonated, I think
most of us would say, Whatever it takes to find out, do it. That is a fiction, a thought experiment. Bowdens defense of torture, in response to many angry letters observing that it is very rare for interrogators to actually know who is guilty or who possesses information that justifies coercion, was basically, Well, Im only justifying this in the case of the real professionals, who know what theyre doing, and wont ever misuse coercion or torture for anything less than a vitally necessary end. Welcome to
yet another fiction or abstraction, and a remarkably stupid one at that.
When is this ever the case? What real people in the world have the necessary
professionalism and the necessary factual knowledge of the specific information
held by a prisoner? In practical terms, none. As Adam Ashforth has argued
in his work on commissions of inquiry in South Africa, states use coercion
or torture largely to demonstrate that they can. Its a performance
of power--and that's mainly what US soldiers have been doing in prisons,
torturing and humiliating captives just to demonstrate that they can do
so. Bowden says, Trust them. The whole point is that you cant
and you mustnt, regardless of how clear-headed or fair-minded the
aspirant torturer might be. The domestic
and international terrain on these issues intertwines. Many critics of
the Bush Administration charge it with an assault on the U.S. Constitution.
Sometimes these charges get hung up on the details of particular cases,
or on antipathy towards particular individuals like John Ashcroft. The
charge is accurate, but what we have seen in the last month is that its
not just or primarily about a set of specific attacks on civil liberties.
The Bush Administration is attacking the core philosophy of the Constitution,
at every moment and in every way that they say, Trust us.
Amid the
wreckage of American legitimacy, nothing stands out more than Theodore
Olson and other lawyers from the US Solicitor Generals office standing
before the Supreme Court of the United States arguing that in war, the
federal government can do anything that it judges to be a prudential necessity
for winning that war, that no constraints apply and that no explicit powers,
Constitutional or statutory, need be granted to the federal government
to do that which it sees as needful. That the executive branch and the
military need no oversight or review by any other branch of the government. To hear the
official legal voice of the United States government making that argument
is the most shameful thing I have heard in my life. The pictures from
Iraq are nothing next to it. Olsons argument was the equivalent
of watching him drop trousers and take a crap on the Constitution. The
central genius of the Constitution is that it constrains the government,
that it says that government has no powers save those granted to it by
the Constitution. It thoroughly rejects the claim that government must
be free to expand its powers expediently. That is the
living, beating heart of the United States: that as a people and a nation,
we are suspicious of power. That we recognize that we must never just
trust in power, whether it is interrogators or the President. This has
nothing to do with whether the people who hold power are good or bad people.
Good people, people you like and trust, can misuse power. In fact, thinking
probabilistically, it is a certainty that they will. I can trust an individual
as an individual, but that is very different from writing individuals
in my government a blank check. Abu Gharib is about more than the Iraq War, and more than Donald Rumsfeld. It is the purest revelation of the consequences of the Administrations contempt for the core values of American democracy, a contempt that they are spreading insidiously throughout the government of the United States. We have a precious few months to remove that cancer, to uproot that tree root and branch. If we fail in Novemberand make no mistake, it will be we, it will be the majority of Americans who make the wrong choice, who failthen I think historians are likely to write that this was the beginning of the end of the American democratic experiment, the moment where the mob handed the reins to Augustus and so guaranteed that one day they would burn, too, under the serenade of Neros violin. May 5, 2004 Primal Scream Stop
with the hindsight, says one writer. Be patient, says
another. Oh, no, lets
not stop with the hindsight. Not when so many remain so profoundly, dangerously,
incomprehensibly unable to acknowledge that the hindsight shows many people
of good faith and reasonable mien predicting what has come to pass in
Iraq. Lets not be patient: after all, the people counseling patience
now showed a remarkable lack of it before the war. One of my great pleasures in life, I am ashamed to say, is saying I told you so when I give prudential advice and it is ignored. In the greatest I told you so of my life, I gain no pleasure at all in saying it. It makes me dizzy with sickness to say it, incandescent with rage to say it. It sticks in my throat like vomit. It makes me want to punch some abstract somebody in the mouth. It makes me want to scrawl profane insults in this space and abandon all hope of reasonable conversation. Thats
because the people who did what they did, said what they said, on Iraq,
the people who ignored or belitted counsel to the contrary, didnt
just screw themselves. They screwed me and my family and my people and
my nation and the world. They screwed a very big pooch and they mostly
dont even have the courage to admit it. They pissed away assets
and destroyed tools of diplomacy and persuasion that will take a generation
to reacquire at precisely the moment that we need them most. Noah
Millman, for one example, is a very smart person who says many useful
and valid things, but I find it impossible to understand how he can give
George Bush the credit for being right on big principles like
the principled need to defend liberty, while conceding that Bush appears
unable to understand the complicated constraints of real life. The principled
defense of liberty is nothing if it cannot be enunciated within the terms
of social reality. Its just an empty slogan, and worse, one that
makes no distinctions between political actors. Does Millman really think
John Kerrywho he sees as inadequate to the task of leadershipis
a principled critic of liberty? Just about everyone besides Robert Mugabe,
Kim Il-Jong, ANSWER and Doctor Doom believes in the principled defense
of liberty. George Bush gets no credit for being right in this respect,
and deserves to be soundly rejected for being so, so wrong where it really
counts, in the muck and mire of real life. Thats the only principled
defense that counts: the one whose principles can be meaningfully reconciled
with human truths. A policy that insists on living in a squatters
tent in Platos Cave is a non-policy. There is
a struggle against terror, injustice, illiberalism. It is real. It will
be with us all our lives. We must fight it as best we can. The people
who backed the war in Iraq, especially the people who backed it uncritically,
unskeptically, ideologically, who still refuse to be skeptical, who refuse
to exact a political price for it, who refuse to learn the lessons it
has taught, sabotaged that struggle. Some of them like to accuse their
critics of giving aid and comfort to the enemy. Right back at you, then.
You bungled, and you dont even have the grace or authentic commitment
to your alleged aims to confess your error. After 9/11,
I wrote about my disenchantment with one very particular and relatively
small segment of the American left and its dead-end attachment to a particular
and valorized vision of sovereignity and national self-determination,
seeing those as the only moral aims of international politics. I criticized
the need to see the United States as a uniquely demonic actor in world
affairs. I still hold to that criticism, and I still think it addresses
a real tendency. Im sure Ill say it again in the future. I
do regret saying it as much or as prominently as I did. That was about
my own journey, my own arc of intellectual travel from my origins, not
about a national need to smack down a powerful ideology. The subject of
my criticisms was not especially powerful or widespread in general, and
is even less so now. I regret
it because I and others like me helped the blindly naive Wilsonian proponents
of the Iraq War to caricature their critics as Chomskyites all. The Bush
Administration had its fixation on WMD; Andrew Sullivan, James Lileks,
Michael Totten and a supporting cast of thousands had a fixation with
the loony left. That allowed them to conduct echo-chamber
debates with straw men, in which the proponents of the war were defenders
of liberty and democracy and opponents were in favor of oppression, torture
and autocracy. Small wonder
that they won that debatebut constructing it as such allowed them
to miss the very substantial arguments by other critics, who said, "The
war on Iraq cannot accomplish what you would like it to accomplish in
producing a democratic and liberal state in Iraq, no matter how noble
your aims are. The war on Iraq will not enhance the war on terror, in
fact, it will severely damage it. The war on Iraq cannot be justified
on humanitarian grounds without arbitrarily and inaccurately defining
Husseins Iraq as a worse situation than many comparable othersand
an arbitrary humanitarian claim damages the entire edifice of humanitarian
concern". There were
plenty of people making arguments like theseperhaps even within
the Administration--and they were shouted down or completely ignored before
the war and even early in the occupation. From these arguments, most of
what has come to pass was predicted. Not because of mismanagementthough
there has been that, in spades. Not because of the misdeeds of individualsthough
there has been that a-plenty, both within the Beltway and on the ground
in Iraq. Not because the Bush Administration lacked a free hand to do
what it wantedit has had that, more than any US government in memory.
But because of deep, irreparable flaws in the entire enterprise. A war on
Iraq where the build-up was handled much more intelligently and gradually,
with much more attention to building international consensus steadily.
An Administration not addicted to strident purity tests and not irremediably
hostile to both internal and external dissent. An argument for the war
that took pains to build bridges rather than burn them, and that accepted
gracefully constraints on its own claims and objectives. An occupation
that was methodically planned and clear about the challenges ahead. These
are the preconditions for even imagining the ghost of a hope that the
war could succeed in its humanitarian purposes. In their evident absence
from the first moment, the war could not overcome its handicaps. Liberalism
and democracy do not come from formalisms slapped down on top of social
landscape: they come from the small covenants of everyday life, and rise
from those towards formalisms which guarantee and extend their benefits
rigorously and predictably. Constitutions, laws, procedures: these are
important. But they cannot be unpacked from a box alongside a shipment
of MREs and dispensed by soldiers. They do not make a liberal society
by themselves. To be midwives
to a liberal and democratic society, occupiers have to blend in to that
society, to become a part of it, to work from below, to gain a rich anthropological
sense of its workings and everyday logics. To do that, occupiers must
become vulnerable to insurgents and terrorists; they must hesitate to
use violence. The two imperatives pull in opposite directions, as they
must do so. Smart management can ameliorate or cope with that tension
for a while, and there have been success stories of individual American
commanders who effectively straddled for a while. But the whole enterprise
has not, could not, and DAMN IT, some of us knew that it couldnt.
So now the
oscillations grow more extreme. To fight insurgents, one must sabotage
liberty, become not just occupiers but oppressors. To promote liberty,
one must be vulnerable to insurgents, and even risk losing the struggle
outright to them. You can have the rule of lawbut if you do, you
cant have prisoners kept forever as enemy combatants
or handed over to military intelligence for reasons of expediency. The
law must bind the king as well as the commoner or it is worth nothing,
teaches no lessons about how a liberal society works. Yes, the enemies
of liberty will use that freedom against you. Thats where the real
costs of it come in. Thats where you have to sacrifice lives and
burn dollars and be vulnerable to attack. Thats where you take your
risks. That this
administration, and most of the proponents of the war, would be risk-averse
in this way was predictable, inevitable, and not altogether ridiculous.
It is hard to explain to military commanders why their troops cannot defend
themselves behind barbed wire and walls. It is hard to explain to soldiers
why they have to do jobs theyre largely untrained to doto
administer, to anthropologically investigate and understand another society,
to bow to the cultural norms and sensibilities of others, to advocate
and practice democracy. To be risk-averse about liberty is to lose the
war, as we are losing it. Not just the war in Iraq, but the broader war
on terror. You can achieve liberalism only with liberalism. Hindsight is 20/20, but some of us had 20/20 foresight. You could have it, tooit would just take joining us in the difficult messiness of social and historical reality. May 3, 2004 No Longer a Bird in a Gilded Cage I am sorry
to see that Erin
OConnor is leaving academia. Some
see a pattern in recent departures from academia announced on blogs,
but the pattern, if it exists, is mostly that scholars who have been trapped
in the labryinth of part-time teaching or work at the periphery of the
academy have decided to let go. OConnor
is different. Its very rare to see a tenured academic in the humanities
voluntarily leave a post, particularly one at a good institution, and
to choose to do so for ethical and philosophical reasons. Ive only
known a handful of similar cases, and theyve mostly involved people
seeking some form of personal fulfillment or emotional transition that
they think is unavailable in their academic lives. OConnor
seems to have a little of that in mind as well, but her explicit reasoning
is more that she views the contemporary academic humanities as unreformable
and corrupt. I have a lot of respect for her enormous courage in choosing
to leave. Not only is it hard to turn your back on the gilded cage of
lifetime job security, it is hard to leave behind that part of your own
self-image that is founded on being a scholar in a university environment.
I share at
least many of, if not all of, OConnors misgivings about the
American academy in its present form. Academia, especially in the humanities,
often seems to me narrow-minded, parochial, resistant to the forms of
critical thought that it allegedly celebrates, and possesses a badly attenuated
sense of its communicative and social responsibilities to the publics
which sustain it. In many research universities, teaching remains privately,
sometimes even openly, scorned. There, scholars are sometimes rewarded
for adherence to self-confirming orthodoxies of specialization and mandarin-like
assertions of bureaucratized privilege. And what one
exceptionally dissatisfied respondent at Crooked Timber said is all
too close to the truth: some academics are tremendously pampered and intensely
unprincipled, in ways only truly visible to insiders behind the sheltered
walls of academic confidentiality. However, Im not leaving, or even contemplating leaving, and perhaps that would make me one of the noisome defenders of academia that OConnor criticizes. I am not leaving because Im happy. I enjoy my teaching, am satisfied with my scholarship, and generally am quite pleased with my institution and my local colleagues here. I like many of the people I know in my discipline and my fields of specialization. I learn many new things every week, read widely, live the live of the mind, make good use of the freedom of tenure. Swarthmore
does a pretty damn good job in many ways. I think I do a good job, too.
I am proud of it and proud to be a part of it all. It is easier to be happy when the basics of my situation are so comfortable. I am paid well, I have tremendous autonomy in my teaching and scholarship, I have many compensations and benefits. I have bright and interesting students about whose future I care deeply. I have many colleagues whose company and ideas enlighten me. I have lifetime job security. My institution is in good financial shape, prudentially managed, led wisely. Whats
not to like? What is not to like, notes OConnor, the Invisible Adjunct and many others is that my situation is unusual in the totality of academia. I think some
of that is the difference between a liberal arts undergraduate college
and a large research university: it is the latter kind of institution
that I think is the locus of most of the problems afflicting academia
at present. There is also one aspect of this that I do not take to be
particular to academia, but instead is true of all institutions, that
some jobs are better than other jobs, some institutions are run better
than other institutions. It is better to work for a top law firm than
to work for a miserable firm of ambulance-chasers. It is better to work
for Google than it is to work for Enron. What is a
bit different, however, is that academics mostly cannot pursue market-rational
strategies that respond to those differences intelligently and predictably,
and the distribution of talent in faculties cannot meaningfully be said
to meritocratically map against the good jobs and bad jobs. I do not imagine
that I am here because I am so much better than many of the people in
jobs where they teach 5/5 loads, have alienated students, get no sabbaticals,
have poor benefits and low wages, and indifferent or even hostile administrations.
I think I am good at what I do, but so are many of the people who seek
jobs in academia, and who ends up where is a much more capricious thing
in the end than in many other fields of work. And once you're established
enough wherever you land, if you're tenured, you're there as long as you
want to remain--or trapped if you want to move elsewhere. The conditions
of labor at the more selective institutions feed on themselves in good
ways: with regular sabbaticals, strong students, and institutional resources
you can improve both as a scholar and a teacher. With heavy loads and
no support, youre hard-pressed just to stay afloat. If the end state
of a tenured faculty member at the University of Chicago and the State
University of East Nowheresville are different, that often has a lot to
do with conditions of employment along the way. It is hard
to know what the solution to all this disparity is. I am not into sackcloth-and-ashes
myself, so Im not going to punish myself for my good fortune by
leaving or donating half my salary to adjuncts. If I were, teaching at
a good college would only be the beginning of the good fortunes for which
I must apologize, and a relatively trivial one at that in comparison to
being a white male American who grew up in suburban California in material
comfort with supportive and loving parents. I do not see any magic way
to make every academic institution wealthy overnight, nor would I want
to eliminate the weaker or more impoverished institutionsthe diversity
and number of colleges and universities in the United States seems one
of our national strengths even in comparison to Western Europe. Instead,
I think that the smaller, simpler solutions which many academic bloggers
have described are the real beginning of meaningful reform. Graduate
institutions should dramatically cut their intake of doctoral students.
Yes, that would simply move the principle of relatively arbitrary distinctions
of merit to an earlier moment in academic careers, but thats the
whole point, to keep people from devoting seven years of their lives to
a system that frequently does not pay off that investment of labor. Graduate
pedagogy needs to shift its emphases dramatically to meaningfully prepare
candidates for the actual jobs they ought to be doing as professors, to
getting doctoral students into the classroom earlier and more effectively,
to learning how to communicate with multiple publics, to thinking more
widely about disciplines and research. At the same time, doctoral study
also needs to reconnect with and nurture the passions many of us brought
to our academic careers at the outsetpassions often nurtured in
bright undergraduates by strong liberal arts institutions like Swarthmore.
The excessive professionalization and specialization of academic work
is killing its overall effectiveness and productivity. The possible purposes
of graduate training need to be opened up, not merely as a compensatory
gesture to disappointed academic job seekers, but as a deep and meaningful
reform of the day-to-day labor of professors presently teaching graduate
classes. The passive-aggressive combination of complacency, conformism
and defensiveness that often afflicts academic culture needs to give way
to something bolder, more infused with joy and creation, more pluralistic
and varied in its orthodoxies and arguments. Tenure as
an institution needs to be rethought. If not actively abandoned, it should
at least not be automatically, reflexively defended as inviolate, because
it presently serves very few of the purposes which are often attributed
to it. The use of adjunct teaching in its present form at many institutions
should simply be outright abolished. Non-tenure track faculty should be
hired on 1-year or 3-year contracts at a salary comparable to a tenure-track
assistant professor with benefits to teach a normal load of courses, never
on a per course basis, save in those cases where short-term emergencies
arise (such as serious illness or other unplanned short-term unavailability
of a tenure-track faculty). Theres more that I could suggest, but I think many of these reforms would squarely confront the problems that have driven Erin OConnor to leave academia. How we get to them is really the crux of the matter. I think the role of insiders who love academia but want to see it realize its potentialand possibly stave off the threat of a collapseis essential. But so too, perhaps, are people who walk away. Heres to the guts to walk away from a sure thing. April 21, 2004 Cry Me a River The Chronicle of Higher Education has an article this week about single academics and their problems, the extent to which many of them feel like outsiders in the culture of academia. (Online version now available to nonsubscribers.) Feeling like
a social outsider is one thing, and always worth discussing empathetically,
as a human concern for one's fellow humans. Particularly in small, rural
colleges, faculty social life is the main source of community, and if
that community coheres around marriages, life can be very difficult for
a single person, whether or not that single person is seeking a partner
themselves. A goodly portion of the Chronicles article is
taken up with these kinds of issues, and I sympathize and welcome any
thoughts about ways that individuals and communities can help address
these feelings, to show a soliticious concern for the problems of others,
and strengthen human ties with an appreciation of differing situations. The notion,
given much airing in the article, that feeling like a social outsider
is something for which one ought to be formally and structurally compensated,
that all such feelings represent forms of injustice or inequity, is silly.
Some of the single faculty quoted in the Chronicle article cry
out for parity in benefits, arguing that if faculty with children receive
tuition discounts for their children or health care for families, single
childless faculty should receive some equal benefit. If I never have a
cavity, Ill never make full use of my dental benefits: should I
receive a comparable benefit to someone who gets a new filling every five
months? No, because I have the same benefit if I develop the same condition.
Same for the single faculty: the marriage and child benefits are there
for them too if at some point in their life cycle they apply to them.
As one administrator says in the article, Fair doesnt necessarily
mean equal. I paid taxes to educating other people's kids long before
I had a kid, and I welcomed doing so--because in paying those taxes, I
was underwriting the labor of social reproduction, which as a member of
society, I benefit from when it is done well and suffer from when it is
done poorly. In some ways,
the article documents just how perniciously the trope of minority
status and its associative moral landscape has spread to every single
discussion of how communities are constituted. To talk of single people
as an underrepresented minority in academia, as Alice Bach of Case Western
Reserve University does in the article, makes no sense. Underrepresented
in the sense that academia sociologically is not a perfect mirror of American
society as a whole? Well, yes, of course. But Bach seems, like some of
her aggreived single compatriots, to be saying that this lack of mimetic
resemblance places a moral burden on the faculty of each particular academic
institution to fix the problem, that the mere fact of a difference constitutes
a moral failure. By that standard, every academic institution needs to
designate a proper proportion of faculty to be paid below the poverty
line, to be left-handed, to suffer the proper proportion of death and
injury at the proper ages, to be polyamorous, to be Goths, to be Mennonites,
to be hired with only a high school diploma and so on. If someone can
demonstrate that at the time of training or hiring, single faculty are
specifically identified and discriminated against and therefore that their
underrepresentation is the consequence of discriminatory behavior, then
that person has a legitimate point. Otherwise,
in the absence of that evidence (and I think such evidence will never
be forthcoming), the aggrieved singles in the article are talking about
the culture of academia, which simply is, in the same way that
academia is intensely bourgeois. To argue that academia ought not
to be bourgeois or dominated by married folk is something that one can
legitimately dobut not from a social justice standpoint, only from
an argument about aesthetics and cultural preference, or from the standpoint
that bourgeois society per se or marriage per se are corrupted social
institutions that we collectively need to destroy or reject. Thats
fine, go ahead and make that argument if you like. Laura Kipnis has. Dont
cloak it in complaints about underrepresentation or stigma or minority
status. Those ideological or cultural claims are not arguments about discrimination
and egalitarianismtheyre a different kind of argument. It gets especially
silly when one of the complaints of single academics described in the
article is that theyre not marriedthat the solitary nature
of academic work is too stifling when youre not with a partner or
children, or that household tasks are more time-consuming because theres
no one to divide the labor with. At that point my head is spinning: so
single faculty are discriminated against, but one of the remedies for
discrimination would be to get a partner and kids? That it is an injustice
that theyre not married and with kids? The comparable benefit to
health insurance for families or maternity leave would be what, a colleage
subsidy of a cleaning service or landscaping business for single faculty
to simulate having a partner who can do household chores? How about we
give single women a subsidy for a male-run cleaning service that only
does 25% of the chores after promising to do 50%, and also subsidize a
service that will come in the houses of single faculty and throw toys
all over the floor and triple the laundry load on a regular basis. The person who really drove me nuts in the article was Benita Blessing, a historian at the University of Ohio. Colleagues who have children or spouses, she says, are free to leave boring faculty meetings while she cant just say that she wants to go home and watch reruns of Buffy, the Vampire Slayer. I really, really do try to see things the way other people see them, but this particular statement stopped me in my tracks. There are a million genuine and feigned ways that she could slip out of meetings if she likes: I feel no guilt for her lack of creativity. Then she complains that her department doesnt have parties for people getting tenure or promotions, only bridal and baby showers. Could it just be that this is her department? The whole article is so shot through with freakish anecdotal reasoning from alleged academics whom one would think should know better. Somebody throw Benita Blessing a party already, though Im guessing that shes going to complain even if they do. Envy combined with a discourse of entitlement rarely respects restraints. April 19, 2004 How Not to Tell a Story Hellboy
is a really enjoyable film. Matrix: Revolutions is not. I saw both
about a week ago. The contrast was a reminder that you can talk plainly
about the technical skill of telling a story. At a basic
level, the problem with the storytelling in Matrix: Revolutions
is that it rejects both of the forks in the road that the mediocre Reloaded
laid down, both possible conclusions. The first
is to play an entertaingly intricate and escalating series of tricky games
with the nature of reality, to return to the basic dilemma of The Matrix
and ask, What is real? The storyteller, in that scenario,
has to have a final answer in mind, but to allow his characters to be
confused about the answer, to have the action of the plot be a labyrinth,
an ascending series of false answers and red herrings, a fan dance tease.
You can even end that story with a little wink of doubt after the supposedly
final answer is revealed, but you do need to have a real and satisfying
climax. This is the more intellectualized storyit demands a high
level of clever playfulness to work. It also would require taking the
story back into the Matrix itself for most of the filmas
Gary Farber notes, the odd thing about Revolutions is that
almost none of it takes place in the Matrix. One possible
strategy for this kind of tricky, layered plot: suppose we find out in
Revolutions that the whole humans-in-vats being energy sources
thing is just as absurd as it sounds, that its just a higher-order
simulation designed to deceive the remaining humans, that what Morpheus
and pals are doing is actually just what the machines want them to do?
What if the machines are really trying to liberate humanity from the Matrix,
and it turns out to be humans who put themselves in it? What if the Architect
is the good guy and the Oracle the bad guy? And so on. In the right hands,
this kind of escalation of doubt and confusion can work beautifullybut
it takes a storyteller who has thought it all out in advance, who has
an exquisite sense of how to use reversal and surprise as a way to structure
storytelling. It also takes a storyteller who is both playful and willing
to make some rules for his game and stick to them. The only
other way to go is to play completely fair with anyone who has followed
the story to that point and reveal everything. Make the movie Matrix:
Revelations, not Revolutions. Solve all outstanding questions,
lay out the secrets, explain it all. Make those secrets basic, simple,
and dealt with quickly through exposition. That also
is not what Revolutions didinstead it dropped some more murky,
oblique characters into the mix, went on some time-wasting excursions
to see old characters whose pointless, plot-arbitrary nature was confirmed
(the appalliingly annoying Merovingian and his squeeze), offered some
incoherently faux-profound dialogue about the plots events, blew
a shitload of things up hoping nobody would notice how hollow the rest
of the film was, and then threw into two incomprehensible conclusions
(Neos defeat of Smith and the final scene with the Oracle, the Architect
and Sati). Along the way there were isolated cases of really excrutiating
badnessTrinitys death scene was so protracted and excessive
that I found myself screaming at the television, Die already! Die
DIE DIE! Im sure there are Matrix fanboys out there who can
explain all this, but a dedicated fanboy can claim to see a pattern in
a random piling of trash in a garbage dump, too. I got it
right
in my comments on Reloaded: the Wachowskis want too badly to
come off like philosophers, but they think philosophy is about incomprehensible
slogans, meaningfully enigmatic glances and Ray-Bans. In Revolutions,
theres no hiding the naked emperor: they clearly dont have
the faintest idea what their story is actually all about, and so they
perform the cinematic equivalent of alternating between mumbling and shouting.
I can see how they could have played fair and explained it all. For example,
make it clear that the machines created self-aware software upon which
they are now dependent, and make the Matrix a literal Third Way
in the human-machine conflictmake it hardware vs. software vs. meatware,
and make the software dictate a peace on both humans and machines. Maybe
the fanboys will claim thats what was going on anyway, but that
takes much more generosity than Im prepared to show. So. Hellboy.
Hellboy gets it right because it tells a story honestly. As one
of my students noted, when you see an opening quote about the Seven Elder
Gods of Chaos, you know that youre deep in the heart of Pulpville.
The storytellers know that too, and they satisfyingly plunk their butts
down right where they belong and stay there consistently throughout the
film. The story moves along smoothly (well, theres a slow bit in
the beginning, maybe), the plot is transparent to its viewers and to its
own genre conceits, and everything is played more or less fair. If the
movie were a little more dour or took itself seriously enough, one might
ask questions like, Why does an agency of paranormal law enforcers
seem to know so little about the paranormal? (then again, just look
at the 9/11 Commission to find out how law enforcement agents can not
know a lot about what theyre supposed to know about) or Isnt
it wise when youre dealing with a quasi-immortal villain to not
assume hes dead? You dont ask these questions seriously
because the story is robustly built and has an assuredness to it at all
times. It knows what it is. These are great examples for a straightfoward discussion of the technical craft of storytelling. Whats important about that discussion is that it can very rapidly scale up into much more critically complex conversations about genre, audience reception and audience formation, the history of representation, the indeterminate meanings of cinema as a form and much more besidesbut it also shows that we need not (and in fact often do not) lose sight of a technically-focused ground floor of cultural criticism in moving towards more difficult questions. April 16, 2004 The Raines Must Fall Having finally made my way through Howell Raines full postmortem of his tenure at the New York Times (its only a bit shorter than Neal Stephensons The Confusion) I cant say that I feel any great sympathy for him. Even when hes making points I agree with about the Times, he comes off fairly badly, writing a weird jambalaya of self-pity, arrogance and gracelessness. He means
to convince anyone reading that he was done in by a cabal of no-talent
hacks protecting their jobsand I walked away convinced that there
were in fact entrenched no-talent hacks at the Times when Raines
was brought in (this is hardly news)but he mostly ends up providing
convincing confirmation that the sniping criticism of Raines during his
tenure was valid. This is not a guy youd want being the executive
of anything, though he might make a good second-banana bad cop
for a person with real leadership skills. Raines takes
credit, and deserves credit, for shaking up the Times utterly
arteriosclerotic coverage of culture. In the mid-1990s, it was stunningly
irrelevant and stultifyingly boring, both in the daily and Sunday paper.
(Even the snotty high culture coverage was often so late, as Raines observes,
that the Post, of all papers, was doing a more timely job on the
same beat.) Raines helped the paper to figure out that you dont
send a man to do a boys job, and got reviewers like Elvis Mitchell
to write about culture they both understood and enjoyed. However, the
Sunday Arts & Leisure is still a snooze: the revolution is only partially
complete. In fact,
the Sunday edition is in general still pretty boring. Raines seems to
think he accomplished a lot in this respect, but I dont see it.
The Book Review is mostly predictable, the Week in Review flounders uselessly
most of the time, and this was the same under Raines as it was before
and since. The Sunday magazine has a better track record for interestingly
controversial articles, though, but I thought that was one of the stronger
parts of the Sunday edition before Raines arrived. One small
change that I have loved about the Times that I think came in during
Raines tenure, though he doesnt mention it (I think) in the
Atlantic piece is the back end of the Saturday Metro section, with
its really intriguing pieces on current debates and ideas in academic
and intellectual circles. Raines doesnt
talk that much about columnists, and thats not surprising: the Times
went from bad to worse during his tenure, keeping some of the same boring
old pissants and adding some new boring younger pissants. Even the people
I agree with are boring me. And it becomes clear as one reads along that
many of his strongest internal critics were on the news staff, where the
Times was in pretty good shape, especially in international coverage,
before Raines started his tenure, and from the perspective of many external
critics, actually got worse during his time. When I compare the international
coverage under Lelyveld to the Times in the 1980s, its like
night and day. The ideological hacks mostly disappeared, and lightweights
like Christopher Wren were mostly swept out, replaced by much more interesting
and energetic writers. The Africa coverage went from being something that
persistently annoyed me to being something I learned from and found usefully
distinctive from anything else in the mass media. The domestic
coverage has been uneven for a decade, but to be honest, all I ask of
the Times in that regard is that it be solid, detailed, and fairly
comprehensive, because thats its purpose in my household, to serve
as the paper of record. When I want something more, I go read The Economist,
the features on the front of the Wall Street Journal, the Washington
Post for inside-the-Beltway stories, and the Internet. Thats the first and major thing Raines doesnt seem to get. He represents himself as coming in all gung-ho to expand the papers subscriber base, widen its appeal, reach new audiences, freshen up the Grey Lady. The core readership doesnt probably want or need the paper to do that in most of its news coverage. Im very glad to have the Times be more interesting in the places where it was just mind-bogglingly dull and snobbish, but when it comes to news, I demand that it first be respectable and meticulous. This is something that Raines didn't and still doesnt seem to think is important: his vision, by his own account, was all about making every single part of the paper equally provocative and edgy. Thats not your market niche, man. Raines ventriloquizes for both actual readers and potential readers and says, This is what the public wanted, but my do-nothing, hack-job, fusty staff wouldnt let me. This reader says, No, thats not what I want when it comes to news and the Times. The Times is not required to be boring, but neither does it require a front-to-back overhaul. I dont require the Times to get there first, and I dont require it to get there sexily. I just require the paper to get it right and get it all. If Raines had spent more time worrying about shoring up standards of craftsmanship and meticulousness in reporting and less time worrying about sexing up the writing, he might not have had the Jayson Blair problem. Raines reports
breathlessly on the internal culture of the Times as if he learned
for the first time as executive editor how office politics works, and
as if the Times is an unprecedented hive of professional cripples.
Shouldnt an executive editor be a seasoned old hand when it comes
to issues like unproductive senior writers, recalcitrant underlings, or
peer networks that rally to support each other? On banal questions of
group dynamics, Raines acts like a sixty-year old spinster being shocked
by a first encounter with the birds and the bees. Aside from that, its really hard to sympathize with Raines as he comes off in this piece simply because he sounds like an unlikeable prick. That's a pretty bad sign when you write an exculpatory account of your own behavior and you still manage to come off like an asshole. It's like watching a television commercial for food where they can't even manage to make what they're selling look appetizing. That generally means that the food in question is authentically nasty. Same thing here. He manages to get in some truly graceless shots at his former colleagues, some of them by name, others only indirectly. Its one thing for a crusader unmistakeably brought down by reactionary forces to shout defiance at his enemies, and another thing to pass the buck as aggressively as Raines does in the wake of a mistake that he at least had titular responsibility for. It makes me wonder if any leader in American public life will ever have the grace to just assume responsibility for failure on his watch and manfully go down with the ship. At the very least, Raines could say a lot more about his shortcomings: there are very few unqualified or straightforward confessions in the article, and the half-hearted apologies take up a very small amount of the total space in the essay. April 12, 2004 Readings and Rereadings #3 Lauren Slater, Opening Skinner's Box April 8, 2004 Footnotes on Death in Iraq I know this is a common fact about war, that many combatant deaths come only indirectly from military conflict, but has anyone noticed how many deaths of coalition forces in Iraq are non-combat deaths? I got interested when looking at the roster of the dead on CNN's web site and started a quick and imprecise count. It looked to me like almost a third of the deaths were attributed to either "non-combat gunshot wounds or injuries" or various kinds of vehicle accidents, with a few cases of death from illness or unspecified medical conditions. A few of the vehicle accidents probably had to do with the pressures of near-combat situations, but a lot of them were due to things like embankments crumbling or boats capsizing or just plain old traffic collisions. I thought for a minute that maybe some of the non-combat gunshot or injury deaths (about a third of the total non-combat deaths) were suicides, but the recorded cause of death as given at CNN actually explicitly note a suicide as such. I assume most of these deaths are actually training accidents or due to equipment malfunction. April 8, 2004 Today's
3-Year Oldisms Emma: Today
is Monday! Us: No, its
Tuesday. Emma: No,
today is Monday. Us: You have
your swim lesson on Tuesdays, and you just got back from your swim lesson.
Ergo, it is Tuesday. Emma: There was a fairy who changed my swimming lesson to Monday this week. And she also gives people candy. --------------------------------------------- Me: Emma, should I shave off my beard? [A frequently asked question.] Emma: No! Me: Why not? Emma: Because you would look more like a boyfriend than a daddy. April 7, 2004 Emergence
and the Metahistory of Efficiency Im
going to gingerly venture in this space for the first time into waters
that Ive been heavily exploring for two years with other faculty
and through very active reading, namely, complex-systems theory, complexity
theory, nonlinear dynamics, emergent systems, self-organizing systems
and network theory. I am a very
serious novice still in these matters, and very much the bumbler in the
deeper scientific territory that these topics draw from. (Twice now Ive
hesitantly tried in public to talk about my non-mathematical understanding
of the travelling salesman problem and why an emergent-systems strategy
for finding good answers in non-polynomial time is useful, and I suspect
that I could begin a career in stand-up comedy in Departments of Mathematics
all around the nation with this routine.) I do have
some ideas for useful applications of these ideas to the craft of historyAlex
Pang wasnt the only one burning the midnight oil with an NSF
application recently. More generally,
I think there is one major insight Ive gotten about many of the
systems that get cited as either simulated or real-world examples of emergence.
The working groups Im in have been thinking a lot about the question
of why so many emergent systems seem to be surprising in their
results, why the structures or complexities they produce seem difficult
to anticipate from their initial conditions. Some complex-systems gurus
like Stephen Wolfram have very strong ontological claims to make about
the intrinsic unpredictability of such systems, but these are questions
that I am not competent to evaluate (nor am much interested in). I tend
to think that the sense of surprise is more perceptual, one part determined
by the visual systems of human beings and one part determined by an intellectual
metahistory that runs so deep into the infrastructure of our daily lives
that we find it difficult to confront. The visual issue is easier to recognize, and relatively well considered in A-Life research. Its why I think some simulations of emergence like the famous flocking models are so readily useful for artists and animators, or why were weirdly fascinated by something like Conways Game of Life when we see it for the first time. Emergent systems surprise us because they have a palpable organicism about themthey move in patterns that seem life-like to us, but in contexts where we do not expect life. Theres a deep human algorithim here for recognizing life that involves a combination of random movement and structural coherence, which is just what emergence does best, connecting simple initial conditions, randomness and the creation of structure. Purely random movements dont look lifelike to us; top-down constructions of structure appear to us to have human controllers, to be puppeted. So we are surprised by emergence because we are surprised by the moment-to-moment actions of living organisms. When I look
at ants, I know in general what they will do next, but I dont know
what exactly any given ant will do in any given moment. This, by the way,
is why most online virtual worlds still fail to achieve immersive organicism:
play enough, explore enough, and you know not only what the general behavior
of creatures in the environment is, but precisely what they will do from
moment to moment. What I think
is deeper and harder to chase out is that we do not expect the real-world
complex systems and behaviors we actually know about to be possible through
emergence, in the absence of an architect, blueprint or controller. Some
of this expectation has rightfully been attributed by Stephen Johnson
and others to a particular set of presumptions about hierarchy, the so-called
queen ant hypothesis. But I also think it is because there
is an expectation deeply rooted in most modernist traditions that highly
productive or useful systems achieve their productivity through some kind
of optimality, some tight fit between purpose and result, in short, through
efficiency. My colleague
Mark Kuperberg has perceptively observed that Adam Smith has to be seen
as an early prophet of emergencewhat could be a better example than
his bottom-up view of the distributed actions of individuals
leading to a structural imperative, the invisible handbut
as digested through the discipline of economics, Smiths view was
increasingly and to my mind unnecessarily parsed in terms of models requiring
those agents to be tightly optimizing. Thats
whats so interesting about both simulated and real-world examples
of emergence: they create their useful results, their general systemic
productivity, through excess, not efficiency. Theyre not optimal,
not at all, at least not in their actual workings. The optimality or efficiency,
if such there is, comes in the relatively small amount of labor needed
to set such systems in motion. Designing a system where there is a seamless
fit between purpose, action and result is profoundly difficult and vastly
more time-consuming than setting an overabundance of cheap, expendable
agents loose on a problem. They may reach a desired end-state more slowly,
less precisely, and more expensively in terms of overall energy expenditure
than a tight system that does only that which it needs to do, but that
excess doesnt matter. Theyre more robust to changing conditions
if less adapted to the specificities of any given condition. We go looking
for efficiencies and thriftiness in productive systems partly because
of a deep underlying moral presumption that thrift and conservation are
good things in a world that we imagine to be characterized by scarcitya
presumption that Joyce Appleby has noted lies very deeply embedded in
Enlightenment thought, even in the work of Adam Smith. And we do so because
of a presumption that productivity and design, fecundity and cunning invention,
are necessarily linkeda presumption that I am guessing is one part
modernist trope and one part deep cognitive structure. We are disinclined
to believe it possible that waste and excess can be the progenitors of
utility and possibility. Georges Batailles answer to Marx may be,
as Michael Taussig has suggested, far more important than we guess. Marx
(and many non-Marxists) assume that surplus must be explained, that it
is non-natural, that it is only possible with hierarchy, with intentionality,
with design. It may be instead that excess is the key, vastly easier to
achieve, and often the natural or unintended consequence of feedback in
both human and natural systems. The metahistory
that I think I see lurking in the foundations here is a tricky one, and
a lot of effort will be required to bring it to light. We will have to
unlearn assumptions about scarcity. At the scale of living things, making
more copies of living things may be thermodynamically incredibly cheap.
At the scale of post-Fordist mass production, making more material wealth
may be much cheaper than we tend to assume. We will have to root out our
presumptions about efficiency and optimality and recognize that many real-world
systems whose results we depend upon, from the immune system to the brain
to capitalist economics, depend upon inefficient effectiveness (productive
non-optimality, wasteful utility). I also think exploring this metahistory of our unconscious assumptions might help us contain emergence and complex-systems theory to a subset of relevant examples. Some of the people working in this field are too inclined to start sticking the label emergent on anything and everything. You could actually come up with a prediction about the limited class of systems that can potentially be emergent or self-organizing (and Im sure that some of the sophisticated thinkers in this field have done just that): they would have to be systems where many, many agents or discrete components can be made exceptionally cheaply and where simple rules or procedures for those component elements not only produce a desired end-state but also intrinsically terminate or contain their actions within the terms of that result, and probably some other criteria that might be identified by unthinking our prevailing assumptions about efficiency and designsay constraints on the space, environment or topology within which inefficiently effective systems might be possible. April 6th, 2004 Dont Play It Again, Sam I never dreamed
when I started my current book project in 1997 that writing about the
British system of indirect rule in colonial Africa as a central issue
would turn out to be relevant not just to understanding how African societies
like Zimbabwe got to where they are today, but also to understanding how
current events in another part of the globe are unfolding minute by minute.
But here
we are. The United States is trying to get an approximately colonial system
of indirect rule up and running in Iraq after June 30th, one more limited
in its conception and at least notionally shorter in its projected lifespan
than the early 20th Century British equivalent, but one nevertheless.
It certainly makes me feel like I had better finish my project as soon
as I can before I feel compelled to once again rethink what Im writing
in light of the latest developments. Im
very hesitant about casual comparisons and analogies, like most historians,
but this general resemblance seems to me to be unmistakable. This resemblance
also clarifies for me why I do not view Iraq and Vietnam as strongly analogous.
Cheap rhetoric aside, Vietnam was not an imperial war, and US power in
South Vietnam was not a form of colonial rule. The configuration of political
authority, the nature of the military conflict, the rhetorical framing
of the struggle, the developmental timeframe of the war: they were all
quite different. The Cold War was its own distinctive moment in the history
of the 20th Century. So too is today, but it is closer to the colonial
past than any other moment since the 1960s. The fighting
in the past week has been unnerving in its intensity, and seems today
as if it will get worse with news of a major ambush of US Marines in Ramadi.
The question is, does the analogy to British indirect rule help us understand
what is happening now and what may happen in the future? I think yes,
and the news is not very good. Many defenders
of the current war say that the critics have too short a time frame for
assessing success, and they may have a point. British rule in Africa (and
elsewhere) was pockmarked with short-lived uprisings and revolts which
seemed briefly significant at the time, but which never really threatened
British colonial authority fundamentally until the 1940s and the simultaneous
challenge of major labor strikes, mass nationalist protest and the Mau
Mau rebellion in Kenya, at a time when British economic and military strength
was at relatively low ebb and the nature of international politics and
morality had fundamentally shifted against empires. So can the
US simply endure these small rebellions similarly, by taking the long
view? Well, probably not, and heres some reasons why. First
reason: the British could simply ignore African resistance at times:
lack of mass media and a global public sphere meant that many uprisings
or acts of unrest were known only to local authorities until considerably
later, and left more or less alone to burn out without fear of public
reaction. Second
reason: colonial rebels lacked access to communicative methods for
mobilizing and coordinating unrest over larger areas, which is not true
in Iraq today. Third
reason: overt racism among British authorities and British society
meant that they regarded Africans in so dehumanizing a manner that they
usually did not have to worry officially or publically in public debate
about what Africans thought, felt or desired, a rhetorical option no longer
as open to the United States government today, though theres certainly
some hint of this now and again. (Official white supremacist practice
contradicted by implicit liberalism under British rule has transited to
official, explicit liberalism contradicted by implicit racial or cultural
categorization of colonial subjects). Fourth
reason: the British were restrained by some humanitarian considerations
in exerting their power, and when those constraints were egregiously overstepped,
as in Amitsrar in India, there were consequencesbut still, and particularly
in the early history of colonial rule, it was possible for British forces
to retaliate very openly with enormous force and with relatively little
regard for due process or rights against even suspected rebels or dissidents.
The US probably cant do the same thing, both because the world has
changed since 1890 and because massive retaliation against suspected sources
of rebelliousness carries the risk of further inflamation of resistance
among previously neutral civilians. Fifth
reason: Iraqi society is much more plugged into regional and global
networks that can reinforce and amplify resistance to US occupation in
comparison to most African societies in the early 20th Century. Sixth
reason: British indirect rule, for all its rhetoric of the civilizing
mission, was ultimately much more modest in its ambitions in most
cases than American rule in Iraq is today. The bar for declaring success
was much lower then. Seventh
reason: British indirect rule existed in an international system dominated
by European state that normalized imperial authority in general and racial
hierarchy in specific. American indirect rule in Iraq exists in a world
that is largely mobilized against imperial ambitions, often insincerely
or instrumentally, but mobilized nevertheless. Eighth
reason: The direct relation of American popular opinion and elections
to the continuance of an imperial policy is structurally very different
than what pertained in Britain from 1880 to the 1930s. What was sustainable
then politically is not sustainable now without an even more seismic shift
in American culture and society. Ninth reason, and perhaps the most importantly concrete; British military power in relation to non-Western societies in 1890 was the most technologically asymmetrical that the world has ever seen: there was an unbridgeable distance in terms of firepower, logistical capability, and much else besides. Britain rarely exerted this power directly after the initial era of conquest in Africa and elsewhere, but when it did, there was simply no question of armed resistance succeeding. This is no longer the case today. American military power is still massively asymmetrical to the military power of armed insurgents in Iraq, but in ways that are of no use in exerting indirect rule authority over Iraqyou cannot assert indirect rule through bombing campaigns, artillery assaults, or nuclear deterrents. You can only do it with ground troopsand here automatic weapons and homemade explosives in the hands of insurgents coupled with the ability to vanish into the general population are enough to bring Iraqi combatants up to the point that they can exert meaningful force against American authorities. You can leave aside all the other comparisons but I think this alone is a devastating difference between the world of 2004 and 1890. Now "they" do have the Maxim Gun, more or less. April 6, 2004 When I was
a graduate student, I once queried the local free weekly about writing
a column of course reviews drawn from universities and colleges in the
regionI thought Id contact the instructor, get a hold of a
syllabus, slip into the back of the room for two or three lectures in
a large course or listen in on two or three discussions, and then write
a review. Im glad they declined the offer (ignored, actually) given
that I couldnt possibly have written such a column and remained
a viable graduate student. Moreoverand I didnt know this at
the timevery few professors would allow such a thing, and in some
institutions, theyd probably be prohibited from giving a stranger
permission to sit in on two or three course sessions. Now theres
a way, sort of, to accomplish something rather like this, and thats
to take advantage of online syllabi and find really great examples of
courses out there worthy of praising. A great syllabus isnt necessarily
a great course, but it is likely to be, and a great syllabus is in its
own right something useful, interesting and compelling. Syllabus design
is one of the subtle but central arts of academic life. A good online
syllabus is the antithesis of an online education: it doesnt pretend
to be teaching or instruction, just to be a form of publication, a sort
of annotated bibliography. Syllabi are good to think. Ill start locally. My Bryn Mawr colleagues Anne Dalke and Paul Grobstein are teaching a course this semester called The Story of Evolution and the Evolution of Stories. This course is a great model for cross-disciplinary--not interdisciplinary--teaching in a liberal arts institution. Sadly, we dont have much that's comparable at Swarthmore. Our cross-disciplinary co-teaching tends much more to the dour and respectable, to rendering service to sequential curricula or established interdisciplinary programs of study. This course, in contrast, is the kind of course that many different kinds of students from different disciplines could come to and gain a new perspective on their work without necessarily having that sense of having climbed a necessary rung of a preset ladder. I've long thought that most institutions of our type should have a few courses every year that are about a discovered conversation between two colleagues--not to be taught again and again, perhaps to be taught only once, and as exploratory for the faculty as the students. As I look over the idea of the course, I can see a lot of places where I would have a different point of entry into the conversation. I thought immediately of a paper by Peter Bogh Andersen, "Genres as Self-Organizing Systems", that I encountered via William Tozier, or Gary Taylor's book Cultural Selection. This strikes me as a sign of excellence in a course of this kind, that many different people would be able to look at it, immediately "get" the basic concept behind it, and think about some other texts or materials to bring to the table. April 5, 2004 Miscellanea 1) I see Ralph Nader has been popping up in the media expressing bewilderment at the vehemence that his candidacy raises among its critics. Reacting in particular to the similarity in the many letters he received from friends and admirers begging him not to run, he says, "It's a virus", saying there could be no other explanation for the similarity between the appeals. No, Ralph. When everyone disagrees with you in the same terms, it's not a virus. It just means everyone sees the same thing. Generally, the only people who conclude in the face of that kind of mass concord that they themselves must be right and everyone else must be wrong are narcissists, paranoids or the Son of God. Take your pick. 2) Will Baude of Crescat Sententia draws attention to the all-important campaign to get George Lucas, another isolated destructive narcissist of the first order (he'd make a great running mate for Nader) to include the original version of the Star Wars trilogy on this fall's DVD set. I don't much care about most of the original scenes, and in fact, most of the CGI additions were kind of nice, if sometimes rather busy. But I do care--enormously--about Han Solo shooting first in the cantina. I don't know if any other director has ever so revoltingly and pointlessly mutilated his own work (voluntarily!), but please, George, give us the option to undo your foolishness. 3) 3-year olds are a hoot! Every day is some fascinating new angle from the peanut gallery on something I'd never thought of that way, or some recombinant view of something that makes a kind of weird, interesting sense. This week's special, while watching the gazebo scene early in The Sound of Music: (Rolf shows up to meet Lisle in the gazebo.) Emma: Who is he? Us: He's the Nazi Boy. Emma: What is his name? Us: Rolf. Emma: Who named him? Us: His parents. We think. Emma: I didn't know that Nazi Boys had parents. April 5, 2004 Piling On Intelligent Design Everywhere
I click in the last few weeks, folk are talking about Intelligent Design
theories and working themselves into a mighty froth over the theories
and the tactics of those who advance them. Rather than
joining the pile-on right awaythough as youll see, Ill
get around to it eventuallyI thought it might be worth taking a
deep breath beforehand, partially because it doesnt seem to me absolutely
intrinsically impossible that one could find evidence of intelligent design
in the universe. I suppose thats what I now class myself as an agnostic
rather than an atheist. I see no reason at all to think that such a designer
exists, but Im an open-minded guy. So perhaps
the first reaction one should have to intelligent design theories is to
specify in advance what real, meaningful evidence could reasonably occasion
a scientifically-sound hypothesis in favor of an intelligent designer.
There are lots of personalized ways ID could be confirmed. Dying and finding
oneself in an afterlife where a Supreme Being personally affirmed that
he was in fact the designer of the universe would be one such source of
evidence. A bit hard to repeat the experiment, though. Revelatory personal
contact with God would be confirmation for a single person (though there
would always be the possibility that you were suffering from mental illness)
but that also cant be shared or repeated. What external,
repeated evidence could there be? What would ID predict? God or his agents
could appear physically in the world and cause events to happen for which
there could be no other explanation save divine power, where we widely
agreed that we had witnessed such events, or had repeatable confirmation
via video or other recording devices that such events had happened. God
could put a genetic watermark into the DNA of all living things that spelled
out Organism by God in English. Equally unmistakeable signsand
were not talking a Rorsach-blot picture of a weeping Jesus on a
tree stump herewould be enough. We could probably list them predictively
with some reasonable precision. What would
not suffice, as many have noted, is a demonstration that our current theories
cannot explain some aspect of observable reality. That proves nothing
about an intelligent designer. And as many have also noted, even if one
conceded most of the ID arguments of this kind, they would tell you nothing
about the identity of the intelligent designerit could just as easily
be Yog-Sothoth or a mad scientist from Dimension X as it could be God.
The thing
that is puzzling in a way is why most Christians would bother with intelligent
design. Modern Christianity, even high rationalist Catholicism, acknowledges
the special role of faith. Who is intelligent design intended for? A Christian
who needs such an exotic, Rube-Goldberg crutch to achieve faith is very
possibly a person whose belief in God is already on the verge of collapsing,
unless theyre a strict blind watchmaker deist. And yet,
if this was the point of ID, I personally would have no real problem with
it. Carl Sagan, Richard Dawkins and various atheists who have made a point
out of confronting and pursuing religious people have typically misunderstood
three things: first, the social, cultural and psychological generativity
and productivity of religious beliefs; second, the conditional rationality
of many of them (e.g., they're reasonable readings or interpretations
of certain events or phenomena but only in the absence of additional information);
third, the degree to which it is completely rational (indeed, scientific)
to be skeptical about the authority of science and scientists, especially
when that authority is marshalled behind the making of public policy.
If any individual
needs ID to bolster his personal faith, thats fine with me. If believers
want to share ID among themselves, then that too is fine, but considered
purely as a matter of intellectual and social history, that says something
interesting and perhaps even comforting about the ascension of scientific
reason as the dominant spirit of our age, that Christian faithful would
feel the need to translate their faith into the terms and norms of pseudo-science
in order to legitimate it among themselves. This is not
what the struggle over intelligent design is about, however. Its proponents
do not use it as a private foundation for their faith, or a shared liturgy.
They want it to stand equally with evolution within the architecture of
public reason. This is where the opponents of ID draw the line, and rightfully
so, because what it reveals is that ID is a highly intentional mindfuck
of the first order. Its not intended for the faithful, and its
not based on credible evidence. Its intended for those who do not
believe in God. It is a tool of subversion intended to produce conversion.
It is a Trojan Horse attempt to speak in the language of the non-Christian
Other while actively trying to obfuscate and sabotage that language. It
is dishonest. This is why Brian
Leiter and many others are perfectly right to react with such intensity
to ID, because it is often a quite conscious attempt to pollute and despoil
the utility and value of scientific thought and illicitly limit its domains
within the social and intellectual life of the nation. Christians have as much right as anyone to persuasively address their fellow citizens on behalf of their own cultural and social projects. However, participating in public reason in a democratic society obligates us all to honesty, to placing all our cards on the table. I will gladly hear a Christian try to persuade me to their faith, even if they talk in terms of arguments for intelligent design, as long as it is a two-way conversation, transparent to the public sphere. Trying to convert me by monkey-wrenching the general productivity of my everyday working epistemologies is a different matter, however. I react to that about as well I would react to someone slashing the tires on my car. March 29, 2004 Readings and Rereadings #2 Holly Elizabeth Hanson, Landed Obligation: The Practice of Power in Buganda March 28, 2004 Category
Error Sasha
Issenbergs typically entertaining, elegant critique of David Brooks
(I get to call it typical because I graded his confident, intelligent
and meticulous writing quite often when he was a student here) is already
getting some justified
and mostly
positive attention from webloggers. One of the
best features of Issenbergs article is his coverage of Brooks
reaction to the piece. Issenberg finds that most of Brooks characterizations
of red and blue America, or anything else, are based on stereotypes rather
than reportage, inference rather than data, cliches rather than research.
Brooks protests that this is a pedantic objection, that Issenberg doesnt
get the joke, or is being too literal. (And tosses in to boot a condescending
jab about whether this is how Issenberg wants to start his career.) I think Brooks
would have a point were he another kind of writer, or inclined to claim
a different kind of authority for his work. If Bobos hadnt
been sold and framed as a work of popular sociology, but instead merely
as witty, personal social observation in the style of Will Rogers or Roy
Blount Jrbasically as a folklorist of the professional classesthen
Brooks would be entirely justified in putting on his best Foghorn Leghorn
voice and saying, Its a joke, son. But as Issenberg
observes, thats not the weight class Brooks tries to fight in. He
wants to be the latest in a line of popular sociologists diagnosing the
American condition. Issenberg perhaps overstresses the untarnished respectability
of that lineage: theres a few quacks, cranks and lightweights scattered
in there. Is Brooks the latest such lightweight? I think Issenberg makes
a good case that he is. This is not
to say that there isnt some truth in what Brooks has to say, but
the odd thing is that the truthfulness of his writing has to do less with
how we now live than how we think about how we live (and more, how others
live). Its not that you cant buy a $20 meal in Franklin County,
but that the professional classes of Blue America think that you cant.
Red and Blue America works not just because its backed by some sociological
data (not collected by Brooks) but because once named, we all recognized
its stereotypes and their correspondence to mostly private, mostly interior
kinds of social discourses in contemporary American life--a point Issenberg
makes astutely in his article. When professional middle-class urbanites
talk amongst themselves about gun controlwhich they mostly favorthey
often lard up their conversations with references to gun racks on pickup
trucks and other visions of the rural Other, and it works in the other
direction too. If Brooks can ask Issenberg if this is how he wants to start a career (seems a pretty good start to me) then I think Issenberg and others are justified in asking Brooks if this is how he wants to sustain one, by feeding us back our stereotypes and acting as if he has accomplished something simply because many of us nod in recognition. If Brooks would like to move beyond that to showing us something we alreadyand often incorrectlythink we know about ourselves and our fellow Americans, hell probably have to get serious about those trips to look for $20 dinners. Or he can hang up the sociologists hat and settle into the role of observer and witticistbut even there, we often most treasure and remember the observers who do something more than hold up a mirror to our confirmed prejudices. March 24, 2004 Middle-Earth Online: A Prediction I was just joining in a group-whine in one discussion forum about the failure of massively-multiplayer persistent-world computer games, and we were commenting in particular on how freakishly bad the initial experience of gameplay is in most of them. MMOGs, almost
ALL of them, go out of their way, almost by design, to make the initial
experience of a player as boring and horrible as possible. Which doesn't fit the ur-narrative of the "level up" heroic fantasy, if you think about it. In the ur-narrative, the protagonist begins his or her heroic career usually in the middle of a contented or at least static life (Frodo Baggins, Luke Skywalker) but the hero's journey doesn't start with ten hours of killing household pests. It starts with a bang: with tension and high stakes, with ringwraiths and stormtroopers. If heroic fantasy was written to match a MMOG, nobody would ever get to Chapter Two. So I thought about that a bit more. Since there is going to be a MMOG game based on Tolkien's Middle Earth, I wondered a bit what the novel Lord of the Rings would look like if it were based on a Middle-Earth themed MMOG. Here's what I came up with: Continue reading "Middle-Earth Online: A Prediction" March 23, 2004 You Don't Know What You've Got Till It's Gone Invisible Adjunct is closing her blog and giving up her search for academic employment. There are two things I'm sure of. The first thing is that this is pretty solid evidence that academia collectively misses the boat sometimes when it comes to hiring the best and the brightest, or uses profoundly self-wounding criteria to filter between the elect and the discarded. I think anyone who read Invisible Adjunct's site could see unambiguous evidence that this was a person who had a productive, committed and deeply insightful understanding of what academic life is and what it could be, the kind of understanding I take to be intrinsically connected to effective teaching. There was also ample evidence of a fine scholarly mind on display, combining passion for her subjects of expertise with precision of knowledge. The second thing is the collective value of a good weblog. Invisible Adjunct's site is what made me excited about doing this for myself, and connected me to people who shared a moderate, proportionate, and reasonable critical perspective on academia--something that is hard to find anywhere, in the world of weblogs or anywhere else. I don't think there is anything else that even comes close to serving that function, and it is clear that it was possible not just because of the domain name or the topics, but because of the rich table of information and useful provocation that the host so regularly set and the tone of moderated principle she struck day after day. March 23, 2004 Waiting for Menchu Not for the
first time, a reported campus hate crime has
turned out to be a hoax, this time at Claremont. A part-time instructor
reported that her car had been vandalized and hate slogans scrawled on
it, sparking campus-wide efforts to confront racism at Claremont. It now
appears likely that the instructor did it herself. Ah-hah!,
say many
critics
of academic life and campus identity politics. This just proves that hate
crimes on campus are exaggerated and the culture of victimization has
run rampant. Nothing to see here, move along, say those
who remain deeply concerned about questions of racism and discrimination
within higher education. This exchange
reminds me in many ways of the debate over the fabrications of Rigoberta
Menchu. For many of the combatants, that affair became a battle first
and only briefly about Menchu herself and Guatemala, and more potently
about the ulterior motives of her defenders or her critics. For the critics,
it was evidence of the conscious, premeditated and instrumental lies of
the academic left; for the defenders, it was evidence of the lurking malevolence
of a conspiratorial right and the need to maintain solidarity in the face
of this threat. There were
more than a few people who also threaded the needle in between in some
manner, most prominently David Stoll, who revealed Menchus prevarications.
What struck me most powerfully was that Menchus real story, had
it been written in her autobiography, would still have been interesting
and valid and important and reasonable testimony to the struggles of Guatemalans
under military rule. The question for me was, Why did she, with
assistance from interlocutors, refashion herself into the most abject
and maximally oppressed subject that she could? The answer to that
question, the fault of that untruth, lies not so much in Menchu but in
her intended audience. Here I think
the academic left, that portion of it most invested in identity politics
(which is not the whole or necessarily even the majority of the academic
left), takes it on the chin. Menchu is what some of them most wanted,
a speaking subaltern. You build
a syllabus and go looking: is there any text, any material, that will
let you say, This is what illiterate peasant women in this place
think. This is what ordinary migrant laborers in 1940s South Africa thought.
This is what serfs in medieval Central Europe thought. This is what slaves
in classical Greece thought. You know those people existed and presume
they had thoughts, feelings, sentiments. You want those thoughts written
in teachable, usable, knowable form. You want
what people in my field call the African voice. If you dont
have it in the syllabus, in your talk, in your paper, in your book, somebodys
going to get up in the audience and say, Where is the authentic
African voice? and mutter dire imprecations when you say, I
dont have it. I cant find it. It doesnt exist.
You may quote or mention or study an African, or many, but if theyre
middle-class, or Westernized, or literate, or working for
the colonial state, somebodys going to tell you thats not
enough. The light of old anthropological quests for the pure untouched
native is going to shine through the tissue paper of more contemporary
theory. You may move into more troubled waters if you say, as you ought,
I dont need it and there isnt any such thing. Theres
just Africans and Europeans and anybody else: everything that anyone has
ever written or had written down about them is grist for my mill. A thousand
voices, no Voice. Some people
wanted Rigoberta Menchu. They wanted La Maxima Autentica, the most subalterny
subaltern ever. They bought her book, taught her book, willed her into
being. She fit. I dont blame Menchu for giving an audience its desire,
and I dont really blame the audience for that desire either. Its
not the highly conscious, totally instrumental, connivingly ideological
scheme that some on the right made it out to be. Its a needy hypothesis
gone deep into the intellectual preconscious, a torment over knowledge
unknowable. Somewhere there probably is a peasant woman who lived Menchus
fictional life, more or less. We dont have her book, her words,
and probably if we did or could, theyd be more morally complex,
more empirically ambivalent, more reflecting the lived contours of an
actuality (suffering included) than the searingly unambiguous jaccuse
that some visions of the world require. This is all
similar to when someone fabricates a hate crime on a campus, or exaggerates
the modestly offensive stupidity of a drunken student into the raving
malevolence of Bull Connor. There is an overdetermination here, an always-already
knowledge, a neediness. Of course some people are going to fabricate such
crimes, following the logic of a moral panic, a deep prior narrative,
a chronicle of a deed foretold. Everyone knows such crimes
existand of course (this is important) they do. But they
are presumed to exist more than they exist, they are needed to exist more
than they exist, because our received narratives of racial and sexual
injustice tell us that institutional and cultural racism is the iceberg
below the sea, an iceberg signaled by the visible tip of extraordinary
hate crimes. Crime has an intentionality that is tangible, a concretization:
from it we infer the concrete intentionality of what is hidden from view. So campuses
mobilize at every blackface, at every act of minor vandalism, at every
hostile word or mysterious epithet. The sign is given! But no one
knows how to deal with subtle, pervasive forms of discrimination, and
thats partly because the discourses we have available to us about
fighting discrimination hold that it is equally bad regardless of its
form or nature, that the harm suffered by being misrepresented, slighted,
overlooked, denigrated, condescended to is one part of a seamless and
unitary phenomenon that includes being lynched and put in the back of
the bus. And they are connected. They are part of a connected history,
but they are not the same. History contains continuity and rupture both. The gleeful
critics of campus politics roll their eyes at this equivalence and take
it as evidence of the triviality of the academic left. I agree with conservative
critics that its a mistake to stress the continuities between the
brutalities of Jim Crow and the subtleties of unconscious stereotype and
subtle exclusion in present practice, but this is not to say that the
latter is non-harmful, or just something to shrug off. One thing I learned
by being a white man living in a black country is that it is an incredible
psychic drain day after day to know that you are marked as a stranger,
as socially different, by mere fact of your physiognomy. It exacts a real
toll on you, and every subtle thing that people do to remind you of it,
without any malice, digs the psychic claws deeper and deeper. This innocent
wounding, this cumulative stigma, is the core of the problem. Many look
for, expect or anticipate hate crimes on campus as the visible signs of
a pervasive malevolence, an illegitimate system of holding power, as an
indication of a willfulness and agency that is the illegitimate and contestable
cause of the sense of alienation and unease that some students, some faculty,
some people have within white-majority campuses. Those crimes come less
often than predicted, and when they come, they mostly dont seem
to be the acts of Simon Legrees spiritual descendents, deliberate
acts rich in the intentionality of power, but accidents and oversights,
misspeech and crudity. Some see in these incidents the secret of racial
conspiracy revealed, rather like Eddie Murphys brilliant sketch
on Saturday Night Live where disguised as a white man, his character
finds that white people give each other free money and privilege once
racial minorities are out of sight. They overdeterminedly read a synecdoche,
a single moment that contains a hidden whole. And when the right number
and type of crimes do not come, some make them come, certain that even
if the incident is false, the deeper truth is not. Rigoberta Menchus real story is still interesting and powerful: a woman with some education, some status, some resources, some agency, in confrontation with a state and a social order, witness to terror and suffering. Its ambiguities are what could teach us, not its stridency. If we want to confront racial alienation on campuses, we will equally have to embrace its ambiguities, its subtleties, and recognize that it cannot be easily marched against, confronted, protested, forbidden by statute or code, expelled. It is in us, it is us, and the world has changed in the time we have all come into being and found ourselves where we do. It is not dogs and firehoses now, but small words and the pain of a thousand pinpricks. Until that is fully understood, there will be occasions where stressed, needy people tired of waiting for Godot try to summon into being the spirit whose ubiquity they have too busily prophesized. March 23, 2004 Via Pandagon, evidence that whatever was funny or smart about Dennis Miller has evaporated into dust and blown away. And I regret it, because I do think he was both funny and smart once upon a time. This judgement has nothing to do with ideology. I am perfectly prepared to credit and like some of the transformations in his own politics he's talked about in the media, presuming they're for real and not just somebody trying to make a career move; some of what he talks about resonates with me. But this is as shameful a meltdown as anything Dan Rather or anyone else has ever had on live media. Miller likes to talk as if he's got cojones: well, anybody with real balls would get up the night after pulling this kind of stuff and apologize unreservedly to his rapidly shrinking audience and to his guest. Been playing the full verson of Unreal Tournament 2004 the last few nights for about an hour or so each night (more than that and I feel like my eyeballs are bleeding). It's really good, at least the Onslaught mini-game, which is clearly influenced by Halo. What's nice is that though I haven't played an FPS for two years or so, I'm actually not a complete noob at it--I'm doing pretty well. It seems to me that multiplayer games like this only have a short "golden age", though, before cheats and hacks become widespread and cheeseball tactics take hold. Onslaught is pretty well designed to prevent some of the cheesiest tactics, like tank rushes, but I can already see a few stunts that could spoil the fun if lots of people start to pull them. Speaking of Unreal Tournament, the Penny Arcade guys have come up with one of the funniest and most spot-on summaries of the online world in general with this cartoon. March 22, 2004 Beyond the Five-Paragraph Essay When I hand
back analytic essays, I try to leave room to do a collective post-mortem
and talk about common problems or challenges that appeared in a number
of essays. I think it helps a lot to know that the comment you got is
a comment that other people got, and also to know how some people dealt
more successfully with the same issue. All anonymous, of course, and following
my Paula-like nature, nothing especially
brutal in terms of the actual grades dispensed. I usually base my comments on some scrawled meta-notes I keep as I work through each batch of essays. Sometimes there are unique problems that arise in relation to a particular essay question, which is sometimes a consequence of my having given enough rope for certain students to hang themselves in the phrasing of the question. Often there are problems Ive seen before and commented upon. Read the Rest of "Beyond the Five-Paragraph Essay" March 16, 2004 Readings and Rereadings #1 I've been meaning to use this blog to compel myself to tackle the backlog of 50 or so books of various kinds that are sitting on my "to be read" shelves. So here I go. What I plan to do in this part of the blog is put short or long reactions to some or all of a book--not to do formal "book reviews". Some of these may amount to no more thana paragraph. If I can stick to it, I hope to tackle 2-3 books a week this way most weeks. So here goes with number one: Noam Chomsky, Hegemony or Survival. March 16, 2004 Anybody else like Tripping the Rift? on the Sci-Fi channel? It's sort of like "Quark" meets "South Park". Obscene, a bit stupid at times, tries too hard, but still funny. I feel a fragging coming on: the demo for Unreal Tournament 2004 is kind of fun, especially the "Onslaught" game. Been a while since I've done this kind of thing: I may actually get the full edition. This is a mirror of a really fascinating archive of photos from the region around Chernobyl. March 15, 2004 Terrorist Tipping Points Following
up more prosaically on my thoughts about the
hard men, the atrocity of March 11th makes me think again
about what moves below the surface in the conflict with terrorism. Somebody
put those bombs on those trains in Spain, and yet that same somebody doesnt
wish to stand forward and be clearly identified, or tie these acts to
some concrete goal or demand. So someone someplace has a model of causality
in their head, that to do this thing without any clear public explanation
will still somehow produce a result they deem desirable. But what? A general
climate of fear? An unbalanced response by governments? A sick morale-booster
for terrorists embattled elsewhere? A victory for the Spanish opposition?
Or nothing more than a nihilistic desire to act somehow, with no real
conceptual belief about what action will accomplish? Particularly if it
turns out to be ETA that was responsible for March 11th (something that
is appearing increasingly unlikely) that last is about the only plausible
interpretation. What March
11th really demonstrated, however, is that any time a small group of people
decides to do something like this in the United States or Western Europe,
they probably can. Given the degree to which Americans have portrayed
al-Qaeda as boundlessly blood-thirsty and completely unprincipled, the
question of the day is thus not What will they do next? but
Why havent they done more? The answers, I think, are
uncomfortable. First, the
strength of US reaction to 9/11, particularly in Afghanistan, when we
were still focused on al-Qaeda and international terrorism rather than
the Administrations unhealthy obsession with Saddam Hussein, communicated
something very valuable and important, that major terrorist attacks would
have major consequences. Osama bin Laden and his lieutenants may have
reckoned that 9/11 would result in the lobbing of a few more cruise missiles
at deserted camps and innocuous factories in Sudan. Having seen that this
was incorrect, having suffered severe damage to their movement's fortunes,
they and others may hesitate to act again against civilians within the
domestic borders of the United States for fear of even graver consequences.
On the other hand, this is where March 11th is a sobering reminderbecause
it may demonstrate that a terrorist movement which has nothing left to
lose has no more fear of consequences. The worst atrocities might come
paradoxically when a terrorist organization is closest to being defeated.
Second, for
all of my anger at aspects of the Bush Administrations homeland
security initiatives, I still have to concede that many of the precautions
taken and the investigative work completed have made it more difficult
for existing terrorist cells in the United States to act. It is easy to
be cynical about all the orange alerts, not the least because the Administration
has been so willing to use security concerns to bolster its own narrow
partisan fortunes (not something a genuine War President ought
to do) but even Administration critics have to concede the very real possibility
that the alerts and accompanying measures have prevented one or more attacks. But that
still leaves us with one additional consideration, which is the possibility
that existing terrorist cells capable of acting have chosen not to act.
This is what is so difficult to calculate. Everyone is looking at the
political results in Spain and asking, Is that what the terrorists
wanted? Will that reward them? Precisely because we have to treat
terrorists as people with their own agency, making their own moral and
political choices, we have to consider the possibility that they might
refrain from attacking for any number of reasons, even including, impossible
as it seems, their own contradictory and hellishly incoherent form of
moral scruples. This is a
critical issue. Even in the best case scenario, we have to assume that
there are still people at large in the United States and Western Europe
who could stage terrorist attacks. Anybody who devotes even a small amount
of time to thinking of plausible targets knows that not only is there
a huge surplus of such targets, there must always be so in democratic
societies. The train attacks in Spain could easily have happened on Amtrak:
in the past ten months, Ive sat on Amtrak trains where people in
my car have left a backpack on a seat and gone to the bathroom or club
car, or so Ive assumed. If they were leaving a bomb instead, how
could any of us tell? Trains only scratch the surface: a hundred ghastly
scenarios spring to mind. Without any effort, I can think of ten things
that a handful of suicide bombers could do in the US or Western Europe
that would have devastating psychological and possibly even economic consequences
at the national and international level. If there are terrorist cells in the US and Western Europe capable of acting, and they have not acted, we can perhaps console ourselves that Afghanistan taught them to fear the consequences. We can also imagine perhaps that they are intimidated by security precautions, unimaginative in their choice of targets, or incompetent in their logistics. Far more, this all begs the question: what do they want, how do they imagine they will get it, and how does that dictate their actions? For all that it is soothingly simple to imagine them to be mindless killers who would commit any atrocity, we nevertheless face the complicated fact that they likely could have already committed atrocities beyond those already inflicted. What internal calculus tips a small group of men over to the commission of horror? There is no invasion force that can keep that tripwire permanently still: there is nothing to invade. The worst dilemma, however, is that we do not know and perhaps cannot know what the terms of that calculus are, whether it moves to action because of rigidity and repression or in the absence of it, whether it seeks anything concrete in terms of results or reactions. If it only seeks pure destruction and the maximization of pain, then I don't really understand why there have not been more attacks already. There must be more to it than that. March 10, 2004 Triumph
of the Will, or in the name of my father Because one
of the major themes of the book Im writing now is the nature of
human agency in historical processes, Ive been thinking a lot about
whether some individuals are able to act in the world through drawing
on unpredictable determination or mysterious inner strength, through a
ferocious desire to make things happen. Through will. Will gives
me a thrill. If theres anything in President Bushs defense
of his post-9/11 strategy that resonates in me, it is the invocation of
will, of a steely determination to stay the course. I know Im
weak and frightened. Ive always been. When I am traveling or working
in southern Africa, I preemptively flinch at even the slightest hint of
tension. In my first stay in Zimbabwe in 1990, when a policeman politely
but quite earnestly commented that he would have to shoot me if I didnt
stop walking while the presidents motorcade went past and then meaningfully
swiveled his gun towards me, I waited frozen and then returned to my apartment
instead of proceeding onto the archives. I crawled inside like a rabbit
frightened by predators, emerging only with the next day. I dont
mean to overstate. I have willingly gotten into strange and sometimes
threatening situations every time I have spent time in Africa. Not with
fearless bravado, rather with a kind of sweet and stupid cheerfulness,
a determination not to listen to the warning bells going off in the back
of my head. I listen to my anthropologist friends who programmatically
seek out opportunities to attend unnerving religious rituals and tense,
near-riotous political situations and I wonder wistfully why Im
so scared and theyre so brave. I know that
if it came to it, Id piss my pants in a minute. Big Brother wouldnt
need a cage full of rats on my face in Room 101 to get me to betray my
deepest commitments. I found that
out when I traveled with my father in South Africa. When we were confronted
with a rather trivial example of a shakedown by a corrupt official in
a game park, I was ready to unload my rands on the man in a country minute,
just because he had a knife and a walkie-talkie (and, I imagined, a bunch
of tsotsi pals waiting down the trail to ambush us). But Dad just
stared him down, and the guy caved. Yet here
I am willing, perpetually willing, to talk about what we ought to do in
a world where people want to kill us, want to kill me. What good am I?
Theres
more than one flavor of will in the world, though, and all of them can
make things happen that would not otherwise happen. Theres
a pure will to violence and survival thats a highly masculized combination
of sadomasochism and swagger. We mostly see it our fictions, in Rocky
films or in the umpteen thousandth time that Wolverine staggers through
a comic book stoically bearing the pain of a hundred knife thrusts to
his abdomen, but it really exists. The trick is not minding that
it hurts. Mostly in the real world this amounts to nothing: lacking
mutant powers or cinematic magic, the man of a thousand wounds usually
staggers towards death, perhaps performing some small miracle of salvation
or destruction on the way. Sometimes it is more, a person who shrugs off
pain and fear to stagger through to some better day. This kind
of will is related to but not identical to the soldiers will, the
will to fight when necessary or ordered, the will to act remorselessly
if need be, to defend what is yours and take what you must. My father
had some of that. When a crazy man with a gun killed people at another
branch of his law firm, Dad wished hed been there, believing that
he could have stayed calm under fire and stopped the man before anyone
died. Dad used to tell me how the Marines taught him to kill or disable
someone by striking their windpipe hard. I dont think any of this
was bravado, or something he was proud of. They were quiet facts, stated
calmly, based on a belief that if it came to it, he could do what was
needed without pause or regret. I believed him. The soldiers
will is not the will of the hard man. The hard man is the man who haunts
our nightmares. The hard man is the man who disproves the easy, lazy adage
that violence never solves anything or causes anything meaningful to happen.
The hard man can drive history like a whipmaster drives a horse, frothing,
eyes-rolling, galloping heedlessly ahead. The hard man dreams not of the
world he desires: his will is fire, and burns down thoughts of better
days. The hard man only knows what he does not want and cannot accept,
and his determination to strike out against the object of his fury is
mighty. The hard man bombs pubs and buildings and planes; he cuts ears
off defeated rivals, hands off innocent children, heads off journalists.
When we think of will, the hard man is the one we both fear and yet sometimes secretly desire. He laughs contemptuously at the doubts that afflict us, sure that he floats above us like an iron balloon, unyielding and untouched. We forget too easily why fascism authentically, legitimately attracted many before 1939: not just the purity of its conception of nation, not just its focus on essence, but also the hardness and clarity of its commitment to transformation, its baptismal yearnings. The hard
man's close cousin is the fierce dreamer, the obdurate idealist, the person
who looks at today and can only see the ways in which it is not some ideal
tomorrow. I may be too quick to accuse some of utopianism--that will require
some reflection--but I do not think I am wrong to fear the utopian's will
and regard with suspicion anything redolent of it. None of these
are the will to do the right thing even if all the world says otherwise.
To do the right thing, but not quickly, not eagerly, not with braying
certainty. The will to do the right thing comes from men and women bound
by honor, directed by wisdom, burdened by a mournful understanding of
their duty. Atticus Finch does not rush ahead, beating his chest and howling
a war cry. Will Kane seeks allies and the support of his community, even
when he wearily understands that he is all alone. There is no eagerness
in him. The lonesome righteous can make horrible mistakes, auto-imprisoning
himself in obligations, like Captain Vere in Billy Budd. He or
she can end up staring with melancholy regret at his dirty hands. This
is the kind of will I most admire, the kind of courage which stealthily
rises to lift the whole world on its shoulders and reluctantly hurl it
into a new orbit. Against the hard man, we raise the quiet man as his
opposite. Dad may have
had the resolve of a soldier, but he also had this kind of determination
as well. He would have stayed the course even if he was the last person
left to hold the rudder. There was a rawness to his integrity: it was
like sandpaper, flaying the sensitive nerve-endings of some around him.
It was uncompromising both when it ought to have been and sometimes perhaps
when it would have been better to bend rather than break. Nor was he tested
as sorely as some have been: he never had to risk his own career, his
livelihood, his future the way that some whistleblowers have. I think
he would have, though, if it had ever come to it. This is the
will I long for now, and its not what were getting. Oh, theyd
like to have us think so, but the lonesome righteous doesnt scorn
allies, doesnt rush to the last stand at the OK Corral. He does
his best to avoid the fatal breach in the social order. He doesnt
talk tough and swagger. Id trust in Atticus Finch, not Napoleon. Id trust in Omar Bradley, not George Patton. I wont trust the hard men or the men who playact at being hard men, those who theatrically declare they will be stopped by nothing. I wont listen to the men who shake their heads sadly at our inability to travel all the way to atrocity, who tell us we must act by any means necessary. But neither will I trust those who lack the will to justice, the will to fight if they must, the will to defend, those who snidely declare in advance that they will blow with the least wind and worry more about their own personal purity than the larger obligations of our times. I may be weak and frightened, but Im not having any of that. Ill trust in the people who both love and defend; Ill trust in the will of the fierce and quiet. Ill listen for the distant echoes of my fathers footsteps. March 3, 2004 Battle of the Moms Lots of recent
online (and, I suspect, offline) discussion about Caitlin
Flanagans article in the Atlantic Monthly that criticizes
working women and praises stay-at-home mothers. At least
some of the bad juju circulating in those discussions (and Flanagans
piece) concerns settling old scores within feminism. There are many who
have never forgiven the feminists of the 1970s for the evident disdain
they demonstrated towards middle-class women who remained in the home.
With good reason: women who felt liberated from domesticity tended to
falsely assume that all women should want the same. Just as a matter of
politics, that mistake was costly, alienating many women who might have
been sympathetic to a more loosely conceptualized feminism. The womens
movement has never really recovered from that blunder, losing the sympathy
both of middle-class women who have chosen domesticity and working-class
women for whom the workplace is not liberation but brutal necessity. Taking a
step back, its interesting that the conversation continues to pit
two sets of women against each other, each vying for the title of best
mother, each notably defensive about their own choices and lives
while projecting the charge of defensiveness onto their opponents. Its
a struggle thats hard to imagine between men about fatherhood, for
a lot of reasons. For one, theres a wider plurality of archetypes
of the good father out there: men can get kudos for being domestic and
attentive or being strong and above the fray, for being breadwinners or
slackers. Its also clear that men dont fight about fatherhood
because they dont feel defined by it: the battle over manhood is
sited somewhere else. Women, on the other hand, cant escape motherhood
even when theyre not mothers: they are held accountable to it by
society, and hold each other to it as well. There are
brush fires that burn in the struggle over parentingsay, for example,
the question of whether or not to ferberize kids. (We tried it, and it
didnt really work for us, both in terms of the emotional impact
it had on us and our daughter, and in terms of results.) Then theres
the wildfire of staying at home versus day care versus nannies. In either
case, the small or the large, everyone involved would gain a lot of perspective
by reading Ann
Hulberts Raising America, a history of advice aimed at
American parents by various experts. One thing I take away from Hulberts
book is a confidence that kids are resilient, that the parental choices
we treat as momentous have far less import that we might guess. Another
thing I take away is a wisdom about how remarkably stable the long-term
terms of contestation over parenting (permissive vs. strict, involved
vs. distant) has been within the American middle-class, and how much those
contests are about middle-class manners and self-presentation rather than
a disinterested evaluation of the development of children. One thing
in Flanagans piece and the reaction to it where I feel a bit distant
from almost everyone in the debate has to do with Flanagans charge
that middle-class feminists are exploiting and thus betraying other women
by using them as domestics and nannies. In a way, its a silly point,
because its awfully hard to contain to domesticity. Whats
the difference between a once-a-month cleaning service and all the other
kinds of service jobs that the middle-class makes use of? If the charge
of exploitation attaches generically to domestic work (not to specific
low-wage conditions of employment), then it attaches to all service-industry
labor and Flanagans critique is suddenly a lot less about child-raising
and much more a back-door socialism. But I feel
differently about it also because Ive spent a substantial amount
of time living in southern Africa. During my first fieldwork in Zimbabwe,
I was intensely phobic about domestic service, and felt as Flanagan does,
that it was exploitation. Id read Maids
and Madams, I knew that domestic workers in southern Africa were
exploited. So I was determined to wash all my own clothes and clean my
own apartment (there were no laundromats in Harare, even in the good old
days of the late 1980s and early 1990s). The family
who lived in the small home behind my apartment building had a different
opinion about domestic service, since they provided it for everyone else
in the building. From their perspective, I was a selfish prick. I could
pay to have my clothes cleaned, but here I was occupying a unit in the
building and refusing to employ them. They werent at all happy about
it, and once I became aware of that, I really didnt know what to
do. I went on washing my underwear in the bathtub but grew more and more
puzzled about my reluctance to do what virtually everyone around me regarded
as the right thing, including local leftists I knew whose commitment to
fighting racial segregation and colonialism had been deep and abiding
for the entirety of their lives. I began to
realize that it really wasnt about exploitation for methat
was just a superficial thing, a cheap ideology, a slogan, and not at all
consistent with my casual willingness to take advantage of other peoples
affordable labor in other spheres of my life. What
it boiled down to was that I was intensely uncomfortable about having
strangers inside my domestic space. Not racially phobic, but generically,
universally so. I didnt want any people seeing my dirty clothes,
my books, my things, my way of life, if they werent very close friends
or family. I still feel that way, actually. For a very long time, I blocked
my wife from hiring a once-a-month comprehensive cleaning service for
this same reason, even though we were finding it increasingly impossible
to handle that kind of cleaning with a toddler around. I just didnt
want them seeing the normal material conditions of my life. (I still dont
allow them in my home office). I was eventually convinced--and view that
service like any other comfort in my life provided by human labor, made
possible because I earn more than the people whose labor I purchase. I
do it because I can. If I don't like it, that's for different reasons
entirely. I wonder a little if the stay-at-home moms argument doesnt come from some of the same attempts to assert privacy, to cocoon some of our lives away from the world, to close the circle of family and shield ourselves from the world. I have some of that same attitude myselfbut Id like to avoid draping myself in laurel leaves and anointing myself Ace Exploitation-Fighter for having what is ultimately less a principle and more a phobia. February 24, 2004 Purge the PIRGs The discussion
of Ralph Nader online has produced an interesting eddy in its wake, namely
an equally passionate attack on Public Interest Research Groups (PIRGs),
which Nader played a role in founding. I dont
actually map my feelings about PIRGs onto Nader, though their mutual connection
doesnt help me get warm and fuzzy about either of them. In many
ways, I object more fundamentally to PIRGs. Theyre a scam. Like
Jane Galt, I first reached that conclusion as a canvasser for a PIRG
one summer in the early 1980s. I only lasted about three weeks before
the toxic combination of viciously exploitative labor practices and a
recognition of the organizations total lack of concern for political
commitment completely alienated me. If you pounded the pavement all evening
but came in just shy of quota, you didnt get paid at all for your
work. The people running the canvassing operation had zero interest in
the issues or the ideas: they were in manner or functioning little different
than the boss of a sweatshop factory floor. Keep the money rolling in
and send it along to the state organization: that was the sole priority.
The spiel we were told to memorize was a frankly deceptive presentation
of the organization and its activities. PIRGs have a long habit of parasitically
attaching themselves to legislation and claiming credit for itand
only if they deem it something fuzzy and blandly liberal enough that it
is likely to raise more money or garner good publicity. Theres no
coherent agenda beyond that, and never has been. My antipathy
deepened when a PIRG came rolling into town at Wesleyan University, when
I was an undergraduate, seeking an automatic fee attached to the tuition
bill. The whole presentation was slimy both in content and style. First,
they dangled an internship in front of student officers, and then they
shifted abruptly to left-baiting and bullying when anyone (a class of
people that most definitely included me at that point) asked why on earth
a PIRG should get an automated chunk of money every year when no other
student group had the privilegea chunk of money which would be completely
non-transparently spent, moreover. As a magnaminous gesture, they finally
offered a system where you could come and get a refund of your PIRG money
if you were willing to show up at a basement office during a one-day window
once every academic year and ask for it. This is all standard for PIRGs
then and now: they push for mandatory fees, and accept as a fall-back
an opt-out. Its
not just that PIRGs are sleazy in their fund-raising and opportunism.
Reading Jane Galts essay, I wonder a bit at whether they havent
played a subtle but important role over two decades in disillusioning
young liberals and leftists and driving them rightward as a result. Based on my own experience and the experience of people close to me, Id say that liberal nonprofits in general are usually not what they seem on the outside, or at least, rarely apply their outward convictions to internal matters. They often have unfair, exploitative or even discriminatory labor practices. Theyre often intensely hierarchical, non-democratic and non-transparent in their internal organization. But PIRGs are in a class of their own. At least with something like the ACLU or Amnesty International, whatever their internal cultures are like, they stand for something consistent politically and socially. PIRGs dont even have that. February 23, 2004 The Old
Man and the Flame The inner
flamer. Its such a temptation to let it loose. I feel like Black
Bolt of the Inhumans: if I but speak, half the city could be destroyed.
In my salad
days, I could crack off a mighty flame. Ah! In the Usenet days of alt.society.generation-x,
when the resident objectivist could drive me to the dark side of the Force
with a single post. Or rec.arts.startrek.current, when all it took to
set me off was the resident feminist Voyager fan praising
Captain Janeway and telling all the critics that they were misogynists
for hating the show. Many Shubs and Zuuls knew what it was to be roasted
in the depths of the Sloar that day I can tell you. These days,
theres only one moment where I feel completely and gloriously justified
in letting Mr. Hyde off his leash, and thats in conversations dealing
with Ralph Nader and his defenders. Not at Nader himself, really, because
its obvious what his problem is. Its the people who still
defend him and proudly announce they voted for him in 2000 and theyll
do it again who drive me out of my tree. Theyre a miniscule number
of people overall, and not really that influentialbut I suppose
they could be just influential enough, which is very bad. As I said over
at Chun
the Unavoidables, the incoherent mish-mash of justifications
for voting Nader, as well as the complete shamelessness of those offering
them, just brings out the worst in me. I sometimes
wonder why I cant flame more often, or when exactly it was that
I developed a helpless compulsion to fairness. Maybe theres something
to this notion that the older you get, if you get more and more comfortable
and attached to responsibilities, the higher the cost of acting up, the
more you become a kept creature of the system. Maybe Ive just become
The Man. Maybe. Id
like to think its something more, that it is about taking the ethical
responsibilities of my profession seriouslysomething that I feel
the usual Punch-and-Judy responses of both right and left inside and outside
of academia dont do, no matter how strenuously they claim to. More
pressingly, its about efficacy, about how you make your convictions
meaningful and powerful in the world. The flamer
really has only a few roads to efficacy and influence. There's one in
which he or she forces everyone to accept his or her view through command
over institutional power (in which case the flame itself is no more than
a calling card for other kinds of compulsion). There's another in which
achieving results in the world doesnt matter, in which only the
unsullied narcissistic purity of expression is the issue. I agree that
the latter view sometimes produces beautiful prosea brilliantly
written flame, curse or diatribe is a pleasure to read. So thank god for
the occasional narcissist, but only if they also happen to be brilliantly
bilious stylists. I suppose sometimes the flamer might hope to change the behavior or views of others through shame, and thats the one time I still think its worth it to let the beast out (as I do against Nader voters): when only outrage and defiance has a hope of breaking through a wall of denial and stupidity. That's my only defense in that case: Nader voters appear so unpersuadable by any other means--in fact to be proud of their near-total invulnerability to any persuasion--that there's nothing left besides flinging feces at them. There are others on the American political landscape similarly cursed with mule-headedness, but I don't flame them because either I don't understand or know them well enough to be sure of their unpersuadability (whereas I feel like I understand Nader voters very well) or because, frankly, they're powerful enough numerically or culturally that it's important to keep trying to push the boulder up the hill no matter how Sisyphean the task. That's one other thing a flame can do: when your cause is lost and hopeless and yet you are still certain that you are right, a flame can be the last thing you do before defeat, a refusal to go gentle into that good night. In that case, a flame is an admission of fatal weakness and should be read as such. Perhaps that's me and Nader voters: I know nothing can stop them so why the hell not scream at them, just to get my own catharsis.Finally,
the flamer can be a blackmailer who demands he or she be given what he
or she wants or or he or she will lay waste to any possibility of a reasonable
exchange between equals. Thats the Ann Coulter approach to the public
sphere: I am become Flamer, Destroyer of Worlds. Being addicted
to mediation and fairness, to exploration of complexity, is actually pretty
exhausting. You get a lot of shit from everyone in all directions, and
very little thanks for it. Some days Id rather be an anarchic free
spirit, rather go back to dancing in private glee after dropping the bomb
on the weakest link, the most suspect argument, the most clueless fool,
rather go back to being the hairy eyebrowed bombthrower hurling discord
from the back of the room. This other guy who usually comes out to play
here and elsewhere in my professional life, well, hes not the guy
I imagined Id become. Hes dour and perpetually disappointed
in the weaknesses of himself and even more of other people. In one virtual
community I have participated in, a departing member who took the time
to spew venom on his way out said that I was a person who wanted to be
superior to other people and liked by them because of it. I remember that
because theres something to it. I suppose its actually confirmation
of its accuracy that I dont think its all that terrible a
thing to be. I also admit that a commitment to reasonable persuasiveness
and unvarnished if polite truth-telling can often be a quite satisfyingly
contrarian, dissenting, provocative thing in its own right. Still, flaming seems a more glorious, free thing to be and do. It would be liberating to stop bothering to instruct, cajole, plead, work with, mediate and persuade, to worry about nothing but ones own blazing righteousness and care little for the consequences or the results. Thats rather like voting for Nader. Which reminds me of why I really stopped doing it, because I saw again and again that when even a few people flame, the whole discursive house burns down. February 20, 2004 On How to be a Tool I just saw
a call for a March 3rd rally against the Comcast-Disney merger led by
PennPIRG, Media Tank, Prometheus Radio Project, the Communication Workers
of America, and Jobs with Justice. Joining this
rally is about as good an example of being a tool as I can think of. Media
monopolization is a real issue, but rushing to the barricades to defend
Disney from Comcast is about the worst possible way I can think of to
deal with the overall problem. Disney executives ought to come outside
and give anyone at the rally $10.00 coupons to the Disney Store in thanks.
The fact that PennPIRG is apparently the key organizer just reinforces
my low opinion of the opportunistic and amateurish nature of PIRGs in
general. Its
actually hard to know who to sympathize with in the Comcast-Disney struggle.
Ive had a low opinion of Comcasts services for a while. Their
technical management of their high-speed Internet service after Excite@home
went belly-up was horrible. The hysterially overwrought, manipulative
drumbeat of attacks against satellite dishes on Comcast channels is a
pretty good argument against media monopolization. Their current level
of service in their On Demand offerings are beyond lame. Its
no wonder they want to acquire Disney to provision content, because the
content that they generate themselves is usually one bare step above the
kinds of public-access channels that have recently released mental patients
whove gone off their meds hosting talk shows. If Comcast succeeds,
expect a whole lot of problems of integration between the two operations:
the learning curve will be by far the steepest on the Comcast side. On the other
hand, if Disney shareholders cant see that Michael Eisner and his
inner circle of sycophants is dragging the company down, they arent
paying attention and deserve to lose value on their investment as a result.
Any parent with young children can see it: the company has foolishly surrendered
the long-term stability of the high ground in childrens media by
relentlessly cannibalizing its own properties, spewing a tide of made-for-video
junk that permanently degrades the value of their most lucrative properties.
There are so many misfires coming out of Disney lately that its
a wonder that there have been any successes like Pirates of the
Caribbean at all. It used to be that you could see a real difference
between the weak storytelling and cheaper animation of non-Disney kidvid,
as in the work of Don Bluth. Now Disney has voluntarily sunk to the bottom
in pursuit of a few quick bucks. Tack on to that Eisners evident
inability to attract, recognize and maintain talent, almost certainly
because of his own authoritarian management style, and you have a company
that is heading for a serious crisis. If I owned a lot of stock in Disney,
Id sure want to give Eisner the boot, and if it took Comcast to
do it, I might well cheer them on. It probably isnt going to be a story that ends happily ever after for anyone, least of all the consumersbut in a world where theres a lot to protest (including media monopolization) being a puppet for Michael Eisner strikes me as a low priority. February 20, 2004 Quicksilver and Foucault I am finally
almost done with Neal Stephensons Quicksilver
(just in time for the sequel!) Stephenson reminded me of why I find early
modern Europe so interesting, but also of why the work of Michel Foucault
was so appealing to me and to many other historians when we first encountered
it. It is easy
to label postmodernism as a single agglomerated villain and attribute
to it every bad thing in the past thirty years. It gets blamed (sometimes
in one breath by the same person) for dragging intellectuals into total
irrelevance and for accomplishing a devastatingly comprehensive subversion
of Western civilization. In academic arguments, a generalized postmodernism
often functions as an all-purpose boogeyman in declensionist narratives,
the singular explanation for why the young turks arent as good as
the old farts. (Though that may be shifting: the genuinely ardent postmodernists
are themselves becoming the old farts, and will presumably shortly be
blaming something else for falling standards.) This general
posture allows people to get away with some appalling know-nothingism
at times. When reading E.O. Wilsons
Consilience, I was excited at first by his ambitions to achieve
the unification of knowledge, to re-create the practice of
the Enlightenment when science and philosophy, interpretation and empiricism,
were joined together. Then I began to realize that Wilson meant unification
roughly the same way that Hitler meant to unify the Sudetenland with Germany.
Nowhere was this more evident in his treatment of Foucault. Wilson basically
admits that he just read a bit of his work, haphazardly, and thought Come
on, get over it, things arent so bad. I say all
this as someone who does often talk about an agglomerated postmodernism
rather loosely, and who certainly views it quite critically. I reject
almost all of the deeper ontological claims of most postmodernists and
poststructuralists, and I find the epistemologies that many of them propose
crippling, useless or pernicious. And yes, I think that a lot of them
are bad writers, though lets leave that perennial favorite alone
for once. But I still recognize the ontological challenge that postmodernism,
broadly defined, offers as a very serious, substantial and rigorous one.
Nor do I just brush off the epistemological challenges that postmodernists
have laid out: theyre real and theyre important. (Though yes,
at some point, I think its perfectly fair to say, Yeah, I
get it, I get it and move on to other things. Youre not required
to read and read and read.) The thing I regret most about casual rejectionism of a loosely conceptualized postmodernism (or any body of theory) is that it seems to deny that it is possible to read a single work and extract some insight or inspiration from it that is not really what the authors full theory or argument is meant to lead you to. It's rather like one of the professors who I encountered in graduate school who would circle words or terms he didn't like and ominously ask, "Do you want to be tarred with that brush?" It's a theory of citation as contagion. Taken in
totality, I think Foucault is doing his damnedest to avoid being pinned
down to any particular vision of praxis or anything that might be summarized
as a theory, in a way that can be terribly coy and frustrating.
Inasmuch as he can be said to have an overall philosophy, I find it despairingly
futilitarian and barren, and I accept very little of the overall vision.
Taken instead as a body of inconsistent or contradictory suggestions,
insights, and gestures, his work is fairly fertile for historians. If nothing
else, he opened up a whole range of new subjects for historical investigation
from entirely new angles: institutions like prisons or medicine and their
practices, forms of personhood and subjectivity, and sexuality. Its
interesting that the historical work which Foucault inspired often ended
up documenting that he was wrong on the actual details and often even
the overall arguments, but even then, you can clearly see how generative
that his choices of subjects were. What Foucault
does most for me comes from his attempt to write genealogies instead of
histories, his attempt to escape forcing the past as the necessary precursor
to the present, to break the iron chain and let the past be itself. Thats
what brings me back to Stephensons Quicksilver and early
modern Europe in general. The temptation
is so powerful to understand early modern Europe as the root of what we
are now, and everything within it as the embryonic present, all its organs
already there, waiting to grow and be born. But what I find so dizzying
and seductive about the period is also its intimate unfamiliarity, its
comfortable strangeness. I dont feel as epistemologically and morally
burdened by alterity as I do when Im dealing with precolonial African
societies, where theres so much groundwork seemingly required to
gain the same sense of interior perspective, but on the other hand, I
always feel that around every corner in early modern European societies
the familiar makes itself strange right before my eyes. The genesis of
the present but also the possibilities of other histories; the world we
have inherited but also all its dopplegangers and ghosts. Thats
what I feel Foucaults idea of genealogies helped me to explore and
understand, and what I think Stephenson manages to deliver in Quicksilver.
The thrill of natural philosophy unbinding the world, so much a part of
the more whiggish history of science is there, but also its distance.
The Royal Society are ur-geeks and proto-technophiles and yet, theyre
also aliens. Jack Shaftoe is the libertarian dream, the free man cutting
loose of the constricted world around himbut hes also the
passive, drifting inhabitant of a commercial and social landscape unlike
anything we know today, to whom events happen, recapitulating the narrative
structure of the picaresque. Reading Quicksilver is like wearing
bifocals: you can switch in and out of being able to locate yourself within
its episteme. Im not entirely sure its a good modern
novel, really, nor is it good historybut it is a good genealogy
as well as genealogical simulation of the narrative roots of the novel
form. This isnt a pleasure limited to representations of the early modern world: Jeff Vandermeers edited anthology of pseudo-Victorian/Edwardian medical prose, The Thackery T. Lambshead Pocket Guide to Eccentric and Discredited Diseases delivers some of the same satisfactions through simulation (rather like Philadelphias Mutter Museum does by simply showing you the medical artifacts and exhibitionary vision of the same era). But simulations or explorations of the Victorian usually feel much more like recursions of the present than trips to a fever-dream alternative universe. Quicksilver, like Foucault, travels farther and tries harder to give us a way of representing the early modern European world that doesnt just make into a toddler version of our own times. February 18, 2004 And Now For Something Completely Different Well, not quite--I see my colleague Prue Schran has joined the conversation about Swarthmore and speech. Actually, I quite agree with a lot of her observations--they're relevant to what I was writing about in "A Pox on Both Houses", as well as some older posts at Easily Distracted about the conservatives-in-academia question. But attitudes and formal speech policy are absolutely not the same thing, and if attitudes rather than policy are the issue, the lever that will move them really is subtle, sympathetic moral and intellectual suasion, or at least that's my feeling. Feeling restricted or ostracized by the pervasive attitudes or unspoken orthodoxies of colleagues is very different than being formally restrained by a quasi-legal code--though of course the existence of the former phenomenon is why it is hard to trust to any procedures outlined in formal policy. There's also the more arcane issue of how to untangle policies on harassment and speech. I think FIRE is overly sanguine both about how easy it is to disaggregate them, either legally or conceptually. Also, O'Connor offers some extra tricky arguments on top of that about the alleged legal invulnerability of academic institutions to federal employment law (is that really true? Where's the Volokh Conspiracy when you need it?) and the legal freedom of colleges and universities to restrict speech if they want unless they otherwise claim that they're not restricting speech, in which case O'Connor sees them as open to legal claims of fraud. At that point my head spins a bit: if colleges have published speech codes or harassment policies which O'Connor and FIRE say clearly and unrestrainedly restrict freedom of speech, and O'Connor acknowledges that colleges and universities are legally free to do so, then by their reading, wouldn't a charge of fraud be legally untenable? Where's the fraud if you have a published speech code that restricts speech and you're legally free to do it? Unless, of course, the kind of thing I've been suggesting is true, that there is a reading of many college policies as also trying, authentically, to promise academic freedom, and that it is the authenticity of that intent which makes its contradiction by speech codes potentially fraudulent. Maybe this is an indication that the only solid ground for challenging speech codes is a moral or ethical one--that we shouldn't have codes because they're wrong to have, because liberty is the bedrock value of academic life, and leave the legal issues aside. That's certainly one of FIRE and O'Connor's most salient consistent observations, that whatever their merits or problems on paper, faculty- or administration-authored speech codes are basically a case of amateurs meddling in the construction of bodies of pseudo-law, hoping to direct the power of a quasi-state entity (their institution) to regulate local behavior. Anyway, on to more diverting things. A couple days ago, my toddler and I found a great new thing at Noggin's website, called ScribbleVision. Basically, you color in a bunch of things and then ScribbleVision animates your colorings in a series of scenes featuring the hand-puppet characters Oobi and Grampu. It's one of those things that will very rapidly have the adults taking the mouse away from the kids. I was especially proud of my scene of Sauron's Lidless Eye dawning over Oobi's house, with a demonic rooster crowing in the background. Let's say that my impression of Oobi and Grampu's animated actions and expressions changed somewhat against that backdrop. February 17, 2004 The Argument Clinic (Apologies to Monty Python) There is a real difference between my reading and Erin OConnors reading of Swarthmores policies on speech, one which may reflect some very deep differences in the ways we approach working with the interpretation of texts and much else as a whole. There are
also stylistic differences: Im long-winded, obsessed with nuance
and ambiguity, and uninterested in calling people to the barricades even
when there is an evidently urgent need to get them there. OConnor
is trying to mobilize people, and to do so with as much immediacy and
intensity as she can. On the whole, I think we agree about a lot of the
problems facing academia, and in particular, about the dangers to speech
rights in academia today. OConnors way of framing these issues
is certainly much more powerful in getting people to acknowledge and confront
those dangers. But I still worry about collateral damage on the way. Sometimes,
I think complexity really is important, not just as an aesthetic preference
but as the heart and soul of an issue. Perhaps on speech rights, what
is more important is the root principle of the matter, and assertions
of complexity are an unhelpful distraction. I would rather build bridges
and mediate between opposing sides, playing for small positional gains.
OConnor would rather burn bridges and achieve victory in our time.
You make the call, dear reader. There are reasons to prefer either approach,
and reasons to think that in either case, we are kids with hammers who
think everything in the world looks like a nail. OConnor
raises some real potential problems with Swarthmores policies, most
of which we broadly share with all colleges, and indeed, all institutional
entities with sexual harassment or anti-discrimination policies. Here are
three probing questions that I think are pretty cogent that I get out
of OConnors second post on this subject:
I have a
straightforward answer to the first question, which is that as I read
it and understand it, our policy on non-harassing speech takes precedence
over everything else, that it is the largest and most expansive principle
we assert on the issue of speech. Harassment (sexual, general, discriminatory)
is only a situational, contextual exception from the general principle,
and only becomes meaningful when it can be proven to exist according to
a defined set of precise criteria. In this sense, harassment under Swarthmores
policy functions rather like the defamation or incitement to violence
functions in relation to the First Amendment. The First Amendment is the
bedrock principle; defamation or incitement are special cases which restrict
speech only in relation to a judicial finding, and only within narrowly
constrained and defined bounds. They exert no prior restraint: you cannot
in advance define particular acts of speech, particular words, particular
phrases as defamation or incitement. Its all about context. If you
take Swarthmores policies on harassment to completely cancel out
or obviate the existence of a comprehensive protection of speech in our
policy, as OConnor does, then you are basically setting yourself
up as a free speech absolutist in general, and arguing that any circumstantial
restriction on speech annihilates a foundational protection for speech,
that the existence of libel laws definitionally and intrinsically cancels
out the First Amendment. You can make that case, and some do. I think
its incorrect. Im not clear if this is OConnors
general position on speech rights. I might also
note that to take this position is to argue that Swarthmore (or any other
college) can never actually articulate a policy that sanctions harassment
which makes reference to speech acts. Id actually be curious to
see whether OConnor thinks that it is notionally possible for a
university to reserve the right to expel a student who personally harasses
another student on a repeated basis but commits no direct violence against
them. If one student followed another student around campus saying, Faggot.
Faggot. Faggot continuously for a week, is there any legitimate
grounds for saying, Listen, thats a problem that goes
beyond moral persuasion directed at the harasser? If so, is there any
way to construct a policy that legitimizes administrative action without
making reference to speech? We went out of our way, at any rate, to avoid
defining that speech as a class of speech like hate speech
which would be definable without reference to context. In fact, it doesnt
really matter what one community member says to another if theres
a finding of general harassment here: the content of the speech is irrelevant.
If the content is irrelevant, I really think its not about a restriction
on speech. Except for
the sexual harassment and discriminatory harassment policies, and here
I can only reiterate that I believeI hopeour general protection
of speech is firmly understood to be the bedrock principle that has precedence
over those policies. On the second question, of whether the sexual harassment policy is a ticking time bomb or slippery slope, in particular because it is adjudicated through a grievance procedure which has no due process protections as theyre commonly understood, well, thats a real point. Its my big problem with most such policies on college campuses, and the major place where they are typically mischieviously misused. OConnor is right to say that I essentially trust my colleagues and my institution and trust that nothing will go wrong, but its also right to suggest that this is a flawed approach. I agree here that we share in common with most academic institutions a serious problem that could well obliterate any of the best intentions of our policies. I would also underscore, as I did in my first post on this subject, that I regard hostile environment standards as intrinsically dangerous. (Though I suppose here too I wonder whether O'Connor thinks that there is anything that would consistitute sexual harassment besides quid-pro-quo, and how you could identify it in a policy without reference to speech acts.) On the other
hand, I think OConnor simply shrugs off the question of legal exposure
and liabilityand easy as it would be for me to do so, I have enough
empathy for those who have a legal responsibility to safeguard the resources
and wealth of this institution to recognize that you cant have a
revolution in one country on these issues. Barring a serious statutory
reform of harassment law in general, it is insane for any single institution
to voluntarily expose itself to serious liability by failing to conform
to existing legal standards, whatever the weakness of those standards.
On the third
question, I have to confess that Im busily inquiring about just
where the policy statement on discriminatory harassment came from. I remember
the debate on the general harassment policy and the question of hate
speech, and how we came to the policy we have. I remember the same
for the sexual harassment policy. But Im honestly puzzled about
this smaller statement, and where it came from, particularly because it
seems more pressingly contradictory to the statement on general harassment
and speech rights. Id sum up by saying, however, that I really think OConnor simply doesnt give Swarthmore enough credit for drafting a policy which is actually quite different from the campus norm, and which actually intended to repudiate the idea of a speech code, with its prior restraint on defined classes of speech acts. I don't see the policy as a "trojan horse" with sinister conspirators inside, much less see myself as one of the Greeks waiting to pillage. As I see our existing policy, students here could hold all the affirmative action bake sales they like without any fear of sanction or judicial action by the college against them (though not without fear of being criticized for doing so). OConnor chooses to portray me as a person who conveniently "pretends" otherwise. No, I just think its more complicated than she lets on, and that there is as much reason for optimism as there is for criticism, that the devilat least in this caseis in the details. February 17, 2004 Thanks for Playing Well, at
least this time, Erin
OConnor has it really wrong. Swarthmore
has no speech code. The community specifically rejected having a speech
code when we considered the issue. We specifically insisted that speech
which might be regarded by some as offensive is non-adjudicable, and underscored
that the college administration can bring no sanction against individuals
for what they say regardless of how offensive it might seem to others. There is
only one way that speech can be an adjudicable issue at Swarthmore, and
that is if it is part of a repeated, persistent attempt by one individual
to personally harass another individual. The standards for this are very
precisely enunciated in our policy on general harassment. You cannot generically
harass a social group or identity. There is no one-time instance of general
harassmenta single statement cannot be taken by one individual to
represent an act of general harassment by another individual directed
at them: it must be persistent and repeated. Our sexual
harassment policy, from which OConnor draws almost all of her quotes,
was adopted at a different point from our general speech and harassment
policy, and I agree has a few emphases which differ from the overall policy,
in that it states that it is possible for a one-time action to represent
a hostile environment against which someone might have a grievance.
Three things are worth noting about this policy, however. First, the general
speech policy supercedes it, as I understand things, e.g., the specific
protections granted free speech are the most important policy dictates
we have on this subject, and the sexual harassment policy does not contradict
or contravene those protections. Second, the sexual harassment policy
contains an important qualifier which OConnor notably fails to cite:
The perception of conduct or expression as harassing does not necessarily
constitute sexual harassment, and goes on to state that every complaint
must be carefully examined on its own merits. No statement or idea or
expression is categorically identified, outside of the context of a specific
complaint, as prohibited or punishable. A grievant is directed to ask
a perceived harasser to stop, and if they do not do so, is given the option
to pursue a grievance procedurebut there is no a priori finding
that any given expression creates a hostile environment. Third, I would
note that aspects of this policy take the form that they do in order to
achieve compliance with existing federal law on sexual harassment: if
there is an issue here, it is an issue whose locus is far beyond this
campus. This is not
to say that Im entirely comfortable with the content of this specific
policy: I found it overly broad in several respects when the faculty voted
on it, and Im especially concerned about the ways a hostile
environment standard can and has been misused on college campusesbut
it is specifically the hostile environment standard which
federal law has legitimated. To expressly repudiate it in college policy
is an invitation to a devastating liability claim against the college
at some future date, because it would place the college at odds with a
clear body of legal precedent. (When institutions or employers lose such
lawsuits, it is often precisely on the grounds that they were made aware
of a hostile environment and did nothing to correct it. Were we to state
outright that we reject that a hostile environment can actually exist,
wed be wide open to such a finding.) Still, I
have to stress again that the impression OConnor gives about even
this aspect of the sexual harassment policy is downright wrong even beyond
her mischaracterization of it as an overall policy governing speech. A
Swarthmore student or member of the faculty expressly cannot be punished
merely for saying something that has the characteristics described in
the sexual harassment policywhich OConnor implies. There is
nothing adjudicable unless there is a grievance from a specific grievant,
and that grievance must meet the specific test of being harassment with
specifically sexual intent. John Ashcroft couldnt file a grievance
against Arthur Schlesinger under Swarthmore policy unless he thought Schlesinger
was making a quid-pro-quo demand for sexual favors from Ashcroft or if
Schlesinger was making Swarthmore a hostile working environment in a sexually
demeaning way. (Since neither of them work here, the hostile environment
standard wouldnt apply in any event.) Let me quote
from the Swarthmore College policy statement on uncivil or demeaning non-harassing
speech, since OConnor didnt see fit to share this with her
readers (although speechcodes.org
does reprint this policy in full): As a member of Swarthmore College, one's moral responsibilities extend beyond formally sanctionable conduct. All of us, therefore, have a responsibility not to indulge in gratuitous offensive expression just because it may not be subject to official sanctions. Anonymous offensive expression is generally inexcusable, but the risk of harm in making adjudicable all forms of offensive expression would not only outweigh the benefits of official proscription, it would also seriously endanger academic freedom." "Even when individuals (or groups) admit authorship, however, they act irresponsibly if they are unwilling to engage in a defense of their views, especially with those targeted. Perpetrators of alleged non-adjudicable but uncivil expression should engage the objects of their attacks through discussion and, possibly, mediation. If they do not, however, no disciplinary action will be taken, though College officials or anyone else may publicly decry the content and manner of such expression." "It
needs stressing again that the College will in no way formally discourage
any argument, provided it does not include threats of violence, though
what is said may be deplorable and very possibly more diatribe that argument.
Thats not a speech code. Its the antithesis of a speech code. Its a specific protection extended to speech, and a specific forbidding of judicial and quasi-judicial forms of sanction against speech by the administration or the community. February 16, 2004 A Pox on Both Houses, or Conservatives in Academia (again) Its
Punch and Judy Show time, with academic blogs trading knocks over the
question of whether conservatives are discriminated against in academia.
Let me once again go over some of the important complexities that seem
to me to be absent from most of the discussion.
Now add some new points about the latest wave of discussion on this issue:
February 11, 2004 Short notes 1. Regarding my earlier woes with my home PC, to my amazement, PestPatrol's tech support gave me a fairly simple command line to run through one PestPatrol utility to fix the aftermath of cleansing SahAgent off our home PC, and it worked, restoring all Internet functionality without any further difficulties. That is just about the first time ever that a tech support person has been able to give me straightforward advice that had straightforwardly good results. I've been reading up more about Winsock2 Layered Service Provider spyware like SahAgent and if anything I'm more appalled than I was before. Is there any question in anyone's mind that this sort of thing should be illegal? I don't see any difference between it and a virus in terms of destructive intent. 2. I'm fairly embroiled in an interesting discussion of what makes for a good childhood over at Crooked Timber--very high quality conversation, particularly the comments from Russell Arben Fox. Short summary of my arguments: I don't like "prosocial" children's programming, which I've hammered at before in Saturday Morning Fever. Not even Veggie Tales. And I let my daughter play "Morrowind" and "Neverwinter Nights" with me on my PC, monster-slaying and all. (When the PC is working.) Anyone who fears for her future goodness, take heart: she won't let me steal anything with my thief character and consistently tells me that I shouldn't kill monsters. Unless they're mean ones. 3. Responding to Laura over at Apt. 11D: I do most of the cooking, clean the dishes about 25% of the time, do all the stereotypically manly jobs like assembling toys and furniture or lifting heavy objects, dress and diaper if I'm closest (and did most of the diapering and baby care from months 3-12) and many other sundry acts of parenting. I also read the bedtime story. I am sorry to admit that I aggressively evade doing the laundry as for some reason I pathologically hate doing it. I would say I definitely don't pull 50% of the domestic weight, so yeah, I kind of suck. But I also think I'm more one of those slacker boys Laura is talking about who has cut back at work to spend time with family rather than the Type A achievement-chaser, which maybe I once was to a greater degree. Which is, I'm beginning to sense, a more complicated choice in its professional consequences and ego-impact than it first appears. No wonder men (and Type-A superwomen) get all angsty and weird at middle-age. 4. Quite a trollish thread over at Kuro5hin on blogging. My short response to the troll who kicked it off would be that yes, of course most personal webpages of any kind are banal. That's hardly a new thing, nor a result of Moveable Type. I remember very well that one of the reasons Justin Hall's links.net, whose tenth anniversary makes me feel very old and decrepit, got such a readership at the outset--it wasn't just nekkid pictures and stories about sex and drugs that drew people, but also that almost every other "home page" out there was a bunch of links to other links and nothing more, while Justin was putting up new and interesting material almost every day. Content then and now is king, and can come from anywhere, whether a blog or Atlantic Monthly Online. Blogs that originate content are more interesting to me, and more what I aspire to for myself, than blogs that do nothing more than link to content elsewhere. But even in collective banality, there are interesting things to see and think about. Even at their worst, the Web in general and blogs in specific represent an interesting sociological aggregate, a way to track the massed preoccupations of various constituencies and the movements of various memes. February 3, 2004 From Hell's Heart I Stab At Thee Well, somehow
my wife took an accidental wrong turn while web-surfing on my home PC,
I think because she misspelled a common domain name for a children's media
site. I came home to find something squatting malevolently in the computer
called SahAgent, which seems related to Gator. Busily infesting
our PC, it kept popping up advertising windows every few minutes into
the desktop while keeping a log of all our web-surfing and downloading
or unzipping more and more executables of various kinds that wanted access
to the Internet. I rushed to get an application Id used once before
to search for spyware called PestPatrol (I know, youre all screaming,
Use AdAware instead, dummy! Be really careful removing SahAgent,
idiot! I am today a bit more knowledgeable than I was on Friday.)
PestPatrol quickly recognized and then supposedly straightforwardly cleaned
the system of tons of SahAgent-associated crap (also lots of things related
to a driver called WildTangent that I think I may have foolishly
allowed on the machine when visiting Shockwaves site and playing
games there.) Unfortunately
that was also the end of our home Internet functionality altogether: browser,
email, everything has gone bye-bye. PestPatrols tech support has
some ideas that Ill try tonight, but I have the bad feeling Im
going to end up reinstalling Windows from scratch. Bye-bye two days of
my life if so. I know, I know, all the techier-than-thou out there are
rolling their eyes and saying, Use Linux, asshole or Thats
your fault for using Internet Explorer, fool. Blaming the victim,
guys. I find myself so gobsmacked at the very nature of the experience (like so many others before me). If I happened to dial a wrong number on my telephone, and the mere act of doing so more or less destroyed my telephone, I suspect there would be real legal consequences for whomever was keeping the telephone-destroying answering machine out there. There are some strange differences in both law and practice in the case of computers and the Internet that to me seem inexplicable. With SahAgent or Gator or what have you, somehow, somewhere, somebody real is making real money by hijacking other peoples computers and sending them advertisements whether they want them or not, downloading software involuntarily onto their machines and the like, and yet that person or persons is basically legally untouchable. Somebody somewhere is paying to squat on domains that are misspellings, just waiting for an accidental visitor so they can seize control of their computers. Whoever these people are, theyve cost me time and money. Theyre going to end up costing Microsoft money as well, because Ive been weighing whether having a PC in order to play games and get access to a relatively wide range of software is worth the hassle, and this has pretty well decided itits probably not worth it, and our next machine may be something else, while my gaming shifts to consoles. (PC games are dying, anyway.) Grrr.
recent blog (June 2003-January 2004) stale blog (May 2003-November 2002) |
timothy
burke swarthmore college Recent Entries Preparing a Place in the Museum of Failure
Readings and Rereadings Slater,
Opening Skinner's Box The Story of Evolution and the Evolution of Stories Cliopatria Entries
What the Polls Can't
Tell You The Jackdaw Nest Building the Liberal Arts Faculty The Digital Divide is a Red Herring Irrelevant, Irresponsible and Proud of It: My Perspective on Cultural Studies
9/11: A Painful Hesitancy, October 2001 Welcome to Swarthmore: August 2001
Beyond the Five-Paragraph Essay
Regular Reading Games * Design * Art * Culture
All materials at Easily Distracted are copyright Timothy Burke. Please do not duplicate without permission or attribution. Interpretation Theory Capstone syllabus: current draft Social History of Consumption syllabus: current draft Theories of Agency: a presentation to the Bryn Mawr Emergence Working Group
Want to contact me? Email me at tburke1 @ swarthmore.edu
|