The Wayback Machine - http://web.archive.org/web/20040823080643/http://www.swarthmore.edu:80/SocSci/tburke1/


August 20, 2004

The Rule of Four and the Romance of the Professorial Life

I just finished The Rule of Four over the weekend. It was pretty weak stuff, and I’m not clear why it got the reviews or attention that it did. It’s mostly a padded and inauthentic coming-of-age narrative mixed some alumni nostalgia for the (exaggeratedly portrayed) student culture at Princeton, topped off with a bit of unimaginatively warmed-over Name of the Rose. The inner voice and experiences of the main character didn’t remind me of any undergraduate I’ve ever taught, met, or been.

One thing I did get out of it is that the authors carried away a woefully inaccurate sense of academic culture from their time at Princeton, but one whose inaccuracies are drawn from some deeper archetypical representations of academia and professors.

Academics know which authors who do get it right, or somewhat right when they write about academia: David Lodge, Jane Smiley, Randall Jarrell, and many of these sorts of satiric or comic treatments of academic life have a wider readership. There are also hysterically wrong portrayals of academics that I don’t think anyone regards as credible or intended as such—did anyone besides me see Lou Gossett Jr. as an anthropologist in a television mystery series a few years back?

But The Rule of Four is another matter. It draws on an older, deeper expectation that people have about academics, that they are engaged in a romantic, eccentric, and often dangerously obsessive quest for truth and knowledge. The humanities professors and students in The Rule of Four are all driven by the thirst for discovery—they’re basically Indiana Jones and Belloq dueling amid the dusty library stacks, trying to be the first person to properly understand a 15th Century manuscript.

There are so few aspects of work in the humanities that are or ever have been like this. There’s an occasional flare-up of that kind of drama-queen “I have the secret to all knowledge locked in my office” stuff around Shakespeare, particularly between around the issue of the “true” authorship of his plays. But mostly, it’s not about discovery in that old romantic, explorers-in-unknown-lands sense. Even science isn’t like that operationally: the era of discovery, always something of a tarted-up mythology suitable largely for third-grade hagiographies of Newton, Curie and Edison, has got nothing to do with contemporary scientific research.

The Rule of Four is even worse on the details. One of the key turns in the plot is a nefarious professor plotting to take credit for the work of an undergraduate by preemptively accusing the undergraduate of plagiarism. Now something like this actually does happen, though it’s mostly professors vs. graduate students, and without the florid conspiracies. What’s wrong about it in the book is that the professor in question has been keeping up a long correspondence with an academic journal about a major article that he’s planning to submit (it will be the plagiarized version of the undergraduate’s research). Um, I just don’t think any journals in the humanities are waiting with baited breath for years for the hot scoop that some professor has promised them, no matter how prestigious the academic in question is. There isn’t a humanities equivalent to Nature, a journal that publishes work which is read avidly not just by academics but others waiting to hear announcements of major new discoveries about books, culture or philosophy.

There’s also another character in the book, a graduate student, lurking about who is planning on applying for tenure-track posts using the same tactic, claiming responsibility for the undergraduate’s research, and he’s been drafting letters to various prestigious universities telling them that his research is almost done and he will soon be available if they want to hire him. I can tell you where a letter like that would go even if its graduate student author could plausibly claim to have found a new book of the Bible hidden inside some shopping lists written by Thomas Aquinas: way back into a file that would never be looked at again. Or the garbage can.

Now some of this is just part of the general clumsiness of this particular book. But I do think that this is still what a lot of Americans think academics are—basically a combination of the Nutty Professor, Professor Kingley from The Paper Chase, Dr. Frankenstein, and various and sundry novelistic alcoholic and lechers. People with secrets, people with strange and monastic passions, people with eccentric manners and esoteric knowledge, people who are sometimes horribly unprincipled but usually in an ethereal and otherwordly way. It’s not utterly wrong, but it’s not especially true either.

I wonder how much of this image, which is in an odd way complimentary—it makes the academic into a kind of liminal creature, a modern shaman—has to do with the savage disappointment that one class of academic hopeful experiences after several years of graduate work. I know I had a touch of this idea in me when I started my Ph.D, that one of the things drawing me to African history was the idea of “discovering” things which were unknown, and one of the things drawing me to academia as a whole was my perception that it combined the aesthetic freedoms and personal expressiveness that we associate with writers or artists with the austere purity of a social institution devoted to knowledge. Naïve, I know, but I do think that was in the back of my mind somewhere.

There are many things that I really do see as inadequate or flawed about contemporary academia, particularly the way it goes about reproducing itself through graduate education. But this one disappointment I don’t hold academia especially responsible for. Most institutions have a romance connected to them that gets shattered in the face of the banal, humdrum reality of their everyday functioning. I hear all the time from undergraduates who’ve been disabused of their hopeful fantasies about politics or government work, about nonprofit and charitable organizations, about development work in the Third World, about K-12 education. I even hear some disappointment from former students who went to work on Wall Street or other big businesses, but at least there is a shorter distance between the exalted image and the grubby reality in those cases.

When someone is bitter about academia for that reason alone—that it isn’t about the pure, passionate, eccentric pursuit of truth and beauty, that being a professor isn’t like being a free-spirit writer or artist who also gets a health care package and a regular salary, I don’t know what to say. I think the image is a lovely one. I had it in mind myself a bit when I chose this. I’d like there to be more freedom, more passion, even more honest eccentricity in academic life. (Though not the murderous rivalries that this leads to in The Rule of Four, of course.) But anyone who has an angry bone to pick with academic institutions that is meant to be a serious call to reform or change has got to have more (and there is more, much more, that could be had) on their bill of particulars.

[permalink]


August 18, 2004

Nothing R Us

Last week, the news came out that Toys R Us is planning to sell off its toy business. This sounds like a businessplan by Magritte: Ceci n’est pas un toy store.

This would turn Toys R Us into Babies R Us, which is apparently the only thing they’re doing that’s making a profit. The explanation offered by most analysts was that the company couldn’t keep up with Wal-Mart and other huge superstore discounters.

No doubt some companies do just plain old get beaten at their own game by a new player who does it better, but many just fumble the ball all by themselves. Toys R Us might be about to get out of the toy business simply because they’re really bad at it. I’ve had to spend more time in their hallowed halls than I might like in recent years, and the simple truth is, they suck. Their stores are usually dirty and badly laid out. Finding a clerk to ring you out—don’t even think of asking for help with the merchandise—can be a real chore. Their selection of merchandise is spotty and inconsistent.

Their ecommerce division is even worse. The Toys R Us neightborhood of Amazon.com is the one domain of Amazon’s that is almost guaranteed to have problems. I’ve given up ordering from them. (I’ve tried getting things for my daughter and also (blush) action figures for my own collection.) They suddenly cancel open orders or allow you to order goods they don’t have and never intend to get, even when you can find the things you ordered if you go to a brick-and-mortar store.

I can think of a lot of other retailers that have sabotaged themselves pretty effectively. Our local supermarket, Genuardi’s, is a great example. When it opened near us, we were very happy. Before that, when I first arrived at Swarthmore, the nearest grocers were uniformly horrible, and then Genuardi's came and saved the day. At first, the Genuardi’s was a smallish chain with high standards, but not a gourmet or boutique grocer like Wholefoods. It was well-kept, well-managed, good quality ordinary produce and meats.

Then Safeway bought out Genuardi’s and proceeded to pretty well destroy it. The worst thing they did was to aggressively move in a big line of their own branded generics in every part of the store, removing many brands previously stocked. This unsurprisingly drove many customers away, particularly because the Safeway brand was often inferior. I quickly got to the point that I wouldn’t buy it even when it was actually pretty good, just out of annoyance. The meat and produce standards went way down. Many brands of goods I rely on and buy regularly began to be stocked inconsistently. The whole store was reorganized for no particular reason. Safeway has apologized to its Genuardi’s customers, but I haven’t seen any actual changes to the stores themselves—if anything, standards have gone down even further since 2002. Nor have they brought back many brands that customers want. I actually tried calling their complaints line once to report specific brand absences—a phone number advertised on "apology" banners within the store—and got buried in an automated phone maze, which I took to be deliberate.

I suppose in terms of “competition”, this is what tennis players call an “unforced error”. If Toys R Us were a better toy store, I think they wouldn’t have so much of a problem with Wal-Mart. Since they suck, most people figure, “Why not go to Target or Wal-Mart? It isn’t like Toys R Us is more reliable, or cheaper, or even especially fun to walk around and shop in.” If Safeway had just bought Genuardi’s and left it alone, or added product without subtracting others, they’d be doing great. But some corporate wizard decided that if you dump product lines and replace them with your own, you double-dip your profits. Not if people stop buying altogether: Genuardi’s hasn’t sold a single cold-cut to me since they dumped Boar’s Head for their own inferior Safeway brand. (Yeah, I know the official line: they didn’t dump Boar’s Head, Boar’s Head withdrew when Safeway stuck its own line in. Amounts to the same thing: Safeway chose to push their crap on me at the cost of letting me buy the brand I want.) When you make me go to two or three or four stores in order to get what I feel is minimum acceptable quality, you make me start looking for a comprehensive alternative, besides losing all that business.

I always find it amusing when someone claims that privatizing a government service will make it more efficient and responsive. What they’re confusing is the abstract efficiencies that market mechanisms and competition produce—which I agree they do produce—and the internal organizational character of corporations. Corporations are often inefficient, lumbering bureaucracies no different from government bureaucracies in their ability to make, conceal and mendaciously defend bad decisions, no different in the way they allow middle-managers of few talents to fumble the ball and screw up the lives of thousands of employees, not to mention disrupt the everyday existence of numerous customers. Dilbert was funny for a reason (before it got stale and old): because that’s the way most corporations are, really are. At some moments, “Office Space” is more documentary than fiction. The interesting thing is, capitalist idealism aside, a giant lumbering company can fester to the brim with dysfunctional crap and still survive on a combination of inertia, accumulated capital and crony capitalist manipulation of public policy for decades.

Yes, there are business failures that I think are much more mechanistic, much less contingent. Krispy Kreme is clearly struggling with an overly rapid pace of expansion; the same thing happened to Boston Market a while back. There’s a kind of automatic, unplanned character to that sort of problem: it’s like bread dough rising too much when you put in too much yeast.

Yes, there are smart companies that simply outmaneuver and destroy their rivals by being smarter and better when their rivals haven’t done anything particularly wrong. (Or by being more ruthless and amoral, like Wal-Mart is with their employment policies. That’s something Wal-Mart can’t just wash away by giving money to NPR.)
But I think a lot of business problems incubate in the dysfunctionality and unresponsiveness of corporations as social institutions. There’s no reason to prefer them to government bureaucracies—in fact, they’re much worse, because as “private” institutions, they can hide information about their own incompetence and malfeasance much more effectively.

I’ve thought about this a little in some more formal and scholarly contexts. The business-case method favored in MBA programs pays a lot of attention to failure as well as success, and a lot of business consultation touts reorganization and innovation (sometimes in ways that simply creates a new layer of middle-management crap to interfere with common sense). It seems to me that a lot of economic history, business history, history of advertising and related fields could do the same. One of the things we don’t study enough is failure, and when we do, we tend to study huge, messy, Enron-level earth-shaking failures. But I suspect the smaller failures—the ad campaign that falls flat, the marketing decision that goes haywire, the product line that dies on the vine, the business venture that sprouts and goes to mold within two years—are by far the more compelling thing to study, and could introduce a healthy dose of contingency and agency to both the orthodox core of business and economic history and to left-leaning critiques of the history of capitalism. I even think there's considerable room for a quasi-Foucauldian take on the history of the corporation as institution: Dilbert and "Office Space" are already half-way there, as are John Bruce's tales of corporate bureaucracy.

[permalink]


August 13, 2004

Crisis on Earth-Fanboy

Jason Craft does a wonderful job of explaining why DC Comics’ new mini-series Identity Crisis is both really interesting and deeply disturbing for me all at once.

I’ve written before about my wish that the purveyors of what Craft calls “proprietary, persistent, large-scale fiction systems” (I really like his terminology) try to align their fictional conception of ordinary humanity and everyday life with their representations of the fantastic and superhuman.

The author of this new mini-series, Brad Meltzer, is trying to do just that, I think. Beware of what you wish for, because you may get it.

He’s got a very fresh approach to a lot of comic-book tropes: his big fight scene in the third issue of the mini-series is a compelling reconfiguration of the same-old same-old of superheroic battle, professionalizing it in rather the same way a writer of police procedurals might with descriptions of police work.

But Meltzer is also going to one of the deepest tropes of superhero fiction—the secret identity—and positing that the only reason the bad guys haven’t figured out what the secret identities of the superheroes are is that the superheroes have been magically erasing those memories systematically every time such a discovery is made.

It’s not as if this kind of thing hasn’t happened from time to time in comics, but it’s usually because of an accident—the supervillain learns the secret and then immediately does something like fall all off a cliff and suffers brain damage, that kind of thing. Occasionally there's the villain who knows the secret but won't tell other bad guys or act on it out of some kind of arrogant anti-hero sense of honor. It’s also not as if clever writers haven’t played with the issue, whether it’s Frank Miller’s run on Daredevil where he showed just how devastating it might be if a bad guy found out who the good guy was, or John Byrne’s somewhat silly if amusing one-shot suggestion that Lex Luthor would arrogantly reject the idea that Superman could want to be an ordinary person and so erase the conclusions of a researcher that Clark Kent and Superman were the same.

Closer to the mark of the current series, James Robinson told a story in Starman that hinted at three heroes murdering a villain who knew their identities and threatened their families. Meltzer is pretty well going balls to the wall with this theme, though. The story posits that villains find out who the good guys really are on a routine basis and then frequently threaten their families, and that the heroes have an organized conspiracy to erase that knowledge on an equally routine basis.

That’s interesting enough. But he goes from there to somewhere that is good, consistent storytelling and yet really squicks me out. So far, the wife and ex-wife of two superheroes have been horribly murdered by an unknown suspect. We’ve also found out that one of the two women was brutally raped by a supervillain in the past, which led the good guys to administer a magical lobotomy to the villain in question.

There’s just something in me that says this is really not a good place to go, that maybe one of the essential fictions of comics is that somehow, for some reason, it’s really hard for most people to guess who a superhero really is. Maybe I’m wrong. I’ve been really interested in how almost all the successful comic-book inspired films of recent years have essentially used the revelation of the hero’s identity to the villain and/or to friends and loved ones as an almost routinely climactic moment. Both of Tim Burton's Batman films had the hero’s secret revealed. Both Spiderman films have done the same. The comics have kept pace with this to some extent. In current issues of Batman it’s getting hard to remember which of his enemies doesn’t know the secret. Lois Lane is married to Superman now. Wally West (aka The Flash) has an identity known to everyone. If you couple the secret identity being less of a storytelling fetish with the rise in supervillains whose villainy is much more consciously if hyperexaggeratedly modeled on “real-world” criminality, you have to ask why the bad guys don’t do what the old-style heroes always feared the villains would do, and that’s target the hero’s loved ones.

The line that Meltzer crossed that might be hard for me to accept is rape. It virtually doesn’t ever happen in the standard comics. The few times that writers have tip-toed in that direction—Mike Grell, in a truly ugly and unnecessary part of his Green Arrow mini-series, or Alan Moore in The Killing Joke—there’s been some attempt to keep it out of frame, implied, contained. With Meltzer it is pretty much front-and-center, though not at all voyeuristically depicted. And because his story attempts to normalize and distribute so many of its proposed revisions of the superhero canon, you’re left asking, “Well, why is this the only time that has happened, given how bad these bad guys are?” And then I find myself thinking I just don’t want to pick up a regular, ordinary, non-pornographic superhero comic from Marvel or DC to find that superheroines are getting raped on a regular basis.

It’s a really well-done comic-book story. It seems to speak to some of the problems I have with superhero comics. And yet I find myself sort of wishing that it hadn’t been done, or it had been safely contained as some kind of “imaginary” or “alternate reality” story.

[permalink]


August 10, 2004

Al-Qaeda on the Inside

So far, I haven’t seen much conversation about Alan Cullison’s fascinating article in the current issue of the Atlantic Monthly (available online to subscribers only now, unfortunately) that centers on information he gleaned from an al-Qaeda laptop acquired right after the fall of the Taliban in Afghanistan.

Perhaps that’s because the information in the article tends to be discomforting for both the ardent defenders of the Bush Administration and some of its strongest detractors.

Cullison’s account fills what I feel is an extraordinary gap in our national—indeed, our international—discourse about al-Qaeda in specific and militant Islamacists in general. There are some interesting intellectual histories of contemporary Islamic fundamentalism out there, including Paul Berman’s contentious linking of fascism to Islamacism. There are some good institutional histories of the spread of particular Islamic educational and ideological projects under Saudi patronage. There are good accounts of the social roots of Islamicism in contemporary Arab nations, and of the role of the war against the Soviets in Afghanistan in providing actual military experience to future jihadis.

But there’s almost nothing that really gets ethnographically inside of an organization like al-Qaeda, that gives us a good model of how they think and operate on a day-to-day basis. All we have had so far is a lot of loose talk about Islamofascism from people who have zero curiosity about the enemy they propose to fight, or on the other side of things, a lot of lazy assumptions about the relationship between terrorism and past U.S. hegemony, as if US policy is a kind of “just add water, create terrorism” thing. In one case, we have terrorists as remorselessly unidimensional, in the other case, as people without real agency who exist as a kind of social formation produced automatically and monolithically by events.

Cullison discovered some interesting things on the laptop he acquired that finally begin to flesh out the complex reality more meaningfully. On one hand, it seems to me that defenders of the Bush Administration’s “war on terror” can actually come out of the article armed with some new support for their views. First, it’s very clear that 9/11 was not a strategic aberration, and that the current security alerts may well be warranted and legitimate, that al-Qaeda, whatever it is and however it is constituted, intends to attack the United States, Western Europe and indeed “Western” influences wherever it can, however it can. If 9/11 wasn’t convincing enough, Cullison’s information should convince more: al-Qaeda’s plans for terrorism are serious, substantial and of long-standing.

More to the point, much of what Cullison found tends to confirm something that George Bush and his associates have said since 9/11, and sometimes been mocked for saying, that al-Qaeda’s principal motivations for planning attacks against the West have a great deal to do with abstract hatred for Western freedoms. Cullsion found, for example, that news broadcasts from the West were carefully saved and compiled on the laptop by al-Qaeda observers, but that the image of female newscasters was always covered over. More generally, I see considerable evidence in what Cullison describes of a non-negotiatble philosophy of total struggle against the West. There’s nothing as tangible and achievable as a simple withdrawal from Saudi Arabia or a simple ending of support for corrupt Arab autocracies here. It might be that those moves would undercut the larger popular enthusiasm for Islamcism in parts of the Arab world, but they would do nothing to placate the core of al-Qaeda’s membership as it stood in late fall 2001. There's also some very interesting and sometimes rather funny material that indicates that al-Qaeda has been actively trying to figure out how to obscure the differences between its members and other Muslims or Arabs and has given serious thought to how to move unmolested across borders and through airports.

Now on the other hand, you can’t just take what you want from the article and ignore the rest. If you go to it and find support for the proposition that the fight against al-Qaeda really is total war, and that a tight focus on homeland security is justified, you have to also deal with another fact that the article extensively documents: that the strongest hope that some al-Qaeda members took into planning for 9/11 is that the United States would respond over-aggressively and clumsily to the attack and entrap itself in a no-win war close to where Islamicist insurgents might inflict heavy and continuing damage on the Americans. In other words, what many critics of the Iraq War said before the invasion, that the Iraq War would turn out to serve al-Qaeda’s interests, to grant al-Qaeda's fondest wish, appears to be something that al-Qaeda also believed.

It doesn’t mean that this is necessarily true—another thing that the article does wonderfully is to capture al-Qaeda’s leaders as fallible and capable of serious miscalculation, financial mismanagement and petty in-fighting over small perogatives—but I think the article, read seriously and honestly, is yet another nail in the coffin of the war in Iraq, and yet more confirmation that anyone serious about the war against terrorism should have been against that war from the outset, and should turn against it now.

[permalink]


August 6, 2004

Well, it looks like the college is going to implement Moveable Type on our server, so once it is, I'll do the work of migrating onto that. I'm also informed that someone is kindly aggregating me already over at Bloglines, so if you're looking for RSS before I take care of it myself through MT, there you go.

 

A brief rant: I was expecting to receive my DVD of Dr. Syn: Alias the Scarecrow this week, which is truly one of the most memorable things I've ever seen on TV. It's an old Wonderful World of Disney special featuring Patrick McGoohan as a Robin-Hoodish smuggler, but it's astonishingly atmospheric, spooky, and compelling. Well, for reasons unknown, Disney abruptly cancelled the scheduled release of the DVD, which had been heavily preordered at Amazon. All I can say is that they'd better get their act together. Get the thing out or give me some detailed explanation of what the hold-up is. Don't make me hold my toddler's taste for Disney stuff hostage, because I'm just crazy enough to do that. Want to sell more toddler underwear with Princess Ariel on it, Eisner? Then get Dr. Syn out the door. NOW.


August 6, 2004

Powerpoint, Presentations and Persuasion

In the two years, I’ve attended a number of talks, workshops and conferences where scientists or “hard” social scientists were the dominant presence. Up to that point, I’d always had a kind of envy for what I assumed were the advantages of a conference format where PowerPoint and poster sessions ruled the day. I thought that such a format would allow presenters to get to the good stuff quickly and efficiently, to integrate visual material into presentations easily, and to open up the general conversation and mutual learning processes.

Another beautiful hypothesis slain by an ugly fact.

I knew about some of the common criticisms about PowerPoint, and I’ve recently been enlightened further by some guidance from Eric Behrens. Seeing it used intensively, I give a lot of credit to those criticisms, and feel much less envy for meetings dominated by its use. The one thing that I still really like about PowerPoint and software like it is the ability to integrate visual information into a presentation—a skilled user can make images, films, graphs and so on be a part of an argument or presentation, rather than an illustrative sideline to it. In particular, it doesn’t seem to me to be any better at engendering discussion or conversation between a presenter and an audience—which seems to me ought to be one of the major points to have a conference rather than to simply post fifty presentations on a web site and let people consume them remotely.

It’s not like I prefer the norm at a humanities or social science conference any better. Technology rarely enters in any form: papers get read, generally woodenly, by their authors. Few authors bother to write a paper that is meant to be read, instead taking a draft of a journal article or chapter and skipping passages as they go, usually being forced to hurry more and more near the end. Most of the time, there’s as little conversation about the presentation as I’ve seen at science meetings.

It finally did occur to me that even if PowerPoint didn’t have some of the conceptual problems that it does, it would still be a problem for most humanities and social science presentations.

I was recently reading an interesting essay “What Is Originality in the Humanities and Social Sciences?” in the April 2004 issue of the American Sociological Review, by Michelle Lamont, Joshua Guetzkow and Gregoire Mallard that brought this home to me. (Not available online). The article crossed my desk because I was one of the informants interviewed for it, due to my work with the Social Science Research Council and the American Council of Learned Societies.

The researchers were looking at how academics involved in judging competitive research grants defined and operationalized “excellence”. Among the things they found was that humanists (including most historians) tended to regard the originality of a proposal as a moral attribute that couldn’t be easily distinguished from the character of the author of the proposal, that originality wasn’t a property that could be neutrally disaggregated from the rhetoric and structure of the proposal itself. The “harder” social scientists, in contrast, tended to be have a wider set of metrics for understanding originality that included this kind of intertwining of an author and an idea, but which also potentially appreciated an original approach or hypothesis on its own merits.

That struck me as more or less true, and more or less a reasonable description of some of the ways I’ve operated myself as a judge of proposals. It’s rare that I read a proposal by a historian where some hypothesis or evidentiary finding simply stands on its own, valuable by itself. It’s always tied into the craftwork of the author, the ways in which they write and think, the form their arguments take, the integrity of their use of material. I can think of many historical monographs where two authors are making a similar “finding” but where one monograph seems absolutely original to me because it’s written compellingly and confidently and the other seems dull and tedious to me because it’s imitative, derivative and evasive, because the author doesn’t seem to understand why what they’re saying is potentially interesting.

Take this passage from an essay called “’Voyage Through the Multiverse’: Contested Canadian Identities”, recently quoted by John Holbo over at his blog:

"Here, I want to look at the ways in which Canadian rap and dub poetry make and reconfigure the boundaries of Canada and Canadianness - those contested spaces that often lose their intelligibility outside of state managerial apparatus. But I am interested in how both dub poetry and rap music are often positioned as not constituting "Canadianness" given how rap and dub poetry disrupt and contest the category "Canadian." I am also interested in how state administrative practices aid in positioning blackness as both part of and outside of the state's various forms of management and containment. Blackness is then understood as having a diploctical relation to nation in its resistance and complicity; and its performances are also regarded as something otherwise.”

Like John, I wince while reading it. So imagine instead that this passage said something like this instead:

“Canadians know that they live in a multicultural society, but also are conventionally portrayed as The Great White North, a country of Caucasian Mounties and beer-drinkers. Many outsiders—and perhaps some Canadians—might regard “Canadian rap” as a humorous oxymoron. The point is not to protest angrily that there is too Canadian rap, and then demand that it be taken seriously and incorporated wholesomely within an official multiculturalism. Because Canadian rap is itself not entirely sure that it is or wants to be Canadian, or in what ways, and neither is the civil society or state to which it relates. It is a good example of the characteristic ambiguities of much global popular culture: of the nation and outside of it, posed as resistance but also as eager for incorporation and acceptance.”

Same argument, same “finding”, as I see it. But I know which of the two I’d be attracted to if I were handing out the money, and it’s not just because the second passage is my own paraphrase. In either case, the argument isn’t a particularly scintillating one, and the finding is pretty intuitive, but I get no sense of command or mastery over the project from the first passage, no sense that the author really knows or is making sense of what he studies.

I recall very intensely being a part of an interdisciplinary center very early in my career where there was a person who habitually waited until the question/comment time was almost over so that he could make the last remarks. (I’m notorious for being the opposite: I’m like Hermione in Harry Potter or Horshack on Welcome Back, Kotter the moment a talk is over, I go "Ooo! Ooo! Over here! Me! I have a question".) This guy would get the last remark and it would be so long and arcane and overtheoritized that no one could say anything else, both because time was up and because nobody got it anyway. Then one day Mr. Last Question slipped up and said something early, and we all pounced on it and interpreted it and translated it, and we basically got it down to: “I liked the paper, and I think some people are being too critical of it.” In its first incarnation, the comment had been more like (I’ve always remembered this unusually clearly): “I want to affirm the gestural field being initiated in the discursive economy of the paper, the refusal of incorporative strategies, the reconfiguration of tropes, the simultaneous translation and retranslation of language that it proposes to undertake…” and so on.

The thing of it is, when we got the commenter down to agreeing that his comment amounted to, “I liked the paper and some people are being too critical of it”, I think he was surprised to discover that that’s what he had said. It was a rather innocent thing, and made me like Mr. Last Question much more than I had before. He hadn’t known: he was mastered by academic language rather than the master of it.

I still dream of conference formats that no one uses at major professional meetings. I’d rather that most formal presentations of scholarly work—whether the writing of a humanist or the findings of a scientist—be delievered in ways intended to involve audiences, that make productive use of the face-to-face meeting of scholars. I’d rather there be more small workshops and roundtable sessions scheduled at large meetings. I’d rather that all scholars at conferences are required to give presentations that are meant to be heard, and meant to be responded to.

I don’t have PowerPoint envy any longer, though. The PowerPoint thing is never going to work for humanities scholars. We don’t have highly concretized knowledge that we can deliver in bullet points to an audience where the novelty or contribution of our work is going to be retained at all in that compressed form. Scientists and maybe some hard social scientists really can say, “Ok, we found out something that we didn’t know before, and here’s the facts, in the most efficient form we can deliver them to you”. Humanists almost never can do the same.

[permalink]


August 4, 2004

Calling all readers. I crave some advice about the future development of this weblog. I've gotten enough email asking for an RSS feed of some kind that I feel obligated to do it, and as long as I'm at it, I've decided that it's about time I had my own comments rather than just parasitically waiting for someone else to link to my essays.

This weblog is a very primitive, hand-rolled affair. I write my essays, drop them in Dreamweaver, and update to the college's webserver. I kind of like it that way, but there are obvious hits to its functionality as a result. (And it's more labor-intensive to boot.)

I can see three ways forward:

1) I move the whole operation out of the college's domain into Typepad and maintain the weblog myself entirely with all the standard Typepad gee-gaws and doodads.

2) I handroll an RSS feed to go along with the rest of the handrolled stuff and stay put right where I am. I have some attachment to keeping the weblog inside swarthmore.edu because I'm interested in arguing that this sort of writing is a part of what I legitimately do as an academic, not a private hobby. Looking at the materials on RSS (I don't use it myself to read blogs) that doesn't seem too impossible--but comments are another matter, and something I'd have to continue to forego.

3) Either with IT support or by my own efforts, I get Moveable Type or something similar working on the Swarthmore webserver and get all the extra functionality that provides. MT makes me a little nervous both technically and in terms of licensing, etcetera. Our IT staff is overburdened enough: I don't want to push for something that has the potential to go wrong or be a burden later on (with comment spam, or security issues, or a licensing issue, for example).

Advice on any of this from people more adept than I with these questions?


August 4, 2004

Indian Guides

Among the many family pictures I scanned while I was at my mother’s was this one:

It was strange to see that image again. My father and I at a meeting of Indian Guides, a kind of pre-Boy Scouts thing for young boys and their fathers. We were in the “Cherokee Tribe”. I was “Little Red Hawk”. He was “Big Red Hawk”.

I only remember fragments about the experience—I was probably in it for only a year or so. They would beat drums at meetings. You wore warpaint. You wore leather hand-made badges of the tribe around your neck. I think we did some kind of craftwork stuff at meetings, whatever it was that six-year old kids could do that was plausibly “Indian”. We went on some kind of trip up to the Angeles mountains and there was still a good deal of snow up there, to my chagrin, since my mother sent me without any snow gear at all.

It had all the basic embarrassing goofiness of all organizations of its type, that kind of ur-Shriner or Rotary Club manly associational thing, but also the extra absurdity added on top of a bunch of Anglo men and their boys playing (badly and inauthentically) at being Native Americans. Many years later, I reminded my dad of Indian Guides, and he commented, “Of all the dumb things that a dad has to do for his kids, that was the number one dumb thing by a mile”. I couldn’t agree more.

The more complicated thought I have about the experience, however, is long-time pressures to change Indian Guides (which are now known as “Y-Guides” as a result) is a good example of where the cultural politics of identity went badly wrong and ended up being the punching bag known as “political correctness”. That journey, from limited, useful, claims to free-floating puritan superego and finally to despised omnipresent caricature is an arc that centers on the culture wars of the 1980s. Within that history, there’s a moment of metastasis, where the practice of identity politics, which had a fairly complicated intellectual genesis, escaped the left and became generically distributed but remained strongly associated with the left by the general population, even after that practice was no longer at home there, even after many Americans of all political persuasions as a whole learned to use and wield the claims of “political correctness” when they have provided a tactical advantage. (I’d say that conservatives today are actually among those most prone to deploy “identity politics” type claims on their own behalf.)

Deep down in its foundation, what came to be glossed as “political correctness” drew on two reasonable propositions:

1) Racial or other forms of social identity and associated forms of social inequality seem to have a lasting character, persisting in the United States and other societies even when major legal and political discrimination ends. There must be something about social identities which draws its force unconsciously from everyday practice and culture rather than formal legal and political structures.
2) Language and representation are not “just” words, but acts. The identity of a speaker, the social context in which he or she speaks, the relation of a speaker to an audience, and so on, make an important difference. The same words in two different contexts mean two entirely different things, and the context in which those words are interpreted also changes their meaning.

These two observations had fairly different intellectual histories. Brought together, however, they led to a third proposition:

1) Speech acts and cultural representation are an important part of the maintenance of discrimination and the definition of social identity.

You can argue with either of those foundational propositions. You can certainly argue with the combined argument. But they’re not obviously silly or trivial; they have a lot of validity to them. They’re serious arguments, not intrinsically leading to the kind of schoolmarmish politics that later came out of them and that are today a rhetorical staple of institutional and cultural politics for some groups on the left and the right.

Where did those ideas go so badly wrong? Well, I think it might have something to do with the way something like Indian Guides or the Atlanta Braves tomahawk chop ended up being understood: first, with a lack of proportionality; second, with a lack of proper historical perspective; third, with a lack of interest in intentionality; fourth, with a lack of curiosity about the general phenomenon of impersonation and identity play in American society.

The lack of proportionality is the easiest mistake to catch, and is the chief reason that “political correctness” is now such a punching bag. If Indian Guides was a part of a system of representation connected to the oppression of Native Americans, it was a ragtag, left-over bit of trivial effluvia, not as some activists put it, the centerpiece of the "dehumanization" of Native Americans. The foundational arguments behind political correctness insisted on the seamlessness and coherency of oppressive systems of representation, claiming that every symbol and sign with even the least visible hint of a stereotypical referent to race, ethnicity or gender is imbricated with equal vigor in lynchings or violence against women or the Trail of Tears. One typical example: we had a “teaching event” here at Swarthmore after a student here showed up at a Halloween event in blackface where one of the students in the audience managed to leap casually from talking about the history of blackface to being angry that some of his peers mistake him for his brother—a person to whom he had a close genetic relationship.

Part of the reason for this lack of proportional differentiation between the way that different symbols in different contexts are tied to the maintenance of discrimination has to do with a disinterest in the cultural, intellectual and social history of these representations. Indian Guides in the 1960s was a harmlessly stupid thing in part because was an impotent and discarded leftover of a much more charged, violent and painful history. I dou’t doubt that as such it could give pain or offense to a Native American who feels a victim in the aftermath of that history, but the fact that a symbol or practice invokes some past practice or representation for an individual does not make it equal to that past practice. I might see an allusive hint of past oppressions in a present-day text, but to collapse the distinction between then and now is to live in a kind of hallucinatory atemporality. We sometimes run into students here who insist that the present condition of African-Americans in the United States is indistinguishable from antebellum slavery, and thus that the acts of representation which seem to them to have a racial component must be equally indistinguishable from the cultural experience of enslavement. That collapsing of distinctions trivializes past suffering while also making it impossible to have a real and tangible politics in the present: it denies any motion or change in the past, and so cannot imagine a condition of change in the future. This collapsing of distinctions is at its worst when it applies itself to culture, speech or representation.

“Political correctness” seems to have most profoundly grated on general sensibilities when it ran (and still often runs) roughshod over intentionality, when it discards any interest in why someone said or did something and takes the determination of meaning in a speech act as entirely dependent on what someone hears or feels it to be. No one involved in the creation or perpetuation of Indian Guides was setting out to create a relation to the social position of Native Americans in the United States in the 1960s. The participants might be said to have been blissfully, foolishly ignorant of and uninterested in how a bunch of white guys calling themselves the “Cherokee Nation” and making lanyards might play in that wider world, or look to the descendents of the Cherokees, but the innocence of the participants is also a material and political fact worth taking seriously before making any criticisms. Audiences at Atlanta Braves games don’t set out to say anything when they do the tomahawk chop: they’re just doing what fans at Braves games do. To tell them that what they are doing means something that none of the audience actually intends to say is unsurprisingly alienating to some. So many of the conflicts and critiques spurred by identity politics borrow the rhetoric of legality, charge people with crimes—but if we’re going to talk about crimes, we have to talk about intentionality—it’s a centerpiece of our ideas about justice and injustice.

To get easily ruffled by Indian Guides, or anything comparable, is also to miss a more complex history underneath it of racial and gendered impersonation, of people playing at being other identities not to mock or hurt, but honestly to explore and make creative use of the experiences of others, however ineptly. Indian Guides isn’t that different from the kind of cultural cross-dressing associated with the legacy of Karl May in Germany, or with any number of other kinds of practices that are at least as complicated politically and culturally as various practices of drag or transgender performance that tend to get exalted rather than attacked within identity politics. To me this is the most important thing that we've lost sight of, the most interesting thing to re-examine.

None of this is to say that Indian Guides shouldn’t have turned into Y-Guides or what have you, or that Braves fans shouldn’t rethink the tomahawk chop. But whether those things happen or not is simply not terribly important: no one’s politics should be built around pressing hard for those kinds of changes. (Or defending strenuously against them, perhaps.) There’s not that much at stake, and a much messier cultural history, with more meanings and possibilities, than conventional identity politics is inclined to credit.

[permalink]


July 28, 2004

Two Essays on Massively-Multiplayer Games

1) The first essay I post here, as a PDF, is a paper I presented at a conference in Bristol in the summer of 2001 (See Ben Talbot's report on that conference for more). I should have circulated it widely then, but I didn't. First because I had hoped it would be part of an anthology I was trying to assemble that eventually collapsed because the other contributors had other obligations or couldn't commit after all, and then second, because I sent it in to a fairly well-known cultural studies journal. I sat on it for a while after getting peer comments. Some of those comments were useful and valid enough, and others--as several colleagues and friends predicted--amounted to a kind of disciplinary gatekeeping from new media and CMC researchers who didn't think I had paid my dues in that field. (One of the reviewers went through a long song-and-dance about how cultural studies always should address questions of production, circulation and audience, and that my piece was too focused on only one aspect of the audience's consumption of the game text. Grandmother, here's how you suck eggs.) So I finally decided to just get the thing out of my closet, for whatever it's worth, Though I updated it a bit in early 2002, it's now badly out of date in a number of ways, most crucially because it only takes passing note of the work of Edward Castronova, who has pretty much changed everything known on this topic. (As has Julian Dibble). It's also fairly out of date in its summary of the state of affairs in Everquest, Asheron's Call and Ultima Online, and in the wider world of MMOGs. (There's been some good material at Waterthread lately on the current state of things, not to mention the usual fine content at Terra Nova.) But here it is: Rubicite Breastplate, Priced to Move Cheap: How Virtual Economies Become Real Simulations.

2) The second essay is a shorter think-piece about a different way to design massively-multiplayer online games, about how to move more decisively towards making virtual worlds. It's called The Narrative-Nudge Model for Massively-Multiplayer Games.

There's a third essay that's a sort of missing link between the two, which is an almost-completed, fairly lengthy review essay on the (infinitely many) problems of Star Wars: Galaxies, and how studying SWG made me change my mind completely about the conclusions in the paper I presented at Bristol. That will be appearing (I hope) in an online journal in the not-too distant future.


July 27, 2004

As I Would Not Be A Slave, So I Would Not Be A Master

I have rarely paid much attention to the party conventions, but this year is different in every respect. I’ve been finding the gosh-wow stupidity of the television journalists about the presence of bloggers unintentionally hilarious—listening to Jeff Greenfield on CNN explain the exotic idea of a “link” as if he were trying to explain superstring theory, followed by some reporter minion of his practically wetting himself over the intricacies of some strange new-fangled thing called “the Web” , was especially rich.

The more interesting thing to me was something that came out in Gore’s mercifully brief speech and reverberated occasionally throughout the night, what I saw of it. Even before 9/11, one of the things that really bothered me about Bush and his administration was their sick arrogance, their lack of respect for the thinness of their electoral margin and what that should have told them about their mandate or lack thereof. It bothered me before I even knew that it did, or why it did. It bothered me early and angers me now because that arrogance has dragged American society to a very seriously dangerous juncture in its history.

I’m not talking about my usual opinion on Iraq and the response to 9/11. If you read this blog, you’ve heard that all before. That’s reason enough to vote against Bush.

However, there’s something deeper and wilder here, a fire that will more than burn the hands of kids playing with matches—and there’s been a lot of playing with matches since November 2000.

The New York Times has lately been assuring us that ordinary Americans are not bitterly divided on partisan grounds, and in one sense, I believe it. Yes, I know that there are a great many issues on which there exists some degree of consensus, and probably many issues beyond that where there might be disagreement between Americans, but of a mild and unexciting sort. In another sense, I think the Times’ polls are full of crap. Among the Americans who actually vote, who are attuned to political issues, there’s a high-strung sense of tension and anxiety that I’ve never experienced in my lifetime. Maybe 1968 would compare: I was more concerned with my tricycle at that point, so I can’t say in a meaningfully experiential way.

I used to say, around October 2000 or so, “Ok, so what if Bush wins? It won’t be that bad. He’ll do some things I don’t like, but he’ll be fairly constrained both by the size of his victory [which we could all see would be small if it came to pass] and by a prudential need to appease the political center.” I was seeing Bush as his father’s son, and his presidency as the mirror image of Bush the Senior’s presidency.

This was staggeringly wrong. The second Bush presidency has been unprecedented in its ideological extremism and arrogance. I think that reflects very badly on Bush and his associates. If we want to talk about Bush’s lies, let’s start with his promise to govern with all Americans in mind, as a uniter and not a divider. That’s his biggest lie of all, one that can’t be qualified as an accidental error based on faulty intelligence or a modest distortion. There’s no way to argue that Bush has governed with the intent to unite, to overcome partisan division. Al Gore called Bush on this lie last night, and rightfully so.

This is about more than Bush. One of the reasons I chide people on the left for not seeking dialogue and consensus, one of the reasons I am constantly looking for the presence of reason and the possibility of connection across the political spectrum, is that if we get ourselves into a situation where 51% of the voting population or a narrow majority of electoral votes is imposing a total and coordinated social and political agenda on the almost-as-large minority who has a radically different and equally coordinated social and political vision, we’re staring at the threshold of a very scary future, regardless of whom the 51% is or what they stand for.

In this respect, we have to see past George Bush and his poor leadership for a moment, and see the people who strongly stand behind him. It is they who really matter, their choice which will shape the next four years. It to them that I make my most desperate, plaintive appeals, my eleventh-hour plea not to pull the trigger. To choose Bush is to choose to impose the starkest, most extreme formulation of the agenda that Bush has come to exemplify on a population of Americans to whom that agenda is repellant. To choose Bush is to choose Tocqueville’s tyranny of the majority (or even, judging from the popular vote in 2000, tyranny of the almost-majority). To choose Bush now—not in 2000, when he plausibly could have been many things--is to aspire to tyranny, to ruling your neighbors and countrymen. That some on the left have had or even achieved similar aspirations from time to time doesn’t change things: it’s wrong whenever it is the driving vision of political engagement, for whomever holds it.

I know that there are socially and culturally conservative Americans, many of them Christians, who already feel that they live in a Babylonian captivity, that they are already at the mercy of a secular culture. But the vigor of evangelical Christian culture in the past decade—the profusion of Christian books, movies, television shows, and so on—demonstrates to me that a secular, consumerist America is one where even nonsecular or dissenting Americans are free to make their own way, form their own communities, choose their own culture. A culturally conservative crusade led from the White House is not the same thing, not a mere flipping of the coin, a karmic reversal. An evangelical Christian can refuse to consume pornography, but if pornography is outlawed, then anyone who wishes to view it is a criminal. Feeling the need to avert one’s eyes and being subject to criminal penalty are very different things. It’s the difference between freedom and unfreedom, between the Bill of Rights and a series of wrongs.

If Kerry is elected, and imposes a kind of extremist political vision root and branch upon the Americans who oppose him that is comparable to what Bush has done (I don’t see how he could, given that Congress is likely to be Republican in any event), then we’ll know that there is no possible consensus for us all, that a kind of final struggle has been joined in which every American will end as either tyrant or slave. I choose to believe and hope and trust that we’re not there yet. I choose to believe that we can have leaders who will not push us to that brink, and that we can have voters who also forbear to do so. If Bush is chosen, it may signal that there's no way out. I yet believe we can find the place where ordinary American decencies live, where most of us can go along to get along, where “don’t tread on me” and the City on the Hill belong in the same neighborhood, are part of the same love of country, are equally part of the American Dream.

[permalink]


July 26, 2004

Brief update on the garden situation. The tomato thieves are deer. I caught them at it this morning, one walking around with a big tomato right between its teeth, and to my surprise, they're getting in not by jumping over, but by squeezing through a very thin gap in the fence at one point. So I've tied that gap off with wire and we'll see if that makes a difference.


July 26, 2004

The Limits to Generalism

I spent three days at the 3rd International Conference on Autonomous Agents and Multi-Agent Systems in New York last week.

I was a little disappointed, in some ways. I had hoped the meeting would be a bit more interdisciplinary, despite its strong connections to the American Computing Society. It was pretty much computer scientists all the way down. But that’s where multi-agent and autonomous agent systems live intellectually. One should not be surprised that the sun is in the sky during the daytime.

The consequence for me was that I understood very little of what I saw and heard. Every once in a while, the light broke through the clouds, generally in papers that were very explicitly devoted to using multi-agent systems for social simulation, those more concerned with the conceptual design and application of their simulations and less concerned with the formalisms, protocols or algorithms underlying them. I was able to grasp one presentation on the simulation of social insects and pheremones (due to the intensely well-travelled nature of the example) and even to see that the presentation offered relatively little that was new on that topic. I really liked one presentation that proposed a formalism for generating irrational agents, or at least for nesting normal Bayesian game-theoretic rationalities one step away from the functioning of a multi-agent system. It seemed very innovative and intelligible, particularly given that I was struck by how utterly reliant the whole field has become on rational optimizing designs for agents. I was also struck at the extent to which the demand for application to commercial needs drove the vast majority of presentations.

At most other points, however, not only did I not understand anything, I barely understood what I’d have to understand in order to understand a presentation.

I repeatedly extoll the virtues of generalism, but it cannot do everything. The sinking feeling I repeatedly had during the conference was knowing that to even get to the point where I grasped the substantive difference between different algorithms or formalisms proposed by many of the researchers at this conference, where I could meaningfully evaluate which were innovative and important, and which were less attractive, would take me years of basic study: study in mathematics, study in computer science, study in economics, areas where I’ve never been particularly gifted or competent at any point in my life. To get from understanding to actually doing or teaching would be years more from there, if ever.

The reverse movement often seems easier, from the sciences to the social sciences or humanities, and in truth, it is. There’s an important asymmetry that I think is a big part of the social purpose of the humanities, that intellectual work in that domain returns, or should return, broadly comprehensible and communicative insights rather than highly technical ones, and thus, that the barriers to entry are lower.

The ease of that move is deceptive, however. It’s the kind of thing that leads someone like Jared Diamond or other sociobiologically inclined thinkers, especially evolutionary psychologists, to what I call “ethnographic tourism”. Operating out of a framework that requires the assumption of universalisms in order to make cogent hypotheses about human history and behavior, scholars coming along that path often quickly scoop up the studies and accounts which support the foundational assertion of a universal and ignore those which do not or casually dismiss them as biased or “culturalist”, regardless of the methodology those studies employ. That’s what leads to their peculiar preference for the work of Napoleon Chagnon on the Yanomano, for example. Bogus or wild-eyed controversies about immunizations and manipulation aside, there’s at least reason from an utterly mainstream, meticulous, scrupulous and disinterested perspective to view some of his methodologies as debatable and to take seriously the work of other scholars who have made very different findings. There’s a selectivity principle at work in ethnographic tourism that wouldn’t be tolerated if it wasn't scientists cherry-picking material from anthropological scholarship they like and ignoring contradictory work.

That is not atypical of what can happen when scientists pressing towards generalism think they understand disciplines outside the natural sciences. Similarly, it’s become easy to mock and ignore scholarship in the humanities for being too theoretical, fashionable, incoherent, and so on, which it very often is. Alan Sokal’s hoax hit a real target, but if you want to think and write about problems like the nature of existence and knowledge, or about why and what a cultural work means to its audiences, sometimes you really are going to have to go into deep waters that require a complex conceptual framework. Some scientists tend to forget that on a series of crucial issues, skeptics in the humanities were closer to the truth for decades than scientists, most notably in the early debate between philosophers of mind, neuroscientists and computer scientists working on artificial intelligence about how easy it would be to create AI.

That debate is an important reminder, however, of what a kind of disciplined drift towards generalism can bring. The intensely fertile contemporary practice of cognitive science draws from all those areas and more besides. It almost seems to me that a good generalist ought to combine an overall curiosity and fluency in the generality of knowledge with a structured search of the possibility space of the intellectual neighborhoods which are just far away enough from their specializations to return novel possibilities and angles of attack but just close enough that those neighborhoods are potentially accessible with a reasonable amount of scholarly labor. To think about generalism in this way is to realize that different generalists are not going to end up in the same place. Their mutual engagements or conversations will have to happen in places of accidental overlap, because the concentric circles of one's own generalist competency are going to differ because they originate out of different initial specializations.

Proximity to your own discipline and specialization can also be deceptive. I’m planning another version of my Primary Text Workshop course for academic year 05-06. I’d like it to involve the students in doing the preparatory work that would be required for making a virtual reality environment based on a historical community—the speech trees, the knowledge of clothing and other material culture, the architectural and geographical knowledge, the understanding of everyday life rhythms, and so on. I’d prefer it be about a city whose history I know very well—Johannesburg or Cape Town spring to mind—but access to primary materials will obviously be limited. On the other hand, late colonial Philadelphia seems an apt choice, but I find myself simply overwhelmed by the literature I’d have to read in between now and then in order to achieve a basic comfort level. It’s not enough to have read Alan Taylor, Timothy Breen, Gordon Wood and so on about the colonial and revolutionary era—I’d need to go far deeper historiographically than that, and at that point, you begin to wonder whether it isn’t just smarter to hand the class off to a colleague who already specializes in that era.

I’ve been thinking about how to calculate the wider bounds of generalism beyond the discipline. In my case, for example, some of the ideas associated with complex systems, emergence, autonomous agents and multi-agent systems and so on are close enough conceptually that I can make use of them and contribute insights to colleagues working in those areas, but they’re just far enough away that I should not ever expect to do original work directly in computing applications myself. Sociobiology might be close enough that I could reasonably expect to offer some critical insights into its methods, but not close enough that I could expect to do my own original research into population genetics. Theoretical physics would be far enough away in every respect that I might not ever reasonably expect to understand it, let alone do it, given that much of it cannot even be translated from its mathematical conception into broadly communicative prose. At that point, you have to have enough faith in the entire system of knowledge production to just say, “I trust you to do what you do, and to do it how you do it”—and if it becomes imperative to do more, as it does in the case of tenure review, you just have to outsource the job of deciding whether another scholar’s work is original or skilled to someone else, to have the humility to know where the final outer bound of a generalist intellect lies.

[permalink]


July 19, 2004

What Gus Here is Sayin’

Well, criticizing Michael Moore definitely seems to get a rise out of some people, judging from this Crooked Timber thread in which John Holbo springboards from some negative comments I made about Fahrenheit 911.

There are criticisms I feel free to disregard—the cry that attacking Moore is breaking ranks or failing to play for the home team. The political rap across the knuckles, the call for left solidarity, is one of the surest signs of intellectual weakness that I know of, and a major reason I have no interest any longer in whether I’m considered to be on “the left” or not. Equally is the reflexive, gut assumption that anyone who fails to genuflect to Moore must be a defender of the war on Iraq. Hardly, as anyone who reads this weblog knows very well.

A number of commentators protest what they see in my original comments or in John’s argument as an equivalence between Moore and the Bush Administration, or between Moore and the most grotesque liars and rabid animals of the polemical right like Ann Coulter or Michael Savage. I agree there’s an asymmetry. In the first instance, because the people who lead the country and the people who comment on that leadership are simply very different in the consequences of their views. There’s no question that the intellectual dishonesty and closed-mindedness of the Bush Administration’s key war planners is vastly worse, and of vastly more concern, than anything Michael Moore has to offer. And I don’t see anything in Fahrenheit, for all that I dislike it, that compares to someone like Ann Coulter wishing that Timothy McVeigh had blown up the New York Times building. There are differences of proportion in either comparison, and Moore is hardly job one or even job one hundred on a very long and filthy list.

But what some CT commentators seem to me to be saying is this: Politics is a dirty, hard business, and we have to play dirty to win. They're saying, don’t come in here with your effete intellectualism, your Marquis-of-Queensbury rules, your naïve pomposity. Moore works, he’s down with the people, he’s telling it like the American people need to hear it.

This is precisely what I took up in my Cliopatria essay: is Moore effective, and effective at what? So I don’t disagree with the CT commentators who say that you have to play politics to win, and that if Moore is effective, that’s a countervailing virtue that outweighs any pedantry one might unload at him. What I think is the CT commentators are actually revealing, however, is why the American left is on a persistent losing streak in the tough game of political struggle (not to mention a nasty little streak of intellectualized anti-intellectualism that is another classic kind of left-wing panic button).

They assume that fairness and intellectual discipline are somehow antithetical to the crafting of effective political argument and rhetoric and they assume rather than demonstrate that Fahrenheit is positively influencing the constituencies whose mobilization against the Iraq War and the Bush Administration is useful or needed at this point.

Fairness and open-mindedness is a pretty crucial part of my own political and intellectual voice. That’s first because I assume that it is a positive good, an ethical position, and to adopt an ethical mode of acting in the world is itself a political strategy. It is a commitment to the dispensation that one hopes to build. I assume, very deeply and I hope not unreasonably, that there would be enormous social good that would come to pass if the American public sphere was everywhere authentically marked by fairness, open-mindedness, and mutually agreed-upon standards for rational argument and use of meaningful evidence.

This the critics would be right to say is an insufficient reason to criticize anyone failing to reach that standard. By itself, it is a luxurious high-mindedness. However, fairness also works as politics in the operational sense. An operatic, performative commitment to decency, an over-the-top acknowledging of the legitimacy of potentially legitimate arguments, an attempt to reduce cheap shots, a showy constraint for saying only that which can be said based on strong evidence: these all function as powerful tools in political struggle within the American public sphere.

Who brought Joe McCarthy down in the end? Not somebody playing “dirty”, down in the same gutter with McCarthy, but someone who waited for their moment and caught McCarthy in a decency trap, who revealed the man’s fundamental unfairness and viciousness in part by being scrupulously decent themselves. How did Archibald Cox defeat Richard Nixon? By walking the straight and narrow. Being decent and fair and meticulous isn’t intellectual wankery: it’s hardball.

It’s especially important in the context of the metapolitics of weblogs as a subdomain of the public sphere. Crooked Timber’s contributors regularly take other webloggers to task for the inconsistency of present arguments with past positions, or for their contradictory use of evidentiary standards. That kind of critique only has political influence, e.g., the capacity to alter the way that others think and act, inasmuch as it is a performative, demonstrated constraint on those who offer it. This is what I understand John Holbo to be talking about most centrally in his own comments. If you hold someone else accountable to standards that you do not maintain when you're talking in the public sphere about someone on your "home team", you've shot your wad, you've blown your credibility, you've lost political capital.

That’s the league that Michael Moore is in: the public sphere, weblog and otherwise. Within that league, there are or ought to be rules. Playing by the rules earns you political capital—and if you have political capital, and spend it wisely, you’re effective in influencing other players in the public sphere, even sometimes those who may pretend not to care about those rules. If you have none, you never get the chance.

All this might be, as some CT commentators suggest, purely academic or at least confined to a sparsely inhabited region of the public sphere where the air is thin if Fahrenheit were a boffo smash with those American audiences who have yet to commit to the struggle against the Bush Administration. Some CT commentators assume this rather than demonstrate it, presumably on the basis of the movie’s impressive ticket sales to date. But by that same standard, one would have to assume that The Passion of The Christ converted huge numbers of previously secular Americans to Christianity. Ticket sales, even in the land of Mammon, can tell a thousand different sociological stories, and it takes more than that to know what a particular film, book or weblog is doing out there in the world. There’s nothing harder than studying an audience's mindset. But at the least, we already know enough about where Fahrenheit is doing well to suspect that it is largely preaching to the converted.

My own intution—just as thin evidentiarily as that provided by the usual working-class-heroes cheerleader squad—is that Moore’s particular confabulation of conspiracy theory, left-wing writ, smarminess, and powerfully affecting and moving scenes of suppressed truths is only sporadically persuasive for those American constitencies which are potentially moveable in their views on the war or on George Bush, and may at times be actively counterproductive. Much of what irritates me about Fahrenheit is that is often self-indulgent, unnecessary, superfluous, appealing mostly to the very intellectuals who then turn around and tell me that appealing to intellectuals is effete and ineffective. Though it might be aesthetically less satisfying and entertaining, something much more conventionally melodramatic or Ken-Burns-respectable might be more powerful by far, crucially because of a peformance of “fairness". The curious thing that moves through at least some defenses of Fahrenheit is an assumption that Ma and Pa Kettle aren't gonna come out and see a documentary unless it has plenty of bread-and-circus pleasures, lots of yuks, unless it goes down smooth and easy. To me, that defense isn't just vaguely condescending, I would also suggest it's wrong. I think you could sell $100 million in tickets for a de-Mooreified Fahrenheit that had all of the heat, all the anger, all the revelation, but without all of the bullshit.

Some reply further at this point in the argument that the effectiveness of Fahrenheit is not measured in whether it changes any hearts and minds, but in mobilizing and energizing the left for the struggle ahead. First of all, come on: how much angrier and more mobilized can people on the American left possibly get without having an aneuryism? YEAH! YEAH! I’M SO ANGRY! GRRRR! GONNA TAKE BACK MY COUNTRY!! GRRR!!

More to the point, I can’t think of anything less effective politically. Guess what happens to a boxer who gets wildly pissed off and starts taking huge swings at his opponent? He ends up tired and leaves himself wide open for jab after jab. Maybe he gets Buster-Douglas lucky once in a great while, but most of the time he ends up on the canvas.

[permalink]


July 15, 2004

The Swarthmore Tomatofield War

Along with travelling, I’ve been gardening. My faculty rental comes with access to a very nice, large garden plot that is some ways away from the house, on the verge of a large wooded area that descends to Crum Creek.

My first year of having a vegetable garden was the best, in 2002.

I had a fabulous yield of tomatoes, peppers,

zucchini (way too much zucchini),

pumpkins, tomatillos, sunflowers and herbs. The only thing that got absolutely annhilated was my corn, which some animal stripped bare just as the ears appeared.

My second year I gave up on corn and added string beans. These grew fabulously well and were hugely tasty. But this time

my sunflowers

were absolutely destroyed before they could even germinate—something systematically dug up all the seeds for two plantings (this happened again this year).

The 2003 tomatoes were also subject to heavy assault by unknown vermin. The zucchini died of some kind of rot that covered the leaves with a grey mold and then turned the stalks to mush. The herbs did really well, though, except for rosemary, which just doesn’t seem to grow out here.

This year, I’m doing ok. I planted 18 tomato plants because the whole point of this garden, really, is to get me the tomatoes I can’t buy anywhere, and get me lots of them. But something has assaulted them again—they’re disappearing just as the first streak of red appears. So I’ve taken to picking them when they’re yellow and letting them ripen inside. The string beans sucked this time, but I think that’s mostly the seeds I planted—not as good as the heirlooms I planted the first year. The zucchini is rotting again. The pumpkins died quickly for some reason. The herbs are terrific as always. The tomatillos are growing, though like last year, have been slow to flower. Carrots, to my surprise, are flourishing—I tried the previous two summers and couldn’t get any to germinate. Peppers are doing well (poblanos, jalapenos, serranos, thai bird peppers). Herbs are self-sustainingly great now. Eggplants are limping along—I’ve never had much success with those, either.

It’s the tomatoes that are on my mind all the time now. They are what I want and crave. I managed to get one off and it ripened and I just put a bit of salt on it and devoured it. Nothing like it in the stores, not even the fancy-schmancy ones.

Here’s what I do to protect the garden: a 5-foot wire fence that I set into a one-foot deep trench to prevent digging under. Bobcat and coyote urine in the corners of the garden and soaked into cotton tags hanging from the tomato cages. An egg white-capascin-vinegar repellant mix sprayed on the tomatoes themselves. And they’re still disappearing. This year, they’re disappearing outright—I’m not finding the half-eaten corpses I’ve found in the previous two years. And they’re disappearing well before they ripen.

So I’m working on hypotheses about what’s doing it.

1) Chipmunks and squirrels. I’ve seen both of them raiding the tomatoes in the past; chipmunks were clearly the guilty parties last year. But they usually leave half-chewed tomato corpses and they usually only want semi-ripe ones.

2) Woodchuck. I don’t think the local ones can get inside my garden as it stands, and I’ve never caught them going for tomatoes anyway. I suspect them instead for other assaults in past years, including the Great Corn Massacre of summer 2002.

3) Rabbit. So why aren’t they eating the carrots, which are almost ready to be picked? Maybe they don’t know what they are. How are they reaching tomatoes that are two feet off the ground? But the young ones can unquestionably get in through the fence—I’ve spotted babies and juveniles in the garden before.

4) Deer. I thought I’d really made it so they couldn’t jump over, but there’s plenty of tales of deer jumping 5 feet, so maybe. No tracks, though. Do they eat tomatoes? Not sure, but I’ve seen other things that make me think they’ve been in the garden (plants that look trampled).

5) Raccoons. We got ‘em, they’re clever, and they could easily carry away tomatoes if they can get in. But damned if I know how they might climb the fence—I wouldn’t think it would support their weight, and there’s nothing dug under it anywhere. They might be pushing in past my improvised “gate” but I doubt it—it always is “tight” whenever I come out to the garden myself, with no signs of disturbance.

6) People. I’m afraid this is my current working hypothesis. The garden is a long ways from the house and people can come and go in it without being observed from any house at all. No footprints, though, even with the recent rain. I have had thefts from the garden before, though—several ripe pumpkins disappeared in September 2002 just as they were ready to pick, for example.

I don’t think there is much left I can do to keep varmints of all kinds out, though. Maybe lock the gate to test the “people” hypothesis, though that seems extreme. I tried stringing chicken wire around the top of the fence to discourage deer, but it ended up looking like a vegetable gulag rather than a garden. I used to put chicken wire around the tomato area, but the chipmunks just laughed at that. I kind of wish I had an old hound, a rocking chair and a shotgun—I’d sit out there for a few nights and see what’s what. Except that it’s illegal and I don’t think the college would be too wild about me blasting away at various critters on their property, not to mention my neighbors. I suppose I could put traps in the garden and see what gets caught, but that’s like catching a few raindrops and thinking you’re going to get a sunny day.


I keep thinking about that bit in Robert Lawson’s Rabbit Hill where the kindly (and evidently very rich) Folks put out a crapload of vegetables and such every single night in order to keep all the animals out of their own garden. “There is enough for all”, they said. I’m guessing that this is not the case—that I could plant 50 tomato plants and still watch them get stripped by the Mystery Vermin. So it’s war—if only I could figure out what I’m fighting and how to fight it.

[permalink]


July 15, 2004

How I Spent My Summer Vacation (So Far)

Spent a good while visiting family in Southern California in June and July, which was a lot of fun.

I had a chance to visit the gallery my brother runs in Los Angeles’ Chinatown. It’s called Oulous Repair Shop, and I really like what he’s done so far. The web page for the gallery doesn’t actually list the address, which is 945 Chung King Road.

Speaking of which, he’s trying to put together material for a show sometime this fall on fringe technological designs. He’s been writing a few scientists to see if they receive and keep letters or inquiries from fringe inventors or technologists, both to try and get names of people to contact and to see if he can gather together any sketches or visual material that were included in such inquiries. If you’ve got any ideas or sources of possible material, contact him at xing@oulous.com .

I also spent a bit of time at my mother’s store, Mixt, which is in the Rivera Village shopping district in Redondo Beach, 1722 South Catalina. It’s a great place—she’s got a nice mix of little doo-dads and very interesting high-end craftwork.


Los Angeles as a whole still puzzles me. I like it (and California as a whole) a lot better than I did when I was a surly teenager. In fact, some of Southern California’s best features are tailor-made for the middle-aged: good food, good booze, great weather, easy living. It’s a tough place to live if you don’t have money—the housing market there now staggers me, after two decades on the East Coast.

One of the interesting things about LA to me now is that it seems to me that the ceaseless reworking of its built landscape has slowed somewhat. I remember a period from about 1980 to 1995 or so when I would visit and find that the retail and residential landscape had shifted once again within a very short time frame. We’d go to places where there had never been houses and lo! Giant developments sprawling as far as the eye could see, people moving in who were facing daily commutes of two hours in each direction. You’d go back to a mall or neighborhood with stores you liked and they were all gone. There are areas which are still very much in flux, but a lot of things seem to me since 1995 or so to have been much more static across the core of the LA Basin. Maybe I’m wrong—it’s hard to know when you only visit twice a year or so. My brothers often have pointed out that there's much more visible, physical history to California's built landscape than most people, including locals, think.

It also seems to me that high-end food retail nationwide has caught up somewhat with California—it’s much easier now around us to find very good produce and meats, and quintessentially California businesses like Trader Joe’s are now national (though I think Trader Joe’s is going to be very hard-pressed to maintain anything close to its traditional quality/price ratio at its present rate of expansion).

But even with overdevelopment, pollution, crowding, traffic and the like, I’m pretty hard-pressed to think of anywhere on the East Coast that has the attractive mix of weather and landscape that a number of California cities do, including Los Angeles. I just can't work up enthusiasm for East Coast beaches or East Coast mountains in comparison. My Dad, who was born in California, always used to say whenever the Rose Bowl was on, showing people playing football on a sunny day to the rest of a miserably cold nation, “Well, here comes another 50,000 assholes”.

[permalink]


July 14, 2004

21st Century College: An Outline

I've been messing around with some ideas about a fantasy college, about what kind of institution of higher education I might build given $500 million and total autocratic power. This is what I've come up with. The sketch I lay out deals with three interwoven issues: first, the overspecialization of the academy, second the insularity of academic life, and third, the increasing over-management of academic communities and the heedless expansion of the "full-service" approach to higher education.

The result will doubtless not be particularly palatable to most or many--hell, I'm not even sure that I would want to teach there in a few ways. But on the other hand, I think it is sometimes useful to imagine systematic alternatives in order to understand how--or whether--what we already have might need to be changed.


July 14, 2004

On Third Partyism

A modest rebooting of this blog now that I’m back, on the recently discussed subject of whether it is a kind of infantilism to support political reforms in the U.S. to allow third parties to compete fairly at the polls.

The simple answer: it depends, but yes, often third-partyism is an infantilism, one I have myself been guilty of in the past. The basic problem with the most devout third partyists is that they either have woefully unrealistic models of the likely prospects of their own preferred third party or they lack any sense of a comprehensive alternative idea of political competition, and argue for third party competition as a purely ad hoc response to some particular dissatisfaction with the Republicans and the Democrats.

Greens or Libertarians, for example, probably would poll only marginally better in most cases than they do now after a breakup of the two-party duopoly. They might have regional strongholds that they’re denied now, and be able to send a few representatives to Congress or to state legislatures, but in Presidential races or even state-wide ones, I don’t see them being competitive for the forseeable future.
This is even more pressingly true for the progressive wing of the Democratic Party. Third-partyism here, especially in its Naderite form, really is a kind of head-in-sand wish-fulfillment scenario, a belief that progressive or left politics, once freed of its captivity to the Democrats, could be a powerful electoral force in general.

I would agree that a Democratic candidate who ran with some conviction and a strong sprinkling of populism might well be a roaring success with independents for the same reason that John McCain is, but that is a question of character: what is liked about such a politician is their honesty, their authenticity, not their ideology.

Stripped of a compensatory attraction due to the character of a candidate, a strongly left politics in most areas of the country would be a major electoral failure, and a simple third-party with a progressive character that was permitted to compete fairly within the present system would go nowhere, both on its own terms and in terms of its influence on the Democratic Party, which would probably move even further to the right in order to compete for the larger pool of independent voters rather than the small pool of hard-core progressive votes. As a voting base, progressives simply don’t compare in either fervor, geographic rootedness or numbers with the religious right, and can’t hope to accomplish what the religious right has in terms of seizing control over the Republican Party.

In conventional terms, the only third party that might benefit from a relaxation of the standard barriers to competition would be something like the Reform Party, a kind of independent-bloc soft libertarian party that could give a home to backers of McCain, Schwazenegger and other Republicans who don’t fit with the Bible Belt social conservatives who have seized control of the Republican Party whilealso drawing off some suburban Democrats and possibly working-class “Reagan Democrats” as well. If that’s what’s on offer, no, that’s not an infantilism, it’s a reasonable if unlikely third-party ambition—a parallel to the 19th Century formation of the Republican Party, a response to new social constituencies who found themselves effectively without any political party corresponding to their interests and outlook. The same may be true now for a variety of Americans, or it may not be true, but the most unrepresented American constituencies in this sense are not urban voters or rural ones, but instead the “swing” constituencies who are perpetually wooed by both parties but the bedrock voting base of neither. This is the only conventional “third-party” movement that I can see making any real political headway at this moment in American history.

Such a third party could also only succeed by first pursuing electoral success at the state level and in Congressional races and by placing reform of winner-take-all politics as its first and primary agenda. The more comprehensive and specific its alternative political platform was, the less headway it would make, as the most appealing parts of its agenda would be cherrypicked by the other two parties and electoral reform left quietly by the wayside. In a sense, this party would have to enter the political system by agreeing to back the agenda of either of the other parties in exchange for systematic reform of the current electoral system to create a level playing field, in a very conscious process of horse trading. That mission accomplished, the new party could then begin to flesh out an independent political platform of some kind. This is quite evidently not the strategy being pursued by Ralph Nader, now or in 2000, nor is it the strategy that any of the third-party Presidential candidates of previous years have pursued.

Third-partyism also makes some degree of sense if it’s articulated as a comprehensive program of political change designed to transit American politics to a more parliamentary mode, with many parties that have narrow ideological or political agenda and serve highly particular constituencies. This is a comprehensive change, rather than the normal argument for one or two “third parties” through minor tweaking of winner-take-all voting or reforming ballot-qualification requirements. This is not what most third-partyists in the United States seem to be arguing for, possibly because most people recognize that this particular reform is far from self-evidently desirable.

I used to think that a greater degree of ideological “sharpness” in our political system would be a good thing, but that seems far less desirable to me now: I don’t want to have to choose between two exaggeratedly single-view philosophies. A parliamentary politics with many, many parties seems an even more unsatisfactory halfway house between republicanism and direct democracy than our present system. I’d rather vote for a representative who strikes me as rational and fair-minded, even one who takes positions different from my own, than have to choose from a diversely sectarian menu and so divide my own political beliefs into fragments.

[permalink]


June 17, 2004

The Great Escape

Unlike Chun the Unavoidable (assuming he's not joking), I haven't given up blogging. I've just been away helping my mother-in-law to move out of her home and also trying to focus exclusively on several articles I owe to various publications. I'm going to be travelling some soon so blogging will have to wait a bit longer, but I have a number of substantial entries and materials being worked up that should be appearing here in mid-July, including long think-pieces on MMOGs, a blueprint for a new kind of college, and some more "Readings and Rereadings".


May 20, 2004

Busy couple of weeks here with grading, Honors exams, and some family matters, so blogging has been and will be lighter than usual for a bit.


May 20, 2004

Preparing a Place in the Museum of Failure

Norman Geras argues strongly that as a supporter of the war in Iraq, he bears no responsibility at all for Abu Ghraib.

I agree that those who supported the war with a rigorously reasoned case do not have to feel personally responsible for Abu Ghraib. I think it is appropriate to hold war supporters directly responsible for Abu Ghraib if (and only if) they fail to regard systemic abuse there and other American military prisons as being a grave concern by the very same criteria that we held Hussein's misrule a concern.

Abu Ghraib does have serious consequences for at least some of the arguments in favor of the war, and I don't think one can dodge those consequences. It's possible but highly unlikely that this is merely seven bad apples doing bad things--even if that were so, this is where the basic point about oversight comes in. A failure to have effective oversight is a guarantee of "bad apples" having impunity to do what they do. The furtive, paranoid unilateralism of the current Administration, its stonewalling of entities like the Red Cross, its apparent disinterest in due diligence practices within its own institutional frameworks, made Abu Ghraib inevitable.

Beyond that, however, the evidence is considerable that this abuse was not merely an accident of mismanagement, but a deliberate policy, deeply integrated into the Administration’s entire approach to the ‘war on terror’. Supporters of the war do need to regard that as a serious issue for their case, because the war cannot be supported as an abstraction. It can only be supported as a concretized, real-world project, and if it is done badly in the real world, it eventually will (and I think already has) do as much damage as the things it set out to fight. If you support the war as part of a battle against illiberalism, then illiberal conduct by your own "side" in the war has to mean something to you, have inescapable implications for your struggle. You can't just shrug off the creation of a gulag in Guantanamo where people have no rights, or evidence of a consistent policy of humiliation and abuse.

To understand this as a conflict that is resolvable strictly through military means or through the imposition of formalist structures is my mind to absolutely and completely misunderstand the nature of the larger conflict against terrorism. To extend the military trope, it’s the equivalent of fighting the wrong battle with the wrong weapons in the wrong place—and in military history, that’s how you lose a war even when you may have superior resources and force at your disposal.

Those who do misunderstand it this way almost all share two things. One, a belief in the universal and inescapable obligations of modern liberalism. It’s no accident that some Marxists, some liberals and many neoconservatives have found the war attractive, because they all derive tremendous intellectual strength from unversalist frameworks. This I find laudable and important and I recognize many supporters of the war who take this approach as intellectual cousins. (Those who do not share this commonality, like those parochalists and chauvinists on the American right who have endorsed brutality at Abu Ghraib, I recognize no connection with.)

But these supporters on both left and right share another attribute which I do not share: a belief that liberalism comes from above, that it can be imposed by power, that it emanates from the structure of the state and is guaranteed by securing a working monopoly on the means of violence. Equally, these thinkers share a belief that illiberalism and oppression emanate from the top, have their source in malformed states and ruling elites who have illegitimately seized control of the state in spite of the natural and rational desire of most people for liberal democratic norms. In essence, many of them--some from the left, some from the right--are statists. This is what the shorthand of "Wilsonian" is all about: a grab-bag aggregate that usefully links ideologically diverse arguments through their common understanding of the nature of political change and the sources of illiberalism in the world.

Fundamentally, this is a clash between different models of change-by-design in the world, of how one does praxis. Even when I was more strongly influenced by Marxism, I was always drawn to the Gramscian vision of politics, to the notion of a “war of position”, because that seemed much closer to me to how meaningful, productive, generative change in the world actually comes about, in the messiness of everyday life, in the small and incremental transformation of consciousness. I do not believe, and have never believed, in revolutionary change, in the proposition that a sudden, sharp disjuncture between the flawed present and the shining future can be produced by a seismic transformation of social structure directed by the state, by political vanguards or other major social institutions that possess strong governmentality.

Real revolutions happen in history, and they are genuinely disjunctive, deeply and abruptly transformative. The ones that are productive largely happen by accident. They happen because smaller social transformations have been building towards a point of criticality, towards a sudden phase change. They do not happen by design or intention. Real revolutions can be guaranteed by changes at the top, by the creation of laws and rights and constitutions, but they don't come from those things.

False revolutions happen in history, and they are much less disjunctive than their supporters pretend. These are the classic political revolutions, the ones that try to force history into a new mold by totalizing design, from above. They can do almost nothing generatively useful at the level of real social change: they can only destroy and terrorize. They cannot create. The only good example we have in modernity is the American revolution, and it is notable that its most fundamentally radical achievement was to specify constraints on its own transformative capacities. Its moderation was the essence of its radicalism, and the source of its long-term fecundity.

Power has a thermodynamic character: good things can happen when more energy is added to an existing system, but only if those bringing power to bear have modest ambitions and tremendous respect for serendipity and unintended consequences, for the organic evolution of events. The more ambitious the design, the more totalistic the ambitions, the more fatal and destructive the consequences are likely to be. A human world fully embued with the humanistic values of the Enlightenment is a world we all should desire, and we should harshly regard the world where it falls short of that. But this is where we have to have faith in the desirability of those values, and play the game steadily towards victory.

It is the “velvet revolutions” of the 1990s that we should cast our covetous eyes at. The fall of the Berlin Wall and the defeat of apartheid are the real triumphs of our age. No invasions or interventions have a share in those victories, but the resolute moral and political will of many states, civil institutions and individuals—backed where necessary by military power—can claim a great share of the credit. I don't deny that on occasion, positive revolutionary-style change does come from above, but this is a rare circumstance, and all the historical stars have to be in alignment for it to happen. That was not the case with the war in Iraq.

The Iraq War’s structural failure is that it is closely allied to the false revolutionary project, to statism, to the belief that social practice usually can be highly responsive to and conforming to the will of strong power, if only that power articulates its will clearly. This is the failed conceit at the bottom of the well, and where Iraq differs from Afghanistan. Afghanistan I support because its primary logic was self-defense, and its secondary logic put forward a sensible, consistently reasoned proposition that failed states represent a clear and imminent danger to the security of liberal democratic nations. The national security logic of Iraq, in contrast, was weak before the war and has gotten dramatically weaker since.

Alongside this deep philosophical shortcoming, the failure at Abu Ghraib is indeed a sideshow. It is the deeper failure that the reasoned supporters of the war need to hold themselves accountable for. The Iraq War will take its place eventually as an exhibit in a museum alongside Cabrini-Green, state-run collective farming, Brasilia, the Great Leap Forward, Italian fascism, and other attempts to totalistically remake the substance of social practice from above.

[permalink]


May 13, 2004

Welcome to Paragon City

I’m supposed to write an assessment of Star Wars: Galaxies and I’ve been putting it off because I feel I need to go in and play the game again just to challenge my established prejudices. The conventional wisdom is that a massively-multiplayer online game needs a year to be judged. But I’m dreading it: I follow news about the game and it seems to me that there may just be things about it that can’t be fixed.

SWG left a bad taste in my mouth about MMOGs. All that expertise, all that prior experience, all that money, and a franchise that you’d think was a can’t-miss proposition, and the result was a worse-than-average experience in a genre of games that is already very unsatisfactory.

As a consequence, I have been looking at every other MMOG coming down the pike with equal presumptive hostility. In particular, I was sure that City of Heroes, a MMOG with a superhero theme, would be a disaster. When the committed cynics at Waterthread started saying nice things about the late beta, I began to wonder.

Now I’ve been playing it for a couple of weeks, mostly on the Protector server, with a martial artist character named "Faust", and I have to admit it: I was wrong.

City of Heroes still has the basic problems of all MMOGs, but as far as the genre goes, it is one of the best. It’s actually fun to play, and even more amazingly, fun to play as a “casual” player—I can drop in for 30 minutes and still find something pleasurable to do. Even the feature that I was certain would suck, which was building your character around “archetypes” that made more sense in terms of MMOG conventions than the comic book narratives the game borrows from, works pretty well without seriously violating the sense that one is a superhero in a universe of superheroes. Basically, it’s one of the few MMOGs that has kept a clear head about fun being the number one objective.

Maybe the most astonishing thing about the game is just that the first day of play went without major technical gliches, and that so far, there are very few disastrous bugs or technical problems. The major issue at the moment is that one type of mission doesn’t work correctly, but it’s easy to avoid doing them. There’s a lesson here that’s crucial. The only other game of this kind to launch well was Dark Age of Camelot. It shares with City of Heroes a basic simplicity and cleanness of design. It’s clear: don’t try to do too much by your launch, and keep your design as minimalist as you can. I’m also hugely impressed by the communication from the developers: they tend to be very forthright, very out in front of problems.

Many small features in City of Heroes are well-implemented. For example, I really like that when I get missions from my “contacts”, after a certain point, I can just “call” them remotely to tell them the mission is completed—I don’t have to run all over creation to tell them. There are a few classic MMOG issues that are in some ways worse in City of Heroes than any other game: what people call “kill stealing” is for some reason uniquely aggravated in the evolving culture of its gameplay. The game also has a treadmill just like any other MMOG, and I still maintain that’s unnecessary, that designers are not thinking properly about how to scale challenges over time, and insist on making “hard” mean “time-consuming”. And finally, as is de rigeur for MMOGs, there are some really dumb and unoriginal names and designs for characters out there. I’ve seen huge numbers of Wolverine and Punisher clones. On the other hand, I haven’t seen a single “Legolas” yet.

There’s also some things I’ll be looking for the designers to do in the months to come that will help the game be more evocative of comic books. For one, I’m getting very tired of fighting cookie-cutter enemies: there should be colorfully indvidual supervillains at every level of experience. That’s the essence of the genre, and it’s sadly missing from the lower-level gameplay and even from the mid-game. In fact, how about every character created getting an “archenemy”, a supervillain who pops up from time to time to attack your character?

There are other elements of superhero narratives that need implementation in some way eventually. Secret identities and all that comes with them are completely absent. The mission storylines are pretty decent—I saved a mechanic and his family from some robots and now civilians remember that I did so—but there need to be more plot types, more content that evokes classic superhero tales. There need to be major public events—say each city zone being attacked by giant robots, with everyone pitching in to repel the menace.

I’m still going to play SWG later this month to be a responsible critic, but when I want to have fun, I’m going to be battling evil in Paragon City as the mysterious and inscrutable Faust.

[permalink]




recent blog (May 2004-January 2004)

stale blog (December 2003-November 2002)

timothy burke

swarthmore college

Recent Entries

The Romance of the Professorial Life
Nothing R Us
Crisis on Earth-Fanboy
Al-Qaeda On the Inside
Powerpoint, Presentations and Persuasion
Indian Guides
As I Would Not Be A Slave
The Limits to Generalism
What Gus Here is Sayin'
The Swarthmore Tomatofield War
How I Spent My Summer Vacation
On Third Partyism
Preparing a Place in the Museum of Failure
Welcome to Paragon City

Readings and Rereadings

Slater, Opening Skinner's Box
Hanson, Landed Obligation
Chomsky, Hegemony or Survival

Great Courses

The Story of Evolution and the Evolution of Stories

Cliopatria Entries

Heteronormativity in Action
The Fortunes of Political History
The Way the Camera Made Us
In the Shadow of The Jungle
What the Polls Can't Tell You
Twizzle Twazzle Twozzle Twome
Conan the Unready Defeats Dr. Zaius
Judging Jefferson
Yglesias and Androgyny
5,000 Years of Marriage?
Simon Schama I Love You
In a Thousand Years
The Perfectly Baked Pie
Don't Tug on Superman's Cape
Bloggers Beat Dead Horses
Ferguson's Sloppy Counterfactual
Smoke But No Fire
Franco Moretti: A Quantitative Turn for Cultural History?
Hole in the Whole
Onate or the Equestrian?
Robert Byrd and the AHA
One of These Things Is Just Like the Others
Venice in Vegas

The Jackdaw Nest

21st Century College: An Outline

The Narrative-Nudge Model for MMOGs

Rubicite Breastplate, Priced to Move Cheap

Should You Go to Grad Gchool?

From ABD to the Job Market: Advice for the Grad School Endgame

Why Journals Suck

Building the Liberal Arts Faculty

The Digital Divide is a Red Herring

Irrelevant, Irresponsible and Proud of It: My Perspective on Cultural Studies

 

 

Last Collection Talk 2002

9/11: A Painful Hesitancy, October 2001

Welcome to Swarthmore: August 2001

 



How to Read in College

Beyond the Five-Paragraph Essay



Syllabi

 

Regular Reading

Crumb Trail

Norman Geras

ThunkThunk

Mainly Martian

Crooked Timber

Electrolite

Michael Berube

Russell Arben Fox

Amardeep Singh

Jason Craft

Unfogged

Pandagon

The Ludologist

In the Shadow of Mt Hollywood

Terra Nova

Household Opera

The Straight Dope

The Weblog

John and Belle Have a Blog

Mamamusings

Brian Leiter

Invisible Adjunct

Early Modern Notes

Gnostical Turpitude

Fafblog

Wolfangel

Grand Text Auto

Gary Farber

Full Context

Cranky Professor

Brian Ulrich

11D

Caveat Lector

Alex Pang

Crescat Sententia

Abu Aardvark

Tacitus

Julian Dibbell's Playmoney

Matthew Yglesias

The University Without Condition

AfricaPundit

Gideon's Blog

Daniel Drezner

Frogs and Ravens

Mark Kleiman

Long Story; Short Pier

History News Network

Butterflies and Wheels

Steven Johnson

Alex Halavais

Gene Expression

Billmon

Iconoduel

Volokh Conspiracy

Ludology

Making Light

Political Animal

Boing Boing

Kerim Friedman

Brad DeLong

Red Ted

Uncle Jazzbeau's Gallimaufrey

Foreign Dispatches

Languagehat

Ken McLeod

Stavros

Cosma Shalizi

cobb, the blog

Penny Arcade

Joseph Duemer

William Tozier

Oxblog

Jane Galt

Kottke

Baraita

PvP

Margaret Soltan

Boldrobot

Erin O'Connor

Eph Blog

Games * Design * Art * Culture

Immediacy

Wonkette

The Little Professor

Lawrence Lessig

Gamegirl Advance

Justin's Links

Bookslut


recent blog
stale blog
ancient blog

All materials at Easily Distracted are copyright Timothy Burke. Please do not duplicate without permission or attribution.

Interpretation Theory Capstone syllabus: current draft

Social History of Consumption syllabus: current draft

Theories of Agency: a presentation to the Bryn Mawr Emergence Working Group

 

Want to contact me?

Email me at

tburke1

@

swarthmore.edu