George Mason University, Ph.D. in History. A Doctorate with a Difference designed for students to gain expertise in both conventional historical methods and web-base technologies. GMU distinguished faculty can give you individualized attention. Located near Washington, DC in northern Virginia. Check us out!
www.history.gmu.edu

| donations| archives| newsletter | contact | about us | FAQ''s

April 14, 2004
Text Size: A A A
Departments



Archives--Media's Take on the News

Media's Take on the News 6-2-03 to 7-14-03

  • Local Papers: Have They Changed Much?

  • Founding Fathers: Why So Popular Now?

  • Alabama's Revolt Against the Constitution

  • Does the U.S. Need Another MacArthur for Iraq?

  • Do Germans Have a Sense of Humor About Their Nazi Past?

  • Anti-Sodomy Laws: A Long American Tradition

  • Patriotism: As American as Apple Pie?

  • Democrats Can't Be Trusted with National Security

  • Norman Mailer: Bush, Honesty and Democracy

  • Slavery and Freedom and Philadelphia

  • Strom Thurmond's Black Daughter

  • Nixon and Hillary

  • Dumbocracy

  • David Brooks: In Lincoln's Image

  • The British General Who Launched an Invasion and Justified It on Moral Grounds

  • Jews Were Also Uprooted from Their Middle East Homes in 1948

  • The Romantic Illusions of Palestinian Revolutionaries

  • Scientists Say Robert the Bruce Tree Can Be Saved

  • Andrew Sullivan: Sodomy

  • Greenspan's Campaign to Stop Deflation Before It Gets Out of Hand

  • Robert Rubin's Record

  • Middle East Studies: Oversight Needed

  • The U.S. Forgot Liberia

  • George F. Will: The Supreme Court's Mistaken Ruling on Gay Rights

  • The Two George's: Orwell and Bush

  • JFK's Speech at American University Is as Relevant Now as in 1963

  • India and China: Friends Now?

  • Immigrants Who Join the US Military

  • The Looting of Iraq: No Myth

  • Does the Senate Have a 50-50 Role in the Selection of Court Nominees?

  • 9-11: The Movie (Really)

  • Wesley Clark for President

  • They Lied in 1898 and 1964, Too

  • Truth Is the First Casualty

  • Europe Shouldn't Run from Its Christian Origins

  • Did the West Succeed By Exploiting Others?

  • "Mencken Made Up Stuff, Too"

  • Arabs and Jews: A Visit to Auschwitz

  • Why Is Plagiarism Such a Big Problem These Days?

  • Conservatives Have a History of Exaggerating Threats Posed By Tyrannical Regimes

  • Sidney Blumenthal and Historians

  • 1933 and 2003

  • 40 Years Ago: Standing in the Schoolhouse Doorway

  • Should the Old North Church Receive Federal Assistance?

  • Why Hasn't the Bush Tax Cut of 2001 Led to Job Growth?

  • Bush's Tax Cuts Will Help Just as Reagan's Did

  • How Saddam Can Be Held Responsible for 9-11

  • Trotsky's Influence on the Intellectuals Who Rule the Washington DC Roost

  • Did Arafat Make-up a Phony History of Palestine?

  • Filibuster a Court Nomination?

  • Canadians Don't Know Their History Either

  • Should Congress Open an Investigation into the Use of Intelligence?

  • Did Blair Manipulate Intelligence to Advance His War Aims?

  • A Unipolar World Can Be a Stable One If the U.S. Behaves with Restraint

  • The Disturbing Parallels Between Japanese Internment and 9-11

  • What Happened to Kennedy's Dream of a European-American Union?

  • Will Harvard Take a Donation from an Arab Sheik Associated with Holocaust Deniers?

  • The French Haven't Forgotten D-Day

  • The History Conservatives Have Forgotten

  • Perils of a Weak Dollar

  • Update on the Looting of Iraq

  • What Happens If Congress Is Taken Out in an Attack?

  • Is It Really 1933 in America Today?

  • The Tyranny of the New Grammar Purists

  • Click here to return to top of page.

    Local Papers: Have They Changed Much? (posted 7-14-03)

    Diana West, writing in the Washington Times (July 11, 2003):

    One way to size up a local community is to buy its local paper. So I did, forking over $5 for a newspaper originally priced at 4 cents. The price represents quite a mark-up until you realize this particular paper is over 100 years old, and the junk store it came from is bilking the summer tourist trade as best it can before the snow flies — which, judging by the air of mistrust with which true Vermonters regard even a sustained July heat wave, could come any time now.

    One of 2,000 copies printed on Friday, October 5, 1894, this barely tattered and lightly sepia-ed edition of The United Opinion could be the only copy to have survived unburnt, uncrumpled — even unrecycled — the rest of the 19th century, all of the 20th, and the first couple years of the 21st. On the day this eight-page broadsheet was new, Grover Cleveland was into his second term as president, the Pullman strike had recently made labor history, Hawaii was a republic, and a border dispute between Venezuela and Great Britain was raising tensions between the United States and Great Britain.

    None of which is mentioned in The United Opinion. "KILLED HIS SISTER," runs the headline to a story datelined Worcester, Mass., one of two that dominate the front page. The other lead story pertains to a now-forgotten war between China and Japan that China would lose, exposing both the weakness of the Manchu dynasty and revealing Japan to be the rising power in East Asia — a rise that would continue unchecked until World War II.

    Not that United Opinion readers had a crystal ball in which to see this. Besides, they were probably more taken with the details of the Carr murder story, a real-life tenement melodrama among the mill workers. "William Carr Expressed no Regret at His Awful Crime. But," the headline continues, "One Dollar Left Him In His Mother's Will — It Filled Him With Rage Against Other Members of the Family." Which about says it all, given that Theodore Dreiser never decided to elaborate. Of course, when it comes to dramatics, nothing in the paper compares with the advertisements: "Can it be that Insanity is Staring Me in the Face?" Try: Dr. Greene's Nevura blood and nerve remedy.

    Click here to return to top of page.

    Founding Fathers: Why So Popular Now? (posted 7-11-03)

    Patrik Jonsson, writing in the Christian Science Monitor (July 11, 2003):

    When descendants of Thomas Jefferson's alleged slave paramour Sally Hemings arrive at Monticello this weekend, they'll come once more as a family apart.

    Despite DNA tests that detected Jeffersonian blood coursing through Hemings's descendants, the 25-year argument over Jefferson's heirs goes not just to science, but to Jefferson's failure to come to terms with slavery. After being refused admittance to the official Monticello family, the Hemings have gone their own way, with 150 converging for a weekend of storytelling and liturgical dancing at the Virginia plantation where America was dreamed up. "People still doubt us, but we on the black side have always known the truth," says Julia Westerinen, one of Hemings's descendants.

    The rancor over Jefferson's biological legacy represents the seedier part of a broadening search for its founding essence, a history distilled not just in forefathers' trysts and heartbreaks, but in their philosophical moorings. Canonized in the 19th century, picked apart late in the 20th, they're cast, these days, in a more nuanced light: visionary figures, yes, but ultimately humans - enigmatic, inconsistent, and something short of George Washington's infallible cherry-tree virtue.

    Such intense historical introspection - fueled by a flurry of books, conferences, and popular discussion - is unusual.

    It's a debate that, some say, reveals a country adrift, pining for Jeffersonian wisdom and Washingtonian leadership in a new kind of war, worrying deep rips in the social fabric on affirmative action and gay marriage, even sifting through 18th-century economic buzz for discussions on corporations' global role.

    "There's a great desire now, a sort of underlying patriotism, where everybody is going back to the founding fathers to figure out what the country is about," says Ronald Radosh, a senior fellow at the conservative Hudson Institute in Washington.

    From Civil War reenactors to Franklin fanatics, the past has always loomed large in the American psyche. But recently, thinkers are gobbling history at a furious rate - and arguing at a rising pitch.

    It's been a while since the country was at such a crossroads. "This discussion ... first happened in the 1770s and it happened last in the 1930s with the New Deal," says Thomas Hartmann, a writer and radio-show host who deals with the Founders' creeds.

    Click here to return to top of page.

    Alabama's Revolt Against the Constitution (posted 7-9-03)

    Bruce Fein, writing in the Washington Times (July 9, 2003):

    The chief justice of the Alabama Supreme Court, Roy S. Moore, has forgotten that Robert E. Lee surrendered to U.S. Grant at Appomattox Court House, not vice versa. Admired by some political conservatives, Justice Moore denies the constitutional authority of federal courts to issue rulings interpreting the establishment clause of the First Amendment that he is bound to obey.

    That the Alabama chief justice revels in seeking to unravel the rule of law shocks. And what multiplies the shock is the deafening Republican silence over Justice Moore's rebellion against the Constitution despite their characteristic celebration of law and order. The jurist should be removed from office by the Alabama Judicial Inquiry Commission and the Court of the Judiciary for violating his constitutional oath to "support this Constitution [of the United States]." Judges are sacred legal symbols who teach the whole people by their example. To tolerate their defilement of the rule of law with impunity is thus unthinkable.

    The case against Justice Moore leaps from the pages of the unanimous decision of the 11th U.S. Circuit Court of Appeals in Glassroth and Howard vs. Roy S. Moore (July 1, 2003). In his capacity as chief justice, Justice Moore installed a transfixing 2½ ton monument of the Ten Commandments as the chef-d'oeuvre of the rotunda in the Alabama State Judicial Building. He intended to teach the citizens of Alabama that God's law trumps laws ordained by men, such as the U.S. Constitution, if the two conflict. He testified as follows during a trial testing whether the Ten Commandments display violated the establishment clause because it put government in the business of sponsoring religion:

    "Question: [W]as your purpose in putting the Ten Commandments monument in the Supreme Court rotunda to acknowledge God's law and God's sovereignty?"

    "Answer: Yes."

    "Question: Do you agree that the monument, the Ten Commandments monument, reflects the sovereignty of God over the affairs of men?"

    "Answer: Yes."

    "Question: And the monument is also intended to acknowledge God's overruling power over the affairs of men, would that be correct?"

    "Answer: Yes."

    In sum, the Justice Moore aimed to proselytize in favor of the supremacy of the Ten Commandments over the Constitution and laws of the United States through the exploitation of his control over the Alabama State Judicial Building. He would not be satisfied with missionary work as a private citizen preaching to persons eager to listen with unbuttoned ears.

    For two centuries since the landmark ruling in Marbury vs. Madison (1803), the Supreme Court of the United States has been understood as the final arbiter of constitutional questions. The case of Cohens vs. Virginia (1824) clarified that state courts are bound to honor Supreme Court decisions. And the Civil War conclusively determined that neither states nor state officers may unilaterally sever their constitutional obligation to accept and enforce federal decrees.

    These principles stem directly from the constitutional text, not from judicial invention. Article VI, Clause 2 of the Constitution unambiguously declares: "This Constitution, and the Laws of the United States which shall be made in Pursuance thereof ... shall be the supreme Law of the Land; and the Judges in every State shall be bound thereby, any Thing in the Constitution or Laws of any State to the Contrary notwithstanding." Clause 3 fortifies the Supremacy Clause by mandating that "judicial Officers" of the several states take an "Oath or Affirmation, to support this Constitution."

    According to unbroken precedents of the U.S. Supreme Court following Lemon vs. Kurzman (1971), government actions whose purpose or primary effect is to promote religion are constitutional heresy under the establishment clause. Evidence was overwhelming proving that Justice Moore's motivation behind Ten Commandments monument was the exaltation of religion in the affairs of men. During his unveiling catechizing, he preached that "we must invoke the favor and guidance of almighty God;" and, that the monument "marked the return of the knowledge of God in our land." He also said speeches conceived by mere mortals would never be worthy of equal position and prominence because placing "a speech of any man alongside the revealed law of God would tend to diminish the very purpose of the Ten Commandments monument."

    Click here to return to top of page.

    Does the U.S. Need Another MacArthur for Iraq? (posted 7-9-03)

    Colin Joyce, writing in the London Daily Telegraph July 5, 2003):

    THE enormous Cadillac used by Gen Douglas MacArthur during six years as Japan's overlord is now in MacArthur's Garage, a bar in Atsugi, a few miles from where he landed in August 1945, wearing dark sunglasses and with a pipe jutting from his mouth.

    The conqueror's casual look was shocking to the Japanese but many today view the moment as the birth of democracy in Japan.

    A Cadillac would not be very useful on Iraq's dusty roads but, as the American occupation of Iraq runs into increasing difficulties, US officials can but look wistfully at Gen MacArthur's success.

    The Americans came to Japan as conquerors after the defeated Emperor told his people that the time had come to "bear the unbearable". Some committed suicide rather than face that shame. Now, though, the Japanese have happy memories of their only period of foreign occupation.

    Portraits of the general stare down at customers drinking beer and eating MacArthur beef jerky. The talk is not usually about the general but, when it is, the compliments are warm.

    When Saddam Hussein fell, President George W Bush's administration decided to shy away from a style of rule as personal or colourful as MacArthur's. Jay Garner, a retired general, arrived in Iraq with little fanfare as the American administration sought to avoid the accusation that it was a neo-colonial power. Derided by his critics as dull and ineffectual, he was quickly replaced.

    But, after two months of rule by his more hard-hitting successor, Paul Bremer, the reconstruction effort remains bogged down and US officials may wish they could install a figurehead as charismatic and forceful as Japan's post-war administrator.

    An authoritarian state was transformed into a democracy and the foundations were laid for the most remarkable economic recovery in history.

    In recognition of its own experience, Japan is a generous donor to ruined countries. It has pledged pounds 66 million of aid to Iraq and its neighbours and last year hosted an Afghan reconstruction conference. Gen MacArthur, a committed Christian, brought what the historian John Dower called "messianic zeal" to democratisation and demilitarisation.

    The Japanese military was destroyed as a caste, with a democratic constitution drafted instead. Industrial conglomerates were broken up, trade unions were legalised and farmland was redistributed. The education system was revamped.

    Japanese, used to the idea of revolution from above, warmed to their American shogun. Thousands sent letters of praise to Gen MacArthur and gifts of local delicacies and crafts.

    His office has been preserved in an insurance company's building. A spokesman for the firm said: "It is a piece of history. The occupation did more good than bad. MacArthur taught us Japanese what is freedom."

    The occupation cannot be judged a complete success, though in Japan few dwell on the failures. Historians argue that Gen MacArthur's decision to absolve Emperor Hirohito of blame for the war waged in his name allowed Japanese to escape soul-searching over the war.

    Click here to return to top of page.

    Do Germans Have a Sense of Humor About Their Nazi Past? (posted 7-9-03)

    Kate Connolly, writing in the London Daily Telegraph (July 5, 2003):

    SILVIO Berlusconi could hardly have touched on a more sensitive nerve when he said the Social Democrat MEP Martin Schulz would be ideal for the role of a concentration camp commandant.

    Sixty years after the last Nazi death camp was liberated by Allied forces, only a handful of comics can get away with viewing the darkest chapter of Germany's history with any sort of humour - and then it is only allowed to be of the blackest kind.

    Several recent attempts go quite some way to dispelling the Italian prime minister's claim that the Germans lack a sense of irony regarding their Nazi era.

    Goebbels und Geduldig, a recent two-part television drama, tells the story of a talented Jewish impersonator and concentration camp inmate who survives the war by posing as the Nazi propaganda minister, Joseph Goebbels.

    But this black comedy, which did not receive high ratings, had to do the rounds of film festivals for two years before television chiefs had the confidence to expose German audiences to it.

    A comic strip centring on Adolf Hitler called Adolf, by the cartoonist Walter Moers, has had considerable commercial success. It ridicules the Fuhrer, imagining that he is still alive and is a drug addict working as a prostitute who is whisked off by an alien.

    A German-based stand-up comedian of Turkish origin, Serdar Somuncu, has made a name for himself across the German-speaking world for bringing Mein Kampf and Goebbels's speeches into cabaret bars and reading tracts to audiences who cannot help but laugh at their seemingly ridiculous content.

    His line of entertainment fits into the growing train of thought that only when the Germans learn to laugh about Hitler will they succeed in undermining his enduring legacy.

    But what Mr Berlusconi's remarks did, and what Germans will not forgive him for, was to bring them uncomfortably face to face with something that still haunts everyday life in Germany. The tendency is shaped, say historians and psychologists, by a concerted, almost obsessive effort to denounce the Nazi era.

    "Germany's whole political culture is shaped by its rejection of the Third Reich," said Etienne Franois, a Berlin professor of history.

    Click here to return to top of page.

    Anti-Sodomy Laws: A Long American Tradition (posted 7-8-03)

    Adam Goodheart, writing in the NYT (July 3, 2003):

    I had a date the other night with a guy who has been dead for almost 400 years. His name was Richard Cornish, and the last time he got involved with another man, he was executed for his crime. That was at Jamestown, Va., in 1624, and his case was the first recorded sodomy prosecution in American history....

    In his majority opinion in last week's ruling, Justice Anthony M. Kennedy dismissed conservative arguments that laws against same-sex intercourse had deep roots in Anglo-American tradition. Sodomy codes, he wrote, originally proscribed both homosexual and heterosexual acts, and in any event were rarely enforced except in cases of rape. Therefore, he wrote, defenders of the Texas law were wrong to claim that history was on their side.

    But Justice Kennedy's well-intentioned evasion slighted the true past. America has a long tradition of laws regulating private sexual conduct, and these laws have been enforced with particular ferocity when the conduct has been between people of the same sex. In the case of Cornish — a sea captain convicted, on flimsy evidence, of sodomy with an indentured servant — not only was he hanged, but when several other settlers grumbled about the verdict, they were whipped or pilloried, or had their ears cut off.

    Similar laws were enforced in the other American colonies. In Massachusetts in 1629, five "beastly Sodomiticall boys" were sent back to England for execution.

    Such colonial codes, inherited from English common law, were the direct ancestors of modern laws. After the American Revolution, their harshness was gradually tempered. No less a civil libertarian than Thomas Jefferson supported changing Virginia's penalty for sodomy from death to mere castration, but even so, as of last week, after four uninterrupted centuries, his state was one of at least 13 where sodomy laws remained on the books.

    Justice Byron White, upholding Georgia's sodomy law in 1986, referred approvingly to the "ancient roots" of proscriptions against homosexuality. These ancient roots were evident even in the language of 20th-century sodomy rulings, language that smacked of witchcraft trials. In 1921, Florida's Supreme Court went so far as to refer to men convicted of sodomy as "creatures" who "are called human beings."

    Now conservatives infuriated by the Supreme Court's decision — and no doubt laying the groundwork for coming political battles over gay marriage — are fulminating about the court's betrayal of "traditional" American values. In one respect, they are absolutely right: laws that penalize homosexuality are, indeed, deeply rooted in our shared traditions. But this should only strengthen our national resolve in undoing them.

    To visit Jamestown is to be reminded that the founders of our nation inherited a great deal of baggage from the past, baggage that has only gradually been left by the wayside. Not only did the persecution of homosexuals in America begin in Jamestown, almost two centuries before independence, but so did the enslavement of blacks. The colonists' break with England was the first conscious step toward creating in the New World a world that was truly new. American history has been a continuing revolution, of which 1776 was only one chapter — as Jefferson himself famously predicted.

    In the eloquent concluding passage of his opinion, Justice Kennedy wrote that America's founders, though they would never have imagined their Constitution being invoked to protect sodomy, "knew times can blind us to certain truths and later generations can see that laws once thought necessary and proper in fact serve only to oppress."

    Last week's decision should therefore be hailed not just as a victory for fairness and equality, but as a step forward in another American tradition: that of clearing out the dust of the past and remaking the world afresh.

    Click here to return to top of page.

    Patriotism: As American as Apple Pie? (posted 7-8-03)

    Janny Scott, writing in the NYT (July 6, 2003):

    Americans like to think of themselves as patriotic. They have been saying as much to pollsters for years. Men, women, old people, younger people, rich people, poor people, whites, blacks, urbanites, farmers: Nearly everyone says roughly the same thing.

    But pollsters tend not to ask what people mean when they say they are patriotic. The meaning of patriotism has always been a moving target. It has meant different things to different people at different times in history. Like the flag, it is open to reinterpretation.

    Over the past two centuries, patriotism has been invoked to make the case for all sorts of things: military sacrifice, conscientious objection, unity, dissent, inclusion, exclusion, anti-Communism, anti-Catholicism, tax cuts, a living wage (not to mention cigars and shopping).

    "Who was the patriot in 1861?" asked Walter Berns, an emeritus professor of history at Georgetown University. "Robert E. Lee or Ulysses S. Grant? In a way, it depends on how you define patriotism. If patriotism is simply a kind of filial piety, my country right or wrong, then the case for Lee can be made."

    "Because, as Lee himself said, he could not raise his hand against his family, his children, his state," Professor Berns said, referring to Lee's decision to decline the offer to command the Union Army. "If, on the other hand, patriotism means devotion to a particular political idea, then clearly Grant was the patriot and Lee was not. That, in a sense, is part of the problem that we face even today."

    To some, patriotism is unquestioning loyalty to the nation. To others, it carries with it expectations that the government will give something in return. Women have experienced patriotism differently than men. Blacks and Indians have experienced it differently than whites.

    In good times, the patriotic reflex weakens. In times of crisis, patriotism thrives.

    What about anxious periods, like the present? David M. Kennedy, a professor of history at Stanford University, finds that periods of chronic anxiety have been known to produce "patriotism of quite a cranky sort."

    In the late 19th and early 20th centuries, for example, anxiety about immigration spawned the American Protective League, an anti-Catholic, anti-Semitic, anti-immigrant group. The Ku Klux Klan was revived. But there was also a surge of reform under Theodore Roosevelt.

    "These moments of anxiety, it seems to me, have both an unlovely and quite a progressive and productive face to them potentially," Professor Kennedy said. As for today, he said the full patriotic potential is not yet clear. So far, the record is mixed.

    "On the one hand, we get what some would say is the overreaction on the part of the Justice Department about how much tolerance and diversity we can afford," he said. "On the other hand, you have gestures by the president toward inclusiveness."

    Click here to return to top of page.

    Democrats Can't Be Trusted with National Security (posted 7-8-03)

    Lawrence F. Kaplan, senior editor at the New Republic, writing in the Wall Street Journal (July 8, 2003):

    Rather than claim the mantle of Truman, John F. Kennedy or even Bill Clinton, the Democratic presidential field lately seems to be taking its foreign policy cues from the New York Review of Books. There is, to begin with, Sen. John Kerry, who claims the president "misled every one of us" into backing the war in Iraq--a claim echoed by Sen. Bob Graham--and who still cannot decide whether he supported the effort. Then there is Howard Dean, surging in the polls and unsure whether Iraq is better off without a genocidal maniac in power, along with Dennis Kucinich, whose campaign signature is a proposal to create a "Department of Peace." As doubts over weapons shade into doubts about the virtue of the war itself, prominent Democrats have even begun to predict another Vietnam in the making.

    Before following Dr. Dean any further down this road, party leaders would do well to cast a glance backward, for this is hardly the first time they have traveled there. The transformation of the party of Truman into the party of McGovern began, of course, in the jungles of Vietnam. By 1972, the conviction that American power was tainted, marred by deceitful use in a "criminal" war, prompted the Democratic presidential nominee to demand that America "come home" from the world. And while such candor tended to be the exception rather than the norm--among others, Henry "Scoop" Jackson and Daniel Patrick Moynihan directly challenged the isolationism that had seized the party's ranks--a barely concealed suspicion of American power lingered in Democratic foreign policy salons for the next two decades. For the likes of Jimmy Carter and Michael Dukakis, what had begun as opposition to the war in Vietnam had, by the eve of the 1991 Gulf War, hardened into a reflexive opposition to the use of force.

    The Clinton presidency put all this to rest. Bill Clinton's motives for employing military power in Haiti, Bosnia and elsewhere may have been less than exalted, but if a commander in chief earned a battle ribbon each time he sent U.S. troops into action, Mr. Clinton would be wearing a chestful today. Assisted by the end of the Cold War and a keen eye to the polls, Mr. Clinton largely drained foreign-policy debates of their philosophical substance, much as he did with American politics as a whole. The result was that, with the exception of a few party activists frozen in amber, he triangulated the "peace Democrats" out of existence.

     

    The end of the Cold War had the opposite effect on the GOP, which counted on the 20-point advantage Republicans traditionally enjoyed on national security matters. During the '90s, though, national security barely registered among the concerns of voters. A 1995 Times Mirror poll found that only 9% of respondents identified defense and foreign policy as the most important issues of the day, a sharp decline from the 42% who put national security at the top in 1980. Indeed, opinion surveys on the eve of the 1996 election showed that Americans actually trusted Mr. Clinton to do a better job of handling foreign affairs than his Republican opponent.

    In the aftermath of Sept. 11, however, voters appear to have reverted to Cold War type. During the 2002 midterm election cycle, polls found that most voters rated national security as the country's top priority, even more important than the economy. And as defense and foreign policy issues have re-emerged, so too has the Republican advantage.

    Click here to return to top of page.

    Norman Mailer: Bush, Honesty and Democracy (posted 7-8-03)

    Norman Mailer, writing in the New York Review of Books (July 17, 2003):

    Democracy, more than any other political system, depends on a modicum of honesty. Ultimately, it is much at the mercy of a leader who has never been embarrassed by himself. What is to be said of a man who spent two years in the Air Force of the National Guard (as a way of not having to go to Vietnam) and proceeded—like many another spoiled and wealthy father's son—not to bother to show up for duty in his second year of service? Most of us have episodes in our youth that can cause us shame on reflection. It is a mark of maturation that we do not try to profit from our early lacks and vices but do our best to learn from them. Bush proceeded, however, to turn his declaration of the Iraqi campaign's end into a mighty fashion show. He chose—this overnight clone of Honest Abe—to arrive on the deck of the aircraft carrier Abraham Lincoln on an S-3B Viking jet that came in with a dramatic tail-hook landing. The carrier was easily within helicopter range of San Diego but G.W. would not have been able to show himself in flight regalia, and so would not have been able to demonstrate how well he wore the uniform he had not honored. Jack Kennedy, a war hero, was always in civvies while he was commander in chief. So was General Eisenhower. George W. Bush, who might, if he had been entirely on his own, have made a world-class male model (since he never takes an awkward photograph), proceeded to tote the flight helmet and sport the flight suit. There he was for the photo-op looking like one more great guy among the great guys. Let us hope that our democracy will survive these nonstop foulings of the nest.

    Click here to return to top of page.

    Slavery and Freedom and Philadelphia (posted 7-4-03)

    Stacy A. Teicher and Walter H. Robinson, writing in the Christian Science Monitor (July 3, 2003):

    At the very moment they were in Philadelphia declaring that all men are created equal, many of America's Founding Fathers were slave owners. Activists are now demanding a fuller accounting at democracy's birthplace.

    When visitors alight from their tour buses for Friday's opening of the National Constitution Center in Philadelphia, they'll be celebrating Independence Day in a place where the of American freedom is anything but simple.

    The center is part of the 45-acre Independence National Historical Park, where ongoing revitalization plans are a flash point for fierce debates over whose truth is being told, and how.

    The symbolic fault lines can be found just beneath the soil. Under the bus drop-off area, for instance, lies the historic homesite of James Dexter, a former slave who cofounded The Free African Society in 1787. The site would have been paved over with no exploration, if not for a concerted drive by local African-American activists.

    Now that it has been excavated and commemorated in an exhibit, the site can put into context to a more familiar event of 1787 - the Constitutional Convention. Delegates met three blocks from Dexter's home, at what is now known as Independence Hall, to hammer out the terms of the new nation - including counting each slave as three-fifths of a person and extending the slave trade for the next 20 years.

    Although disputes continue to bubble up, many historians, activists, and park officials say this juncture is a unique opportunity to examine the paradoxes at the country's very foundation. While popular history often relegates slavery to the shadows when celebrating the Founding Fathers, now its inextricable links to the economic and political beginnings of the United States are being brought to light.

    "It will challenge old ideas [for people to see that] America's most-agreed-upon 'birthplace of freedom' was 'complicated' by slavery - that is something that Americans need to know," says Clement Alexander Price, a history professor at Rutgers University, and a consultant to the park service on the site where Presidents George Washington and John Adams lived and governed.

    "Liberty for some Americans came at the price of enslavement for other Americans," he says. "The 1790s was a very critical period, because it was the first decade of the new republic, and at end of it, the country had pretty much decided that it was not going to deal with slavery."

    Some activists say decisionmakers today are still not dealing adequately with the slavery issue - and they worry that, with over $ 100 million of federal money being poured into the park's revitalization, this key opportunity might be squandered. At the time of publication, two groups were planning demonstrations: a candlelight memorial walk Wednesday to Philadelphia sites where Africans were sold as slaves; and a "Black Independence Day" Thursday on Independence Mall, to "free" the stories of African ancestors.

    For their part, representatives of the Constitution Center (an independent nonprofit organization) and the park say they've never had any intention of shying away from slavery and other complexities of early American history.

    Central to that story are the slavery compromises at the Constitutional Convention, which left a mixed legacy of prosperity and victimization, unity and civil war, treasured diversity and modern-day racism. Efforts to portray that legacy are often caught between two camps: those who say there's not enough truth-telling about racial injustice, and those who say that a patriotic view of the Founding Fathers will be unnecessarily sullied by dwelling too much on slavery.

    "There are lots of people who think that kids will not want to be American if they learn [about slavery and] genocide against Indians," says Gary Nash, a historian at the University of California, Los Angeles, who has done extensive research on Colonial Pennsylvania and has advocated for greater representation of African-American history at Independence Park. But for the most part, he adds, "Americans do not think that it is unpatriotic to talk about our blemishes."

    The saving grace in Philadelphia may be that the story of the African-American experience there is not just a story of slavery, but also of "free" blacks and their crucial role in establishing black churches, the Underground Railroad, and other key institutions.

    "The best way to shed light on the juxtaposition of freedom and slavery," Professor Price says, "is to point to the fact that as more Americans of African ancestry were gaining their freedom during and after the American Revolution, they put a very high premium on freedom - hence enhancing the meaning of freedom in a society that still had not made up its mind as to how free or how enslaved it was going to be."

    Indeed, the Constitution Center aims to celebrate "the struggles of generations to realize the promise" of that revered document, says research director Stephen Frank.

    Its exhibits and multimedia presentations address not just Colonial history, but the Civil War, the civil rights movement, and various ways that citizens today can make a difference. The center will also incorporate artifacts from its own building site - including evidence of enslaved African and native American populations. Excavated in 2001, the site is considered the richest trove of Colonial American artifacts ever found in an urban area.

    The center's presentations relating to slavery, Mr. Frank says, reflect the most recent scholarship.

    That body of scholarship - on everything from the economic impact of slavery to antislavery sentiment at the time of the revolution - has been building for decades, gradually correcting longstanding mistaken notions.

    Prior to World War II, for instance, it was widely believed that slavery had not been profitable, says Barbara Solow, an economic historian retired from Boston University. Americans held to images of Pilgrims landing and intrepid frontiersmen pushing West, while largely ignoring the economic underpinnings of plantation slavery.

    But in recent decades, historians have shown that slavery provided the primary financial support of the Colonies and the United States in its first 50 years, she says. "If you look at what was moving across the Atlantic [during the Colonial period], it was either slaves, the products of slaves, supplies to sustain slaves, or things bought with the earnings of slave labor. Seventy-five percent of Colonial New England's exports went to the Caribbean to support the slave system."

    From 1807 to 1865, adds Harvard economic historian Sven Beckert, "the center of our economy was cotton," and both North and South profited.

    "The old history separated 'American capitalism' and 'democracy' from 'slavery.' " Dr. Beckert says. "In the 'new history,' the three are organically connected."

    Much of this scholarship is not yet widely appreciated outside the realm of economic historians themselves, Solow says. But it is getting more attention, especially from advocates of the slavery-reparations movement.

    Nash agrees that only in this generation have historians come to accept the idea that "Slavery allowed there to be liberty... There could not have been as much liberty as the colonists gained had there not been enslavement of a fifth of the population."

    But that's not to say there wasn't strong opposition to slavery at the time. It arose alongside colonists' arguments against the tyranny of Britain. "The paradox became part of the whole Revolutionary discourse," Nash says.

    That's why he rejects the argument that a politically correct move is under way to impose 21st-century values on 18th-century individuals. It's simply a matter of acknowledging that the Founding Fathers embodied contradictions, he and others say.

    Of the 55 Constitutional Convention delegates, 25 owned slaves. At one time Benjamin Franklin owned a total of five slaves, but it's not clear if any were still serving him at the time of the convention, Nash says. By 1787, Franklin was president of the Pennsylvania Society for Promoting the Abolition of Slavery, but he declined to read a statement condemning slavery at the convention. There were two main obstacles to people like Franklin using their moral capital to eliminate slavery, Nash says: the question of how to compensate slave owners for the slaves they regarded as their property, and the concerns that black and white people would not be able to live together peaceably on equal footing.

    Click here to return to top of page.

    Strom Thurmond's Black Daughter (posted 7-3-03)

    Diane McWhorter, writing in Slate (July 1, 2003):

    In all the words spent on Strom Thurmond's life and times since his death last week, I have seen no acknowledgment of the most interesting of his sundry racial legacies. She is Essie Mae Washington Williams, a widowed former school teacher in her 70s, living in Los Angeles. Presumably she did not show up for any of the obsequies even though Strom Thurmond was almost certainly her father. Williams is black.

    Jack Bass and Marilyn W. Thompson present persuasive evidence in their 1998 biography, Ol' Strom, that Thurmond sired a daughter in 1925 with a black house servant named Essie "Tunch" Butler, with whom he reputedly had an extended relationship. Though "Black Baby of Professional Racist" would seem to sail over the man-bites-dog bar of what is news, the story has never really gotten traction. The particulars of this family saga simply do not fit into the "redemption narrative" Americans tend to impose on our more regrettable bygones: Better that ol' Strom "transformed" from the Negro-baiting Dixiecrat presidential candidate of 1948 to One of the First Southern Senators To Hire a Black Aide in 1971.

    In contrast to, say, George "I Was Wrong" Wallace, Thurmond has always been an ornery redemption project. He did not repent. Even so, his illegitimate daughter further complicates the moral picture. Does she mean that he was even more heinous than we knew? Or that—dude!—he wasn't such a racist bastard after all?

    We need not dwell on the obvious mind-boggling hypocrisies here: that someone who ran for president on an anti-pool-mixin' platform was party to an integrated gene pool. Or that Thurmond's other signature political achievement—the 24-hour-without-bathroom-break filibuster against the Civil Rights Act of 1957—was done in the name of sparing the South from "mongrelization." This form of duplicity has been a Southern tradition dating back to those miscegenating slave owners. Their peculiar conflation of shame and honor was captured in 1901 Alabama, at a constitutional convention called to disfranchise blacks. A reactionary old ex-governor known for being good to his mulatto "yard children" was aghast that the insincere anti-Negro propaganda fomented by him and his peers might bring actual injury to its objects. He demanded to know why, "when the Negro is doing no harm, why, people want to kill him and wipe him from the face of the earth."

    Even as Thurmond was making a career of segging against his own flesh and blood, he himself wasn't a complete cad. If he didn't exactly claim Essie Mae Williams, neither did he disown her. He gave her money and paid her regular visits (and probably tuition) at the black South Carolina college where she was a "high yaller" sorority girl while he was governor of the state. And in some ways, Williams has played the dutiful daughter, insisting over the long years that Thurmond was merely a "family friend." (Efforts to reach her failed.)

    Click here to return to top of page.

    Nixon and Hillary (posted 7-3-03)

    John Taylor, executive director of the Nixon Foundation (July 3, 2003):

    RN and Hillary – peas in a pod?

    That’s the thrust of a thoughtful post on the Nixon Forum composed by Doug Sidewand in Florida, one of our regular posters. “People like this are often brilliant and have outstanding leadership qualities,” he writes. “Yet many disagreed and vilified [them]. It's what happens when you're a leader.” (If you want to participate in the Forum, just visit www.nixonlibrary.org and follow the link on the upper right-hand corner of the page.)

    I do agree with Doug that both RN and Mrs. Clinton are polarizing figures. Perhaps polarizers can be defined as colorful politicians who stand out from the pack early in their runs. The young RN attracted his most persistent critics because of his anti-communism and particularly the Alger Hiss case. Mrs. Clinton’s political persona was also defined early in her national career. Many on the right identified the Clintons, fairly or unfairly, with the worst characteristics of the sixties generation. Then the newly-elected President Clinton set his first lady’s legacy in concrete, or rather red tape, by asking her to formulate a plan that, if enacted, would have amounted to the nationalization of U.S. health care. So if her critics still see her as an ambitious, power-grabbing, big-government liberal, it’s entirely her and her husband's doing.


    Besides their politics, the biggest difference between RN and Mrs. Clinton is that she gets a near free pass from the media. In his scores of post-Presidential interviews, RN was asked hard, sometimes even savage questions about Watergate and other controversies. In two major network interviews recently, Mrs. Clinton was asked almost no substantive questions about the Clinton scandals, and although she is a U.S. senator and probably the '08 Democratic frontrunner, she was asked almost nothing about where she wants to take the country. One has the impression the major media are trying to foist her on us as an inexorable phenomenon, too big and important to be asked petty questions about mysteriously reappearing legal records or lucrative commodity trades, about taxes or defense.

    As a matter of fact, I bet Mrs. Clinton would be pleased to discuss more of her policy views, if someone would just ask. David Letterman got her rolling a couple of weeks ago, and suddenly there she was: Pro-big government, pro-taxes -- hardly the centrist the New York Times recently described in a lengthy feature. Are the media trying harder to package her than she is herself? Watching some of these TV personalities interviewing her, I can feel their desire for her election radiating from the screen.

    For all our sakes, they should do their jobs and push her harder. The most disappointing thing about Mrs. Clinton is the way she draws lessons from her worst experiences. She acts as though she and her husband were the first victims of a political vendetta. That’s incredibly ironic, since during Vietnam and the scandal it sparked, she was part of an at times ruthless movement to unseat RN. Her memoirs do contain a few pages about her work as a staffer for House Democrats’ impeachment committee in 1974. Writing of her pride in her colleague’s precision and professionalism, she (or her researchers) misconstrued the content of the all-important June 23, 1972 tape, whose release led to RN’s resignation. Mrs. Clinton also errs in repeating the canard that RN once called all student protestors “bums” instead of those who used violent tactics. The Washington Post got it right in its headline about President Nixon’s famous comment in April 1970: “Nixon Denounces Campus ‘Bums’ Who Burn Books, Set off Bombs.”

    When President Clinton erred, there was indeed a movement to unseat him comparable to the anti-Nixon storm of Watergate. That's the way politics work. Mrs. Clinton and family acolyte Sidney Blumenthal have chosen to characterize this as a conspiracy. So will she stipulate that she and anti-Vietnam War activist Blumenthal were part of a conspiracy to oust RN? Likely not. In her apparently unexamined anger at those who wounded her, Mrs. Clinton reminds me of another formidably inflexible leader, England’s Mary Tudor. After he became king, Edward VI and his over-zealous protectors pressured Mary to give up her Catholicism. Surviving that experience and becoming queen, Mary put her half-sister Elizabeth in exactly the same awful position about her faith. Mary would probably say: The difference, of course, is that I was right and the Protestants were wrong. And that's precisely the view of the Clinton-Blumenthal wing: We were running a noble campaign against Nixon and his war, whereas the right was running an ignoble conspiracy against us. And soon it may be their turn in power again.

    Her book also contains an odd account of her brief stint in the Rockefeller campaign in 1968. She writes that RN’s nomination signaled the dawn of right-wing ascendancy in the Republican Party, an assertion that ought to provoke howls of laughter among those in the conservative movement who consider RN to be a liberal-leaning apostate. Do she and Blumenthal consider everyone to the right of Nelson Rockefeller a right-winger? That’s a mighty big cabal.

    Beyond that, Mrs. Clinton has relatively little to say about the President she helped destroy. She does repeat a kind word RN said to her, when visiting the White House family quarters in the early 1990s, about how he too had tried to address the health care crisis. It was just like the President to try to put people at ease in such situations. Mrs. Clinton writes that she responded in kind, saying we’d be better off if RN’s landmark bill had been adopted. While she doesn’t say so, the bill died because Senator Kennedy and other self-styled health care reformers in control of Congress weren't willing to let Richard Nixon score a win on their issue. And so the cycle continues.

    Click here to return to top of page.

    Dumbocracy (posted 7-1-03)

    Phillip Adams, writing in the Australian (July 1, 2003):

    The popular appeal of stumbling political utterances speaks volumes for democracy....

    English historian Simon Schama contrasts the grandiloquent voice of British imperialism with the dumb and dumber dialogue of its US counterpart. He quotes George Nathaniel, Viscount Curzon, waxing lyrical on Britain's manifest destiny: "To me the message is carved in granite: out of the rock of doom -- that our work is righteous and shall endure." Whereas consul Jay Garner, musing on the prospects of the US imperium in Iraq, said: "If we make headway in a lot of major things, we will put ourselves in a marvellous up-ramp where things can begin happening. If we don't do that, we're on the negative ramp." Thus the ventriloquial voice of Bush speaks through many mouths.

    The internet provides an endless selection of Bush jokes -- principally his public utterances. While half the downloads on the net are pornographic, much of the rest are photos and transcripts of the incumbent President, suggesting he's one of the greatest fools in history.

    Yet the joke is on us. He's the President and we're not. He's running the world and we're running scared. In a sense, it's his ignorance that gives him invincibility, whereas his critics, so eloquent and articulate, achieve invisibility....

    In the wit and wisdom of George W. we've learned that the French don't have a word for entrepreneur and are constantly reminded that God is an American. Although he'd still find it hard to find Iraq on a map, that doesn't stop George lobbing missiles on Baghdad or reducing decades of diplomacy to words you'd see stencilled on a T-shirt or stuck on a bumper sticker. "You're either with us or against us" is a classic case. Everyone understands that -- whether you're in Jerry Springer's audience or the President of France in the Elysee Palace.

    The lesson is simple. Dull down your language if you want to deaden discourse and dull debate. Welcome to dumbocracy.

    Click here to return to top of page.

    David Brooks: In Lincoln's Image (posted 7-1-03)

    David Brooks, writing in the NYT Magazine (June 29, 2003):

    We're at an odd cultural moment. There's no dominant image of business success. Neither dot-com millionaires nor the Wall Street whizzes seem alluring. The risk-taking, push-the-envelope executives no longer inspire confidence. The charismatic C.E.O.'s just seem like overplayed blowhards. And yet nobody gets inspired at the thought of being the safe, secure, highly anal Organization Man.

    So how about Abraham Lincoln as the defining capitalist figure for our age? As the Yuppie was to the 80's, as the dot-commer was to the 90's, maybe Abraham Lincoln could be for the coming decade. Not the great statesman Lincoln -- the president Lincoln -- but rather the middle-aged corporate-lawyer Lincoln, the guy who in the 1850's represented railroads and banks, the guy who traveled relentlessly around the legal circuit handling cases big and small, the guy who, when he made some money, added a second floor to his house so his family could have more space, the guy whose ambition, as his law partner famously said, knew no rest. That middle-aged Lincoln represents all the sometimes homely but invariably dreamy pushers who are what American striving is really all about.

    Lincoln began life with high anticipations of glorious success. When he was young, he had a little boat, which he kept on the Ohio River. One day a pair of travelers asked him if he would row them to the middle of the river, where they could intercept a steamboat. Lincoln took them out, and as the men boarded the steamboat, they each threw a silver half-dollar into the bottom of his boat. "You may think it was a very little thing," Lincoln later recalled, "but it was a most important incident in my life. I could scarcely credit that I, a poor boy, had earned a dollar in less than a day. . . . The world seemed wider and fairer before me."

    He became a fervent believer in social mobility and came to see, as the historian Allen C. Guelzo has pointed out, that self-transformation is almost a moral responsibility for the aspiring American.

    Many people start out like Lincoln, fervently convinced that easy and quick riches lie just over the horizon. Four-fifths of American college students, according to a Jobtrak.com study, believe it will take them 10 years or less to achieve their career goals. Three-quarters of U.S. college students expect to become millionaires, and 52 percent expect to have achieved this stratospheric status by the time they are 50.

    But success didn't come quickly for Lincoln, just as it doesn't come quickly for most people. Recent research has indicated that the United States is, and always has been, a less mobile society than we think. Americans do move upward as we age. Only 5 percent of the individuals who were in the bottom income quintile in 1975 were still there in 1991. But an individual's mobility is likely to be measured in decades, not years. We rise as we age and as we get gradual promotions, not because we strike it rich. That's what happened through most of Lincoln's life. The Lincoln of the 1850's was prosperous and apparently a brilliant lawyer, but he felt that his greatest dreams were not realized. And that, too, is not atypical. For every Bill Gates and Jack Welch, there are millions of men and women doing well but not spectacularly, somehow not fulfilling the media image of corporate heroism.

    Click here to return to top of page.

    The British General Who Launched an Invasion and Justified It on Moral Grounds (posted 7-1-03)

    Ben Macintyre, writing inthe LOndon Times (June 28, 2003):

    A few hundred yards from Downing Street, on the west side of Trafalgar Square, stands a man who invaded another country on a dubious pretext after an act of horrific terrorism; who rationalised the action on moral and humanitarian grounds, and relied for justification on two official dossiers of evidence which have since turned out to be distinctly dodgy. His name was General Sir Charles Napier, and his statue is a reminder of the imperial ethos, unacknowledged but unmistakable, that underpins the war in Iraq and its aftermath.

    In 1843 Napier annexed the rich Indian province of Sind in what is now Pakistan, an operation that is best remembered for the one-word, bilingual pun the general is said to have sent back after his victory: "Peccavi" - Latin for "I have sinned" (Sind).

    This bold act of regime change provoked some outrage back in London, but Napier and his imperial masters stood by their official explanation of events. There was no BBC to claim that the evidence had been "sexed up", and so the controversy died down, Napier got his vote of thanks in Parliament and, in time, his bronze statue.

    But, 160 years later, the parallels seem more than coincidental, offering a lesson in the enduring nature of imperial conquest, and subsequent spin.

    Leaving aside his astonishingly luxurious facial hair, Napier might have been the Donald Rumsfeld of his day - swashbuckling, utterly convinced of his own righteousness and possibly slightly nuts. Like Rummy, he knew there would be loud complaints about his actions, and could not have cared less. "We have no right to seize Sind, but we shall do so," he declared. "And a very advantageous, useful, humane piece of rascality it will be."

    The world superpower, then as now, had been badly rattled by a horrific massacre of men, women and children. In 1841, 16,000 people, an entire British army, had been massacred during the disastrous retreat from Kabul.

    First, an army of retribution descended on Kabul and killed large numbers of Afghans, some of whom might have been guilty of something. Then the empire turned its sights on nearby Sind, "like a bully who has been kicked in the street and goes home to beat his wife in revenge", in the words of one critic. The cradle of the Harappan civilization, Sind was the place where large-scale agriculture originated 5,000 years ago; but Napier was more concerned about its place in a geopolitical security strategy. "Whose lands are these?" he asked, as he toured the province with an acquisitive imperial eye.

    The reasons given for invading Sind were vague and shifting: it was said that its Amirs had been negotiating with rebellious neighbouring tribes (the 19th century al-Qaeda connection), although no proof was produced; the Sindhis were accused of violating international treaty obligations, but few observers believed that they posed any immediate threat. The real casus belli may have been economic. The lucrative Chinese opium trade passed through Sind; opium was the oil of the 19th century.

    Sind had some characteristics of a rogue state, and Napier, like Tony Blair, espoused a moral, tyrant-toppling rationale. "These villains of Amirs are my aversion," he wrote. "Thank God the treacherous assassins are not to be restored, and I shall have the pleasure of doing good in recovering and cheering up a fine race of oppressed people." Replace "Amirs" with "Saddam Hussein's regime", and the words might be those of Alastair Campbell.

    Napier went in with massive firepower and high technology, blowing up key installations such as the fortress at Imamghar. This was Shock and Awe, 1840s style. The regime leaders fled, their ragged army was cut down like corn.

    With British rule established, the victors basked in a glow of moral rectitude.

    "It is allowable to indulge feelings of satisfaction when the course of events brings the fall of barbarians and selfish rulers," reflected Lord Ellenbrough, the Governor-General. Whatever the nit-pickers might argue, Britain, he said, had been morally right "to extend...the benefits of a beneficent and enlightened rule".

    Peel's Government came under fire, but there was a limit to how far the Opposition could object, having done similar things itself when in power. This, of course, is the conundrum facing the Tories, who can hardly damn the second Gulf war, having gloried in the first.

    Two parliamentary blue books of official documents were published to explain and justify what had happened. Sarah Ansari, an historian at Royal Holloway, University of London, however, has now compared the official with the unofficial account, to uncover a remarkable work of Victorian spin: phrases were tweaked, key passages omitted, while only the most supportive data were included, even exaggerated. Campbell would have called these "presentational" amendments and "drafting suggestions". The official version was accepted; the die was cast and so, in bronze, were the impossibly whiskery features of Napier.

    Click here to return to top of page.

    Jews Were Also Uprooted from Their Middle East Homes in 1948 (posted 7-1-03)

    The Ottawa Citizen (June 28, 2003):

    Most people know that the creation of Israel sparked a refugee crisis, as thousands of Arabs fled from villages that are now part of Israel. Less well known is that Israel's founding also sparked a Jewish refugee crisis. Thousands of Jews who for generations had lived in Arab lands, from Syria in the east to Algeria in the west, were suddenly uprooted. As inconvenient as it may be to people who want to claim that only Palestinians lost homes and land in 1948, it's impossible to talk of the Arab dispossession without talking of the Jewish one, too.

    This week in New York, a major study of the Jewish exodus was released, co-authored by David Matas, an internationally known human rights advocate from Canada. Also on hand was Liberal MP Irwin Cotler, another human rights legal expert, and former U.S. diplomat Richard Holbrooke, who brokered the 1995 Dayton Peace Accord that ended the fighting in much of the former Yugoslavia. These men -- and an increasing number of historians -- believe the displacement of Jews from Arab countries is a forgotten tragedy. We agree, and hope the study released this week will help remedy that.

    It was at 5 p.m. on May 14, 1948, that the Jewish colony in Palestine declared its independence, henceforth to be known as the State of Israel. Within hours, the surrounding Arab armies of Iraq, Egypt, Syria, and Transjordan (now Jordan) were on the march, hoping to strangle the Jewish state before it could take its first breath. Israel fought off the invaders, and it was in the chaos of that regional war that most of the mass displacements occurred.

    Some Arab Palestinians fled of their own volition, planning to come back once the Jews had been pushed into the sea. Other Arabs were driven out by the Israelis. It was a similar story for the Jews in Arab lands. Some left voluntarily to help build Israel, but others left only after Arab governments revoked their citizenship, confiscated their property or sponsored anti-Jewish pogroms.

    The reason the world seems to remember only the Palestinian dispossession is that to this day those original Palestinian refugees and their children remain stateless. Nearly two dozen Arab countries have, for more than half a century, denied refuge to their Palestinian brethren, preferring to let them stew in camps as a testament to Israel's "illegitimacy." Israel, on the other hand, took in every Jewish asylum-seeker who wanted to live there. As the just-released study points out, "Little is heard about these Jewish refugees because they did not remain refugees for very long."

    Click here to return to top of page.

    The Romantic Illusions of Palestinian Revolutionaries (posted 7-1-03)

    Martin Peretz, writing in the St. Petersburg Times (Florida) (June 27, 2003):

    Every failed revolution in modern times has had its fellow travelers, a phenomenon hard to define but easy to recognize. Picasso was one; Jean-Paul Sartre, another; FDR's vice president, Henry Wallace, a third. Two, three decades later, Susan Sontag would also put her words to work for the brutal engineers of soul and society.

    There were literally tens of thousands of these influentials in the United States and elsewhere in the West. And the revolutions of the left did not have a monopoly on fellow-traveling. In the 1930s, there were lots of fellow travelers of Nazism, too: Charles Lindbergh, Ezra Pound, the duke of Windsor and many others. Many fellow travelers went exuberantly from one decaying communism to another, seriatim, from the Soviet Union to the People's Republic of China to Castroite Cuba and Vietnam and then to Sandinista Nicaragua, never quite realizing they would soon feel the need to move on again.

    But move on they would, armed as always - as author David Caute put it - with their usual arsenal of "bifocal lenses, double standards, a myopic romanticism."

    Of course, there is now no world revolution into which these deluded folk can vest their ardors, as yesteryear's fellow travelers did when extolling the nonexistent - but exemplary - democratic virtues of Stalin's Russia or of some other transformatory idyll. Only certified kooks are in the business these days of changing the nature of man.

    So the present-day romantics, who at home typically despise the idea of the nation-state and the realities of national interest, are left with often contrived and almost always murderous nationalisms to adore. The nationalism du jour is Palestinian nationalism.

    It was the British political historian David Pryce-Jones who, I think, first made the analogy between the old fellow travelers and the new, between those who romanticized the Soviets and those who now romanticize Palestinian (and Islamic) terrorism.Not that all Palestinians are terrorists, not at all, although polls show an overwhelming proportion of them to be supporters of terrorism. But terrorism happens to be the defining paradigm of the Palestinian cause.

    Click here to return to top of page.

    Scientists Say Robert the Bruce Tree Can Be Saved (posted 7-1-03)

    Jim Mcbeth, writing in the Scotsman (June 27, 2003):

    SCIENTISTS using the latest sonic imaging technology have found a way to rescue the King's Tree, one of the most significant natural landmarks associated with Robert the Bruce's wars of independence.

    Experts and historians had feared the ancient yew, an image of which was worn on the clothing of the Bruce's army at the Battle of Bannockburn, was doomed.

    Until now, the technology did not exist to establish if the tree, which overlooks Loch Lomond, could be saved.

    However, The Scotsman and Roddy McGregor, one of Scotland's leading tree experts, arranged for Ian Isaacs, the business development manager of Fujikura Europe, to come north with ultra-modern equipment, which has only been available for months, to create the first internal picture of the tree. The world's most sensitive electronic sensors, connected to the Picus Tomograph, a "holy grail" of imaging technology, were stretched around the 2,000-year-old yew, which is regarded as an irreplaceable national treasure.

    In 1306, the tree became inextricably linked to the future king when he used it as the means to rally his twice-defeated army after it had escaped from the western to the eastern shore of the loch.

    The Picus sensors' readings established that more than half of the tree is decayed.

    But the good news is that there is enough life in the yew to ensure survival if remedial work is carried out on the tree and the surrounding area, in order to allow more light to feed it....

    The yew was ancient even in the Bruce's time, when it found its place in turbulent history.

    The future king and his guerrilla force of 200 were running from two defeats, the first against the Earl of Pembroke at Methvenwood and then at Dalrigh by McDougall of Lorne, seeking vengeance for the death of the Red Comyn at the Bruce's hand.

    The Bruce's force had to escape by crossing Loch Lomond in a leaking boat capable of accommodating only three men at a time.

    The Bruce, Sir James Douglas, and an oarsman were first across, and it took 24 hours to ferry the remaining warriors. As each contingent arrived, the Bruce stood under the tree raising the spirits of his men with songs and jokes. He used the yew to symbolise their struggle, extolling its strength and ability to endure.

    Eight years later, the Bruce won independence at Bannockburn, and many of his fighters wore a depiction of the yew. The King's bowmen also used arrows harvested from yew trees on Loch Lomondside.

    Click here to return to top of page.

    Andrew Sullivan: Sodomy (posted 7-1-03)

    Andrew Sullivan, writing in the LA Times (June 27, 2003):

    Have you committed sodomy lately?

    You may be surprised to know that, in all likelihood, you have. Sodomy, after all, is not theoretically restricted to homosexuals. It's an act that can be engaged in by two people of the same or opposite sex.

    And as a legal matter, it has by no means been restricted to anal sex between two men (its most popular meaning). Sodomy statutes -- not unlike the one just struck down by the U.S. Supreme Court in Texas -- have long included a whole variety of sexual behaviors, specifically fellatio and cunnilingus, whether heterosexual or homosexual.

    Theologically, the definition is broader still. The natural-law tradition, which invented sodomy as a concept in the Middle Ages, defines it as any sexual activity outside reproductive heterosexual intercourse -- that is, masturbation, coitus interruptus, using contraceptives and even, in some texts, incorrect sexual positions. In fact, it's relatively hard to have anything we might call sex today -- including foreplay -- that doesn't have some sodomitic aspect to it.

    But then a modern understanding of sex in general has less and less relationship to the theories of celibate monks in the early Middle Ages. Given their narrow knowledge of sexual desire, experience and expression, this is hardly a surprise.

    Why was sodomy of such concern to the medieval theologians? Partly, it seems, because sodomy was widespread among the officially celibate clergy (some things never change), but mainly because in the early Middle Ages it was widely believed that sperm contained everything necessary to make a human being. The woman was a mere incubator of a new human being and added nothing to the process. So "wasting" semen was tantamount to killing off human life itself.

    Thomas Aquinas, the magisterial thinker who largely created the anti-sodomitic theological tradition, was particularly strict on this point. "After the sin of homicide whereby a human nature already in existence is destroyed," Aquinas wrote, "this type of sin appears to take next place, for by it the generation of human nature is precluded."

    Next to murder! One medieval church manual, uncovered by Emory University historian Mark Jordan, gave instructions on how to judge and absolve every sin known to man -- from pride to sloth to envy and gluttony. But the subcategory of sodomy amounted to 40% of the entire text. It took up more space than anger, sloth, envy, pride and gluttony combined.

    No wonder that when civil authorities adopted many ecclesiastical codes as law in the Renaissance and afterward, the death penalty was often prescribed for the sin of Sodom. No surprise either that early Americans -- often even stricter than their European peers -- swiftly made sodomy a capital offense in this land as well.

    Click here to return to top of page.

    Greenspan's Campaign to Stop Deflation Before It Gets Out of Hand (posted 7-1-03)

    Peter Hartcher, writing in the Australian Financial Review (June 28, 2003):

    What's the cause of deflation? And is it a real risk to the lives and livelihoods of the world population, or is it just some mumbo jumbo to keep economists busy?

    The cause is in hot dispute, but one person took a step back several steps, actually to look at the big picture. In 1996, the economic historian David Fischer published The Great Wave, an investigation of inflation over the past 800 years.

    He identified four great waves of inflation he calls them price revolutions in the past eight centuries, each running for about a century.

    And he drew the tentative conclusion that the fourth of these, the one that dominated the 20th century, was already crashing to a conclusion.

    Since his book was published, the evidence continues to mount that he was right.

    So how real is the risk?

    Greenspan's short answer is that the reason this is such a dangerous new era is because no one knows.

    The great guru of the global economy would never admit publicly that he hasn't a clue, but he did concede as much in private to an old friend a couple of weeks ago.

    The Fed chairman invited the eminent US economist (and a friend of Greenspan's for 30 years) Allan Meltzer to his office to discuss the issue.

    After their conversation, Meltzer, a professor at Carnegie Mellon University and a former adviser to the US Treasury and the Fed, told Weekend AFR: "There are many people, including people at the Fed, who don't believe the deflation story, but they don't include the chairman. He feels strongly about it. You can't win an argument with him. You can try to influence his thinking I made him think about his reasons."

    At the core of Greenspan's argument is uncertainty that no one knows enough about the working of the modern global economy, not even Greenspan himself, to be sure that a dangerous deflation can be averted.

    His thinking seems to have been influenced by an important research paper by 13 Fed staff economists, entitled Preventing Deflation: Lessons from Japan's Experience in the 1990s. It contains a cautionary tale about complacency and intellectual arrogance.

    It says: "Japan's deflationary slump was not anticipated. This was true not only of the Japanese policymakers themselves, but also of private-sector and foreign observers, including Federal Reserve staff economists.

    "Moreover, financial markets had no better handle on the economy's prospects . . . The failure of economists and financial markets to forecast Japan's deflationary slump in the early 1990s poses a cautionary note for other policymakers in similar circumstances: deflation can be very difficult to predict in advance.

    "In consequence, as interest rates and inflation rates move closer to zero, monetary policy perhaps should respond . . . to the special downside risks in particular, the possibility of deflation."

    In short, the US economy may be in the process of stepping Through the Looking Glass, but you can't be sure about it until you wake up one day and find yourself there, in a strange new world where prices don't go up but down, and banks pay you to take their money.

    Click here to return to top of page.

    Robert Rubin's Record (posted 7-1-03)

    Economist Dean Baker, commenting on a June 22 article in the NYT about the new head of the Council of Economic Advisors (June 30, 2003):

    This article profiles Gregory Mankiw, the new head of President Bush's Council of Economic Advisors. At one point it contrasts President Bush's economic advisors with President Clinton's, commenting that none of President Bush's advisors enjoy the credibility that Robert Rubin did as Treasury Secretary. It would have been appropriate to note that Mr. Rubin's policies were associated with the stock bubble -- the collapse of which has led to the current period of recession and stagnation. They were also associated with the "strong dollar," which has led to a current account deficit that is presently running at a $550 billion annual pace, and has added several trillion dollars to the nation's foreign indebtedness. If none of President Bush's advisors has achieved the same credibility that Mr. Rubin enjoyed as Treasury Secretary, it is also the case that none has been associated with such large-scale economic failures.

    At one point, the article lists as one of Mr. Mankiw's mistaken predictions the assertion in the late eighties that inflation- adjusted housing prices would fall by 50 percent in the next two decades due to the retirement of the baby boom generation. It is not yet possible to know that this prediction is wrong. A person who predicted in 1993 that stocks would have below normal returns for the next decade would have appeared very wrong in 2000 -- but they subsequently would have been proven correct. When two decades have elapsed it will be possible to know whether or not Mr. Mankiw's prediction on housing prices was correct.

    Click here to return to top of page.

    Middle East Studies: Oversight Needed (posted 7-1-03)

    Peter Wood, writing in frontpagemag.com (July 1, 2003):

    I am about to do something I never thought I would: call for more regulation of higher education. I’m confronted with a case for which I see no other remedy.

    On June 19, the House Subcommittee on Select Education held a hearing on whether Title VI of the Higher Education Act needs any fine-tuning. Title VI is the little corner of the vast federal budget where you can find the $86.2 million a year that the feds spend on helping academics to study international relations. It doles out funds for 118 “national resource centers” and, among other things, gives grants to graduate students who want to study languages and cultures in which the United States has some “security” interest—which is just about every language and culture.

    In the wake 9/11, Congress got the idea that maybe we should increase funding to encourage scholars to study the Middle East and other potential hot spots. The trouble is that “Middle Eastern studies” is a hotbed of ideological disdain for America and American foreign policy. Not that anyone mistakes “area studies” as the place in the university where one would be likely to find deep historical knowledge or richly informed understanding of the world. It is a hotbed, for sure, but a hotbed of breezy formulations. Think of it as the Weber Grill of higher ed, holding a perpetual cookout of charcoal-broiled American interests.

    Congress was inclined to overlook this little foible and send over a few more bags of taxpayer dollars, but about a year ago Stanley Kurtz, writing in National Review Online, seized the issue. Kurtz was especially distressed by the boycotts that the African Studies Association, the Latin American Studies Association, and Middle East Studies Association had deployed against the program that aims to give graduate students David L. Boren Fellowships under the National Security Education Program (NSEP). But he was alarmed as well by the unseemly role the Middle East Studies Association (MESA) was playing in other aspects of publicly-funded scholarship.

    Kurtz traces the history of Middle Eastern studies in American universities from its emergence as scholarly specialization in the 1940s to the point in the late 1970s where the field flung away its commitment to objectivity in favor radicalism. Kurtz, writing in The Weekly Standard, observed:

    The way was cleared for [the radicals] to wrest power from the Middle East studies establishment when Edward Said’s Orientalism (1978) crystallized a new understanding of the field. The founding text of postcolonial studies, Orientalism effectively de-legitimated all previous scholarship on the Middle East by branding it as racist. Said drew no distinction between the most ignorant and bigoted remarks of nineteenth-century colonialists and the most accomplished pronouncements of contemporary Western scholars: All Western knowledge of the East was intrinsically tainted with imperialism.

    One of Kurtz’s targets is John Esposito, a professor of Islamic Studies at Holy Cross College and past-president of MESA. In Kurtz’s reading, Esposito adopted and adapted Said’s views to develop a program of Middle East studies that was indifferent to the growing threat of Islamic terrorism. In Esposito's account, Islamic fundamentalism was to be seen as an indigenous form of democracy. And it is thanks to Esposito and his like-minded colleagues that Middle East studies singularly failed to keep Osama bin Laden in focus.

    Kurtz also emphasizes the acute shortage in the American government of competent Arabic speakers. Thanks to that shortage, we were unable to translate enough of the intercepted transmissions before 9/11 to stop the terrorist outrage, and still today we are hampered by the scarcity of Arabic speakers. The domination of the field by people enamored with Edward Said's or John Esposito’s approach may have something to do with the dearth of students appropriately trained in the Arabic language.

    So much for background. What has my attention at the moment is what happened at that Committee hearing on June 19. Kurtz presented his argument about how some Title VI money was being used to fund organizations that actively thwart U.S. policy, as, for instance, in promoting boycotts of National Security scholarship programs. At one point in his testimony, Kurtz explained:

    For at least a decade, the African, Latin American, and Middle East Studies Associations have sponsored a boycott against NSEP. Since 1981, the directors of Title VI African National Resource Centers have agreed not to apply for, accept, or recommend to students any military or intelligence funding from the Defense Intelligence Agency, the NSEP, or any other such source. Shamefully, a mere two months after September 11, Title VI African Studies Center directors voted unanimously to sustain their boycott of military and intelligence-related funding, including the NSEP.

    Kurtz proposed a solution: add an oversight board.

    I endorse that additional layer of regulation that Kurtz calls for. But it isn’t an easy step for me. During my sixteen years as an academic administrator at Boston University, I spent a lot of time fighting excessive government regulation. In fact, I spent a lot of time as well as fighting excessive non-government regulation, because higher education has a surfeit of both.

    Click here to return to top of page.

    The U.S. Forgot Liberia (posted 6-27-03)

    Kathryn Westcott, writing on the website of the BBC (June 26, 2003):

    The United States is coming under increasing international pressure to intervene in the fierce fighting in Liberia. The US has long historical and cultural ties with the West African nation but, over the years, it has become increasingly disengaged from it.

    In the early 1820s, hundreds of freed US slaves were sent to coastal Africa by anti-slavery societies.
    When, in 1947, they founded the continent's oldest republic, they gave it a constitution and a flag modelled after the country they had come from.

    But, despite its strong association with the US, Liberia does not have a conventional colonial history.

    It was never ruled from Washington in the same way as most other African countries were ruled by colonial powers - such as Zimbabwe from London and Ivory Coast from Paris.

    For most of the country's history, Liberian-Americans, descendents of the early black colonisers, ruled the country.

    While the country's name means 'Liberty' and its coat of arms reads "The love of liberty brought us here" the Liberian-Americans went on to enslave the nation's indigenous people.

    Liberia was for a long time economically and strategically important to the United States.

    In the early part of last century, the US relied on one of its natural resources, rubber, to compete with Britain in the rapidly growing automobile industry.

    This natural source of latex rubber was also vital to the allies during World War II.

    During the Cold War years Liberia was viewed by the US as an ideal post to fight the spread of communism through Africa.
    A mutual defence pact was signed and the US established a massive air base and built communications facilities to handle intelligence traffic and relay a Voice of America signal throughout the continent.

    But, when the Cold War came to an end, US political interests faded.

    Now, America is coming under increasing pressure to turn its attention again to Liberia, particularly from the UK, which has suggested the US lead a military mission to the country.

    Click here to return to top of page.

    George F. Will: The Supreme Court's Mistaken Ruling on Gay Rights (posted 6-27-03)

    George Will, writing in the Washington Post (June 27, 2003):

    Given the Supreme Court's 6 to 3 ruling yesterday that Texas's anti-sodomy law violates the constitutional privacy right, lap dancing -- like prostitution, for that matter -- looks like a fundamental constitutional right. Consider the discontinuities in the evolution of that right, which the court first explicitly affirmed in 1965, more than 17 decades after the Constitution was ratified.

    In 1965 the court said a Connecticut law banning the sale and use of contraceptives violated a constitutional right of privacy. But the court connected this right to society's stake in an institution -- marriage, "an association that promotes a way of life." Marriage is grounded in nature, in the generation and rearing of children, a matter about which every society legislates.

    The privacy right is most famously associated with Roe v. Wade, the 1973 abortion decision. But the radicalism of that decision was in severing the privacy right from any relationship with any social institution. Rather, the court said in 1973 that the privacy right encompasses the individual's right of choice. In sexual conduct, the right to choose is the right to consensual activity.

    In the 1973 severing, the court said the privacy right involves "freedom from government domination in making the most intimate and personal decisions." Such as to choose to engage in sodomy. So the court contradicted its 1973 privacy right ruling when, in 1986, it voted 5 to 4 to affirm a Georgia law criminalizing consensual adult sodomy. And one justice in that majority, Lewis Powell, later said he regretted his vote.

    Yesterday the court held that Texas's law "furthers no legitimate state interest" that can justify abridging the privacy right to consensual adult homosexual activity. The logic of the ruling, which the court flinches from recognizing, is that no legitimate state interest is served by any law for the promotion of a majority's convictions about sexual morality.

    In the 1986 case, the court said it was being asked to "announce . . . a fundamental right to engage in homosexual sodomy. This we are quite unwilling to do." Yesterday the court seemed to think it still had not done so. It was mistaken.

    Today laws criminalizing homosexual sodomy are rare and rarely enforced. They should be repealed. In most states they have been, by democratic persuasion.

    But "unconstitutional" is not a synonym for "unjust" or "unwise," and the Constitution is not a scythe that judges are free to wield to cut down all laws they would vote to repeal as legislators. Legislators can adjust laws to their communities' changing moral sensibilities without creating, as courts do, principles, such as the broadly sweeping privacy right, that sweep away more than communities intend to discard.

    Click here to return to top of page.

    The Two George's: Orwell and Bush (posted 6-26-03)

    From the website of DemocracyNow.org (June 25, 2003):

    100 years ago today, author and journalist George Orwell was born. We’ll spend the hour hearing excerpts from his classic work 1984. The book introduced the terms "Big Brother," "thought police," "newspeak" and "doublethink." We'll also hear clips from President Bush, Attorney General John Ashcroft, Secretary of State Colin Powell, Fox New’s Bill O’Relly, Defense Secretary Donald Rumsfeld, Sen. Robert Byrd and broadcast footage of Donald Rumsfeld meeting with Saddam Hussein in 1983.

    But we begin at the Pentagon, yesterday. Defense Secretary Donald Rumsfeld told reporters: “I don't know anybody that I can think of who has contended that the Iraqis had nuclear weapons… I don't know anybody in any government or any intelligence agency who suggested that the Iraqis had nuclear weapons. That's fact number one.”

    Well we thought of someone who suggested otherwise, his boss, President Bush:

    “The evidence indicates that Iraq is reconstituting its nuclear weapons program. Saddam Hussein has held numerous meetings with Iraqi nuclear scientists, a group he calls his ‘nuclear mujahideen’ -- his nuclear holy warriors… Facing clear evidence of peril, we cannot wait for the final proof -- the smoking gun -- that could come in the form of a mushroom cloud.”

    That was Bush speaking last October in Cincinnati days before Congress voted to give him the authority to wage a preemptive attack.

    Now we will turn to another George, George Orwell who wrote about the rewriting of history.

    He was born Eric Arthur Blair in India June 25, 1903. He moved to England in 1907 and would eventually become one of the country’s most heralded writers. It was not until 1933 when he took the name George Orwell. He fought fascism in the Spanish civil war. During the early 1940s he worked as a journalist and editor for the BBC, the Observer and the Manchester Evening News. He published Animal Farm in 1945 and 1984 four years. He died on January 21, 1950 at the age of 46

    Orwell’s most famous book, 1984 is a warning about a futuristic totalitarian government that controls the public by spreading propaganda, monitoring citizens, changing language and rewriting history. In 1984 Oceania is in perpetual war. The enemy may regularly change but the state is always at war. And there seems to be no end.

    Click here to return to top of page.

    JFK's Speech at American University Is as Relevant Now as in 1963 (posted 6-26-03)

    Editorial published by the Pittsburgh Post-Gazette (June 25, 2003):

    Four decades ago this spring, in the last six months of his life, John F. Kennedy delivered a speech that was little remarked upon at the time and is hardly remembered now. It did not possess the soaring sentiments of his Inaugural Address, when he accepted the torch of leadership from an earlier generation, nor the drama of his speech in Berlin, when in the shadow of the Wall he identified himself with the forces of freedom.

    And yet there was something about Kennedy's American University speech, delivered 40 Junes ago, that lingered in the minds of historians. Indeed, in his remarkable new biography of the 35th president, "An Unfinished Life," Robert Dallek describes the speech as "one of the great state papers of any 20th-century American presidency." And in the definitive American biography of Nikita S. Khrushchev, published only months ago, William Taubman quotes the Soviet leader calling his rival's address "the best speech by any president since Roosevelt."

    This June, so different from that springtime so long ago, is a good time to re-examine the Kennedy speech that would not be forgotten, the one that the president rushed into delivering so as to have it on the record before a critical Sino-Soviet summit in July 1963 -- the speech that, in an extraordinary break from custom, the Communist government actually allowed to be published in translation for distribution in the Soviet Union. Here are some excerpts, annotated for our time:

    "What kind of peace do we seek? Not a Pax Americana enforced on the world by American weapons of war. Not the peace of the grave or the security of the slave."

    The Kennedy speech was an examination of the nature of peace, but it also stands as a remarkable period piece. At that time, not even a year after the Cuban Missile Crisis, the notion of a Pax Americana was inconceivable. But then, as now, when talk of a Pax Americana is not unknown, worries about weapons of mass destruction -- a threat known to Kennedy but a phrase unfamiliar to him -- dominated the worries of Washington. And then, as now, freedom was the dominant ideology, and the dominating rhetoric, of American politics.

    "I speak of peace . . . as the necessary rational end of rational men. I realize that the pursuit of peace is not as dramatic as the pursuit of war -- and frequently the words of the pursuer fall on deaf ears. But we have no more urgent task."

    Just this spring the historian James MacGregor Burns, the author of a Kennedy biography prepared before the 1960 election, wrote a book arguing in part that few leaders achieve greatness without taking their people into war. In this excerpt, Kennedy elevates the challenge posed by peace to that posed by war. A note on an ironic antiquarianism: The use of the word "men" as a shorthand for "humankind" was common throughout the decade that launched the contemporary women's movement.

    "Too many of us think it is impossible. Too many think it unreal. But that is dangerous, defeatist belief. It leads to the conclusion that war is inevitable -- that mankind is doomed -- that we are gripped by forces we cannot control. We need not accept that view. Our problems are man-made -- therefore, they can be solved by man. And man can be as big as he wants."

    Click here to return to top of page.

    India and China: Friends Now? (posted 6-26-03)

    Robert Marquand, writing in the Christian Science Monitor (June 26, 2003):

    In the post-Sept. 11 era, as the US moves toward more fluid coalitions of interest, India and China may be following suit by pursuing better ties with Washington and with each other. As Chinese leader Hu Jintao stated in a meeting with Prime Minister Vajpayee, "history will show we are partners, not rivals."

    Or, as a Western scholar in Beijing puts it, "There is enough maturity in both India and China to contemplate a relationship based on Asian proximity and Asian solidarity, something often talked about, but never realized."

    That sounds good on paper. Trade between the two nations has increased from several hundred million in the 1990s to $ 5 billion today, according to Chinese Foreign Ministry figures. Direct flights between Delhi and Beijing have been under way for more than a year.

    There remains, however, significant anti-China feeling in Delhi. When the current Hindu nationalist government came to power in 1998, its stated rationale for testing a nuclear device only days later - an act that forced Pakistan to test - was a perceived threat from China. The security brain trust in Delhi has viewed China as a regional danger, and its intelligentsia have chafed at a perceived cultural dismissiveness of India by the Chinese.

    Brief mid-century bright spot

    Both countries emerged from occupation and a colonial past in the mid-20th century. There was a brief halcyon period in the late 1950s when Indian Prime Minister Jawaharlal Nehru thought the two nations would link arms, redefine Asia, and create a developing-world socialist paradise. When Chinese premier Zhou Enlai visited India in 1956, Indians thronged the streets shouting "Hindi-Chini Bhai Bhai!" - India and China are friends.

    But months later, China invaded border areas along the Indian northern frontier. The Indian Army was humiliated, and some historians say Nehru's sense of betrayal and disgrace was total. He died shortly after, and relations went into a deep chill for years as China emerged as a champion of rival Pakistan - and Nehru's daughter Indira Gandhi became a force in Indian politics, and a friend of Moscow.

    This month, Chinese officials briefing reporters carefully avoided trying to account for the 1962 border debacle - attributing it to a grandly obfuscatory category of "history" and "colonialism." Indian officials are now describing 1962 as just a "clash."

    But given decades of enmity, do this week's expressions of cooperation represent more than diplomatic boilerplate?

    Maybe. Both countries are more confident, more integrated in global markets, and free from cold-war ties, experts note. Both are nuclear powers, and India is pushing to be a permanent member of the UN Security Council. Children of the Brahmin elite in India and of the party and business elite in China study at US colleges, own laptops, and have probably seen "Matrix Reloaded."

    It is too early to tell whether Vajpayee's visit suggests the origins of a China-Russia-India "counterbalance" to the US, as some observers say, or simply a construction of useful ties that support mutual interests. The idea of a "strategic triangle" was long a dream of such Indian statesmen as I.K. Gujural, who saw it as a counterweight to NATO. But efforts to build such ties during the Kosovo campaign, in particular, failed.

    In any event, both Delhi and Beijing are establishing their own relations with Washington. Former Chinese leader Jiang Zemin's trip to Crawford, Tex., last fall made this clear. Indian leaders regularly tread the hallways of power in Washington.

    Click here to return to top of page.

    Immigrants Who Join the US Military (posted 6-23-03)

    Vanessa Hua, writing in the San Francisco Chronicle (June 22, 2003):

    Almost 60,000 immigrants serve on active duty in the U.S. armed forces, accounting for 5 percent of enlisted personnel. More than 20,000 are naturalized, and 37,000 are noncitizens who are legal permanent residents.

    Of the noncitizen soldiers, 32 percent are Hispanic, 36 percent are Asian or other, 10 percent white, and 21 percent black, according to the Department of Defense. The Pentagon does not track such information for naturalized immigrants....

    In July, President Bush signed an executive order granting immediate consideration of citizenship for noncitizens in the U.S. military on active duty since Sept. 11.

    Previously, noncitizen members of the military in peacetime could apply for citizenship after three years of service. By comparison, civilian legal residents can apply for citizenship after five years, while those married to U.S. citizens can apply after three years.

    This order only applies while the United States is engaged in the war on terror. However, a bill now moving through the Senate would permanently cut the three-year wait period to one year, waive application fees, make the process accessible to those serving overseas, and clear up technical barriers faced by non-citizen relatives of soldiers killed in the line of duty. The House approved a similar bill earlier this month.

    ProjectUSA, an anti-immigrant group based in Washington, D.C., questions the practice of awarding naturalization to noncitizens in the Armed Forces. The noncitizen involvement is a sign that the nation is "overextending itself," the group argues.

    In Iraq, Arab immigrants have used their cultural and language skills to assist the U.S. military.

    Soldiers hailing from impoverished countries may also bring a personal understanding of what civilians are going through, observed Jan Scruggs, founder of the Vietnam Veterans Memorial.

    The participation of immigrants in the military has helped accelerate their integration into American society, historians say.

    In World War I, the foreign-born accounted for 800,000 or 20 percent of the U.S. armed forces -- many of them draftees drawn from the waves of newcomers from Italy and Eastern Europe.

    The rapid, large-scale World War I buildup transformed the small, largely Anglo Protestant U.S. military. Leaders brought in English teachers, built interdenominational chapels and reached out to immigrants in other ways, said Chris Sterba, author of "Good Americans: Italian and Jewish Immigrants During the First World War."

    Upon their return, veterans could point to the sacrifices they made for the country, he said.

    Then came a backlash. Native-born Americans felt they were losing ground to foreigners, that U.S. institutions were at risk, Sterba added. Lawmakers curtailed immigration, and anti-Semitic and racist groups rose to power.

    Two decades later, during World War II, the government tapped citizens of the Philippines -- then under U.S. rule -- to enlist, promising them American citizenship and veterans' benefits.

    A 1946 law reversed the wartime pledge, with some in Congress arguing that Filipinos should help pay for the islands' liberation from the Japanese.

    While a 1990 act granted U.S. citizenship to Filipino veterans, many lobbied for veterans benefits. About 3,200 are in the Bay Area, living on Supplementary Security Income.

    Luciano Simangdan served as a guerrilla messenger for the U.S. military in World War II, spying on the Japanese. He became a U.S. citizen in 1992 and moved to San Francisco alone. His nine children live in the Philippines.

    "We're fighting for equality," said Simangdan, 75, who gathered with other veterans in April to commemorate the 1942 fall of Bataan in the Philippines.

    Click here to return to top of page.

    The Looting of Iraq: No Myth (posted 6-21-03)

    Eleanor Robson, a council member of the British School of Archaeology in Iraq and a fellow of All Souls College, Oxford, writing in the Guardian (June 18, 2003):

    What is the true extent of the losses to the Iraq Museum -170,000 objects or only 33? The arguments have raged these past two weeks as accusations of corruption, incompetence and cover-ups have flown around. Most notably, Dan Cruickshank's BBC film Raiders of the Lost Art insinuated that the staff had grossly misled the military and the press over the extent of the losses, been involved with the looting themselves, allowed the museum to be used as a military position, and had perhaps even harboured Saddam Hussein. The truth is less colourful.
    Two months ago, I compared the demolition of Iraq's cultural heritage with the Mongol sacking of Baghdad in 1258, and the 5th-century destruction of the library of Alexandria. On reflection, that wasn't a bad assessment of the present state of Iraq's cultural infrastructure. Millions of books have been burned, thousands of manuscripts and archaeological artefacts stolen or destroyed, ancient cities ransacked, universities trashed.

    At the beginning of this year, the staff, led by Dr Dony George and Dr Nawala al-Mutawalli, began to pack up the museum in a well-established routine first devised during the Iran-Iraq war. Defensive bunkers were dug in the grounds. Early in April, Dr John Curtis, head of the Ancient Near East department at the British Museum, described a recent visit to Baghdad during which the museum staff were sandbagging objects too big to be moved, packing away smaller exhibits, and debating "the possibility of using bank vaults and bunkers if the worst came".

    The worst did come. On April 11 the news arrived that the museum had been looted. We later discovered that there had been a two-day gun battle, at the start of which the remaining museum staff fled for their lives. Fedayeen broke into a storeroom and set up a machine gun at a window.

    While senior Iraqi officials were begging for help in Baghdad, the US Civil Affairs Brigade in Kuwait was also trying from April 12 to get the museum protected. They already knew that its most valuable holdings were in vaults of the recently bombed Central Bank. The museum was secured on April 16, but it took until April 21 for Civil Affairs to arrive.

    Captain William Sumner wrote to me that day: "It seems that most of the museum's artefacts had been moved to other locations, but the ones that were looted were 'staged' at an area so that they would be easier to access. It was a very professional action. The spare looting you saw on the news were the excess people who came in to pick over what was left." In other words, there was no cover-up: the military were informed immediately that the evacuation procedures had been effective. Suspicions remained that a single staff member may have assisted the core looters. But, Sumner says: "It might have been one of the grounds people, or anybody. I suspect that we will never know."

    Within a week the museum was secure enough for George to travel to London. At a press conference he circulated a list of some 25 smashed and stolen objects which the curators had been unable to move from the public galleries before the war. They included the now famous Warka vase, which had been cemented in place. Last week it was returned in pieces. Other losses came from the corridor where objects were waiting to be moved off-site. George was understandably reluctant to reveal the location of the off-site storage to the Civil Affairs Brigade as security was still non-existent.

    Click here to return to top of page.

    Does the Senate Have a 50-50 Role in the Selection of Court Nominees? (posted 6-20-03)

    David G. Savage, writing in the LA Times (June 19, 2003):

    The White House gave a "thanks, but no thanks" reply Wednesday to an offer from Senate Democrats to consult with the president before he nominates any justices to the Supreme Court.

    President Bush's top legal advisor left no doubt that the choice -- when or if there is one -- will be the president's alone.

    "If a Supreme Court vacancy arises during his presidency, President Bush will nominate an individual of high integrity, intellect and experience," White House Counsel Alberto R. Gonzales said in a letter to Senate Democrats.

    Then "the Senate will have an opportunity to assess the president's nominee and ... to vote up or down," he added.

    White House Press Secretary Ari Fleischer also dismissed the Democrats' offer as a "novel new approach" to choosing Supreme Court justices.

    In the last week, several Democrats have written Bush to say that he could avoid a battle over the Supreme Court by talking with them about a consensus nominee.

    "I stand ready to work with you to help select a nominee or nominees to the Supreme Court," Sen. Patrick J. Leahy (D-Vt.), the ranking Democrat on the Judiciary Committee, said in a June 11 letter to Bush.

    He noted that Senate Judiciary Committee Chairman Orrin G. Hatch (R-Utah) has taken credit for advising President Clinton to select Ruth Bader Ginsburg and Stephen G. Breyer for the high court.

    "Meaningful bipartisan consultation in advance of any Supreme Court nomination" would prevent a "divisive confirmation fight," said Senate Minority Leader Tom Daschle (D-S.D.) on Tuesday.

    Sen. Charles E. Schumer (D-N.Y.), another Judiciary Committee member, offered Bush a few possible nominees, including Republican Sen. Arlen Specter (R-Pa.).

    Republican leaders and conservative scholars say they are taken aback by the Democrats' claim to have a role in the nomination process.

    "I am astounded by those letters. Does Charles Schumer think he is the president?" asked law professor John Eastman.

    A former clerk to Supreme Court Justice Clarence Thomas, Eastman teaches at Chapman University Law School in Orange and recently advised Senate Republicans on the constitutionality of filibusters. "The president has the sole power to nominate, and only then does the Senate give its advice and consent," Eastman said.

    Sen. John Cornyn (R-Texas), the newest Republican on the Judiciary Committee, also urged Bush to ignore the Democrats. "Few things would politicize our judiciary more than to hand over control of the process for selecting Supreme Court justices to individual members of the Senate," Cornyn said. "Presidents, not politicians, nominate justices."

    The debate about the Senate's role in confirming judges is an old one, and it tends to flare up when Supreme Court seats are at stake. At the moment, there is speculation that one or more of the justices will retire this month at the end of the court's current term.

    The Constitution says the president "shall nominate, and by and with the advice and consent of the Senate, shall appoint ... judges of the Supreme Court."

    On Wednesday, Sen. Barbara Boxer (D-Calif.) cited this "advice and consent" clause in describing judicial appointments as "a 50-50 deal."

    She added: "The president, in this process, is not more important than the Senate, and the Senate's not more important. They have to work together."

    Boxer and other Democrats base their view on historians who say that early drafts of the Constitution gave the Senate the power to appoint officials. The final version of the Constitution, though, made it clear that the power to nominate judges and other officials rests with the president.

    Liberal and conservative activists are gearing up for an all-out battle, and the Senate Judiciary Committee has cleared its calendar for possible hearings this summer.

    But none of the justices has hinted at retirement; instead, they have spoken of their plans for the fall session.

    Click here to return to top of page.

    9-11: The Movie (Really) (posted 6-20-03)

    Paul Farhi, writing in the Washington Post (June 19, 2003):

    In the hours after the terrorist attacks of Sept. 11, a bold, forceful President Bush orders Air Force One to return to Washington over the objections of his Secret Service detail, telling them: "If some tinhorn terrorist wants me, tell him to come and get me! I'll be at home, waiting for the bastard!"

    Well, the president didn't actually speak those words. But it's close enough for the Hollywood version of events. In a forthcoming docudrama for the Showtime cable network, an actor playing the president spits out those lines to his fretful underlings in a key scene.

    The made-for-TV film, "D.C. 9/11," is the first to attempt to re-create the events that swirled around the White House in the hours and days immediately after the strikes on the Pentagon and the World Trade Center.

    The quintessentially American story was shot primarily in Toronto, where drafts of the movie's dialogue were leaked to the Globe and Mail newspaper.

    Sources here confirmed the generally heroic portrayal of the president and his aides, including the dramatic scene in which Bush is hopscotching the country in Air Force One as a security precaution. When a Secret Service agent questions the order to fly back to Washington by saying, "But Mr. President -- , " Bush replies firmly, "Try 'Commander in Chief.' Whose present command is: Take the president home!"

    The two-hour film, to air around the second anniversary of Sept. 11, 2001, stars Timothy Bottoms as Bush, reprising a role Bottoms played for laughs on the short-lived Comedy Central series "That's My Bush!," which went off the air a week before the Sept. 11 attacks. Many of the movie's secondary roles, such as Vice President Cheney and Secretary of State Colin Powell, are played by obscure New York and Canadian actors. Among the familiar faces in the cast are Penny Johnson Jerald (she plays the president's ex-wife on the Fox series "24"), who appears as national security adviser Condoleezza Rice, and George Takei (Sulu on the original "Star Trek" series), who plays Transportation Secretary Norman Mineta. The movie's veteran director, Daniel Petrie, made such films as "Eleanor and Franklin," "Sybil" and "A Raisin in the Sun."

    The writer-producer of "D.C. 9/11," Lionel Chetwynd, declined to discuss specific scenes or dialogue in the film. But he defended its general accuracy, saying: "Everything in the movie is [based on] two or three sources. I'm not reinventing the wheel here. . . . I don't think it's possible to do a revision of this particular bit of history. Every scholar who has looked at this has come to the same place that this film does. There's nothing here that Bob Woodward would disagree with." Woodward, a Washington Post assistant managing editor, is the author of "Bush at War," a best-selling account of the aftermath of Sept. 11.

    Chetwynd said his approach to the post-Sept. 11 story was similar to that of a 1974 TV movie, "The Missiles of October," a dramatization of the showdown between President Kennedy and Soviet Premier Nikita Khrushchev over Soviet missile emplacements in Cuba in 1962. "This is about how George Bush and his team came to terms with the reality around them and led the country in a new direction," he said.

    Click here to return to top of page.

    Wesley Clark for President (posted 6-19-03)

    James Taranto, writing in the Wall Street Journal's "Best of the Web" (June 18, 2003):

    There's a move afoot to draft retired general Wesley Clark, who oversaw the liberation of Kosovo as NATO's supreme allied commander, as the Democratic presidential nominee. "The Democratic Party must lead on issues of national security to win the Presidency," declares the Web site DraftClark.com. "We must nominate a candidate with the ability and ideas needed to resolve these problems and renew the peace."

    But shouldn't a president at least know something about his country's history? Clark appeared on NBC's "Meet the Press" Sunday and made this astonishing statement by way of explaining why he opposes President Bush's tax cuts:

    I thought this country was founded on a principle of progressive taxation. In other words, it's not only that the more you make, the more you give, but proportionately more because when you don't have very much money, you need to spend it on the necessities of life.

    As radio host Neal Boortz points out, the notion that the nation "was founded on a principle of progressive taxation" is staggeringly ignorant. In a brief history on its Web site, the Century Foundation points out that there was no income tax at all for 118 years after America's founding, except for a temporary levy during the Civil War. Congress enacted a highly progressive income tax in 1894, but the Supreme Court declared it unconstitutional the following year--which would have been a neat trick if progressive taxation had been one of the nation's founding principles. The income tax didn't return until October 1913, when Woodrow Wilson signed it into law. This was possible only because of a constitutional amendment--the 16th, ratified six months earlier.

    America does have a history of electing generals to the presidency; 12 of the 42 men who've served as president held the rank of general in either the Army or a state militia. But in recent times general-presidents have been the exception rather than the rule. The last general to serve before Dwight Eisenhower was Benjamin Harrison (1889-93).

    Do generals make good presidents? The record is decidedly mixed. The 10 generals-turned-presidents ranked in the Federalist Society/Wall Street Journal survey of scholars (William Henry Harrison and James Garfield are excluded, since both died soon after taking office) include the first and greatest president, George Washington, and two "near greats," Ike and Andrew Jackson.

    Of the remaining seven presidents in this group, all fall below the median ranking (and they don't even have the excuse of having gone to school in Sacramento, where half the kids are below the 50th percentile). The highest-rated of the seven is Rutherford Hayes, at No. 22 of 39. Only Hayes and Chester Arthur make the "average" category; Benjamin Harrison, Zachary Taylor and Ulysses Grant are "below average," while Andrew Johnson and Franklin Pierce rank as "failures."

    Another interesting fact: Of the generals who became president, all but four did so after holding some lower elective office. Of those four, three--Washington, Grant and Eisenhower--commanded the troops in the most consequential wars of American history. Liberating Kosovo is nothing to sneeze at, but it's just not up there with winning independence, preserving the Union or defeating the Nazis. At the risk of sounding cruel, the Draft Clark crowd's aim seems to be to give us another Zachary Taylor.

    Click here to return to top of page.

    They Lied in 1898 and 1964, Too (posted 6-18-03)

    Kennan Ferguson, author of The Politics of Judgment, writing in Newsday (June 17, 2003):

    Americans have historical experience with specious justifications for warfare. The past century of American history provides two notably similar instances, parallel both for impelling a skeptical American population toward war and for their ultimate misrepresentation.

    The first, the 1898 sinking of the battleship USS Maine, triggered the Spanish-American War. Most historians have concluded that the incident in Havana harbor was not caused by the Spanish, but was the result of an internal explosion. Nonetheless, it led not just to war but also to the U.S. occupation of Cuba, Puerto Rico, Guam and the Philippines.

    The second, the 1964 Gulf of Tonkin incident, occurred when President Lyndon B. Johnson's administration fabricated a torpedo attack on U.S. warships off North Vietnam in order to justify a massive escalation in the Vietnam War. In the latter case, unlike the Maine, the American government could ascertain that the "enemy" had done no such thing.

    These examples do not precisely match the case of Iraq, of course, but both shed light on the claims of both opponents and champions of Bush's war. In each case, U.S. power expanded in the war's wake, even as doubts arose about American integrity. The United States emerged as an imperial power at the beginning of the 20th century, and Vietnam clearly showed the world that the United States stood ready to go to war to protect its interests. Our government's chicanery may even have strengthened its place in the world, as it became less predictable and more dangerous.

    Truth, then, has little to do with power. In both of these cases, the falsity of the claim did not lessen U.S. power in the world. At each time, strong domestic pressures also supported the deception. The fires of the Spanish-American war were fed by William Randolph Hearst's chain of newspapers, whose bias and lack of accuracy make today's Fox News look like The Nation. Johnson had the support of a Democratic Party eager to be "tough on communism" and a mass media all too willing to accept whatever the president declared.

    Like today, the aftermath of the falsehoods not only reinforced the nation's power abroad, but also that of the presidency. Those on the left should not be surprised that Bush continues to be taken seriously by Arab governments and by most Americans.

    But the effects of fabrications are not limited to how others are influenced; if they were, deception would only be morally wrong if uncovered. Philosophers from Socrates to Kant have argued that lying is itself morally damaging, that those who lie are themselves harmed by the act. If true, then the results of duplicity are measured not by influence in the world, but by its effects on the soul.

    Again, these two historical examples illuminate this point. Vietnam not only destroyed Johnson's legacy but also profoundly damaged the nation, turning Americans against their institutions and one another. Our seizure of the Spanish empire became a moral failure; the occupation ended up killing more than half a million Filipinos over the years, and we managed our relationship with Cuba so badly that many Cubans eventually welcomed the rule of Fidel Castro.

    In both cases, the worst aspect was that those who lied were affected by their own fictions, justifying their deceptions in the name of something bigger. America ostensibly started these wars to free people from oppression, but the falsity of the reasons for the wars resulted in conflicts far less moral than those waged for honest reasons.

    Click here to return to top of page.

    Truth Is the First Casualty (posted 6-18-03)

    Michael Getler, writing in the Washington Post (June 15, 2003):

    In 1917, with World War I raging, Sen. Hiram W. Johnson of California observed: "The first casualty when war comes is truth." In 1975, a book by British journalist Phillip Knightley titled "The First Casualty" advanced that theme by examining wartime reporting by correspondents from the Crimean War in the 1850s through Vietnam in the 1960s and '70s. Knightley's book, his publisher noted, "suggests that our attitudes to history are molded by what we read in wartime and that what we read too often bears little resemblance to reality."

    Knightley's volume was quite popular, and it is likely that other journalists or historians soon will probe at length into whether truth was a casualty of this year's U.S.-and-British-led war against Iraq. Only this time, the focus needs to be at home rather than abroad, and on the government as well as the press.

    Were flawed or exaggerated intelligence estimates of Iraqi weapons of mass destruction, of the immediacy of the threat posed to this country by Iraq, and of the links between Iraq and the events of 9/11 used to help propel the United States into war? However popular and beneficial it may be to have removed Saddam Hussein from power, the question of whether the administration took the country to war on questionable premises is central to U.S. credibility and government.

    A fair number of readers are agitated about these questions, but finding authoritative answers won't be easy. Congress, which might be expected to produce a public record of investigative hearings and accountability, has not done much so far. Many newspapers and newsmagazines are struggling to get at these questions. But it is a murky, sensitive and highly classified field of inquiry, and one that is vulnerable to being politicized. It is an area in which some of the people who know some of the things and are willing to talk to reporters are unwilling to speak on the record. Although reporters may trust these sources and may have checked their information with other sources, quoting people anonymously still erodes the confidence of some readers and gives administration spokesmen an edge in the battle for making a case and molding attitudes.

    Click here to return to top of page.

    Europe Shouldn't Run from Its Christian Origins (posted 6-18-03)

    Kenneth L. Woodward, contributing editor at Newsweek, writing in the NYT (June 14, 2003):

    Next week, leaders of the European Union will meet in Greece to vote on a proposed constitution that will govern the lives of 450 million Europeans. The most agitated debate at the convention that produced the draft focused on the preamble, specifically whether God in general, and Christianity in particular, ought to be mentioned among the sources of the "values" that produced a common European culture and heritage.

    Though the Vatican did not have a representative at the convention in Brussels, Pope John Paul II has been the most outspoken of the European churchmen who have argued that Christianity should be listed among the inspirational sources that have shaped European culture. That's no surprise, since the pope has long insisted that Christianity is the cultural link between the people of Western and Eastern Europe. Ten East European countries, including Catholic Poland, are expected to join the union next year. Opponents have argued that a reference to God belies the constitution's secular purpose, and that a specific reference to Christianity would alienate Western Europe's 15 million Muslim immigrants -- not to mention Muslim Turkey, which is eager to join in the union's eastward expansion.

    For the moment, the secularists have won. In the draft that the convention approved yesterday, the preamble refers abstractly to "the cultural, religious and humanist inheritance of Europe." That seemed awfully vague to me as I sipped brandy one recent night in the Piazza San Marco after hours of communing with the mosaics of the Christian saints inside Venice's majestic Basilica of St. Mark the Evangelist. Indeed, it seems as if one cannot find a Venetian public square that does not also have a church, many of them decorated with frescos by masters like Titian, Tintoretto and Tiepolo. The next evening, in the perfect acoustics of the Church of San Samuele, I listened to two young vocalists sing arias from "Tosca," "La Boheme" and "La Traviata" under the serene gaze of a "Madonna and Child." No one can visit Italy, or the medieval core of any European city, without encountering evidence of the Christian humanism that gives Europe its enduring cultural identity and -- even now -- its particular glow. Who goes to Brussels except on business?

    "At the center of culture is cult," observed Christopher Dawson, the great historian of medieval Europe. And for more than a millennium, the cult or "worship" of Europeans was manifestly Christian. On that basis alone, Christianity has an unrivaled claim to a privileged place among the sources of European culture.

    Click here to return to top of page.

    Did the West Succeed By Exploiting Others? (posted 6-17-03)

    Dinesh D''Souza, writing in the Washington Times (June 17, 2003):

    The idea that America and the West grew rich through oppression and exploitation is strongly held among many intellectuals and activists. In the West, the exploitation thesis is invoked, by Jesse Jackson and others, to demand the payment of hundreds of billions of dollars in reparations for slavery and colonialism to African-Americans and natives of the Third World. Islamic extremists like Osama bin Laden insist the Muslim world is poor because the West is rich, and they use Western oppression as their pretext for unleashing violence, in the form of terrorism, against American civilians.

    Did the West enrich itself at the expense of minorities and the Third World through its distinctive crimes of slavery and colonialism? This thesis is hard to sustain, because there is nothing distinctively Western about slavery or colonialism. The West had its empires, but so did the Persians, the Mongols, the Chinese, and the Turks. The British ruled my native country of India for a couple of hundred years. But before the British came, India was invaded and occupied by the Persians, the Mongols, the Turks, the Afghans, and the Arabs. England was the seventh or eighth colonial power to establish itself on Indian soil.

    If colonialism is not a Western institution, neither is slavery. Slavery has existed in every known civilization. The Chinese had slavery, and so did ancient India. Slavery was common all over Africa, and American Indians had slavery long before Columbus arrived on this continent.
    What is uniquely Western is not slavery but the movement to abolish slavery. There is no history of anti-slavery activism outside of Western civilization. Of course in every society, slaves have strongly resisted being slaves. Runaways and slave revolts occurred frequently in all slave cultures. But only in the West did a movement arise, not of slaves, but of potential slave-owners, to oppose slavery in principle.

    Click here to return to top of page.

    "Mencken Made Up Stuff, Too" (posted 6-17-03)

    Jack Shafer, writing in Slate (June 12, 2003):

    Jayson Blair, Stephen Glass, and Christopher Newton all fabricated details—mundane and spectacular—in their journalism. But why? Reaching for the simplest explanation, I previously wrote that fabulists make stuff up because they don't have the talent or industry to produce copy grand enough to satisfy their egos.

    But if we agree that hacks and loafers resort to lies because they don't know how else to make great journalism, what can we say about reporters from the Pantheon who marbled their journalism with fiction? I'm thinking of H.L. Mencken, A.J. Liebling, and Joseph Mitchell, all of whom made stuff up. None of them suffered much in the way of reputation injury when their inventions were discovered. What sort of double standard is this?

    The most egregious prevaricator was Mencken. In the second volume of his memoirs, Newspaper Days, Mencken gleefully confesses to concocting stories at the turn of the century while working as a young city reporter for the Baltimore Herald. When the Herald promoted him to the City Hall beat, Mencken and the American's reporter asked the Sun's City Hall guy if he would like to pool his reporting with them in the name of efficiency. (Mencken had just left a similar arrangement on his previous beat.) When the Sun reporter resisted, Mencken and his pal on the American planted fake stories in their papers "with refinements of detail that coincided perfectly, so all the city editors in town … accepted it as gospel." Mencken and his pal steadily escalated "from one fake a day to two, and then to three, four, and even more," making the Sun editors think the American and Herald were consistently beating their reporter. Finally, the Sun reporter relented and joined the pool.

    Elsewhere in Newspaper Days, Mencken brags of publishing a Page One story in the Herald in 1905 about the outcome of a naval battle between Japan and Russia—two weeks before the authentic results were known. Luckily for Mencken, he correctly imagined Tokyo the winner. He also brags of publishing weekly stories about a Baltimore "wild man" he invented.

    Some may excuse Mencken, arguing that journalistic standards were less rigorous back then. Or they might say most of Mencken's crimes against truth were inconsequential practical jokes. But by 1925, newspaper standards were sufficiently strict enough that the New York Times fired the 21-year-old A.J. Liebling from its copy desk for pranking a Times reporter. According to Raymond Sokolov's Wayward Reporter: The Life of A.J. Liebling, the young Liebling, working on the copy desk, changed another reporter's byline, swapping the middle name "Patrick" for the more officious sounding "Parnell."...

    Telling the truth is a learned skill, and the best time to teach it is when the subject is young—either in school or in the first day on the job. It's no accident that so many of the recent journalistic prevaricators—Glass, Blair, Jones, Finkel, Forman—were young when apprehended. As my friend Jonathan Chait points out, journalism unmasks most fact-futzers when they're young, leaving few to age into old journalists.

    Click here to return to top of page.

    Arabs and Jews: A Visit to Auschwitz (posted 6-17-03)

    Yossi Klein Halevi, a contributing editor at the New Republic, writing in TNR (June 11, 2003):

    As we emerge from the underground crematoriums, Ali, a leader of the Arab Israeli Scouts movement, links his arm in mine. "Does it make it easier or harder to deal with the past by coming here?" he asks. For a Jew to be comforted by an Arab in Auschwitz is so counterintuitive that all my accumulated rage, from the Holocaust to the intifada, yields to grief.

    We are participants in the first pilgrimage by Arab and Jewish Israelis to Auschwitz. There are 260 of us, almost evenly divided between Jews and Arabs, joined by 200 Muslims, Christians, and Jews from France. The Arab Israelis include professionals for whom coexistence with Jews is routine. One Arab Israeli doctor who heads an emergency ward tells me, "I haven't missed a terrorist attack." The Jews include the rabbi of a West Bank settlement, who says that, when your neighbor tries to feel your pain, you must respond, and a former Knesset member from the ultra-dovish Meretz Party, which expelled him for seeking common ground with rightists. Our starting point is political despair. We are entering the abyss together, hoping to emerge in some way transformed. It's no coincidence that this pilgrimage was conceived by a priest, Emile Shoufani, an Arab Catholic from Nazareth, whom we all call "Abuna," Arabic for father. Abuna speaks grandly of creating "a new human being." Fostering dialogue between Israeli Arabs and Jews seems ambitious enough.

    I didn't want to come. For years, I've avoided the Holocaust, convinced that, as a survivor's son who had experienced militant Zionism in my teens, group therapy in my twenties, immigration to Israel in my thirties, and spiritual journeys in my forties, I had exhausted my ability to find some way to respond to the event that formed me. And I've long resisted linking the Holocaust with Israel's predicament. After all, Jews hardly need Auschwitz to feel vulnerable today.

    Finally, I worried about Arab expectations of a false reciprocity: Your Holocaust for our nakba ("catastrophe"), the term Arabs use for the creation of Israel. But Abuna set a precondition for Arab participants: No comparisons between our suffering and theirs. In the late '90s, I undertook a journey of prayer into mosques and monasteries. When the Palestinians began their current terrorist war, though, I felt too betrayed to continue reaching out. Only an Arab attempt to understand us, I decided, could renew my capacity for empathy. Now, here was a group of Arab Israelis trying to do precisely that. On the bus to Auschwitz, an Arab woman takes the mike. She has joined us, she explains, because she fears the anger that's distorting her. That is why I've come: not to save the Middle East but myself.

    Click here to return to top of page.

    Why Is Plagiarism Such a Big Problem These Days? (posted 6-13-03)

    David Mehegan, writing in the Boston Globe (June 11, 2003):

    Why do they do it? With the Internet making it easy to disseminate and read virtually anything anyone writes, it has become that much easier to catch plagiarists. So why do writers continue to steal the works of others? There are many explanations: gnawing self-doubt, narcissistic self-confidence, haste, pressure from publishers and editors, unrestrained ambition, a self-destructive need to court disaster, and, sometimes, ignorance of what plagiarism is.

    "There has to be some anxiety that motivates it," says Louise J. Kaplan, a New York psychotherapist and author. "It's very much tied up with a person's uncertain sense of personal identity. Tricking people and convincing them that something untrue is true helps them conquer some other anxiety."

    Plagiarists are often accomplished and talented writers, which makes their acts baffling to others. But they may not believe that their own work can be as good as anyone else's. This can even lead to a kind of reverse plagiarism. Take the 18th-century poet Thomas Chatterton, who claimed poems he had written were the work of a 15th-century monk. "He was a talented poet," says Kaplan, who wrote a book about him, "yet he created an entire world because he felt that no one would buy anything written by him."

    Joseph Glenmullen, a clinical instructor in psychiatry at Harvard Medical School, says plagiarism is sometimes rooted in the American dream that anyone can be anything he aspires to be. "I see a lot of students in trouble for plagiarism," he says. "One of the great gifts of our culture is upward social mobility, but it has an Achilles' heel: People in this kind of trouble are often under a lot of pressure from their families to do well and succeed, but sometimes there is a lot of anger and resentment at that pressure."

    Sometimes ordinary self-doubt becomes pathological. "There is often a narcissistic theme," Glenmullen says, "a facade of self-confidence, arrogance masking self-loathing. It can be a college kid or a multi-bestselling author who thinks it's humiliating" not to be able to match that grandiose self-image. Such people, he says, "are often conscious of wanting something, but not of guilt about how they get it."

    A plagiarist can be an egoist with an exaggerated conviction of his own talent, which makes the work of others seem ripe for "improvement."

    "There's usually a feeling that [plagiarizing] doesn't matter very much, a feeling that the important thing is what the plagiarist is going to bring to the project," says Amy Bloom, a therapist, novelist, and short-story writer. "He throws his cloak of talent over [another writer's work], and the substance is transformed from useful information and dross to his very special gold."

    Historians and educators have also become concerned at the degree of ignorance among students, and even professional authors, of what constitutes plagiarism. And they're worried the rise of the Internet has made plagiarism more tempting in times of pressure.

    "The resources for committing plagiarism are expanding," says William Cronon, professor of history at the University of Wisconsin and vice president of the professional division of the American Historical Association. "We are seeing more plagiarism because of the ease with which students can download from the Web."

    The association's newly revised statement on professional standards (found at www.theaha.org) has one of the most detailed definitions of plagiarism to be found. "Plagiarism includes more subtle and perhaps more pernicious abuses than simply expropriating the exact wording of another author without attribution," it reads. "Plagiarism also includes the limited borrowing, without attribution, of another person's distinctive and significant research findings, hypotheses, theories, rhetorical strategies, or interpretations, or an extended borrowing even with attribution."

    Click here to return to top of page.

    Conservatives Have a History of Exaggerating Threats Posed By Tyrannical Regimes (posted 6-13-03)

    Fareed Zakaria, writing in Newsweek (June 16, 2003):

    For decades some conservatives, including many who now wield great influence, have had a tendency to vastly exaggerate the threat posed by tyrannical regimes.

    It all started with the now famous “Team B” exercise. During the early 1970s, hard-line conservatives pilloried the CIA for being soft on the Soviets. As a result, CIA Director George Bush agreed to allow a team of outside experts to look at the intelligence and come to their own conclusions. Team B—which included Paul Wolfowitz—produced a scathing report, claiming that the Soviet threat had been badly underestimated.

    In retrospect, Team B’s conclusions were wildly off the mark. Describing the Soviet Union, in 1976, as having “a large and expanding Gross National Product,” it predicted that it would modernize and expand its military at an awesome pace. For example, it predicted that the Backfire bomber “probably will be produced in substantial numbers, with perhaps 500 aircraft off the line by early 1984.” In fact, the Soviets had 235 in 1984.

    The reality was that even the CIA’s own estimates—savaged as too low by Team B—were, in retrospect, gross exaggerations. In 1989, the CIA published an internal review of its threat assessments from 1974 to 1986 and came to the conclusion that every year it had “substantially overestimated” the Soviet threat along all dimensions. For example, in 1975 the CIA forecast that within 10 years the Soviet Union would replace 90 percent of its long-range bombers and missiles. In fact, by 1985, the Soviet Union had been able to replace less than 60 percent of them.

    In the 1990s, some of these same conservatives decided that China was the new enemy. The only problem was that China was still a Third World country and could hardly be seen as a grave threat to the United States. What followed was wild speculation about the size of the Chinese military and accusations that it had engaged in massive theft of American nuclear secrets. This came to a crescendo with the publication of the Cox Commission Report in 1999, which claimed that Chinese military spending was twice what the CIA estimated. The Cox report is replete with speculation, loose assumptions and errors of fact. The book it footnotes for its military-spending numbers, for example, does not say what the report claims.

    Iraq is part of a pattern. In each of these cases, arguments about the threat posed by a country rest in large part on the character of the regime. The Team B report explains that the CIA’s analysis was flawed because it was based on too much “hard data”—meaning facts—and neglected to divine Soviet intentions. The Chinese regime is assumed to be a mortal danger because it is Leninist. Saddam was assumed to be working on a vast weapons program because he was an evil man.

    Click here to return to top of page.

    Sidney Blumenthal and Historians (posted 6-12-03)

    Transcript of an interview posted on the website of the Washington Post (June 10, 2003):

    washingtonpost.com: Mr. Blumenthal, thank you for joining us today. A lot of Clinton book news lately -- Hillary's "Living History" just came out and just before that, your "The Clinton Wars." Yours has been described as "part history, part memoir." It's 800 pages. What was your reason for writing the book and how are you reacting to your critics of your account of Clinton's second term?

    Sidney Blumenthal: As someone who was able to be witness and to participate, in the real West Wing of the Clinton White House I wanted to write about the history in order to establish the rcord as I understand it. Journalism has been called the first rough draft of history and the first draft on the Clinton presidency was very rough and often wrong.

    With some perspective and new facts I hope my book sheds new light for all readers and especially for historians on this crucial period in our history. Reviews of my book have been divided. No one, however, has challenged the historical accuracy or the facts in The Clinton Wars, and even my harshest critics, even a conservative writer like Andrew Sullivan, has conceded that my account is not only accurate but the best account of these tumultuous events. Four historians have so far written about the Clinton wars including Robert Dallek, David Greenburg of Columbia, Sean Wilentz of Princeton and Jack Bass of the University of Charleston and all of them have acclaimed the book as a valuable contribution to history, and as Dallek wrote in the New York Times Book Review, the place to begin to understand the Clinton presidency.

    However, there have been some critics -- all members of the press corps -- who have been extremely critical and virtually all of them were participants in the events described in The Clinton Wars and these critics have been self-defensive about their own actions involving what I consider either pseudo-scandals or serving as tools of right-wing operatives, Republican congressional staffers and/or Ken Starr's office. What their intent defensiveness reveals is that they haven't come to terms with these events or their own role in them. It is the historians who have been able to approach The Clinton Wars with more perspective.

     

    Click here to return to top of page.

    1933 and 2003 (posted 6-12-03)

    Bernard Weiner, former editor at the San Francisco Chronicle, writing in The Crisis Papers (June 9, 2003):

    If my email is any indication, a goodly number of folks wonder if they're living in America in 2003 or Germany in 1933.

    All this emphasis on nationalism, the militarization of society, identifying The Leader as the nation, a constant state of fear and anxiety heightened by the authorities, repressive laws that shred constitutional guarantees of due process, wars of aggression launched on weaker nations, the desire to assume global hegemony, the merging of corporate and governmental interests, vast mass-media propaganda campaigns, a populace that tends to believe the slogans and lies it's fed without asking too many questions, a timid opposition that barely contests the administration's reckless adventurism abroad and police-state policies at home, etc. etc.

    The parallels are not exact, of course; America in 2003 and Germany seventy years earlier are not the same, and Bush certainly is not Adolf Hitler. But there are enough disquieting similarities in the two periods at least to see what we can learn -- cautionary tales, as it were -- and then figure out what to do with our knowledge.

    The veneer of civilization is thin. We know this from our own observations, and various writers -- from Shakespeare to Sinclair Lewis ("It Can't Happen Here") -- have shown us how easily populations can be manipulated by leaders skillfully playing on patriotic emotion or racial or nationalist feelings.

    Whole peoples, like individuals, can become irrational on occasion -- sometimes for a brief moment, sometimes for years, sometimes for decades. Ambition, hatred, fear can get the better of them, and gross lies told by their leaders can deceive their otherwise rational minds. It has happened, it happens, it will continue to happen.

    One of the most outrageous and horrific examples of an entire country falling into national madness probably was Hitler's Germany from 1933-45. The resulting world war was disastrous, leading to more than 40,000,000 deaths.

    A good share of what we know about how this happened in Germany usually comes to us many years later from post-facto books, looking backward to the horror. There are very few examples of accounts written from the inside at the very time the events were unfolding.

    One such book is "Defying Hitler," by the noted German journalist/author Sebastian Haffner. The manuscript was found, stuffed away in a drawer, by Haffner's son in 1999 after his father's death at age 91. Published in 2000, the book became an immediate best-seller in Germany and was published last year in English, translated by the son, Oliver Pretzel. (His father's original name was Raimund Pretzel; as Sebastian Haffner, he went on to a highly successful career, writing in England during the war and then later back in Germany. He authored "From Bismarck to Hitler" and "The Meaning of Hitler," among many other works.)

    "Defying Hitler" is a brilliantly written social document, begun (and ended abruptly) in 1939; even though it fills in the reader on German history from the First World War on, its major focus is on the year 1933, when, as Hitler assumed power, Haffner was a 25-year-old law student, in-training to join the German courts as a junior administrator.

    You find yourself reading this book in amazement; there is so much historical perspective, so much sweep of what was going on and predictions of what later was to happen, so many insights into what led so many ordinary Germans to join with or acquiesce to the Nazi program -- how could anyone so young be so prescient in the midst of the brutal sordidness that was Nazism? (Indeed, some critics claimed that Haffner must have rewritten the book decades later; every page of the original manuscript was sent to laboratories, which authenticated that it indeed had been composed in 1939.)

    What distinguishes "Defying Hitler," in addition to its superb writing, is that Haffner focuses on "little people" like himself, rather than on the machinations of leaders. He wants to explore how ordinary Germans, especially non-Nazi and anti-Nazi Germans, permitted themselves to be swallowed whole into the Hitlerian maw.

    Haffner makes occasional broad pronouncements about German character traits ("As Bismarck once remarked in a famous speech, moral courage is, in any case, a rare virtue in Germany, but it deserts a German completely the moment he puts on a uniform"), but he devotes a good deal of his attention to the question of personal responsibility. If you read ordinary history books, he says, "you get the impression that no more than a few dozen people are involved, who happen to be 'at the helm of the ship of state' and whose deeds and decisions form what is called history.

    "According to this view, the history of the present decade [the 1930s] is a kind of chess game among Hitler, Mussolini, Chiang Kai-Shek, Roosevelt, Chamberlain, Daladier, and a number of other men whose names are on everybody's lips. We anonymous others seem at best to be the objects of history, pawns in the chess game, who may be pushed forward or left standing, sacrificed or captured, but whose lives, for what they are worth, take place in a totally different w orld, unrelated to what is happening on the chessboard.

    "...It may seem a paradox, but it is nonetheless the simple truth, to say that on the contrary, the decisive historical events take place among us, the anonymous masses. The most powerful dictators, ministers, and generals are powerless against the simultaneous mass decisions taken individually and almost unconsciously by the population at large...Decisions that influence the course of history arise out of the individual experiences of thousands or millions of individuals."

    Haffner tries to solve the riddle of the easy acceptance of fascism in Hitler's Third Reich. In March of 1933, a majority of German citizens did not vote for Hitler. "What happened to that majority? Did they die? Did they disappear from the face of the earth? Did they become Nazis at this late stage? How was it possible that there was not the slightest visible reaction from them" as Hitler, installed by the authorities as Chancellor, began slowly and then more quickly consolidating power and moving Germany from a democratic state to a totalitarian one?

    All along the way, Hitler would propose or actually promulgate regulations that sliced away at German citizens' freedoms -- usually aimed at small, vulnerable sectors of society (labor unionists, communists, Jews, mental defectives, et al.) -- and few said or did anything to indicate serious displeasure. In the early days, on those rare occasions when there was concerted negative reaction, Hitler would back off a bit. And so the Nazis grew bolder and more voracious as they continued slicing away at civil society. Many Germans (including some of Hitler's original corporate backers) were convinced Nazism would collapse as it became more and more extreme; others chose denial. It was easier to look the other way.

     

    Click here to return to top of page.

    40 Years Ago: Standing in the Schoolhouse Doorway (posted 6-12-03)

    From NPR (June 11, 2003):

    Forty years ago today, Alabama Gov. George Wallace stood at the door of Foster Auditorium at the University of Alabama in a symbolic attempt to block two black students, Vivian Malone and James Hood, from enrolling at the school. The drama of the nation's division over desegregation came sharply into focus that June day. NPR's Debbie Elliott reports.

    It was the same year that civil rights marchers had been turned back with police dogs and fire hoses in Birmingham, Ala. The year began with Wallace vowing "segregation now, segregation tomorrow and segregation forever" in his inaugural speech.

    During his campaign, Wallace talked of physically putting himself between the schoolhouse door and any attempt to integrate Alabama's all-white public schools.

    So when a federal judge ordered Malone and Hood be admitted to the University of Alabama in Tuscaloosa that summer, Wallace had the perfect opportunity to fulfill his pledge, Elliott reports.

    Cully Clark is dean of the university's College of Communication and author of The Schoolhouse Door: Segregation's Last Stand at the University of Alabama. He says President Kennedy and his brother, Attorney General Robert Kennedy, personally negotiated what was to happen in Tuscaloosa that summer, but they weren't sure what Wallace would do.

    "They knew he would step aside," Clark says. "I think the fundamental question was how." Just in case, Clark says, National Guard troops had practiced how to physically lift the governor out of the doorway.

    On June 11, with temperatures soaring, a large contingent of national media looked on as Wallace took his position in front of Foster Auditorium. State troopers surrounded the building. Then, flanked by federal marshals, Deputy Attorney General Nicholas Katzenbach told Wallace he simply wanted him to abide by the federal court order.

    Wallace refused, citing the constitutional right of states to operate public schools, colleges and universities. Katzenbach called President Kennedy, who federalized the Alabama National Guard to help with the crisis. Ultimately, Wallace stepped aside and the two students were allowed to register for classes.

    But the incident catapulted the governor into the national spotlight and he went on to make four runs at the presidency. It was also a watershed event for President Kennedy, who in staring down the South's most defiant segregationist aligned himself solidly with the civil rights movement.

    Vivian Malone Jones, then a 20-year-old transfer from an all-black college, said her goal was simply to sign-up for accounting classes. "I didn't feel I should sneak in, I didn't feel I should go around the back door. If [Wallace] were standing the door, I had every right in the world to face him and to go to school."

    Two years later, she became the first African American to graduate from the University of Alabama

    Click here to return to top of page.

    Should the Old North Church Receive Federal Assistance? (posted 6-12-03)

    Terry Eastland, publisher of the Weekly Standard, writing in the Washington Times (June 11, 2003):

    Interior Secretary Gale Norton recently announced a grant of $317,000 to help preserve an aging edifice of historical importance to the nation. Whereupon Americans United for the Separation of Church and State objected. Why? Because the group sees a manifest violation of church and state in the new policy under which the Park Service made the grant.

    The edifice happens to be Boston's Old North Church, where Paul Revere got the signal that the British were advancing. And the Old North Church still is — as a horrified Barry Lynn, the executive director of Americans United, told reporters — "an active church." Were it inactive, were it one of the dead cathedrals of Europe — if its 150 members had dispersed to other churches — why, in Mr. Lynn's reckoning, the grant would be fine.

    Mr. Lynn says Americans United might sue the Interior Department. It would lose. The department's policy happens to be right. And it is one that the Old North Church, though this hardly was its intent, helped bring about. Twice now, you could say, it has proved nationally important.

    Fourteen months ago, the Old North Foundation applied to the Park Service for a historic preservation grant. The nonprofit was established to "support the maintenance of Old North Church [now 280 years young] and its associated buildings as a symbol of freedom." And the grant it sought reflected its mission. Among other things, the church's original windows needed repair.

    The Park Service awarded the grant, noting the church's "standing and importance in the history of America." But a month later, the Park Service reversed course and withdrew the award on the grounds that Episcopalians own the Old North Church and actually worship there.

    Apparently, the Park Service had processed the application as though it involved a nonreligious entity. You can see how that might have happened, since the Old North Church operates a museum and gift shop and is open daily to the public for tours. Indeed, on its Web site, the Park Service says visiting the Old North Church brings to life such "American ideals" as freedom of speech and self-determination. Once someone realized the Old North Church is more than just a historic site, the Park Service applied what had been policy since the late 1970s.

    That policy wasn't committed to writing until 1995, when the Interior Department asked the Justice Department to advise on how a court might regard the direct award of a historic preservation grant to a church. Acknowledging that the question was a "very difficult one" and that "the Supreme Court's jurisprudence in this area is still developing," the Justice Department concluded that a court would say an award to "an active church" violates the First Amendment.

    The Park Service later did what responsible government agencies do: It asked the Justice Department whether its 1995 opinion reflects its view of the law today. And the department said it didn't. The overriding opinion takes into account more recent Supreme Court decisions in concluding that the First Amendment doesn't bar historic preservation grants to the Old North Church or other active houses of worship that qualify for such assistance. The opinion observes that many regulations ensure that grants made to religious entities won't be used to advance religion — that they will be spent only for authorized purposes relating to historic preservation, not on religious services.

    The Park Service's new policy embraces the worthy principle that where government assistance is being provided to a broad class of beneficiaries, defined without reference to religion and including both public and private institutions, entities that are religious shouldn't be denied such assistance if they otherwise are eligible for it.

    Click here to return to top of page.

    Why Hasn't the Bush Tax Cut of 2001 Led to Job Growth? (posted 6-12-03)

    Jerry Bower, writing in National Review (June 9, 2003):

    We have had four presidents in the past four decades of American history who signed major tax cuts: Kennedy, Reagan, Clinton, and Bush. Although these men also raised some taxes, they stand out in history for their large tax cuts.

    The first three times, the tax cuts led almost immediately to significant drops in unemployment. In the three years following the Kennedy tax cut of 1964, unemployment fell by 32.14 percent; and in the three years that followed the Reagan tax cut (which took effect in 1982), unemployment fell by 15.12 percent. When Clinton signed the NAFTA agreement, a tariff-rate cut, unemployment fell by 23.29 percent. And following the Clinton capital-gains cut of 1997, unemployment fell 24.53 percent. On average, for Kennedy, Reagan, and Clinton, unemployment went down by 23.77 percent in the three years following their tax cuts.

    BuzzCharts feels that the mass media has overly hyped recent unemployment statistics (USA Today led the story with the headline, “Unemployment Rises Sharply”). Nevertheless, it is clear that the current recovery has been one in which employment growth has lagged. Why is this?

    Employment tends to lag no matter what the circumstances. If you are the manager of a small enterprise and business starts to fall off, you wait as long as you can before laying people off. Conversely, when the orders start pouring back in again, you tend to wait a long time before staffing up again. This means that employment tends to remain steady during the beginning of a recession, peak toward the end of a recession, and stay high for some time following a recession.

    That’s the way it works under the best of circumstances. But as far as employment goes we are not under the best of circumstances. The economy has undergone an amazing trend in productivity, driven by information technology. As recently as ten years ago, some establishment economists were saying that they could "see computers everywhere but in the productivity statistics." No one is singing that song any longer. Even during the recession of 2000-2001 when productivity ought to have been driven down by economic dislocation and uncertainty, productivity continued to grow. In fact, the most recent reports on productivity indicate substantial productivity growth in the first half of this year.

    Click here to return to top of page.

    Bush's Tax Cuts Will Help Just as Reagan's Did (posted 6-11-03)

    Donald Regan, writing in the Wall Street Journal (June 11,2003) in an article published shortly after his death:

    The recent debate over President Bush's tax proposal had so many echoes of the Reagan era that I could almost recite the parts of the various players from memory. Amid all the clamor, few have stepped back -- as President Reagan did in 1981 -- and asked three basic questions: Where are we, where do we need to go, and how are we going to get there? Unfortunately, we have no GPS that could pinpoint where we are and how to navigate from here to there.

    The 1981 tax debate occurred when the economy had serious ills. It was entangled in a strange phenomenon known as stagflation -- a combination of slow growth and inflation that could not be accounted for by the dominant Keynesian economic theory of the time. Keynesian economists believed that slow growth -- or no growth -- was the cure for inflation, but somehow both were happening at the same time. Tax cuts, it was feared, would only create more deficits, stimulate more inflation and raise interest rates -- already in the high teens -- further into the stratosphere.

    Ronald Reagan, however, was not troubled by this state of affairs. To answer the question of where we were, he recognized that growth was held back by high tax rates and excessive regulation. As for where we needed to go, his answer was that the first priority was economic growth; other problems would take care of themselves as long as the Federal Reserve maintained a steady and moderate rate of monetary expansion. And his policy for how we were going to get there was breathtakingly simple -- the government was going to get off the back of the American people by taxing and regulating less.

    The opponents of Mr. Reagan's program were saying many of the same things they said in response to the Bush tax cut proposal: We can't cut taxes, it will increase inflation and raise interest rates; deficits are already too high; tax cuts will only deprive the government of the revenues it requires to meet the many needs of the American people.

    Thanks to President Reagan, we know a lot more today, although it seems that many in Congress didn't get the memo. We know that tax cuts spur economic growth by improving incentives to work and invest and by making more money available for new ventures and small business, where the real job growth occurs in our economy. There are many examples of this in recent history, from the Kennedy tax cuts of 1962, through the Reagan cuts of 1981 and 1986. We also know that deficits do not cause inflation or cause interest rates to rise. Although the deficits during the Reagan period were higher (as a percentage of gross domestic product) than the deficits projected today, interest rates declined after the Reagan tax plan was adopted.

    Click here to return to top of page.

    How Saddam Can Be Held Responsible for 9-11 (posted 6-11-03)

    Sherry Eros and Steven Eros, writing in the Hudson Institute's American Outlook (June 10, 2003):

    Foreign and domestic opponents of the war in Iraq claimed (and continue to claim) that it merely distracted America from what should have been an exclusive concentration on the War on Terrorism, arguing that there was no compelling proof of Saddam Hussein’s complicity in the September 11 attacks or involvement with Osama bin Laden.

    In response to the latter claim, Walter Russell Mead, senior fellow for U.S. foreign policy at the Council on Foreign Relations, asserted a direct causal link between Saddam Hussein and the September 11 attacks, in the March 12, 2003, issue of the Washington Post. Mead correctly observed that Saddam Hussein’s noncompliance compelled U.S. forces to stay in Saudi Arabia ever since the 1991 Gulf War in order to maintain the sanctions regime and related restrictions imposed by the United Nations (UN). Saddam Hussein’s persistent cheating and mistreatment of his own people, coupled with his unrelenting threat to his highly vulnerable oil-rich neighbors, Kuwait and Saudi Arabia, made it impossible for America to extricate itself from its military bases in Saudi Arabia.

    Mead noted further that it was principally in order to expel the U.S. forces from Saudi Arabia that Osama bin Laden created the al Qaeda terrorist network. Saudi Arabia is the home of the two most important Islamic holy sites in the world and is the birthplace of the Prophet Mohammed. As such, bin Laden considered it unacceptable for military forces of an “infidel” nation to be stationed there.

    Hence, Mead argues, “The existence of al Qaeda, and the attacks of September 11, 2001, are part of the price the United States has paid to contain Saddam Hussein.” He concludes, “This is the link between Saddam Hussein’s defiance of international law and the events of September 11; it is clear and compelling. No Iraqi violations, no September 11.”

    By this logic, however, one could also lay blame for September 11 directly on the United States, for if the American presence had anything to do with bringing on the attack, the United States certainly had ultimate control over the decision to deploy its forces in Saudi Arabia. (“No American interventionism, no September 11.”) One could equally blame Kuwait’s political and military weakness for its needing American intervention against Iraq in the first place. (“No Kuwaiti weakness, no September 11.”) The UN itself might be indicted for instituting unworkable sanctions, the violation of which by Saddam Hussein led to the U.S. intervention. (“No UN sanctions, no September 11.”) Or one could hold Britain accountable, for failing to draw effective borders in fashioning the Iraqi state decades ago. (“No Iraq, no September 11.”) This won’t do. Basic logic teaches that such “counterfactual conditionals” are insufficient to establish either causality or criminal responsibility.

    In spite of this, it is possible to establish positively that Saddam Hussein is legally and morally responsible for the September 11 attacks. The key is to combine two tools of criminology: forensic timeline analysis, and basic principles of legal criminal responsibility.

    One longstanding legal principle applies the felony murder statute to criminal behavior that causes “accidental” death or harm. If I commit a serious crime and in doing so accidentally cause an innocent person to die, then I may be held guilty of murder—even if his death was not my direct intention. For instance, if I set an arson fire in a building I believe vacant, and an unseen occupant dies, then I am responsible for his murder even if his death was “unintentional.”

    This is where forensic timeline analysis proves Saddam Hussein’s responsibility for the September 11 attacks. In the case at hand, we know that, in violation of international law, Saddam Hussein invaded Kuwait and threatened to attack Saudi Arabia. Having been expelled by coalition forces during the Gulf War, as Mead notes, Saddam knew that his continuous flouting of the Gulf War cease-fire agreements, UN-imposed sanctions, and the continuing threat of conquest he posed to his Kuwaiti and Saudi neighbors were compelling American forces to remain in Saudi Arabia.

    Click here to return to top of page.

    Trotsky's Influence on the Intellectuals Who Rule the Washington DC Roost (posted 6-10-03)

    Jeet Heer, writing in Canada's National Post (June 7, 2003):

    More than a decade after the demise of the Soviet Union, Stalin's war against Trotsky may seem like quaint ancient history. Yet Stalin was right to fear Trotsky's influence. Unlike Stalin, Trotsky was a man of genuine intellectual achievement, a brilliant literary critic and historian as well as a military strategist of genius. Trotsky's movement, although never numerous, attracted many sharp minds. At one time or another, the Fourth International included among its followers the painter Frida Kahlo (who had an affair with Trotsky), the novelist Saul Bellow, the poet André Breton and the Trinidadian polymath C.L.R. James.

    As evidence of the continuing intellectual influence of Trotsky, consider the curious fact that some of the books about the Middle East crisis that are causing the greatest stir were written by thinkers deeply shaped by the tradition of the Fourth International [a group of several thousand people Trotsky organized to rally his supporters].

    In seeking advice about Iraqi society, members of the Bush administration (notably Paul D. Wolfowitz, the Deputy Secretary of Defence, and Dick Cheney, the Vice-President) frequently consulted Kanan Makiya, an Iraqi-American intellectual whose book The Republic of Fear is considered to be the definitive analysis of Saddam Hussein's tyrannical rule.

    As the journalist Christopher Hitchens notes, Makiya is "known to veterans of the Trotskyist movement as a one-time leading Arab member of the Fourth International." When speaking about Trotskyism, Hitchens has a voice of authority. Like Makiya, Hitchens is a former Trotskyist who is influential in Washington circles as an advocate for a militantly interventionist policy in the Middle East. Despite his leftism, Hitchens has been invited into the White House as an ad hoc consultant.

    Other supporters of the Iraq war also have a Trotsky-tinged past. On the left, the historian Paul Berman, author of a new book called Terror and Liberalism, has been a resonant voice among those who want a more muscular struggle against Islamic fundamentalism. Berman counts the Trotskyist C.L.R. James as a major influence. Among neo-conservatives, Berman's counterpart is Stephen Schwartz, a historian whose new book, The Two Faces of Islam, is a key text among those who want the United States to sever its ties with Saudi Arabia. Schwartz spent his formative years in a Spanish Trotskyist group.

    To this day, Schwartz speaks of Trotsky affectionately as "the old man" and "L.D." (initials from Trotsky's birth name, Lev Davidovich Bronstein). "To a great extent, I still consider myself to be [one of the] disciples of L.D," he admits, and he observes that in certain Washington circles, the ghost of Trotsky still hovers around. At a party in February celebrating a new book about Iraq, Schwartz exchanged banter with Wolfowitz about Trotsky, the Moscow Trials and Max Shachtman.

    "I've talked to Wolfowitz about all of this," Schwartz notes. "We had this discussion about Shachtman. He knows all that stuff, but was never part of it. He's definitely aware." The yoking together of Paul Wolfowitz and Leon Trotsky sounds odd, but a long and tortuous history explains the link between the Bolshevik left and the Republican right.

    To understand how some Trotskyists ended up as advocates of U.S. expansionism, it is important to know something about Max Shachtman, Trotsky's controversial American disciple. Shachtman's career provides the definitive template of the trajectory that carries people from the Left Opposition to support for the Pentagon.

    Throughout the 1930s, Shachtman loyally hewed to the Trotsky line that the Soviet Union as a state deserved to be defended even though Stalin's leadership had to be overthrown. However, when the Soviet Union forged an alliance with Hitler and invaded Finland, Shachtman moved to a politics of total opposition, eventually known as the "third camp" position. Shachtman argued in the 1940s and 1950s that socialists should oppose both capitalism and Soviet communism, both Washington and Moscow.

    Yet as the Cold War wore on, Shachtman became increasingly convinced Soviet Communism was "the greater and more dangerous" enemy. "There was a way on the third camp left that anti-Stalinism was so deeply ingrained that it obscured everything else," says Christopher Phelps, whose introduction to the new book Race and Revolution details the Trotskyist debate on racial politics. Phelps is an eloquent advocate for the position that the best portion of Shachtman's legacy still belongs to the left.

    By the early 1970s, Shachtman was a supporter of the Vietnam War and the strongly anti-Communist Democrats such as Senator Henry Jackson. Shachtman had a legion of young followers (known as Shachtmanites) active in labour unions and had an umbrella group known as the Social Democrats. When the Shachtmanites started working for Senator Jackson, they forged close ties with hard-nosed Cold War liberals who also advised Jackson, including Richard Perle and Paul Wolfowitz; these two had another tie to the Trotskyism; their mentor was Albert Wohlstetter, a defence intellectual who had been a Schachtmanite in the late 1940s.

    Shachtman died in 1972, but his followers rose in the ranks of the labour movement and government bureaucracy. Because of their long battles against Stalinism, Shachtmanites were perfect recruits for the renewed struggle against Soviet communism that started up again after the Vietnam War. Throughout the 1970s, intellectuals forged by the Shachtman tradition filled the pages of neo-conservative publications. Then in the 1980s, many Social Democrats found themselves working in the Reagan administration, notably Jeanne Kirkpatrick (who was ambassador to the United Nations) and Elliott Abrams (whose tenure as assistant secretary of state was marred by his involvement with the Iran-Contra scandal).

    The distance between the Russia of 1917 and the Washington of 2003 is so great that many question whether Trotsky and Shachtman have really left a legacy for the Bush administration. For Christopher Phelps, the circuitous route from Trotsky to Bush is "more a matter of rupture and abandonment of the left than continuity."

    Stephen Schwartz disagrees. "I see a psychological, ideological and intellectual continuity," says Schwartz, who defines Trotsky's legacy to neo-conservatism in terms of a set of valuable lessons. By his opposition to both Hitler and Stalin, Trotsky taught the Left Opposition the need to have a politics that was proactive and willing to take unpopular positions. "Those are the two things that the neo-cons and the Trotskyists always had in common: the ability to anticipate rather than react and the moral courage to stand apart from liberal left opinion when liberal left opinion acts like a mob."

    Trotsky was also a great military leader, and Schwartz finds support for the idea of pre-emptive war in the old Bolshevik's writings. "Nobody who is a Trotskyist can really be a pacifist," Schwartz notes. "Trotskyism is a militaristic disposition. When you are Trotskyist, we don't refer to him as a great literary critic, we refer to him as the founder of the Red Army."

    Paul Berman agrees with Schwartz that Trotskyists are by definition internationalists who are willing to go to war when necessary. "The Left Opposition and the non-Communist left comes out of classic socialism, so it's not a pacifist tradition," Berman observes. "It's an internationalist tradition. It has a natural ability to sympathize or feel solidarity for people in places that might strike other Americans or Canadians as extremely remote."

    Christopher Phelps, however, doubts these claims of a Trotskyist tradition that would support the war in Iraq. For the Left Opposition, internationalism was not simply about fighting all over the world. "Internationalism meant solidarity with other peoples and not imperialist imposition upon them," Phelps notes.

    Though Trotsky was a military leader, Phelps also notes "the Left Opposition had a long history of opposition to imperialist war. They weren't pacifists, but they were against capitalist wars fought by capitalist states. It's true that there is no squeamishness about the application of force when necessary. The question is, is force used on behalf of a class that is trying to create a world with much less violence or is it force used on behalf of a state that is itself the largest purveyor of organized violence in the world? There is a big difference." Seeing the Iraq war as an imperialist adventure, Phelps is confident "Trotsky and Shachtman in the '30s and '40s wouldn't have supported this war."

    This dispute over the true legacy of Trotsky and Shachtman illustrates how the Left Opposition still stirs passion. The strength of a living tradition is in its ability to inspire rival interpretations. Despite Stalin's best efforts, Trotskyism is a living force that people fight over.

     

    Click here to return to top of page.

    Did Arafat Make-up a Phony History of Palestine? (posted 6-10-03)

    Diana West, writing in the Washington Times (June 6, 2003):

    In 1857, the British consul in Palestine, James Finn, told his government that Palestine "is in a considerable degree empty of inhabitants and therefore its greatest need is that of a body of population."

    Ten years later, Mark Twain visited the Holy Land, recording his impressions in "The Innocents Abroad." Jericho was "a moldering ruin," he wrote. About the Galilee, he noted "a desolation ... that not even imagination can grace with the pomp of life and action." Which is pretty desolate. As for the land around Jerusalem, "The further we went ... the more rocky and bare, repulsive and dreary the landscape became," Twain wrote. "There was hardly a tree or shrub anywhere. Even the olive and cactus, those fast friends of a worthless soil, had almost deserted the country."

    Things hadn't changed much by 1881, when British cartographer Arthur Penrhyn Stanley observed, "In Judea it is hardly an exaggeration to say that for miles and miles there is no appearance of life or habitation." To be sure, there were at this time Arabs (and Jews) living in Palestine, but 1881 hardly marks the shimmering high point of civilization Yasser Arafat would describe to the United Nations in 1974, when he conjured visions of "a verdant land, inhabited mainly by an Arab people in the course of building its life and dynamically enriching its indigenous culture."

    Why the world came to accept the mendacious vision of a terror-kingpin over a wealth of historical impressions recorded by writers, scientists and officials is a tantalizing question. (On another memorable U.N. occasion, this same terrorist-fabulist hallucinagenically said "Jesus Christ was the first Palestinian fedayeen" — or Muslim fighter of Christians.) Somehow, the weight of the world's collective understanding of history flipped: Myth turned to fact, and the facts were forgotten. I don't know when this happened. I just know I never came across such vivid eyewitness accounts of 19th-century Palestine as those above until I read them (and others) in the 2000 edition of Benjamin Netanyahu's excellent Middle East primer, "A Durable Peace: Israel and Its Place Among the Nations."

     

    Click here to return to top of page.

    Filibuster a Court Nomination? (posted 6-10-03)

    C. Boyden Gray, White House counsel for President G.H.W. Bush, writing in the Wall Street Journal (June 10, 2003):

    In response to filibusters of federal court nominees Miguel Estrada and Priscilla Owen, Senate Majority Leader Bill Frist is pushing an amendment of Senate rules that would explicitly rule judicial filibusters out of order. The filibustering Senate Democrats argue that Mr. Frist's proposal represents a radical departure from precedent and a power-grab by the president. In reality, it is the Democrats who have broken brazenly from the past with the current filibusters.

    In defending their filibuster, Democrats have cited the Abe Fortas nomination as a precedent that proves Republicans have engaged in judicial filibusters too: So, what was good for the Fortas goose is good for the Estrada/Owen gander. But theirs is a mistaken position.

    On June 26, 1968, President Johnson nominated Justice Fortas to Chief Justice Warren's seat. Senators from both parties opposed the Fortas nomination for a variety of reasons, some plausible (e.g., that Justice Fortas, who had been a trusted adviser to President Johnson before his nomination, had continued to participate in White House decision making during his tenure on the Court), some not so plausible (e.g., that President Johnson should not be allowed to choose Chief Justice Warren's successor because the president was a lame duck). Whatever the merits, the criticisms did not prevent a relatively rapid decision on the nomination, which was reported out of the Judiciary Committee (by divided vote) in the middle of September and was opened to floor debate on Sept. 24.

    A filibuster followed, but not for long. On Oct. 1, the Senate voted on a motion for cloture that would have ended debate on the nomination and allowed an immediate vote on whether to confirm Justice Fortas as chief justice.

    The Congressional Record for Oct. 1, 1968, shows that 45 senators voted for cloture, 43 voted against. However, if the senators who did not vote are taken into account, we find that 48 were on record as opposing cloture, 47 as favoring it. Indeed, at least one of the senators who voted for cloture, Republican John Sherman Cooper of Kentucky, said that he would vote against the Fortas nomination if it came to a vote. Another who voted for cloture proposed immediately after the vote that the president withdraw the nomination and submit a name that could be quickly confirmed. This evidence alone shows that of the 47 on record for cloture, at least one, if not more, was actually opposed to the Fortas nomination.

    Perhaps that is the reason why Justice Fortas decided to ask that his nomination be withdrawn, and why President Johnson promptly complied on Oct. 4. The point is, at least 49 senators -- a majority of the 95 senators whose positions were identified in the Congressional Record -- either opposed allowing a confirmation vote or opposed confirmation on the merits. This evidence -- which suggests that, if anything, Justice Fortas might have had a majority opposed to his confirmation -- casts doubt on the likelihood that a committed plurality of 50 senators (who, with Vice President Humphrey, would have constituted a majority) would have voted for Justice Fortas's confirmation had the filibuster not prevented it.

    By contrast, Mr. Estrada and Ms. Owen would win a confirmation vote today if the Democrats allowed them one. Even assuming that the Fortas filibuster was legitimate, it is not a precedent for Mr. Estrada's case or for others where a declared majority of senators favors confirmation of the nominee, and where the nominee reacts to the filibuster not by throwing in the towel but by standing his ground.

     

    Click here to return to top of page.

    Canadians Don't Know Their History Either (posted 6-10-03)

    Mike Blanchfield, writing in the Montreal/Quebec Gazette (June 9, 2003):

    A new poll has added weight to the argument that Canadians are ignorant of their history.
    A majority of those surveyed could not identify Samuel de Champlain as the founding father of the first European settlement in what was to become Canada.

    Forty per cent of those surveyed by Environics Research Group/ Focus Canada for the Association for Canadian Studies, picked Jacques Cartier - the French explorer who discovered Canada - while only 28 per cent selected Champlain.

    A combined total of 23 per cent gave the credit to Henry Hudson, who opened up the fur trade in Northern Canada, or to Christopher Columbus, who discovered the Americas in 1492.

    "I think we know generally the level of historic knowledge in our country isn't strong," said pollster Jack Jedwab.

    But Jedwab said Canadians will get a history lesson next year, because the federal government is launching a series of events to mark the 400th anniversary of Champlain's arrival in Canada.

    Canadians shouldn't be faulted too much for confusing Cartier and Champlain, Jedwab said.

    But he added, "It suggests the federal government has a fair bit of work ahead of it insofar as this commemoration is concerned, in making Canadians more aware of Champlain."

     

    Click here to return to top of page.

    Should Congress Open an Investigation into the Use of Intelligence? (posted 6-10-03)

    Gail Russell Chaddock, writing in the Christian Science Monitor (June 9, 2003):

    The failure to find weapons of mass destruction in Iraq and new claims that evidence of the threat was manipulated are edging Congress toward a position many members had hoped to avoid: that of challenging a popular wartime president.

    After voting President Bush broad powers to use force in Iraq, members on both sides of the aisle are now questioning whether the move to war was based on sound intelligence - or at least demanding a fuller accounting.

    Last week, the Republican chairman and ranking Demo- crat of the Senate Armed Services Committee called for a "thorough" investigation into possible intelligence lapses. House and Senate intelligence committees are reviewing the documents that backed up the administration's case for war, and Democrats say that discussion should be public, noting the threat to US credibility in the world.

    "It is important that this investigation not only include open hearings, but also a comprehensive, fact-finding review. We need to get started," said Sen. John Rockefeller (D) of West Virginia, vice chairman of the Senate Select Committee on Intelligence, last week....

    In the past, congressional investigations into the conduct of wars began only after the war ran into trouble. Senate hearings into the 1964 Gulf of Tonkin incident, which provided the rationale for the Vietnam War, did not begin until 1966. The Gulf of Tonkin Resolution was not repealed until 1969.

    "There's always been a reluctance to carry on an investigation while troops are in harm's way," says Donald Ritchie, associate historian of the Senate. "When the war is over, that's the time when investigations seem to be more appropriate."

    Bush administration officials insist it's too early to say weapons won't be found. "This was a program that was built for concealment," said National Security Adviser Condoleezza Rice on television talk shows.

    Meanwhile, top Republicans say it's a mistake to focus on weapons of mass destruction as the main cause for going to war.

    "To dredge all this up as a scandal is nonsense," says Sen. Richard Lugar (R) of Indiana, chairman of the Senate Foreign Relations Committee. "Intelligence is an inexact science, and when it comes to weapons of mass destruction, it is not very good. They may be gone in hours, not just misplaced but destroyed."

    Even if weapons of mass destruction are never found in Iraq, "there is no doubt ... that if [Saddam Hussein] had been left alone, he would have continued to try to develop these weapons," said Sen. John McCain (R) of Arizona last week.

    Click here to return to top of page.

    Did Blair Manipulate Intelligence to Advance His War Aims? (posted 6-10-03)

    Francis Elliott, Colin Brown And David Bamber, writing in the London Sunday Telegraph (June 8, 2003):

    The briefing note circulated to ministers on Thursday afternoon was unequivocal - "on no account" were they to suggest that elements of the intelligence service were seeking to undermine the Government.

    An incendiary attack on "rogue elements" within Britain's intelligence service by Dr John Reid, the Leader of the House, on Wednesday had - not for the first time in recent weeks - panicked Number 10 into an urgent damage limitation exercise.

    Yet as the Prime Minister's plane had touched down at Heathrow's Royal VIP suite at 4.30am on Tuesday, Tony Blair already knew that he was coming home to a political storm. His week-long world tour ending in the G8 summit at Evian, France, had been overshadowed by claims that Number 10 had ordered intelligence assessments to be "sexed up" for publication.

    In particular Mr Blair faced the charge that a Government dossier, Iraq's Weapons of Mass Destruction, had after pressure from Number 10, included the claim that Saddam could launch a chemical attack in 45 minutes. Categoric denials by Mr Blair and John Scarlett, the head of the Joint Intelligence Committee and a former high-ranking officer at MI6, who cleared the dossier, failed to quell the story. A ferocious attack by Clare Short, the former International Secretary, in this newspaper last week had rocked Mr Blair further when she accused him of "duping" the nation into believing that Saddam posed an imminent threat.

    By the time the Prime Minister arrived home one of his backbenchers was claiming that the row "was more serious than Watergate". It was a remark that Malcolm Savidge, the MP for Aberdeen North, would come to regret as Number 10 coordinated a tooth-and-nail fightback to restore the Prime Minister's credibility.

    John Prescott, the Deputy Prime Minister, cornering Mr Savidge in the Commons tearoom on Tuesday, asked menacingly: "Have you had your re-selection meeting yet, Malcolm?" Later that day the hapless Labour MP was forced to flee the Strangers bar in the Commons, his pint of bitter - which he had, as ever, asked for in a jar with a handle, an affectation which his peers loathe almost as much as they dislike him - untouched, under the verbal blows of his peers.

    It was in this febrile atmosphere that Dr Reid made his intervention. After being approached by a journalist from the Times on Tuesday, he said: "There have been uncorroborated briefings by a potentially rogue element - or indeed elements - in the intelligence services. I find it difficult to grasp why they should be believed against the word of the British Prime Minister and the head of the Joint Intelligence Committee."

    Senior officials claim that it was simply a piece of ill-judged freelancing not explicitly authorised by Alastair Campbell, Mr Blair's director of communications. "John came back from holiday and asked Alastair if he could go out and do some work on this but Alastair never told him to use the words that he did or speak to the people he did," said one.

    Cabinet ministers privately disown Dr Reid's remarks. One said: "There is absolutely no conspiracy among the intelligence services trying to bring down the Government."

    Not all members of the intelligence service are convinced that Dr Reid's attack on "rogue elements" was unsanctioned. They point out that Hilary Armstrong, the Chief Whip, was reported to have made similar charges at the same time. The row also proved a useful distraction from the main allegations against Mr Blair, they say.

    Dr Reid, as a former Communist and a keen political historian, is well-versed in the mythology about Labour and British intelligence which begins with the Zinoviev letter, believed now to be an MI6 forgery, that helped to bring down the Labour government of Ramsay MacDonald by linking it with a Soviet-inspired uprising in Britain in 1924.

    Peter Wright, in his 1984 book, Spycatcher, gave credence to claims by Harold Wilson that MI5 had plotted against him. Dr Reid, Mr Campbell and Peter Mandelson all saw first-hand the damage done to Neil Kinnock as Labour leader - by a relentless campaign of media vilification that some believe was inspired by the intelligence service - and have had their outlooks shaped by the experience. However, once in power New Labour quickly provoked suspicions that it was not above abusing intelligence for its own narrow political purposes. Within the civil service there are bitter memories of the day the news was breaking in the News of the World of Robin Cook's affair with his secretary, Gaynor Regan, in August 1997. In order to distract attention from the story a senior government minister, thought to have been Peter Mandelson, leaked to the Sunday Times the story that MI6 was probing the former Governor-General of Hong Kong, Chris Patten, allegedly for leaking classified documents to his biographer Jonathan Dimbleby.

    One former intelligence officer said later: "It was an extraordinary moment when we realised that for the first time we had a government that would use information supplied by us for political purposes, and in particular, for media manipulation."

    It was not the only change that was noted with alarm. The intelligence services share Whitehall's consternation over New Labour's open willingness to blame civil servants for Government failures. What really damaged relations with the Blair government, however, was not the first dossier published in September last year but the second, Iraq - Its Infrastructure of Concealment, Deception and Intimidation, handed to journalists accompanying the Prime Minister on a trip to America at the end of January. It was this document, parts of which were later exposed to have been plagiarised from an article by a US-based PhD student, that led to acute unease about Number 10's abuse of intelligence. Mr Blair, introducing the first dossier, said it was "based in large part, on the work of the Joint Intelligence Committee". For the second dossier, it was said only that it "draws on a number of sources, including intelligence material". This nice distinction was lost in the blast of ridicule prompted by the exposure of the so-called "dodgy dossier".

    Senior intelligence officers felt that their reputations had been damaged by a piece of inept propaganda. "We are not responsible for this bastard offspring," said one at the time, furiously, about the second dossier. "It devalued the currency, there is no question about that," admitted one senior official. "There is a dispute about who saw what but it is clear that the JIC was not involved. It was a monumental cock-up."

    Click here to return to top of page.

    A Unipolar World Can Be a Stable One If the U.S. Behaves with Restraint (posted 6-10-03)

    William Choong, writing in the Straits Times (Singapore) (June 8, 2003):

    [W]ith the Sept 11 terrorist attacks in the US, anthrax scares, a war in Iraq and talk of more to come, there is a certain longing for the time when only two superpowers ruled the political landscape.

    Before the Cold War started dying down in 1989, US under-secretary of state Lawrence Eagleburger started to express nostalgia for the 'remarkably stable and predictable atmosphere of the Cold War'.

    And writing 10 years after the fall of the Berlin Wall, eminent historian John Lewis Gaddis reminisced about the demise of what he called the 'Long Peace'.

    By the late 1990s, victims of genocide were found in Central Europe; African rivers choked with dead bodies; armed teenagers ruled Third World cities from the backs of pick-up trucks.

    Lastly, terrorists were striking with deadly efficiency where one might least expect it: the American heartland.

    But the most intriguing thing about the Long Peace was its deadening predictability.

    Yes, there were all the high-level feints, manoeuvring and shadow-boxing behind the Iron Curtain, as the US and the Soviet Union played the high-stakes game of nuclear brinkmanship.

    Ironically, however, mutually assured destruction - nuclear Armageddon following a pre-emptive nuclear strike - never materialised.

    And unlike now, when the US is having difficulty keeping track of the number of terrorist groups arrayed against it, the Cold War made for an easier time to win friends and influence people.

    Seen through the rivalry between the US and the former Soviet Union, geopolitics was akin to the tidy chessboard game of 'your friend is my friend and your enemy is my enemy'.

    But to any astute observer of global politics, however, so what?

    The key here is that how states perceive the world determines how they act within it. This simple rubric is tied to how anarchy - the unpredictable state of world affairs - is managed.

    International relations experts classify this under three modes: a multipolarity of many powers (very unstable), bipolarity (which supported the stability of the Cold War), and lastly, unipolarity (fairly stable).

    In this sense, it is becoming increasingly evident that an America triumphant after Afghanistan and, more recently, Iraq has turned the world truly unipolar.

    There are benefits to unipolarity - it is more stable than multipolarity, and the strength alone of a unipolar power could reduce 'disorder', or anarchy, within a 'system'.

    But it has taken the US more than a decade to realise this.

    It was in 1990 when Pulitzer Prize-winning columnist Charles Krauthammer came up with the term 'unipolar moment' - where untrammelled American power ruled the world following the collapse of the Soviet Union.

    But the moment faded away, as Washington grappled with problems like the Balkans crises, its relations with a unified Europe, and the emergence of Japanese economic power.

    So much so that 10 years later in 1999, political scientist Samuel Huntington proclaimed unipolarity had been replaced by 'uni-multipolarity' - a short cut to describe a major power being fringed by not-so-major powers.

    Even before Iraq, polarity watchers were hailing the emergence of what could be America's second 'unipolar moment'.

    Two political scientists have alluded to Washington's overwhelming superiority in all fields - military, economic and technological.

    'If today's American primacy does not constitute unipolarity, then nothing ever will,' wrote Drs Stephen Brooks and William Wohlforth in the journal Foreign Affairs.

    Some analysts have even gone on to tag America's second 'unipolar moment' with the dreaded 'E' word - Empire.

    'More American neo-conservative pundits are embracing the idea of an American informal empire, this meaning hegemonic leadership, not the direct rule of earlier empires,' Columbia university political science don Richard Betts told Sunday Review.

    The key to sustaining America's second unipolar moment, however, depends on how it wields its power.

    If the US again loses grip on unipolarity this time round, it would not be because of tacky post-Cold War problems, but because of its own doing.

    Drs Brooks and Wohlforth said unipolarity is only possible if Washington looked beyond its own immediate needs.

    'Unipolarity makes it possible to be the global bully - but it also offers the United States the luxury of being able to look beyond its immediate needs and the world's long-term interests,' they wrote.

    This includes the forestalling of resentment against Washington by working with partners on diverse issues such as the environment, disease and migration.

    The US would also need to improve the lot of the developing world by lowering protectionistic trade barriers. The US has been doing that, with its recent proposal to transform a post-Saddam Middle East with a slew of free trade agreements.

    It is what happens after Iraq, however, that could see Washington slink back to its old bullying ways. Already, the US has ridden roughshod through the United Nations before the Iraq conflict by eschewing strong multilateral cooperation.

    If the US decides to do an Iraq elsewhere in the world, this could prove to be costly, said Prof Betts.

    Click here to return to top of page.

    The Disturbing Parallels Between Japanese Internment and 9-11 (posted 6-10-03)

    Teresa Watanabe, writing in the Los Angeles Times Magazine (June 8, 2003):

    After Pearl Harbor, nearly 8,000 Japanese immigrants were arrested and interned as potentially dangerous enemy aliens, says University of Cincinnati professor emeritus Roger Daniels; not one was found guilty of espionage or sabotage. Since Sept. 11, about 4,000 men, mostly Arabs and Muslims, have been arrested and detained, according to Georgetown University law professor David Cole; among them, he says, only a minuscule number have been charged with crimes related to terrorism.

    These developments have provoked unease among many Japanese Americans, 120,000 of whom--including my grandmother and her children--were removed from their West Coast homes and locked up in desolate camps after Pearl Harbor. Many speak of a sense of deja vu. Japanese Americans are making documentaries, staging performances and holding forums and vigils--from Seattle to Los Angeles--to underline their concerns that innocent people are again being trampled upon in the name of national security.

    There are "disturbing parallels with post 9/11 experiences in Arab, Muslim and South Asian communities all over the U.S.," says John Christgau, who wrote a 1985 book, "Enemies," on the alien internment program and who helped compile a current exhibition at UCLA's Powell Library on the 31,000 Japanese, Italian and German immigrants and their families who were interned during World War II.

     

    Click here to return to top of page.

    What Happened to Kennedy's Dream of a European-American Union? (posted 6-9-03)

    Martin Walker, writing in the AP (June 9, 2003):

    Just 40 years ago this week, President John F. Kennedy at the Paulskirche in Frankfurt suggested that the ties between the United States and Europe were so close and so essential that they should consider not only an economic union, but possibly even a political union between these two pillars of the West.

    How times change.

    White House officials now say the new U.S. policy toward the ever-larger European Union is "disaggregation," to distinguish between the friendly Europeans and the less reliable, or even the potentially hostile. Or as Defense Secretary Donald Rumsfeld has put it, between "Old Europe;" and the new.

    A clue to what this means is on display in Iraq, where British troops have fought alongside their U.S. allies from the start. Now Polish troops are leading a new peacekeeping contingent of 7,500 troops from Denmark, Norway, Lithuania, Romania, Bulgaria, Slovakia and Ukraine, with NATO providing the logistic, communications and intelligence support.

    Beyond that, there is very little sign the new U.S. rhetoric of picking and choosing between Europeans means anything at all. The White House is preoccupied with the Middle East. The U.S. Trade Representative works with the European Union's Commissioner for Trade, Pascal Lamy, because the EU requires that all its members subordinate trade negotiations to him.

    The State Department has yet to rethink, let alone examine, its 50-year assumption that a united Europe is by definition in U.S. interests. British officials are still told by their State Department colleagues they will be taken more seriously in Washington the more they are engaged in Brussels, the EU capital.

    Try telling that to Tony Blair, who is vilified across Europe for his loyal support of President Bush over Iraq. Only last week, the Greek Bar Association began drafting war crimes charges to haul Blair before the International Criminal Court. Forty years ago, France's President Charles De Gaulle said that Britain would be "America's Trojan Horse" inside Europe. Now the pro-American Poles, voting in a referendum this weekend to ratify the terms of accession to the EU, are dubbed "the Trojan donkey" in the French and German media.

    The assumption that America can only benefit from a Europe "whole and free" and ever more integrated has been coming under increasing question. Two years ago, Henry Kissinger in his book "Does America need a foreign policy?" suggested that it was high time to reconsider the old assumptions. He concluded that the economic ties between the United States and EU were too important to put at risk, but that a great deal more American political engagement with the Europeans would be needed.

     

    Click here to return to top of page.

    Will Harvard Take a Donation from an Arab Sheik Associated with Holocaust Deniers? (posted 6-6-03)

    Rachael Lea Fish, a student at Harvard Divinity School, writing in the Wall Street Journal (June 6, 2003):

    In July 2000, the Harvard Divinity School accepted $2.5 million from the president of the United Arab Emirates, Sheik Zayed bin Sultan al-Nahyan, for the creation of an endowed professorship in Islamic religious studies. A professorship in Islamic studies is long overdue at Harvard Divinity School, something I especially appreciate as a student of Islam. But when I learned more about the donor, Sheik Zayed, I became dismayed.

    Why? Amnesty International has repeatedly documented the terrible human-rights record of Sheik Zayed's country: its lack of elections, use of corporal punishment on political prisoners and trafficking in Bangladeshi child slaves. Sheik Zayed has ruled the United Arab Emirates as unelected president since 1971.

    That is bad enough. But, perhaps more important, Sheik Zayed also funds the Abu Dhabi-based Zayed Center for Coordination and Follow-Up, a prominent think tank of the Arab League, founded in 1999. The Zayed Center, described on its Web site "as the fulfillment of the vision of Sheikh Zayed," promotes Holocaust denial, anti-American conspiracy theories and hate speech in its lectures, symposiums and publications.
    In August 2002, the Los Angeles Times quoted Mohammed Murar, the executive director of the Zayed Center, saying about Jews that "the truth is they are the enemies of all nations." His comment came on the heels of a Zayed Center report stating that "the Zionists are the people who killed the Jews in Europe."

    The Zayed Center has a history of giving Holocaust deniers like David Irving a forum to promulgate their ideas. In 1998, Sheik Zayed's wife donated $50,000 to finance the defense of infamous Holocaust denier Roger Garaudy in a French court.

    In April 2002, the Zayed Center hosted Thierry Meyssan, the French author of "The Appalling Fraud," which claims that the U.S. military staged the 9/11 attacks. The center translated Mr. Meyssan's book into Arabic, hailed its publication and widely advertised the work. A month later, Lyndon LaRouche, the fringe political figure who has made disparaging remarks about Judaism, was an honored guest. Just last month, the center hosted Umayma Jalahma, a professor of Islamic Studies at King Faisal University, who declared: "The Jewish people must obtain human blood so that their clerics can prepare for holiday pastries."

    Despite Sheik Zayed's track record, Harvard Prof. William Graham, now dean of the Divinity School, hailed his donation. "This endowment," he told the Harvard Gazette in September 2000, "is a most welcome gift. We are delighted with this encouraging development."

    At the time, Mr. Graham was probably not aware of Sheik Zayed's links to hate speech and Holocaust deniers. So a group of Divinity School students, including me, went to him this March with a dossier of evidence and a request that Sheik Zayed's hate money be returned. Mr. Graham told us that he was going to have an "independent" researcher look into the matter and that he would get back to us in four to six weeks. We're still waiting. It should be noted that Mr. Graham has not been afraid to take a public stand on Harvard's ties to the Middle East--last year he signed a petition calling for the university to disinvest from Israel--but so far he has not spoken out on Sheik Zayed's gift.

    Click here to return to top of page.

    The French Haven't Forgotten D-Day (posted 6-6-03)

    Joe Ray, writing from Paris in the Houston Chronicle (June 5, 2003):

    For months, the question has loomed in the minds of many Americans: "Have the French forgotten?"

    Now, with today's 59th anniversary of the D-Day invasion of France at hand, most French would answer, "Non."

    Whether through words, deeds or observances, many French seem as grateful as ever to the Americans, as well as their British and Canadian liberators, who invaded the beaches of Normandy early on June 6, 1944, and began the Battle of Europe that eventually crushed Nazi Germany's armies.

    For some French, the invasion has even become part of their family histories, with stories about the soldiers handed down from generation to generation.

    During the emotional debate that preceded the Iraqi war at the United Nations, several media outlets and elected officials in the United States had a field day vilifying the French for their staunch antiwar stands.

    One U.S. newspaper replaced French Foreign Minister Dominique de Villepin's head with a weasel's in a photo from the U.N. Security Council. Some editorial cartoonists suggested rolling up the American cemeteries in France to bring the war dead back home. A few editorial writers and politicians even derided the French as the "cheese-eating surrender monkeys."

    For many French, though, the anniversary of the Normandy invasion has little to do with global politics. Genevieve Brame, an author who splits her time between Paris and her village in Normandy, calls the anniversary "part of my heritage."

    Click here to return to top of page.

    The History Conservatives Have Forgotten (posted 6-5-03)

    John Judis, writing in the New Republic (June 5, 2003):

    History is not physics. Studying the past does not yield objective laws that can unerringly predict the course of events. But peoples do draw lessons from history and change their behavior accordingly. Western European countries, for instance, took the experience of two world wars as reason to change radically their relations with one another. The United States took the experience of the Great Depression as reason to alter the relationship between government and the market.

    Historical lessons can also be unlearned or forgotten. The New Left of the 1960s, for instance, forgot the lessons of an earlier "God that failed" and projected the same hopes for a communist utopia onto Castro's Cuba or Ho Chi Minh's Vietnam that earlier generations had projected onto the Soviet Union. And, today, the right is going through its own bout of historical amnesia. Conservatives, forgetting the lessons of the early twentieth century, are attempting to rehabilitate the long-discredited strategy of imperialism.

    The revival is centered in East Coast journals and think tanks, from National Review and The Wall Street Journal editorial page in New York to the American Enterprise Institute, The Weekly Standard, Policy Review, and the Project for the New American Century in Washington. In an October 2001 Weekly Standard cover story, Max Boot called on the United States "unambiguously to embrace its imperial role." In Foreign Affairs last July, Thomas Donnelly, a former Lockheed official who is a senior fellow at the Project for the New American Century, wrote that "American imperialism can bring with it new hopes of liberty, security, and prosperity." In Policy Review last April, Stanley Kurtz called for a new "democratic imperialism."

    Although the Bush administration's foreign policy is a mix of different ideologies, it has clearly been influenced by this new imperialism. Evidence can be found in the cultlike popularity of Theodore Roosevelt, the president many conservatives take as their guide to a neo-imperial strategy. (George W. Bush has declared Roosevelt his favorite president, and Donald Rumsfeld displays a plaque quoting TR on his Pentagon desk.) More important, it is evident in the administration's attitude toward international institutions, its arguments for invading and occupying Iraq, its case for preventive war, and even its international economic strategy.

    This new imperialism differs in some respects from the older U.S. imperialism of Roosevelt and Senator Henry Cabot Lodge—the new imperialists don't assume, for instance, the superiority of the Anglo-Saxon race or seek the spread of Christian civilization—but it is sufficiently similar to raise the question of whether these new imperialists are reviving a strategy that failed the United States 80 years ago. That failure was understood most clearly by Woodrow Wilson, who offered not only the most compelling critique of U.S. imperialism but also the most thoughtful alternative—a liberal internationalism that served the United States well in the second half of the twentieth century and could guide Americans again today.

     

    Click here to return to top of page.

    Perils of a Weak Dollar (posted 6-5-03)

    Robert Samuelson, writing in Newsweek (June 2, 2003):

    To anyone with a sense of history, the Bush administration’s decision to bless a cheaper dollar must seem disquieting. It may be defensible as economic policy or simply as acceptance of the inevitable. After all, huge U.S. trade deficits (now roughly $500 billion annually) have flooded the world with so many dollars that a sizable currency decline was, at some point, likely. But the cheaper dollar, by making U.S. exports more price competitive and hurting other countries’ exports, raises the unsettling specter of “beggar thy neighbor” policies.

    OK, “BEGGAR THY neighbor” is a mouthful. In plain language, it means that countries protect their own industries—through cheaper currencies, trade barriers, subsidies and regulatory preferences—at the expense of other countries. It’s not free trade; it’s political trade. In the 1930s, this sort of economic nationalism arguably contributed to World War II by weakening opposition to Germany. Countries don’t easily cooperate when they’re blaming each other for their economic problems.

    We need to recall this history—and avoid repeating it. In early June, leaders of the world’s wealthy democracies will meet for their annual economic “summit.” Squelching economic nationalism ought to top their agenda. The war in Iraq has already strained relations between the United States and many countries. A further poisoning of the climate could hamper negotiations on everything from trade to terrorism to AIDS. Preventing this sort of calamity is a no-brainer. But it may be hard for two reasons: (1) the world economy is already weak, and (2) a cheaper dollar threatens so many other countries.
    Consider: in 2001, the United States bought 88 percent of Canada’s exports, 30 percent of Japan’s, 21 percent of South Korea’s, 20 percent of China’s, 11 percent of Germany’s and 9 percent of France’s. Except for China and South Korea—more on this in a minute—the cheaper dollar threatens them all. Look at the European reaction. “Recession fears grow as U.S. launches ‘weapon of mass destruction’ against Europe,” warns The Guardian (U.K.). EUROPE LOSES THE CURRENCY WAR, says Le Monde (France). HARD EURO, WEAK NERVES—”The weak dollar causes problems to German exporters,” says Der Spiegel (Germany).

    The headlines speak volumes about politics; a weaker dollar feeds anti-Americanism. By contrast, the true economics are more complicated.
    First, the Bush administration didn’t cause the cheaper dollar, even if it likes the result. What’s lowered the dollar’s exchange rate is an imbalance between supply and demand: foreigners have become less willing to hold all the dollars they earn from their exports to the United States. If they sell dollars and buy euros, the dollar falls and the euro rises.

    Second, the dollar’s drop has so far reversed only about a third of the previous 34 percent increase between mid-1995 and early 2002. Other countries “benefited hugely [through exports] from the appreciation of the dollar,” says Fred Bergsten of the Institute for International Economics. It’s “churlish” for foreigners to complain now about a modest “correction.”
    Finally, Europeans (and others) can deal with the consequences of a cheaper dollar. One obvious response is to stimulate their domestic economies. “The European Central Bank [the equivalent of the Federal Reserve] can cut interest rates,” says economist Barry Eichengreen of the University of California, Berkeley. He suggests the ECB should cut its main interest rate from 2.5 percent to 1.5 percent.

    Click here to return to top of page.

    Update on the Looting of Iraq (posted 6-5-03)

    Bruce Craig, writing in the newsletter for the Coalition for History (June 5, 2003):

    It has been some number of weeks since we last updated you on what's going on in the effort to protect and recover antiquities, museum artifacts, and archival records in Iraq. Iraq also has more than 10,000 registered archaeological sites, and archaeologists say that treasure hunters continue to tear into them, stealing antiquities that often date back 3,000 years and more. According to a 23 May New York Times article, experts say the real threat is to 15 to 20 major sites atop ancient cities like Larsa, Fara, and the great Sumerian city of Erech. "We believe that every major site in southern Iraq is in danger," said Donny George, director of research at Iraq's State Board of Antiquities and Heritage, which oversees all archaeological excavations in Iraq.

    A few days later (on 26 May 2003) the Times reported that Iraqi officials say that they asked American military leaders for help in securing major archeological sites from looting over a month ago. Reportedly, military officials were hesitant to provide help, claiming that their time needed to be spent on more important needs, including food and water for the Iraqi people. Colonel O'Donohue, for example, stated that "we don't have anywhere near enough Marines to police every fixed site in the country. Our view is that if it's a fixed site, it's primarily an Iraqi responsibility."

    The Colonel also offered to help train and arm Iraqis to guard the sites. However, arming Iraqis would stand in violation of a new edict posted by the top American administrator in Iraq, L. Paul Bremer III, which forbids most Iraqis from carrying guns outside their homes. The Marines have sent patrols to investigate looting, but many times it has been a reactive response, too late to prevent the looting from occurring. According to the Times reporter Edmund Andrews, "the looting of archaeological sites, if unchecked, could prove far more devastating…with looters in some locations extracting more in two weeks than archaeologists had unearthed in two decades."

    Pietro Cordone, the former Italian ambassador to the U.A.E., and recently appointed by the Coalition forces to restore the Iraqi Ministry of Culture, including the museums and archeological sites looted in the last period, reassured those who worried about the protection of archeological sites. "The army has appointed forces to protect, round the clock, all the main museums and sites, such as Abraham's ancestral home in Ur, south of Iraq," he said. "In addition, in the remote areas, on-the-spot investigations are conducted constantly by using helicopters." However, reports have also surfaced about a new group of looters -- American soldiers. In a London Observer article (18 May 2003), aid workers claimed that American soldiers had vandalized the ancient city of Ur. The reports included incidents of graffiti on walls, and the theft of clay bricks.

    The looting of archeological sites is the "second wave" of cultural theft in Iraq. The first took place in the country's museums, and Coalition forces in cooperation with various cultural organizations worldwide have been working to find and return those missing items to Iraq. The good news is that on 7 May 2003, U.S. Customs agents announced that approximately 700 artifacts and 39,400 manuscripts have been recovered since the looting began. The search continues for other items, that officials believed may have left the country or entered the Iraqi black market. Some items that were presumed stolen have been found in secret vaults below the museum. However, some of these recovered artifacts are now unaccounted for, raising concern that some museum staff may be trading in artifacts. Even more items have been discovered in the vault of Baghdad's Central Bank, where they await recovery.

    American officials have taken several steps to establish an accurate checklist of what was missing, and have created a program dubbed "Operation Iraqi Heritage." In addition, a two-day Interpol conference was held in Lyon, France on May 5-6, which was called to create a database of the missing artifacts from the museum. Also on 6 May, the American Coordinating Committee for Iraqi Cultural Heritage was formed in New York City. Chaired by former Smithsonian director Robert McCormick Adams, the committee will help Iraqi officials re-establish the museum, restore records, and train new curators.

    The list of participants in the Interpol conference is located here: <http://www.interpol.int/Public/WorkOfArt/Iraq/ListParticipants.pdf>;. The meeting agenda is available at: http://www.interpol.int/Public/WorkOfArt/Iraq/Programme.pdf, and the final recommendations of the meeting are posted at: <http://www.interpol.int/Public/WorkOfArt/Iraq/finalRecommendations.pdf>;. The full minutes for the meeting can be found at this address: <http://www.interpol.int/Public/WorkOfArt/Iraq/Minutes.asp>;.

    Click here to return to top of page.

    What Happens If Congress Is Taken Out in an Attack? (posted 6-3-03)

    Editorial in the Wall Street Journal (June 3, 2003):

    America's Founders were remarkably prescient about many things. But who can blame them for not anticipating a hijacked airplane diving at the U.S. Capitol?

    That understandable oversight is why, in the wake of September 11, a bipartisan commission has been exploring how to ensure the continuity of government in case of another terrorist attack on Washington. Its objective is to help prepare for an orderly and legitimate transfer of government. The commission's first report -- on Congress -- will be released today. Reports on the Presidency and Supreme Court will follow.

    Congress was first on the list because (as always) it's the most problematic. The Constitution specifies that vacancies in the House can be filled only by special election -- a process that takes an average of four months. In the Senate, the 17th Amendment provides for vacancies to be filled by temporary appointment by governors.

    There's also the problem of the quorum, which House rules define as a majority of living members. Under that interpretation (which some scholars dispute), if five Members were to survive a terrorist attack, a quorum of three might end up selecting a new Speaker, who could in turn end up in the White House should the President and Vice President also be killed.

    A President picked by only three people raises serious questions of legitimacy -- especially if the Supreme Court weren't around to weigh in. Or what about incapacitation? If a large number of Members were struck down by smallpox, say, it's conceivable that neither House would be able to conduct business for lack of a quorum.

    The commission recommends a Constitutional amendment to give Congress the power to address these problems. As for specifics, it offers two main ideas: Let governors make temporary appointments, or have current Members draw up a succession list in advance. The commission wants to keep the amendment simple (see proposed text nearby), so Congress has flexibility to make and amend the rules.

    A Constitutional amendment requires passage by a two-thirds majority of both houses of Congress and ratification by the legislatures of 38 states -- a cumbersome and lengthy process. Unlike most proposed amendments (such as the one just endorsed by Bill Clinton, ahem, to let Presidents serve more than two terms) this one has a chance of passing. Congress has already held hearings on continuity and more are in the works.

    The Continuity of Government Commission is a project of the conservative American Enterprise Institute and liberal Brookings Institution. Its findings are unanimous -- no mean feat considering that members include such ideological opposites as Newt Gingrich and Kweisi Mfume. It is chaired by former GOP Senator Alan Simpson and Lloyd Cutler, former Democratic White House Counsel.

    The commission's report opens with the following scenario: "It's 11:30 a.m., inauguration day." Al Qaeda has just detonated a small nuclear device on Pennsylvania Avenue between the White House and the Capitol. "Everyone present at the Capitol, the White House, and in between is presumed dead, missing, or incapacitated." Who's in charge? That's a question for Congress to answer now, not amid a national crisis.

    Click here to return to top of page.

    Is It Really 1933 in America Today? (posted 6-3-03)

    James Traub, writing in the NYT Magazine (June 1, 2003):

    Have you heard that it's 1933 in America? God knows I have. Three times in the last few weeks I have been told -- by a novelist, an art historian and a professor of classics at Harvard, none of them ideologues or cranks -- that the erosion of civil liberties under the Bush administration constitutes an early stage, or at least a precursor, to the kind of fascism Hitler brought to Germany. I first heard the 1933 analogy a few months back, when one of the nation's leading scholars of international law suggested at a meeting of diplomats that Bush's advisers were probably plotting to suspend the election of 2004.

    Now, I think I understand the argument that compares the United States with imperial Rome, or with one of the unwitting great powers of 1914. But 1933? Hitler? That's grotesque; and the fact that is has achieved such currency among what the French call the bien pensant is vivid proof that in much of the left, 9/11 and its aftermath have increased the visceral loathing not of terrorism or of Islamist fundamentalism but of President George Bush.

    Like all forms of reductio ad Hitler, the 1933 analogy constitutes a gross trivialization of the worst event in modern history. Do we remember what actually happened in 1933? Hitler ascended to the chancellorship, suspended constitutional rights and banned all opposition political parties, sent the Brown Shirts into the streets and issued the first decrees stripping Jews of their rights. To compare the passage of the U.S.A. Patriot Act and the proposed -- but scotched -- program to get ordinary citizens to pass along tips about suspicious dark-skinned strangers, not to mention the cancellation of Tim Robbins's invitation to appear at the Baseball Hall of Fame because he might criticize the war in Iraq -- to compare these and other inroads on our liberties to Hitler's budding terror state is repellent.

    But 1933 theorists, at least the more sophisticated ones, look beyond current policy to what they consider the structural similarities between contemporary America and various fascist states. In a recent article in The Nation, Sheldon Wolin, an emeritus professor of politics at Princeton, described the contemporary Republican party as "a fervently doctrinal party, zealous, ruthless, antidemocratic and boasting a near majority." The combination of toothless Democrats, a compliant media and "a politically demobilized society" ensures that the Republicans, and their corporate overlords, will face little opposition in their drive for total domination.

    Haven't we been here before? Back in the 60's, when my ideological sympathies were first shaped, practically everybody was considered a fascist, including your junior-high-school principal. Those were genuinely apocalyptic times: National Guardsmen were shooting protesters, black revolutionaries were strapping on bandoliers and President Nixon was turning the F.B.I. into a private investigative force. But the era ended not with the triumph of the state, but with its humbling: the impeachment proceedings against Nixon, the election of the modest Jimmy Carter, the clipping of the C.I.A.'s wings (now being unpinned post-9/11). Civic and state institutions, including the media, the judiciary and the party system, proved stronger than the forces that would constrain them.

    And this is really the fundamental point: fascist states arise not simply because a mesmerizing leader seizes state power in unsettled times but because the democratic institutions that might oppose him have rotted away, as they did in Weimar Germany. Has that really happened here? It's true that today's Republican Party is, by all historical standards, fervently doctrinal, if not necessarily ruthless or antidemocratic. Left to its own devices, the Bush administration, and especially Attorney General John Ashcroft, might be perfectly willing to expand government powers to fight terrorism no matter the cost to individual liberties. But the administration has not been left to its own devices. Opposition from both liberals and libertarian conservatives -- i.e., Republicans -- killed the TIPS program and may already be hindering next-generation Patriot II legislation; organs of the "corporate-controlled media," like "60 Minutes," have reported on the growing threat to civil liberties.

    Much of the left seems to feel that the greatest threat to emerge from 9/11 is an untrammeled Bush administration -- as if the destruction of the twin towers was the functional equivalent of the Reichstag fire, as I have heard one of my friends say. And yet even the most devout civil libertarians recognize that the terrorist threat compels rethinking. Norman Siegel, the former head of the New York Civil Liberties Union and a famous First Amendment purist, says, "The security interests are real, they're legitimate and you have to balance freedom and security in a different way post-9/11." Siegel says that he has been hard put to explain to skeptical audiences that the Patriot Act, for all its problems, does not preclude traditional forms of peaceful protest.

    Click here to return to top of page.

    The Tyranny of the New Grammar Purists (posted 6-2-03)

    Geoffrey Nunberg, writing in the NYT (June 1, 2003):

    there a grammatical error in the following sentence? "Toni Morrison's genius enables her to create novels that arise from and express the injustices African Americans have endured."

    The answer is no, according to the Educational Testing Service, which included the item on the preliminary College Board exams given on Oct. 15 of last year. But Kevin Keegan, a high-school journalism teacher from Silver Spring, Md., protested that a number of grammar books assert that it is incorrect to use a pronoun with a possessive antecedent like "Tony Morrison's" — that is, unless the pronoun is itself a possessive, as in "Toni Morrison's fans adore her books."...

    Unlike the hoary shibboleths about the split infinitive or beginning sentences with "but," this one is a relative newcomer, which seems to have surfaced in grammar books only in the 1960's. Wilson Follett endorsed it in his 1966 Modern American Usage, and it was then picked up by a number of other usage writers, including Jacques Barzun and John Simon.

    The assumption behind the rule is that a pronoun has to be of the same part of speech as its antecedent. Since possessives are adjectives, the reasoning goes, they can't be followed by pronouns, even if the resulting sentence is perfectly clear.

    If you accept that logic, you'll eschew sentences like "Napoleon's fame preceded him" (rewrite as "His fame preceded Napoleon"). In fact you'll have to take a red pencil to just about all of the great works of English literature, starting with Shakespeare and the King James Bible ("And Joseph's master took him, and put him into the prison"). The construction shows up in Dickens and Thackeray, not to mention H. W. Fowler's "Modern English Usage" and Strunk and White's "Elements of Style." ("The writer's colleagues . . . have greatly helped him in the preparation of his manuscript.") ...

    The ubiquity of those examples ought to put us on our guard — maybe the English language knows something that the usage writers don't. In fact the rule in question is a perfect example of muddy grammatical thinking. For one thing, possessives like "Mary's" aren't adjectives; they're what linguists call determiner phrases. (If you doubt that, try substituting "Mary's" for the adjective "happy" in sentences like "The child looks happy" or "We saw only healthy and happy children.")

    And if a nonpossessive pronoun can't have a possessive antecedent, logic should dictate that things can't work the other way around, either — if you're going to throw out "Hamlet's mother loved him," then why accept "Hamlet loved his mother"? That's an awful lot to throw over the side in the name of consistency.

    But that's what "correct grammar" often comes down to nowadays. It has been taken over by cultists who learned everything they needed to know about grammar in ninth grade, and who have turned the enterprise into an insider's game of gotcha! For those purposes, the more obscure and unintuitive the rule, the better.

    Roundup
    Historians' Take on the News
    Media's Take on the News
    History Being Talked About
    Comments About Historians
    Historians in the News
    On Other Websites

    HNN Blogs
    Cliopatria
    Liberty & Power
    Judith Apter Klinghoffer
    Allan Lichtman

    Thomas C. Reeves

    A Thin Blue Line: The History of the Pregnancy Test Kit. The exhibit, created by the Office of NIH History and the Center for History and New Media, includes a historical timeline of pregnancy testing, portrayals of the pregnancy test in popular culture, and scientific background on the research that led to the development of the test. Visitors to the on-line exhibit will have the opportunity to contribute to the site by anonymously relating their own experiences with the home pregnancy test.

    Amazon Honor System

    HNN Underwriter
    History Books
    Biology Books
    Sport Books
    Law Books

    Click Here to Pay Learn More

    Post a Comment

    What rules govern discussion boards?

    User Name: If you have not already, you must Sign Up before you can post.
    Password: Remember Me
    Subject:
    Comment:

    home | archives | newsletter | contact | about us | faq