Category: Articles

Welcome to our articles section. The articles below either have been written specifically for ButterfliesandWheels or are appearing here having been published elsewhere previously.

If you’re interested in writing an article for ButterfliesandWheels, please click here for our information for contributors page.

  • The Arts and Cultural Diversity

    Immigrant, ethnic minority, asylum-seeker – slivers of insinuation separate
    the meanings of each term in contemporary Britain. Ethnic minority, black and
    Asian, cultural diversity – clouds of obfuscation have distinguished contemporary
    arts in Britain over the past 30 years.

    That I draw an analogy between socio-political and artistic terminology is
    not incidental: socio-political concerns have determined arts-funding policy
    for the past three decades. Ever since, in fact, the publication of Naseem Khan’s
    seminal report for the Arts Council in 1976, ‘The Arts Britain Ignores’. This
    year sees the launch of yet another arts initiative, designed to heap attention
    on ‘culturally diverse’ arts, aptly titled ‘decibel’ (noise). Why do we need
    a showcase of ethnic arts? And what noise is decibel really making?

    In the same year Naseem Khan’s report was published, the writer Amrit Wilson
    published her compilation of Asian women’s stories – Finding A Voice.
    It seems we ethnics are still deemed to be in need of finding a public voice.
    Hence the decibel showcase, where Arts Council England aims to draw the
    attention of producers nationally and internationally to work they have hitherto
    ignored. This despite ‘ethnic’ artists of the first rank making noises in almost
    every category of contemporary arts, from Anish Kapoor to Ben Okri, Akram Khan
    to Chewitel Ejiofor, Shobana Jeyasingh to Zadie Smith.

    ‘Access’ and ‘opportunity’ are the current buzzwords, their roots lying in the
    best of British liberal sentiments. It is laudable for any society that considers
    itself civilised to seek to promote an equality of opportunity for all its citizens.
    But when the wheel is having to be reinvented every 10 years or so, it is time
    to question the wheel.

    In 1976, it was Naseem Khan’s report that drew attention to the arts Britain
    ignored. In 1983, the Arts Council sought, through its ‘Glory of the Garden’
    policy, to enforce a minimum representation of ethnic arts in the arts-infrastructure
    of the country. In the mid-1990s, the Arts Council adopted the promotion of
    cultural diversity as a central part of its mission statement. In 2002, the
    Arts Council’s Eclipse Report aimed to change the institutionally racist
    face of British theatre.

    Significantly, every liberal political measure undertaken so far to correct
    injustices – the Stephen Lawrence Inquiry into institutional racism being only
    the most recent – has proven ineffectual. Racism is not an intellectual failure
    that can be corrected by a greater dose of education. It is a moral value, however
    much one may abhor such a morality. It is an imaginative construct and so the
    engineers of the imagination – artists – find themselves in the frontline, their
    weapons being the pen or the hand or the body or the voice.

    But when these troops are divided by ethnicity, it makes the prospect of victory
    ever dimmer. Just as we forget that Black Africans, Caribbeans and Americans,
    Indians, Chinese, Afghans and other ‘ethnics’ fought alongside the Allies in
    two world wars, so by institutionalising ethnic divisions we are prone to forget
    that in contemporary British arts there is an ever-present ‘ethnicity’ – and
    forgetting is tantamount to devaluing.

    What, if any, is the artistic significance of Bombay Dreams as a West
    End musical? Is its ability to attract an Asian audience into the West End an
    artistic value? Is, conversely, Jerry Springer – The Opera anything
    more than a ‘pakora’ musical: say ‘cunt’ and ‘fuck’ enough times and you’re
    guaranteed to draw in a youthful audience in droves (mirroring what goes on
    in the Asian comedy scene, where the mention of ‘pakora’ is a sure-fire route
    to cracking-up the audience).

    A national theatre critic once admitted to me that he came to review our shows
    not because of the particular play we were producing but because it was ethnic
    and needed to be brought to wider attention. That was in 1989. Has much changed
    since then? And will showcasing culturally diverse work – as the decibel
    initiative purports to do – help that critic to drop his ethnic lens? I very
    much doubt it.

    Of equal concern, however, is how the critic judges: how to evaluate ‘other’
    arts. And this is a real challenge. How does one judge, for example, the ‘fusion’
    dance of Akram Khan, drawing as much on classical Kathak as it does on contemporary
    dance vocabularies? This I take to be the artistic value of ‘cultural diversity’
    – challenging our preconceptions, our imaginations.

    But when a corral is created around cultural diversity we are being fed, and
    we help sustain, difference; rather than be confronted to explore connections.
    Merely beating the drum of culturally diverse arts – as decibel seeks
    to do – will only help to marginalise these artists within the confines of ‘identity’.
    Identity need not be immutable; it can be in dialogue with other identities.
    It is only then that we can all participate in the quality of the artistic experience.

    The contradictions in arts policies were brought home to me at the recent Asian
    Women’s Achievement Awards. The title, of course, is very ghetto. But Cherie
    Blair and other leading New Labour women were there to endorse it – as an instance
    of the multicultural reality of Britain today. Surely a multicultural, integrated
    society would honour women’s achievements, or even just achievements. So are
    we in reality talking multi-culture or separate-cultures?

    We in the arts world have become so dominated by marketing gurus – and their
    dogma of ‘niche marketing’ – that we forget that if I don’t see you in me and
    if you can’t see me in you we might as well dispense with the abiding hope of
    the arts: to connect one human being with another. This fetishisation of marketing
    makes the good ship Arts scythe through the waves of humanism and, like Moses,
    we stand before a divided sea.

    Jatinder Verma is artistic director of Tara Arts, a theatre company he co-founded
    in 1977. Tara Arts is a regular client of Arts Council England, has toured extensively
    in Britain and overseas, and co-produced with theatres including the Royal National
    Theatre and the Lyric Theatre, Hammersmith. Tara Arts’ latest show, A Taste
    For Mangoes, about the Indo-British love affair before the Raj, will open in
    London’s oldest music hall, Wilton’s, in November 2003.

  • Who Mourns the Gepids?

    The answer to the question in the title is "No one," but it will
    take a while to get to the reasons. I thought about the Gepids as I drove through
    the Navajo Reservation in Arizona and New Mexico through incomparable scenery,
    a lot of history, and often uncomfortable knowledge about the present, much
    of it filtered through the novels of fellow Oklahomans, Tony Hillerman and Ron
    Querry. Their books and other sources touch on problems of the contemporary
    Navajo, but they are more noted for their celebration of the coherence of Navajo
    culture and the sense of "hozho," of oneness with the beauty of the
    world. This theme is attractive to many Anglos who buy into a nostalgia for
    a culture they don’t know and who, as they learn more about Navajo-white relations,
    may not only feel guilty about overrunning other cultures but come to believe
    that these cultures are superior to our own. While this attitude is understandable,
    does credit to the goodheartedness of those who hold it, and is to a degree
    laudable, it is sometimes based on mistaken assumptions about or ignorance of
    human beings and the process of human history. More important, it can be a way
    of avoiding social and individual problems rather than dealing with them realistically.


    Not that it is wrong to find Navajo country and its people attractive. The
    landscape is spectacular, and the human scene looks exotic to anyone who has
    not driven across other stretches of the American West. Scattered across the
    valleys and up gradual slopes to the mountains are ranches and farms, some with
    traditional hogans, some with modern, chiefly manufactured, housing, some with
    both, some with hogans that have two stories or ells. Here and there, there
    are small, dusty towns with wide streets, along which not many people move not
    very quickly. Except for the vernacular architecture, it doesn’t look all that
    different from west Texas.


    Of course, it is different. The people here are darker and more heavily built
    and have a different lilt to their speech. And along the highways trudge a few
    pedestrians who don’t bother to put out thumbs to the passing cars. Here and
    there a man or boy watches a pair of dogs herd sheep along or across the road.
    The core of the culture – what attracts the attention of outsiders who romanticize
    it – has not, despite missionaries of various sects, been fundamentally changed
    by Christian or European influences.


    But those influences have been and are profound. The U.S. government’s treatment
    of the Navajo, as of other Indians, is not pretty to contemplate: expropriation
    of land, deportation, fiddling with or outright breaking of treaties. Poverty
    and alcoholism rates are high. Educators struggle to keep the Navajo language
    alive, and there is a severe shortage of singers to conduct traditional ceremonies.
    No Anglo with any vestige of conscience can look at all of this without feeling
    guilty about what our people have done.


    But we feel this way because we know what happened and is happening. And that
    brings me back to the Gepids and the question in the title. Few people in the
    U.S. or for that matter anywhere else ever heard of the Gepids or know that
    they were one of the tribes overrun and then obliterated by the Magyar incursion
    into the Carpathian Basin, much of which is now Hungary, a bit over a millennium
    ago.


    Even those who have heard of the Gepids don’t know much about them because,
    if they had a written culture, the Magyar did not, and thus would have had no
    means of recording and transmitting information in the unlikely event that they
    would have been interested. Thus there is no cult of Gepid spirituality or attempt
    to revive Gepid rituals or to uphold the Gepid way as superior to that of the
    people who displaced it.


    In modern times, however, the winners have taken care, though not always the
    greatest care, to record information about the winning side, all the way from
    preservation, in translation, of the Aztec codex through Joseph Conrad’s impressionistic
    account of European incursions into the Congo in Heart of Darkness to
    the latest self-styled white shaman’s version of Native American culture.


    Moreover, many contemporary marginalized cultures have representatives able
    to speak for them. Chinua Achebe, a leading African novelist, attacked Conrad
    for presenting issues from the European perspective. N. Scott Momaday speaks
    and writes eloquently about the culture of his forbears and other tribes. And
    this process continues, as in Geary Hobson’s The Last of the Ofos, in
    which the sole survivor of a Louisiana tribe recounts the conditions which he
    has survived and, in the moving last chapter, speaks the language that only
    he knows to the wind and swamp. Among my own ancestors and distant cousins,
    Celts and Confederates have published whole libraries celebrating those ways
    of living and of interpreting the world. Celebrating one’s heritage is not only
    honorable but necessary.


    Of course, these apologists frequently attempt not simply to record ways of
    living but assert quite loudly their spiritual and aesthetic superiority to
    those of victors who depended on brute technology or accident—stirrups in the
    case of the Magyar; horses, steel, guns, and disease in the case of the Europeans
    who came to the Americas. The sins of their fathers are conveniently thrown
    in the memory hole—partly because people defeated thoroughly enough give themselves,
    and frequently get, amnesty for actions which, had they been performed by Europeans,
    would be roundly condemned.


    Claims for the superiority of marginalized cultures are sometimes honored by
    defectors from the majority culture, the more sensitive or disgruntled heirs
    of the victors, for several reasons. For one thing, the hearers are impressed
    by the testimony of representatives of minority cultures. But many members of
    the majority culture are reluctant to examine this testimony critically, making
    the unconscious assumption that their status somehow gives their statements
    the stamp of infallibility. And this is a subtle form of condescension.


    These well-intentioned members of the majority have another reason for accepting
    criticism of their society: they can see that what victory has produced is far
    from perfect and coherent. On the other hand, a beleaguered minority group can
    seem more tightly-knit and coherent than the culture from which the observer
    comes. Even as a lapsed Catholic, though not Irish, I found the hair on the
    back of my neck rising when I saw my first Orange Lodge. Those more deeply involved
    with a minority culture can be unified by outside pressure, like various Indian
    tribes who in previous centuries hated, despised, and made war on each other.
    Someone who has never heard a Cherokee talk about Kiowas or vice versa, to take
    one of many examples, might envy Indian unity in contrast to the divisions he
    or she perceives in white society.


    Less supportable, at least without critical examination of the testimony, is
    the claim that what has been displaced or suppressed is superior to one’s own
    culture. Still, it is easy to understand how some people are able to think so.
    For one thing, a little knowledge can be consoling, and a lot of knowledge can
    be disheartening. We know a great deal about our own society and, unless we
    are experts, very little about others. Thus D. H. Lawrence could suppose in
    the absence of any hard evidence that the Etruscans had a culture far more balanced
    and harmonious than that of the vulgar, expansionist Romans who supplanted them
    and, of course, of the far more vulgar and expansionist society in which he
    lived. And, according to an anthropologist friend, many students enter her introductory
    course with highly romantic views of any people who can be termed "primitive."
    The Mayans were once considered to be gentle humanists superior to the Aztecs,
    who had a highly developed civilization but also practiced human sacrifice,
    because, thanks to a Spanish friar, we have known something about the Aztecs
    for a long time. But until recently, when linguists like David Kelley were able
    to translate Mayan glyphs, we knew very little about the Mayans. It turns out
    that they skinned captives alive, among other things, and what they did when
    a king died will make you grab your crotch protectively.


    There is also selective use of evidence—or, perhaps, what Orwell called "doublethink.".
    In Legends of the American Desert, Alex Shoumatoff has the "impression
    that most Navajo, even progressives who live in the cities of the Southwest,
    still live by the Navajo Way," which, he quotes a woman as defining as
    "’being in harmony with everything—yourself, mainly, all the living things,
    the air, Father sky, moon, and on and on." Wouldn’t it be pretty to think
    so. But elsewhere Shoumatoff gives figures that suggest otherwise: "five
    hundred intoxicated Indians freeze to death or are hit by cars in New Mexico
    every year," and in broader terms, alcohol-related deaths, unemployment,
    tuberculosis, suicide, and infant mortality rates are much higher than in the
    rest of the population. Some of the problems are due to Anglo incursion or neglect,
    and whatever their cause, they ought to be corrected as soon as possible. But
    it does no good to insist that the traditional system is perfect or necessarily
    superior to other ways of looking at life when it has so obviously failed to
    deal with these issues.


    A second reason for preferring other cultures to one’s own is the assumption
    that, because the people are not like us in some ways, they must be totally
    different. One example of this attitude can be seen in historical museums, where
    spectators and even curators seem fascinated by the possibility that primitive
    peoples created social structures and artifacts not wholly unrecognizable to
    the people on the other side of the glass in the display case. But for the most
    part observers tend to see so-called exotic peoples as wholly other and to regard
    as pure and unmixed their motives and responses. This purity is hard to discover
    in lived experience. Consider religious rituals. One might ask, though people
    rarely do, whether even the most traditional Hopis or Navajos are so much different
    from white Christians that they never have doubts about the efficacy of their
    rituals or sigh at having to get up and perform a ceremony or show up because
    it’s expected of them or enjoy shaking their booty or wish to be somewhere else?
    Did all participants have equal faith and fervor? Did no one come for social
    or aesthetic reasons?


    My experience tells me otherwise. During my college years, when I was a practicing
    Catholic, trying to observe not only the letter but the spirit of the law, I
    spent weekday mornings one summer vacation singing Requiem Masses at the parish
    church with two other college students. I can’t speak to their motives, but
    mine were minimally spiritual. Ideally, the Masses were supposed to help spring
    the departed from Purgatory. But my decision to get up very early, before going
    to work, was primarily aesthetic rather than spiritual: I enjoyed singing, especially
    this kind of music, and there were very few aesthetic outlets in small-town
    Missouri in the early 1950s.


    But at least I believed. A friend, son of a Texas Baptist minister, has lost
    his faith but not his taste for hymns, and he meets regularly with a group to
    perform "shape note" music, originally religious and now purely recreational.
    And many people seem to go to church because they can get out of the house,
    meet people, and engage in a different kind of ritual, like Shriners riding
    tiny motorcycles in parades.


    It’s possible, of course, that people we regard as exotic are really different
    from you and me. It’s highly improbable that we can know. Outsiders like Tony
    Hillerman can imagine what it’s like to be a Navajo in his mystery stories set
    on the reservation. Still, he hedges his bets by concentrating on Joe Leaphorn
    and Jim Chee, who have been to Anglo universities. And in any case, his novels
    deal with solving problems—whodunnit—rather than with presenting problems, like
    alcoholism and alienation from the old ways, in their full complexity. The very
    nature of genre fiction distances the reader from these issues. This, by the
    way, does not make them bad books, but as Hillerman says, "My readers are
    buying a mystery, not a tome of anthropology….the name of the game is telling
    stories; no educational digressions allowed." [Tony Hillerman and Ernie
    Bulow, Talking Mysteries: A Conversation with Tony Hillerman (Albuquerque:
    University of New Mexico Press, 1991), p. 39.]


    Nevertheless, Hillerman, like every sensitive student of another culture, performs
    a valuable service in making readers more aware of the problems of a culture
    obviously under attack. And no one in the majority should regret attempts to
    preserve knowledge about other beliefs and ways of life. But no outsider, no
    matter how well-intentioned, should be allowed to forget Heisenberg’s Uncertainty
    Principle: one cannot study a phenomenon without somehow affecting it. Like
    ecotourism, which has much the same effect on an environment as the supposedly
    more vulgar kind, contact with other cultures affects them. To put it another
    way: consider the Prime Directive in the original "Star Trek." If
    Kirk and the crew of the Enterprise had observed it, at least half of the episodes
    could never have existed.


    But many people who sentimentalize extinct or suppressed cultures do so less
    out of love for them than out of distaste for their own. Susan Smith Nash, who
    has worked with and translated indigenous writers, sees this process as becoming
    "a sort of commodification of history that works because it scandalized
    certain behaviors (thus fetishizing them and titillating the audience), and
    it makes history (or the construction of it) an artifice to be retooled each
    season so that it’s a fashion statement as well. Sentimentalizing is a kind
    of sales pitch—what’s being sold? The thrill of voyeuristically contemplating
    atrocities? A way to appear enlightened?" Or, as a Navajo put it to Shoumatoff,
    "You Americans are looking for instant religious satisfaction, like instant
    mashed potatoes." In other words, like Lawrence with the Etruscans, the
    Other provides convenient and all too easy weapons with which to attack the
    ways of their parents, literal or figurative, in the dominant culture.


    But in cooler and more logical terms, it seems unwise to regard the ways of
    other cultures as necessarily superior. Take the film "Koyaanisqatsi,"
    the title a Hopi word meaning "life out of balance," which indicts
    America’s rampant urbanism and mindless embracing of technology. That is a valid
    point – but to make it Godfrey Reggio used a number of advanced technological
    resources, and the review for the internet’s Apollo web-site concludes that
    "To get the full effect on video, you’ll need a big screen, an excellent
    sound system, and the audio turned up good and high." Another on-line reviewer,
    Vladimir Zelevinsky was left "with the feeling of elation and triumph.
    The sheer complexity of the urban activity and power and variety of humanity
    on display is enough to make one proud to be a part of these extravagant species
    [sic]."


    This was a more eloquent version of my reaction to the film, and it leads to
    a broader issue: don’t be too quick to despise the familiar. There is plenty
    wrong with modern American civilization and with Catholicism. But there is also
    a great deal to celebrate, for the traditions and rituals of both are rich and
    complex. And there is plenty wrong with colonialism, which Joseph Conrad’s Marlow
    defines as "the taking it away from those who have a different complexion
    or slightly flatter noses than ourselves." As he says, it’s not a pretty
    sight.


    But it is a common one. Human beings are opportunistic. Those to whom we did
    it almost certainly did it to someone else. My Celtic ancestors moved, or sometimes
    were moved, through Europe until they ran against the Atlantic Ocean. The Navajos
    and Apaches moved from western Canada to the American Southwest, where they
    harassed and, when possible, despoiled the Pueblo and other neighboring tribes,
    who helped in Kit Carson’s campaign against them. The Apaches were forced farther
    south and west by the incursion of the Comanches. In the anthropologist’s version,
    American Indians came from Asia. If one insists on tribal origin stories, the
    Navajos came up from a lower world to this one, specially created for them.
    That, and the Navajos’ name for themselves, Dineh, The People, sounds a little
    like Manifest Destiny. Or consider the case of the Kennewick Man, whose remains,
    nine millennia old and not indisputably Mongoloid, were discovered in Washington
    state. Scientists wanted to test the DNA; Indian groups maintained that this
    was intrusive and that, since he was discovered on their land, he must be Indian.
    In any case, they argued, they had always been here, so he had to be one of
    their ancestors. There is every reason to respect native culture and traditions,
    but imagine what the response would be to similar claims by white Christian
    fundamentalists, including the Kansas textbook board which removed textbooks
    which so much as mention evolution.


    But the Indian claim points to a serious question: which people can be called
    indigenous? While I was drafting this essay, I heard a talk by a learned and
    earnest young Osage scholar who listed a number of indigenous peoples in the
    Americas, Australia, New Zealand, and Africa. With the exception of the Lapps
    – and perhaps the Basques, they might argue – Europe apparently has no indigenous
    peoples. But the term really means that no one knows when they got where they
    are now.


    I raise this issue not to quibble and certainly not to justify regarding any
    other culture as inferior or to excuse some or even most of the actions of those
    who moved in, including EuroAmericans. Nor to make my fellow Americans feel
    triumphant: our turn will come.


    When it does, those of us who survive may take whatever consolation is possible
    in the sight of the disaffected heirs of our conquerors appropriating the outward
    and visible signs of our culture. Perhaps really avant-garde youth will wear
    three-piece suits; drink Chivas Regal; dine at ethnic restaurants serving meat
    loaf, mashed potatoes, and green beans boiled limp; listen to Lawrence Welk
    and Neil Diamond; and attend re-creations of tent revivals. They may even hire
    us as gurus and tour guides.


    In the meantime, it might be well to follow the advice of the strongest character
    in Frank Chin’s novel Donald Duk to a boy who hates being Chinese: "History
    is war, not sport." And like Donald Duk, you have to learn about your history
    in order to celebrate and defend it. And about others’ histories in order to
    understand and respect rather than to sentimentalize them.


    We also have to deal with internal critics who attempt to deconstruct American
    history and institutions. Some of this is necessary: the George Washington cherry
    tree type of history persists in the minds of many Americans, including those
    who infest the editorial page of the largest newspaper in my state. All we can
    say to these criticisms is "Yes."


    And then "but." The young Osage scholar speaks of a responsibility
    to his own past, his own family, his own history. We have the same responsibility,
    made more difficult by the greater burden of knowledge and the even greater
    burden of success. It can be tempting, as Geary Hobson points out in "The
    Rise of the White Shaman as a New Version of Cultural Imperialism," to
    shirk this burden and try to become a reverse image of what blacks, Indians,
    and Asians call oreos, apples, bananas – white only on the outside, red, black,
    or yellow on the inside. This kind of distortion makes the imitator ludicrous
    in the eyes of the people being imitated. As Hobson says, people like this need
    "to restore themselves to their own houses – by learning and accepting
    their own history and culture."


    That doesn’t mean that any of us can afford to ignore, let alone despise, other
    cultures. The Osage hasn’t cut himself off from white language and culture;
    Frank Chin knows about Westerns and flamenco; Ralph Ellison knew about Hemingway
    and Malraux and a whole lot else; and Hobson, a Quapaw-Cherokee-Chickasaw, teaches
    Faulkner. But none of them wants to be an imitation white man, and it would
    be equally mistaken for whites to try to be Navajos or blacks or anything else
    but what we are. And what we are is in part a heritage, a family, and a history,
    and none of us can escape these, or should try. As King Duk says in Chin’s novel,
    "You gotta keep the history yourself or lose it forever….That’s the mandate
    of heaven." Shoumatoff quotes the Navajo phrase about "becoming real"—getting
    rid of false values and seeing the true nature of things. All of us need, Chin
    would agree, to take responsibility for what we have become and will become.
    Perhaps Shoumatoff should consider more directly that there are ways, besides
    the Navajo way, of doing so. Owen Wister, often a little too obsessed with his
    white heritage and values, nevertheless has a point when his Virginian insists
    that there is only one kind of goodness and that he tries to follow it. "And
    when I meet it," he adds, "I respect it." And so should every
    Asian-American, African-American, Navajo, and anyone else who hopes to become
    real—by whatever name it is called and whatever process it is attained.

    This article was originally published in Southwest Review, vol. 86,
    no. 1, 2001.

  • Debunking Edward Said

    This is an edited version of the article, Debunking Edward Said – Edward
    Said and Saidists: or Third World Intellectual Terrorism, which
    is here
    . For the purposes of ease of reading, references and bibliographical
    information have been removed from this edited version of the article, but the
    longer version is fully referenced. Interested readers should follow the link!

    Consider the following observations on the state of affairs in the contemporary
    Arab world :

    The history of the modern Arab world – with all its political failures,
    its human rights abuses, its stunning military incompetences, its decreasing
    production, the fact that alone of all modern peoples, we have receded in democratic
    and technological and scientific development – is disfigured by a whole series
    of out-moded and discredited ideas, of which the notion that the Jews never
    suffered and that the holocaust is an obfuscatory confection created by the
    Elders of Zion is one that is acquiring too much – far too much – currency;

    ….[T]o support Roger Garaudy, the French writer convicted earlier this year
    on charges of holocaust denial, in the name of ‘freedom of opinion’ is a silly
    ruse that discredits us more than we already are discredited in the world’s
    eyes for our incompetence, our failure to fight a decent battle, our radical
    misunderstanding of history and the world we live in. Why don’t we fight harder
    for freedom of opinions in our own societies, a freedom, no one needs to be
    told, that scarcely exists?

    It takes considerable courage for an Arab to write self-criticism of this kind,
    indeed, without the personal pronoun ‘we’ how many would have guessed that an
    Arab, let alone Edward Said himself, had written it? And yet, ironically, what
    makes self-examination for Arabs and Muslims, and particularly criticism of
    Islam in the West very difficult is the totally pernicious influence of Edward
    Said’s Orientalism. The latter work taught an entire generation of Arabs
    the art of self-pity – “were it not for the wicked imperialists, racists
    and Zionists, we would be great once more” – encouraged the Islamic fundamentalist
    generation of the 1980s, and bludgeoned into silence any criticism of Islam,
    and even stopped dead the research of eminent Islamologists who felt their findings
    might offend Muslims sensibilities, and who dared not risk being labelled “orientalist”.
    The aggressive tone of Orientalism is what I have called “intellectual
    terrorism,” since it does not seek to convince by arguments or historical
    analysis but by spraying charges of racism, imperialism, Eurocentrism, from
    a moral high ground; anyone who disagrees with Said has insult heaped upon him.
    The moral high ground is an essential element in Said’s tactics; since he believes
    his position is morally unimpeachable, Said obviously thinks it justifies him
    in using any means possible to defend it, including the distortion of the views
    of eminent scholars, interpreting intellectual and political history in a highly
    tendentious way, in short twisting the truth. But in any case, he does not believe
    in the “truth”.

    Said not only attacks the entire discipline of Orientalism, which is devoted
    to the academic study of the Orient, but which Said accuses of perpetuating
    negative racial stereotypes, anti-Arab and anti-Islamic prejudice, and the myth
    of an unchanging, essential “Orient,” but he also accuses Orientalists
    as a group of complicity with imperial power, and holds them responsible for
    creating the distinction between Western superiority and Oriental inferiority,
    which they achieve by suppressing the voice of the “oriental,” and
    by their anti-human tendency to make huge, but vague generalizations about entire
    populations, which in reality consist of millions of individuals. In other words,
    much of what was written about the Orient in general, and Islam and Islamic
    civilisation in particular, was false. The Orientalists also stand accused of
    creating the “Other” – the non-European, always characterised in a negative
    way, as for example, passive, weak, in need of civilizing (western strength
    and eastern weakness).

    But “Orientalism” is also more generally “a style of thought
    based upon an ontological and epistemological distinction made between “the
    Orient” and (most of the time ) “the Occident.” “Thus European
    writers of fiction, epics, travel, social descriptions, customs and people are
    all accused of “orientalism”. In short, Orientalism is seen “as
    a Western style for dominating, restructuring, and having authority over the
    Orient.” Said makes much of the notion of a discourse derived from Foucault,
    who argued that supposedly objective and natural structures in society, which,
    for example, privilege some and punish others for noncoformity, are in fact
    “discourses of power “. The putative “objectivity ” of a
    discipline covered up its real nature; disciplines such as Orientalism participated
    in such discourses. Said continues, “…[W]ithout examining Orientalism
    as a discourse one cannot possibly understand the enormously systematic discipline
    by which European culture was able to manage – even produce – the Orient politically,
    sociologically, militarily, ideologically, scientifically, and imaginatively
    during the post-Enlightenment period.”

    From Pretentiousness to Meaninglessness

    There are, as I shall show, several contradictory theses buried in Said’s impenetrable
    prose, decked with post-modern jargon (“a universe of representative discourse”,
    “Orientalist discourse”) (and some kind editor really ought to explain
    to Said the meaning of “literally” and the difference between scatological
    and eschatological), and pretentious language which often conceals some banal
    observation, as when Said talks of “textual attitude”, when all he
    means is “bookish” or “bookishness”. Tautologies abound,
    as in “the freedom of licentious sex “.

    Or take the comments here: “Thus out of the Napoleonic expedition there
    issued a whole series of textual children, from Chateaubriand’s Itinéraire
    to Lamartine’s Voyage en Orient to Flaubert’s Salammbô,
    and in the same tradition, Lane’s Manners and Customs of the Modern Egyptians
    and Richard Burton’s Personal Narrative of a Pilgrimage to al-Madinah
    and Meccah
    . What binds them together is not only their common background
    in Oriental legend and experience but also their learned reliance on the Orient
    as a kind of womb out of which they were brought forth. If paradoxically these
    creations turned out to be highly stylized simulacra, elaborately wrought imitations
    of what a live Orient might be thought to look like, that by no means detracts
    from the strength of their imaginative conception or from the strength of European
    mastery of the Orient, whose prototypes respectively were Cagliostro, the great
    European impersonator of the Orient, and Napoleon, its first modern conqueror.”

    What does Said mean by “out of the Napoleonic expedition there issued
    a whole series of textual children” except that these five very varied
    works were written after 1798? The pretentious language of textual children
    issuing from the Napeolonic expedition covers up this crushingly obvious fact.
    Perhaps there is a profound thesis hidden in the jargon, that these works were
    somehow influenced by the Napoleonic expedition, inspired by it, and could not
    have been written without it. But no such thesis is offered. This arbitrary
    group consists of three Frenchmen, two Englishmen, one work of romantic historical
    fiction, three travel books, one detailed study of modern Egyptians. Chateaubriand’s
    Itinéraire (1811) describes superbly his visit to the Near East;
    Voyage en Orient (1835) is Lamartine’s impressions of Palestine,
    Syria, and Greece; Salammbô (1862) is Flaubert’s novel of ancient
    Carthage; Lane’s Manners and Customs of the Modern Egyptians (1836) is
    a fascinating first-hand account of life in Egypt, particularly Cairo and Luxor,
    written after several years of residence there, Burton’s account of his audacious
    visit to Mecca was first published in three volumes between 1855-6. Lane and
    Burton both had perfect command of Arabic, Classical and Colloquial, while the
    others did not, and Lane and Burton can be said to have made contributions to
    Islamic Studies, particularly Lane, but not the three Frenchmen.

    What on earth do they have in common? Said tells us that what binds them together
    is “their common background in Oriental legend and experience but also
    their learned reliance on the Orient as a kind of womb out of which they were
    brought forth “. What is the background of Oriental legend that inspired
    Burton or Lane? Was Flaubert’s vivid imagination stimulated by “Oriental
    legend”, and was this the same legendary material that inspired Burton,
    Lane and Lamartine? “Learned reliance on the Orient as a kind of womb…”
    is yet another example of Said’s pretentious way of saying the obvious, namely
    that they were writing about the Orient about which they had some experience
    and intellectual knowledge..

    Orientalism is peppered with meaningless sentences. Take, for example,
    “Truth, in short, becomes a function of learned judgment, not of the material
    itself, which in time seems to owe its existence to the Orientalist”. Said
    seems to be saying :‘Truth’ is created by the experts or Orientalists, and does
    not correspond to reality, to what is actually out there. So far so good. But
    then “what is out there” is also said to owe its existence to the
    Orientalist. If that is the case, then the first part of Said’s sentence makes
    no sense, and if the first part is true then the second part makes no sense.
    Is Said relying on that weasel word “seems” to get him out of the
    mess? That ruse will not work either; for what would it mean to say that an
    external reality independent of the Orientalist’s judgement also seems to be
    a creation of the Orientalist? That would be a simple contradiction. Here is
    another example: “The Orientalist can imitate the Orient without the opposite
    being true.” Throughout his book, Said is at pains to point out that there
    is no such thing as “the Orient”, which, for him, is merely a meaningless
    abstraction concocted by Orientalists in the service of imperialists and racists.
    In which case, what on earth could “The Orient cannot imitate the Orientalist”
    possibly mean? If we replace “the Orient” by the individual countries,
    say between Egypt and India, do we get anything more coherent? No, obviously
    not : “India, Egypt, and Iran cannot imitate the Orientalists like Renan,
    Bernard Lewis, Burton, et al.”. We get nonsense whichever way we try to
    gloss Said’s sentence.

    Contradictions

    At times, Said seems to allow that the Orientalists did achieve genuine positive
    knowledge of the Orient, its history, culture, languages, as when he calls Lane’s
    work Manners and Customs of the Modern Egyptians “a classic of historical
    and anthropological observation because of its style, its enormously intelligent
    and brilliant details”; or when he talks of “a growing systematic
    knowledge in Europe about the Orient”, since Said does not have sarcastic
    quotation marks around the word knowledge, I presume he means there was a growth
    in genuine knowledge. Further on, Said talks of Orientalism producing “a
    fair amount of exact positive knowledge about the Orient”. Again I take
    it Said is not being ironical when he talks of “philological discoveries
    in comparative grammar made by Jones,…”. To give one final example, Said
    mentions Orientalism’s “objective discoveries”.

    Yet, these acknowledgements of the real discoveries made by Orientalists are
    contradicted by Said’s insistence that there is no such thing as “truth”;
    or when he characterizes Orientalism as “a form of paranoia, knowledge
    of another kind, say, from ordinary historical knowledge”. Or again, “it
    is finally Western ignorance which becomes more refined and complex, not some
    body of positive Western knowledge which increases in size and accuracy”.
    At one point Said seems to deny that the Orientalist had acquired any objective
    knowledge at all, and a little later he also writes, “the advances made
    by a ‘science’ like Orientalism in its academic form are less objectively true
    than we often like to think”. It is true that the last phrase does leave
    open the possibility that some of the science may be true though less
    than we had hitherto thought. Said also of course wholeheartedly endorses Abdel
    Malek’s strictures against Orientalism, and its putatively false “knowledge”
    of the Orient.

    In his 1994 Afterword, Said insists that he has “no interest in,
    much less capacity for, showing what the true Orient and Islam really are”.
    And yet he contradicts this outburst of humility and modesty, when he claims
    that, “[The Orientalist’s] Orient is not the Orient as it is, but the Orient
    as it has been Orientalized”, for such a formulation assumes Said knows
    what the real Orient is. Such an assumption is also apparent in his statement
    that “the present crisis dramatizes the disparity between texts and reality”.
    In order to be able to tell the difference between the two, Said must know what
    the reality is. This is equally true when Said complains that “To look
    into Orientalism for a lively sense of an Oriental’s human or even social reality…is
    to look in vain”.

    Historical and Other Howlers

    For a work that purports to be a serious work of intellectual history, Orientalism
    is full of historical howlers. According to Said, at the end of the seventeenth
    century, Britain and France dominated the eastern Mediterranean, when in fact
    the Levant was still controlled for the next hundred years by the Ottomans.
    British and French merchants needed the permission of the Sultan to land. Egypt
    is repeatedly described as a British colony when, in fact, Egypt was never more
    than a protectorate; it was never annexed as Said claims. Real colonies, like
    Australia or Algeria, were settled by large numbers of Europeans, and this manifestly
    was not the case with Egypt.

    The most egregious error surely is where Said claims Muslim armies conquered
    Turkey before they overran North Africa. In reality, of course, the Arabs invaded
    North Africa in the seventh century, and what is now Turkey remained part of
    the Eastern Roman Empire and was a Christian country until conquered by the
    Seljuk Turks in late eleventh century. Said also writes “Macdonald and
    Massignon were widely sought after as experts on Islamic matters by colonial
    administrators from North Africa to Pakistan”. But Pakistan was never a
    colony, it was created in 1947 when the British left India. Said also talks
    rather oddly about the “unchallenged Western dominance” of the Portuguese
    in the East Indies, China, and Japan until the nineteenth century. But Portugal
    only dominated the trade, especially in the 16th century, and was
    never, as historian J.M.Roberts points out, “interested in the subjugation
    or settlement of large areas”. In China, Portugal only had the tiniest
    of footholds in Macao. The first decades of the seventeenth century witnessed
    the collapse of much of the Portuguese empire in the East, to be replaced by
    the Dutch. In the early eighteenth century there was a Dutch supremacy in the
    Indian Ocean and Indonesia. However, the Dutch like the Portuguese did not subjugate
    “the Orient” but worked through diplomacy with native rulers, and
    through a network of trading-stations. Said thinks that Carlyle and Newman were
    ‘liberal cultural heroes’! Whereas it would be more correct to characterize
    Carlyle’s works as the intellectual ancestry of fascism. Nor was Newman a liberal,
    rather a High Church Anglican who converted to Catholicism. Said also seems
    to think that Goldziher was German; Goldziher was of course a Hungarian. (One
    hopes that it is simply a typographical error in his 1994 Afterword which
    was responsible for the misspelling of Claude Cahen’s name.)

    Tendentious Reinterpretations

    The above errors can be put down to ignorance, Said is no historian, but it
    does put into doubt Said’s competence for writing such a book.

    Said also does not come across as a careful reader of Dante and his masterpiece,
    The Divine Comedy. In his trawl through Western literature for filth
    to besmirch Western civilization, Said comes across Dante’s description of Muhammad
    in Hell, and concludes “Dante’s verse at this point spares the reader none
    of the eschatological [sic!] detail that so vivid a punishment entails: Muhammad’s
    entrails and his excrement are described with unflinching accuracy”. First,
    Said does not seem to know the difference between scatological and eschatological,
    and second, we may ask how does he know that Dante’s description is unflinchingly
    accurate? He simply means, I presume, that it was highly graphic.

    Furthermore these illustrious Muslims were included precisely because of Dante’s
    profound reverence for all that was best in the non-Christian world, and their
    exclusion from salvation, inevitable under Christian doctrine, saddened him
    and put a great strain on his mind – gran duol mi prese al cor quando lo
    ’ntesi
    – great grief seized me at heart when I heard this. Dante was even
    much influenced by the Averroistic concept of the “possible intellect”.
    The same generous impulse that made him revere non-Christians like Avicenna
    and their nobleness made Dante relegate Muhammad to eternal punishment in the
    eighth circle of Hell, namely Dante’s strong sense of the unity of humanity
    and of all its spiritual values – universalis civilitas humani generis
    the universal community of the human race. He and his contemporaries
    in the late thirteenth and early fourteenth century had only the vaguest of
    ideas about the history and theology of Islam and its founder. Dante believed
    that Muhammad and Ali were the initiators of the great schism between Christianity
    and Islam. Dante like his contemporaries thought Muhammad was originally a Christian
    and a cardinal who wanted to become a pope. Hence Muhammad was a divider
    of humanity whereas Dante stood for the unity – the essential organic unity
    – of humankind. What Said does not see is that Dante perfectly exemplifies Western
    culture’s strong tendency towards universalism.

    Self -Pity, Post-Imperialist Victimhood and Imperialism

    In order to achieve his goal of painting the West in general, and the discipline
    of Orientalism in particular, in as negative a way as possible, Said has recourse
    to several tactics. One of his preferred moves is to depict the Orient as a
    perpetual victim of Western imperialism, dominance, and aggression. The Orient
    is never seen as an actor, an agent with free-will, or designs or ideas of its
    own. It is to this propensity that we owe that immature and unattractive quality
    of much contemporary Middle Eastern culture, self-pity, and the belief that
    all its ills are the result of Western-Zionist conspiracies. Here is an example
    of Said’s own belief in the usual conspiracies taken from “The Question
    of Palestine”: It was perfectly apparent to Western supporters of Zionism
    like Balfour that the colonization of Palestine was made a goal for the Western
    powers from the very beginning of Zionist planning: Herzl used the idea, Weizmann
    used it, every leading Isreali since has used it. Isreal was a device for holding
    Islam – later the Soviet Union, or communism – at bay “. So Isreal was
    created to hold Islam at bay!

    As for the politics of victimhood, Said has “milked it himself to an indecent
    degree”. Said wrote: “My own experiences of these matters are in part
    what made me write this book. The life of an Arab Palestinian in the West, particularly
    in America, is disheartening. There exists here an almost unanimous consensus
    that politically he does not exist, and when it is allowed that he does, it
    is either as a nuisance or as an Oriental. The web of racism, cultural stereotypes,
    political imperialism, dehumanizing ideology holding in the Arab or the Muslim
    is very strong indeed, and it is this web which every Palestinian has come to
    feel as his uniquely punishing destiny”.

    Such wallowing in self-pity from a tenured, and much-feted professor at Columbia
    University, where he enjoys privileges which we lesser mortals only dream of,
    and a decent salary, all the while spewing forth criticism of the country that
    took him in and heaped honours on him, is nauseating. As Ian Buruma concluded
    in his review of Said’s memoir, Out of Place, “The more he dwells
    on his suffering and his exile status, the more his admirers admire him. On
    me, however, it has the opposite effect. Of all the attitudes that shape a memoir,
    self-pity is the least attractive”.

     Said’s Anti-Westernism

    In his 1994 Afterword, Said denies that he is anti-Western, he denies
    that the phenomenon of Orientalism is a synecdoche of the entire West, and claims
    that he believes there is no such stable reality as “the Orient” and
    “the Occident”, that there is no enduring Oriental reality and even
    less an enduring Western essence, that he has no interest in, much less capacity
    for, showing what the true Orient and Islam really are.

    Denials to the contrary, an actual reading of Orientalism is enough
    to show Said’s anti-Westernism. While he does occasionally use inverted commas
    around “the Orient” and “the Occident”, the entire force
    of Said’s polemic comes from the polar opposites and contrasts of the East and
    the West, the Orient and Europe, Us and the Other, that he himself has rather
    crudely set up.

    Said wrote, “I doubt that it is controversial, for example, to say that
    an Englishman in India or Egypt in the later nineteenth century took an interest
    in those countries that was never far from their status in his mind as British
    colonies. To say this may seem quite different from saying that all academic
    knowledge about India and Egypt is somehow tinged and impressed with, violated
    by, the gross political fact [of imperialism] – and yet that is what I am
    saying
    in this study of Orientalism”.[ Emphasis in original ]

    Here is Said’s characterisation of all Europeans: “It is therefore correct
    that every European, in what he could say about the Orient, was consequently
    a racist, an imperialist, and almost totally ethnocentric”. In other words
    not only is every European a racist, but he must necessarily be so.

    A part of Said’s tactics is to leave out Western writers and scholars who do
    not conform to Said’s theoretical framework. Since, arguably, for Said, all
    Europeans are a priori racist, he obviously cannot allow himself to quote
    writers who are not. Indeed one could write a parallel work to Orientalism
    made up of extracts from Western writers, scholars, and travellers who were
    attracted by various aspects of non-European cultures, which they praised and
    contrasted favourably with their own decadence, bigotry, intolerance, and bellicosity.

    Said makes much of Aeschylus’ The Persians, and its putative permanent
    creation of the “Other” in Western civilization. But Aeschylus can
    be forgiven his moment of triumphalism when he describes a battle in which he
    very probably took part in 480 B.C., the Battle of Salamis,
    on which the very existence of fifth-century Athens depended. The Greeks destroyed
    or captured 200 ships for the loss of forty, which for Aeschylus was symbolic
    of the triumph of liberty over tyranny, Athenian democracy over Persian Imperialism,
    for it must not be forgotten that the Persians were ruthless imperialists whose
    rule did not endear them to several generations of Greeks.

    Furthermore had he delved a little deeper into Greek civilization and history,
    and looked at Herodotus’ great history, Said would have encountered two features
    which were also deep characteristics of Western civilization and which Said
    is at pains to conceal and refuses to allow: the seeking after knowledge for
    its own sake, and its profound belief in the unity of mankind, in other words
    its universalism. The Greek word, historia, from which we get our “history”,
    means “research” or “inquiry”, and Herodotus believed his
    work was the outcome of research: what he had seen, heard, and read but supplemented
    and verified by inquiry. For Herodotus, “historical facts have intrinsic
    value and rational meaning”. He was totally devoid of racial prejudice
    – indeed Plutarch later branded him a philobarbaros, whose nearest modern
    equivalent would be “nigger-lover”- and his work showed considerable
    sympathy for Persians and Persian civilization. Herodotus represents Persians
    as honest – “they consider telling lies more disgraceful than anything
    else” – brave, dignified, and loyal to their king. As to the religions
    of the various peoples he studied, Herodotus showed his customary intellectual
    curiosity but also his reverence for all of them, because “all men know
    equally about divine things”.

    It was left to Montaigne, under the influence of Peter Martyr, to develop the
    first full- length portrait of the noble savage in his celebrated essay “On
    Cannibals
    “,( c. 1580) which is also the source of the idea of cultural
    relativism. Deriving his rather shaky information from a plain, simple fellow,
    Montaigne describes some of the more gruesome customs of the Brazilian Indians
    and concludes:

    I am not so anxious that we should note the horrible savagery of these
    acts as concerned that, whilst judging their faults so correctly, we should
    be so blind to our own. I consider it more barbarous to eat a man alive than
    to eat him dead; to tear by rack and torture a body still full of feeling, to
    roast it by degrees, and then give it to be trampled and eaten by dogs and swine
    – a practice which we have not only read about but seen within recent memory,
    not between ancient enemies, but between neighbours and fellow-citizens and,
    what is worse, under the cloak of piety and religion – than to roast and eat
    a man after he is dead.

    Elsewhere in the essay, Montaigne emphasises their inevitable simplicity, state
    of purity and freedom from corruption. Even their “fighting is entirely
    noble”. Like Peter Martyr, Montaigne’s rather dubious, second hand knowledge
    of these noble savages does not prevent him from criticising and morally condemning
    his own culture and civilisation: “[We] surpass them in every kind of barbarity”.

    The attitude of Voltaire can be seen as typical of the entire 18th century.
    Voltaire seems to have regretted what he had written of Muhammad in his scurrilous,
    and to a Muslim blasphemous, play Mahomet (1742), where the Prophet is presented as
    an impostor who enslaved men’s souls: “Assuredly, I have made him out to
    be more evil than he was”.

    But, Voltaire, in his Essai sur les Moeurs,1756, and various entries
    in the Philosophical Dictionary, shows himself to be prejudiced in Islam’s favour
    at the expense of Christianity in general, and Catholicism in particular.

    Gibbon, like Voltaire, painted Islam in as favourable a light as possible in
    order to better contrast it with Christianity. He emphasised Muhammad’s humanity
    as a means of indirectly criticising the Christian doctrine of the divinity of Christ. His
    anti-clericalism led Gibbon to underline Islam’s supposed freedom from that
    accursed class, the priesthood. Gibbon’ s deistic view of Islam as a rational,
    priest-free religion, with Muhammad as a wise and tolerant lawgiver enormously
    influenced the way all Europeans perceived a sister religion for years to come.

    The important thing to emphasize here is the biased nature of Said’s apparently
    learned and definitive selection; I could just as easily go through Western
    Literature and illustrate the opposite point to the one he is making. Furthermore,
    my selection is not of some peripheral figures culled from the margins of Western
    culture, but the very makers of that culture, figures like Montaigne, Bayle,
    Voltaire, Gibbon, Lessing and some I have not quoted like Montesquieu (The
    Persian Letters
    , 1721) and Diderot (Supplément au Voyage de Bougainville,
    1772), the latter two exemplifying the European Enlightenment’s appeal to reason,
    objective truth and universalist values.

    Misunderstanding of Western Civilization

    The golden thread running through Western civilization is rationalism. As Aristotle
    said, Man by nature strives to know. This striving for knowledge results in
    science, which is but the application of reason. Intellectual inquisitiveness
    is one of the hall marks of Western civilisation.

    Vulgar Marxists, Freudians, and Anti-Imperialists, who crudely reduce all human
    activities to money, sex, and power respectively, have difficulties in understanding
    the very notion of disinterested intellectual inquiry, knowledge for knowledge’s
    sake.

    One should remind Said that it was this desire for knowledge on the part of
    Europeans that led to the people of the Near East recovering and discovering
    their own past and their own identity. In the nineteenth and early twentieth
    century archaeological excavations in Mesopotamia, Ancient Syria, Ancient Palestine
    and Iran were carried out entirely by Europeans and later Americans – the disciplines
    of Egyptology, Assyriology, Iranology which restored to mankind a large part
    of its heritage were the exclusive creations of inquisitive Europeans and Americans.
    Whereas, for doctrinal reasons, Islam deliberately refused to look at its pre-Islamic
    past, which was considered a period of ignorance.

    It is also worth pointing out that often the motives, desires, and prejudices
    of a scholar have no bearing upon the scientific worth of a scholar’s contribution.
    Again, vulgar Marxists, for example, dismiss an opponent’s arguments not on
    any scientific or rational grounds but merely because of the social origins
    of the scholar concerned.

    Said, Sex, and Psychoanalysis

    If Said can be said to have a bête-noir, it must surely be Bernard
    Lewis. Said has a sentence where he accuses Lewis of persisting “in such
    ‘philological’ tricks as deriving an aspect of the predilection in contemporary
    Arab Islam for revolutionary violence from Bedouin descriptions of a camel rising”.
    Said, twenty five years on, still has not forgotten his battle with Lewis on
    the issue of a camel rising, to which I will now turn. In Orientalism,
    Said quotes from Lewis’ essay “Islamic Concepts of Revolution”:

    In the Arabic-speaking countries a different word was used for [revolution]
    thawra. The root th-w-r in Classical Arabic meant to rise up (e.g.
    of a camel), to be stirred or excited, and hence, especially in Maghribi usage,
    to rebel. It is often used in the context of establishing a petty, independent
    sovereignty; thus, for example, the so-called party kings who ruled in eleventh
    century Spain after the break-up of the Caliphate of Cordova are called thuwwar
    (sing. tha’ir). The noun thawra at first means excitement, as
    in the phrase, cited in the Sihah, a standard medieval Arabic dictionary, intazir
    hatta taskun hadhihi ’lthawra
    , wait till this excitement dies down – very
    apt recommendation. The verb is used by al-Iji, in the form of thawaran
    or itharat fitna, stirring up sedition, as one of the dangers which should
    discourage a man from practising the duty of resistance to bad government. Thawra
    is the term used by Arabic writers in the nineteenth century for the French
    Revolution, and by their successors for the approved revolutions, domestic and
    foreign, of our own time.

    Among Said ’s conclusions is :

    Lewis’s association of thawra with a camel rising and generally
    with excitement (and not with a struggle on behalf of values) hints much more
    broadly than is usual for him that the Arab is scarcely more than a neurotic
    sexual being. Each of the words or phrases he uses to describe revolution is
    tinged with sexuality: stirred, excited, rising up. But for the most
    part it is a ‘bad’ sexuality he ascribes to the Arab.

    Can any rational person have drawn any conclusion which even remotely resembled
    that of Edward Said’s from Lewis’s scholarly discussion of Classical Arabic
    etymology?

    Orientalists’ Complicity in Imperialism

    One of Said’s major theses is that Orientalism was not a disinterested activity
    but a political one, with Orientalists preparing the ground for and colluding
    with imperialists: “To say simply that Orientalism was a rationalization
    of colonial rule is to ignore the extent to which colonial rule was justified
    in advance by Orientalism, rather than after the fact”. The Orientalist
    provides the knowledge that keeps the Oriental under control: “Once again,
    knowledge of subject races or Orientals is what makes their management easy
    and profitable; knowledge gives power, more power requires more knowledge, and
    so on in an increasingly profitable dialectic of information and control”.

    This is combined with Said’s thesis derived from the Coptic socialist thinker,
    Anwar Abdel Malek, that the Orient is always seen by the Orientalists as unchanging,
    uniform and peculiar, and Orientals have been reduced to racist stereotypes,
    and are seen as ahistorical ‘objects’ of study “stamped with an otherness…of
    an essentialist character….”. The Orientalists have provided a false
    picture of Islam: “Islam has been fundamentally misrepresented in the West”.
    Said adds Foucault to the heady mix; the French guru convinced Said that Orientalist
    scholarship took place within the ideological framework he called ‘discourse’
    and that “the real issue is whether indeed there can be a true representation
    of anything, or whether any and all representations, because they are
    representations, are embedded first in the language and then in the culture,
    institutions, and political ambience of the representer. If the latter alternative
    is the correct one (as I believe it is), then we must be prepared to accept
    the fact that a representation is eo ipso implicated, intertwined, embedded,
    interwoven with a great many other things besides the ‘truth,’ which is itself
    a representation”.

    It takes little thought to see that there is a contradiction in Said’s major
    thesis. If Orientalists have produced a false picture of the Orient, Orientals,
    Islam, Arabs, and Arabic society – and, in any case, for Said, there is no such
    thing as “the truth” – then how could this false or pseudo- knowledge
    have helped European imperialists to dominate three-quarters of the globe? ‘Information
    and control’ wrote Said, but what of ‘false information and control ’?

    Orientalists Fight back

    For a number of years now, Islamologists have been aware of the disastrous
    effect of Said’s Orientalism on their discipline. Professor Berg has
    complained that the latter’s influence has resulted in “a fear of asking
    and answering potentially embarrassing questions – ones which might upset Muslim
    sensibilities….”.

    For Clive Dewey, Said’s book “was, technically, so bad; in every respect,
    in its use of sources, in its deductions, it lacked rigour and balance. The
    outcome was a caricature of Western knowledge of the Orient, driven by an overtly
    political agenda. Yet it clearly touched a deep vein of vulgar prejudice running
    through American academe”.

    The most famous modern scholar who not only replied to but who wiped the floor
    with Said was, of course, Bernard Lewis. Lewis points to many serious errors
    of history, interpretation, analysis and omission. Lewis has never been answered
    let alone refuted.

    Negative Arab and Asian Reaction to Said’s Orientalism

    It must have been particularly galling for Said to see the hostile reviews
    of his Orientalism from Arab, Iranian or Asian intellectuals, some of
    whom he admired and singled out for praise in many of his works. For example,
    Nikki Keddie, praised in Covering Islam, talked of the disastrous influence
    of Orientalism, even though she herself admired parts of it:

    I think that there has been a tendency in the Middle East field to adopt
    the word ‘orientalism’ as a generalized swear-word essentially referring to
    people who take the ‘wrong’ position on the Arab-Israeli dispute or to people
    who are judged too ‘conservative’. It has nothing to do with whether they are
    good or not good in their disciplines. So “orientalism” for many people
    is a word that substitutes for thought and enables people to dismiss certain
    scholars and their works. I think that is too bad. It may not have been what
    Edward Said meant at all, but the term has become a kind of slogan.

    Kanan Makiya, the eminent Iraqi scholar, chronicled Said’s disastrous influence
    particularly in the Arab world:

    Orientalism as an intellectual project influenced a whole generation
    of young Arab scholars, and it shaped the discipline of modern Middle East studies
    in the 1980s.The original book was never intended as a critique of contemporary
    Arab politics, yet it fed into a deeply rooted populist politics of resentment
    against the West. The distortions it analyzed came from the eighteenth and nineteenth
    centuries, but these were marshalled by young Arab and “pro-Arab “scholars
    into an intellectual-political agenda that was out of kilter with the real needs
    of Arabs who were living in a world characterized by rapidly escalating cruelty,
    not ever-increasing imperial domination.

    Though he finds much to admire in Said’s Orientalism, the Syrian philosopher
    Sadiq al- ‘Azm finds that “the stylist and polemicist in Edward Said very
    often runs away with the systematic thinker”. Al-‘Azm also finds Said guilty
    of the very essentialism that Said ostensibly sets out to criticise, perpetuating
    the distinction between East and West.

    Nadim al-Bitar, a Lebanese Muslim, finds Said‘s generalizations about all Orientalists
    hard to accept, and is very skeptical about Said having read more than a handful
    of Orientalist works. Al-Bitar also accuses Said of essentialism, “[Said]
    does to [Western] Orientalism what he accuses the latter of doing to the Orient.
    He dichotomizes it and essentializes it. East is East and West is West and each
    has its own intrinsic and permanent nature….”

    The most pernicious legacy of Said’s Orientalism is its support for
    religious fundamentalism, and on its insistence that “all the ills [of
    the Arab world] emanate from Orientalism and have nothing to do with the socio-economic,
    political and ideological makeup of the Arab lands or with the cultural historical
    backwardness which stands behind it”.

  • The Rise of the Info-Novel

    What was it you wanted from that big new novel? If you’re looking for an education
    about Victorian brothels, Dante studies during the 19th century,
    iconography and iconology in art history, the structure and function of railroads,
    the Allied retreat to Dunkirk, British scientific expeditions in the Himalaya,
    Bobby Thomson’s Brooklyn-crushing dinger or any number of other subtopics in
    history, philosophy, business or law, then you’ll likely find it satisfying
    enough. But if you’re looking for the promise of invention, for a world created
    and set in motion, for characters who grapple with ethical and moral dilemmas
    that radically transform their perspective – the elements that make a great
    and true novel – you’ll be disappointed.


    I’m not arguing that the novel is dead. Partly because it’s not – sales of
    both new literary fiction and classics are up over the last few years, if anemically
    – and partly because it’s such a tedious argument to make. (And inevitably,
    someone else will make it.) Rather, it’s become a bloated, creaking mess. The
    contemporary novel, as Jonathan Safron Foer recently remarked, is ‘stuffed with
    crap.’ It’s filled to the gills. What happened?


    Novelists have traded a critical literary mission for one dictated by literary
    critics. Indeed, it’s not even true literary critics who have mislaid the course
    of the contemporary novel. Rather, cultural theorists posing as English professors
    but primarily engaged in interdisciplinary studies seek to understand literature
    in terms of another field that is, invariably, their real interest. The list
    of theorists and their disciplines is long and marvelously varied. Anthropology
    and sociology animated Claude Levi Strauss and informs the work of Structuralists;
    linguistics preoccupied de Saussure, Chomsky and Barthes; communism and the
    relationship of labor to capital drove Terry Eagleton and the dialectical materialists
    years after the end of history; psychoanalysis was the true profession of Lacan
    and Kristeva; queer theory and socially constructed ideas about sexuality were
    Foucault’s overarching interest, just as cultural hegemony and colonialism remain
    Edward Said’s focus (we are to understand Pride and Prejudice in terms
    of the exploitation of far-flung colonies, which made the English estate such
    a magnificent place for a fling); gender studies is the passion of Judith Butler;
    race the central concern of an array of critics such as Appiah and Gates; and
    of course semiotics and its derivatives the play of Derrida and his merry if
    baffling band of pranksters. These critics have widely divergent aims, yet they
    share a single prejudice: each has relegated novelists to aimless scriveners
    and novels to texts no more authoritative than advertising copy or the back
    of a cereal box. The kind of close reading favored by the New Critics, which
    presupposed a view of literature as worthy of analysis on its own terms, hasn’t
    been seriously undertaken in a half century.


    It’s not surprising that critics celebrate novels which reflect their own prejudices
    and presuppositions. Fiction built on nonlinear narratives, informed by indeterminacy,
    skepticism, and of course moral relativism, and stuffed with reams of data about
    topics in finance, philosophy, technology and history, usually with swatches
    of intertextual material (letters, legal briefs, patents, pop songs, schema
    and diagrams) are fashionably media-friendly. There is also a discernible political
    element to the trend. Theorists tend to reside on the Left, and they invariably
    conflate conservative aesthetics with right-wing politics, essays and reviews.
    The kind of well-made novel that John Irving produces is implicitly denigrated
    as conservative regardless of its characters impressive capacity for extramarital
    sex, drug use and general debauchery.


    Critical theorists have been the real stars of the academy for nearly three
    decades, their supremacy challenged only for a few years in the late 1990s,
    when intellectual property mavens, electrical engineers and an assortment of
    related technologists briefly eclipsed them. The popularity of the technologists
    has proved only modestly more durable than the stock market bubble they inspired,
    while the influence of theorists prevails. It does not stop at the university
    gates; it suffuses the popular, book-reviewing press. Novelists are hip to critical
    theory as well, particularly now that so many of them spend less time in cafes
    than classrooms, the royalties from their art insufficient to cover the cost
    of a latte.


    The Info-novel is the inexorable result. Novelists have proved impressionable,
    quick studies, recalibrating their aesthetic objectives to reflect those of
    the critical theorists they emulate. In an effort to lend weight to narrative
    and justify fiction at a time when literary criticism has displaced literature
    and technology has displaced religion as a source of meaning (to the extent
    it can be ascertained), fiction writers have unmoored blocks of sociological,
    industrial, pop-culture, linguistic and political information and dropped them
    in, largely undigested, into their work. Just as theorists have brought interdisciplinary
    studies to bear on fiction, novelists have taken whole subject areas and downloaded
    them into their novels. It is as if, confronted by the infinite amount of information
    available online, they have decided to cram in as much of it as possible. Perhaps
    writers believe information is the representative condition of our time, and
    the novel must reflect it.


    If so, you’d expect characters who tangle with overwhelming mass of information
    and complexity, who are either defeated or alienated by it – a L’Etrangere for
    the information age – but the Info-novel is something quite different. It exists
    to convey information rather than comment upon it. It is more interested in
    its structural and technical components than in utilizing them in the service
    of an imaginative world. And it represents a radical diminution, a failure of
    nerve, a crisis of faith in the project of fiction.


    Mailer once despaired of timid talents, exhorting younger authors to write
    as if they were running for president. (This was the political Mailer of the
    late 60s, and he meant it as a great complement.) The old pugilist hasn’t mellowed
    – in a recent talk with Charlie Rose, Mailer demanded the same of Franzen –
    proving he’s more perceptive than the critics, who have mistaken the Info-novel
    for literary ambition. Writing in The New York Times Book Review, Judith
    Shulevitz gushed, ‘Novelists, in short, have become our public intellectuals
    – our polymaths, our geographers, our scholars of the material world.’ Shulevitz’
    sole reservation is that the characters in works by Delillo, Franzen and Eugenides
    aren’t themselves intellectuals, or even particularly perceptive. For a really
    smart and self-aware fictional character, she writes, you have to wade through
    a Richard Powers novel.


    Powers’ characters are a well-educated and hyperarticulate bunch, but they’re
    not, as Harold Bloom wrote of Shakespeare’s leads, capable of interpreting their
    own thoughts and actions, changing and growing as a result. Shakespeare may
    set too high a bar, but even with more modest expectations, encyclopedic novels
    fail.


    It is a truism of creative writing programs that in serious fiction character
    drives plot, not the other way around. In spy novels, murder mysteries and prefabbed
    legal thrillers, characters serve plot, which tends to chew them up and spit
    them out. In literary fiction, characters fill and organize the story around
    them. Fully realized characters demand to be understood on their own terms.
    The big new tomes violate this maxim, which is no less true for being tired.
    Franzen’s The Corrections has much to say about campus politics at second-tier
    schools, the aftermath of the 1990s technology bubble, corrupt business practices
    in post-Soviet economies, even restaurant operations, but its characters can
    barely find their way through the maze and are hemmed in by towers of information.
    They are slathered with it, and drowning. This is not to conflate Franzen, who
    won last year’s Pulitzer Prize, or Andrea Barrett, whose collection, Servants
    of the Map
    was nominated for this year’s honor, with John Grisham. A Grisham
    thriller has very different ambitions, and they usually succeed on their own
    terms. The same cannot be said for Barrett, whose characters are 19th-century
    botanists, biologists, physicians and explorers, and who exist primarily to
    convey the historical and scientific information that animates them – unless
    Barrett’s real interest is the history of science, particularly Victorian-era
    science, and not fiction. But then why is she wasting our time?


    American critics have trumpeted Franzen, Delillo and Powers, but the Info-novel
    is hardly a U.S. phenomenon. Michel Houellebecq’s wildly celebrated Atomized,
    contains tracts of political and moral philosophy, chemical and nuclear science,
    even modern French social history more befitting a monograph. Julian Barnes,
    who might have been expected to know better, called Atomized ‘a novel
    which hunts big game while others settle for shooting rabbits.’ The sort of
    hunting Barnes got so excited about typically runs something like:

    Individuality, and the sense of freedom that flows from it, is the natural
    basis of democracy. In a democratic regime, relations between individuals
    are commonly regulated by a social contract. A pact which exceeds the natural
    rights of the co-contractors, or which does not correspond to a clear retraction
    clause, is considered de facto null and void.

    And big game included tracts that might otherwise have passed for hack journalism:

    A number of other important events in 1974 further advanced the cause of
    moral relativism. The first Vitatop club opened in Pars on 20 March; it was
    to play a pioneering role in the cult of the body beautiful. The age of majority
    was lowered to 18 on 5 July, and divorce by mutual consent was officially
    recognized on the eleventh, thus removing adultery from the penal code. Lastly,
    on 28 November, after a stormy debate described by commentators as ‘historic’,
    the Veil act legalizing abortion was adopted, largely thanks to lobbying by
    the left.

    The agnosticism at the heart of the French republic would facilitate the
    progressive, hypocritical and slightly sinister triumph of the determinist
    worldview. On temperate climates, the body of a bird or mammal first attracts
    specific species of flies (Musca, Curtoneura), but once decomposition has
    begun to set in, these are joined by others, particularly Calliphora and Lucilia.
    Under the combined action of bacteria and the digestive juices disgorged by
    the larvae, the corpse begins to liquefy and becomes a ferment of butyric
    and ammoniac reactions.

    Barnes wasn’t alone. Magazines and newspapers described Atomized as
    a novel of ideas. It isn’t. A novel of ideas is something else, in which characters
    grapple with ethical and moral challenges. The Magic Mountain, for example,
    is among other things a richly animated version of Nietzschean tragedy. A reader
    finishing Mann’s masterpiece has a serviceable understanding of Nietzsche’s
    opposing forces: the Apollonian, which represented order, reason, clarity and
    harmony; and the Dionysian, denoting wild creativity, free-spirited and usually
    drunken play. Many philosophers incorrectly interpret Nietzsche’s conception
    of tragedy to elevate the Dionysian over the sterile Apollonian, but Nietzsche
    was a subtler thinker, and Mann was subtler still. Just as Nietzsche demanded
    an ethics beyond good and evil, Mann created characters who balance, at least
    for a time, Apollonian and Dionysian forces in their personality and the world
    they inhabit.


    Other great writers have approached the novel of ideas similarly. Tolstoy understood
    the difference between fiction and historiography, and probably for that reason
    his discourses on military history in War and Peace are segregated in
    separate chapters. It was as if he didn’t want to spoil the novel itself. Even
    without these discursions, War and Peace is a novel of ideas, for in
    one respect it is about how people function in wartime, continue to fall in
    and out of love in a rapidly changing society. Maybe there’s something about
    the Russians. Brothers Karamazov may be the apotheosis of the novel of
    ideas. Dostoevsky’s fiction examines good and evil, faith and despair, realism
    and mysticism – all without telling the reader anything, without requiring a
    lesson.


    The novel of ideas need not run on for 800 pages, nor is it the province of
    long-dead white guys. Alice Munro has been turning out brilliant stories for
    the past two decades. Sebald evoked memory and time as well as Proust, and a
    lot more economically. And while Updike and Mailer have slowed and turned out
    their weakest work in decades, and Bellow is powering down the laptop, the last
    three or four Philip Roth novels have been daring and brilliant.


    The most celebrated younger writers have ignored these examples to pursue Tom
    Wolfe’s prescription for reflecting modern society whole, attuned to its cartooned,
    pop-culture components. Their novels are usually delivered in a steroid-juiced
    voice that critic and novelist James Woods has called ‘hysterical realism:’
    clever compilations of consumer references splattered throughout essays on politics,
    business and pop culture. This has happened before. The naturalistic novel,
    particularly in its 19th century French instantiation (Dickens never
    lost sight of his story), was preoccupied with the mechanics of recently industrialized
    society, and its vast plots impressed characters into an overpopulated chorus.
    Naturalism had a long hangover, even in English, as anyone who’s tried to wade
    through Dreiser must know. And yet from naturalism sprang the multivalent brilliance
    of modernism. There is hope.


    The novel has no Platonic form, and there is certainly no requirement that
    writers adhere to a formula or set of rules. The novel is not a haiku or sonnet,
    nor even a movie, with its well-observed limits of length and perspective. Fiction
    must have room to grow, reinvent and reassert itself. As readers, we must stand
    aside and grant it some latitude. Yet we can be both open-minded and demanding.
    As Dr. Johnson said, the essence of poetry is invention. Art strives to answer
    the question, What is life? The Info-novel, bereft of poetry and barren of invention,
    succeeds only as a clever construction, an amalgamation of data, posing again
    and again the same stupid question.


    Peter Lurie is general counsel of Virgin Mobile USA, a wireless voice and
    internet service. The views expressed are his own.

  • The Optimist’s Slaughter

    Early on every thinking man makes the conscious or unconscious decision whether
    to view the cup of life as half full, or dry as the Garagum Desert. Those whose
    cup is half full are the world’s optimists, the Pollyannas and the kind of people
    to be avoided at all costs, particularly at parties. In America they are, according
    to Gallup, the majority (64 percent). These are the same folks who wave flags,
    join the PTA, bet on the Cubs, and get caught in thunderstorms without an umbrella
    and hopefully catch pneumonia. Pessimists, by my estimate, make up about 10
    percent of the American population. The other 26 percent couldn’t care less,
    and were probably too busy watching professional wrestling to bother answering
    the survey. Curiously, Kenyans are the world’s most optimistic people, though
    god knows why. And their neighbors in Zimbabwe are the most pessimistic, which
    raises the question: what do the Zimbabs know that the Kenyans don’t?


    My suspicion is that the reason for the generally low opinion held of the pessimist
    is related to his close ties to the critic, the cynic, the misanthrope, the
    whiner and the curmudgeon. Personally, the only one of these people I find objectionable
    is the whiner. The critic plays a vital role in society by helping us to distinguish
    the gold from the dross, the wheat from the chaff. The cynic – who by definition
    distrusts people’s motives – is the type I want checking other people’s baggage
    at the airport, though not necessarily my own. The misanthrope is a harmless
    cuss, keeping mainly to herself and bothering no one, asking only that you not
    bother her. And the loveable curmudgeon is responsible for most of literature’s
    best quotations, maxims and aphorisms. But the whiner is another animal altogether.
    The whiner has no saving grace whatsoever, and is as annoying as fingers on
    a dry chalkboard.


    I find the pessimist to be, if not exactly pleasant, then at least sincere.
    You always know what a pessimist is thinking, and if you don’t he will likely
    tell you anyway. The optimist, on the other hand, has always struck me as an
    imposter, walking around with her nose toward the heavens as if she inhabited
    a better world than you and me, a land of warm green meadows where the sun always
    shines and there is never a drought because of it. Not only that but the optimist
    is forever hypocritically criticizing your negativity as though it were some
    kind of birth defect. The pessimist does not go around saying, "Quit being
    so dadgum happy!" Or "Stop that smiling will you!" But the optimist
    has no problem trying to change your natural disposition. "Smile!"
    she snaps. "Quit being so pessimistic!" It may also be that I associate
    the optimist with the affected cheerfulness of the politician angling for votes,
    or the faux-friendly, chirpy male or female who goes into public relations,
    used car sales and telemarketing. And anyone who has spent any amount of time
    with one of these phonies can testify that once off the clock they morph into
    the most cynical, black-hearted bastards in the land.


    But even the pessimist can become weary of too much pessimism, and, as in all
    things, moderation is key. It would be folly to think that the pessimist is
    without hope, without expectations. He simply tempers his hopefulness with common
    sense, reason and those lessons gained from hard experience. Every man, from
    time to time, has a few good words for his fellows, even the pessimist, but
    the optimist goes overboard. She gazes at the world through grossly distorted
    glasses, refusing to focus on reality.


    Since the optimist has failed miserably in transforming the pessimist through
    various methods of harassment and blacklisting, she has come to rely more and
    more heavily on pseudo-science and quackery to make her case. Such a study,
    undertaken recently by the Mayo Clinic in Rochester, Minnesota, found that optimists
    live longer and are healthier than pessimists. By reviewing medical records
    of 839 people living around Rochester, Minnesota, researchers were able to relate
    a patient’s health and longevity to his or her outlook on life, and found that
    pessimists are less likely to reach their life expectancy. This presupposes
    that pessimists necessarily want to live longer, which is highly debatable,
    particularly if you have to spend your life around Rochester, Minnesota.


    The godfather of the positive-thinking mafia was the Rev. Dr. Norman Vincent
    Peale, whose famous book of sermons The Power of Positive Thinking launched
    the multi-billion dollar self-help book industry and gave wings to the motivational
    speaker racket. Dr. Peale’s book can be summarized simply and succinctly, based
    on this one commandment: Pray more and put your faith in God, and happiness
    and confidence shall be yours. This kind of simplistic direction went over big
    with the Arkansas fishwives, who, I am given to understand, since putting Dr.
    Peale’s wizardries to work have never been happier. And I couldn’t be happier
    for them.


    The psychologist Julie K. Norem has done all us critics a great service with
    her book, The Power of Negative Thinking. Unlike the Rev. Dr. Peale,
    Dr. Norem has her lovely toes planted firmly in the black soil of this world
    and helpfully suggests that one sure-fire way to avoid embarrassment, disaster
    and heart-break is by "imagining all of the worst-case scenarios."
    This is Dr. Norem’s prescription so as not to be taken off guard by sudden and
    unpleasant surprises. Conversely, if Dr. Peale were delivering a talk, and suddenly
    found his notes whisked away by a cyclone wind and his bloomers set afire, his
    positive thinking would scarcely see him through the remainder of his sermon,
    which is no doubt a good thing for those of us in the audience.


    If pessimism has a spiritual godfather it is perhaps the German philosopher
    Arthur Schopenhauer. It has undoubtedly not escaped your notice that the godfather
    of optimism was a quack doctor and backwoods preacher, while the founder of
    the school of doomsaying was a legitimate philosopher. Likewise, I’m going to
    assume you know what Herr Schopenhauer stood for, since it is impolite to talk
    down to your readers, and since I myself can’t make heads or tails out of half
    of what he’s saying. But since he was a pessimist we can safely assume that
    he thought things were pretty rotten in Denmark, or Prussia, or wherever he
    happened to be staying at the time.


    As a philosophy, I cannot say that I find pessimism very useful in its practical
    applications. The fact is I don’t find any philosophies very useful, especially
    on a job resume. The pessimist philosopher holds the doctrine or belief that
    this is the worst of all possible worlds and that all things ultimately tend
    toward evil. I happen to know that the worst of possible worlds is Mercury where
    it is always hotter than a July firecracker! As for all things ultimately tending
    toward evil, well that may be stretching things a bit. I’d say all things ultimately
    tend to suck. Case in point: Have you seen The Simpsons lately?


    Going back even farther, we find the Cynics, a school of philosophers who haunted
    the back roads and academies of ancient Greece around 4 BC. The Cynics preached
    that independence and self-control are essential to virtue. Despising the budding
    Greek civilization, as well as money, pleasure, and personal comfort, they advocated
    the simple life, which in 4 BC, probably wasn’t all that difficult. That is,
    how much money and how many luxuries did a philosophy major in Ancient Greece
    really have to give up? "Instead of driving my Mercedes to my non-job I’ll
    take the train. Wait, there are no trains. So I’ll walk. Wait, everybody else
    is walking too. Maybe I’ll walk backwards. Or crawl…"


    Oscar Wilde believed the basis for optimism was sheer terror – that optimists
    are simply unable to deal with the likely outcomes and common tragedies of life.
    In other words, the optimist lives in a deluded, unreal mind-state whose sunny,
    evergreen landscapes in no way resemble the real world. The optimist, it follows,
    may be said to have mental health issues, which explains Havelock Ellis’ observation
    that "the place where optimism flourishes is the lunatic asylum."
    Pessimism, then, tempered with a fine sense of humor and the ability to laugh
    at life’s numerous and constant absurdities, may be the healthiest response
    of all. And I say that with all of the unbounded confidence and positivity of
    a true sourpuss.


    Christopher Orlet can be emailed here.

  • Poetry and the Politics of Self-Expression

    You say, as I have often given tongue
    In praise of what another’s said or sung.
    ‘Twere politic to do the like by these;
    But was there ever a dog that praised his fleas?

    William Butler Yeats

    Some years ago, a mentor of mine put forth the argument: “Would you try to build a cabinet when you did not posses even the rudimentary woodworking skills or knowledge of the tools necessary to build the cabinet? Of course not, then why do so many people think they can write poetry without an iota of preparation?”

    Still, many do. “Pop vocalists pose as opera singers. Important art museums exhibit installations that the cleaning staff mistakes for trash. Obscenity-riddled recitations, imposed over rhythm tracks, are reckoned to be music.” (Sarah Bryan Miller, St. Louis Post-Dispatch) So why should poetry be held to any standard – other than the “validation” of its author and his inalienable right to self-expression?

    Numerous surveys, declining SAT scores, and classroom anecdotes have established that many (and their numbers are growing) young Americans can barely read, cannot spell or do arithmetic, and know next to nothing of their own history; but they do not let mere ignorance get in the way of self-expression. And this popular wave of “self-expression,” more often than not, takes the form of poetry. This is not to say that we’ve become a nation of Whitmans, Dickinsons, and Frosts. Far from it, the result of this self-expression is far more likely to fall into what a friend of mine refers to as “solipsistic prose arranged in random line breaks.”

    Many, if not most, teenagers write poetry. Most of it is bad. Fortunately, this poetry, like many communicable childhood diseases such as mumps, chickenpox and measles, afflicts its authors for a short time and then they are forever immune to the pathogen. Of course, these “poems” are sincere, but as Oscar Wilde advised us, “all bad poetry is sincere.” These “journals” – notebooks filled with angst, self-loathing, raging egotism and cryptic marginalia – are then shut forever, packed away in mom and dad’s attic, forgotten, and if there is any justice in the universe, eventually incinerated.

    Occasionally, a well-meaning English teacher in a misguided attempt to promote “self-esteem” (of which self-expression is an ancillary component) encourages the young poets to explore their feelings and present these exercises in therapeutic catharsis in some sort of school-sponsored publication. There seems to be little harm done on the surface, but in the ego-centered world of the young and semi-literate, this practice only creates the illusion that there might be a talent denied – cruelly suppressed by the capricious, unfeeling (and profoundly unjust) critical standards of the literary establishment.

    In a recent article entitled “Hip Hop vs. Hip Not: The struggle for Poetic Validity in the Halls of Academia,” a locally acclaimed hip hop artist, writing under the style “Abiyah,” argues for the acceptance of hip hop poetry by the “Eurocentric” gatekeepers of our universities’ English departments. It might be an interesting argument, however, it seems Ms. Abiyah has arrived a little late to the party. The gatekeepers have been sent home, replaced by the functionaries of postmodernism, multiculturalism, and gender studies. I shouldn’t think “hip-hop” should have too long a wait to be welcomed into the hallowed halls of academia. (Why exactly they seek this “validation” is a bit of a mystery as they have attained fame, fortune, and public acceptance far beyond the wildest dreams of “academically approved” poets.)

    But, my point is not to argue the relative merits of hip hop or any other genre of poetry, it is about self-expression, and Ms. Abiyah has some interesting things to say on that matter:

    Certainly, there are basics of poetry that may need to be learned, but the learning of these techniques may inhibit rather than enhance the Hip Hop poet’s ability to express himself or herself. Academia or academic settings tend to discourage the Hip Hop poet, especially those who are innovative and experimental. Poems cannot and will not be created by recipe. In a classroom setting, particularly one focusing on creative writing, pre-emptive judgment calls by an instructor on the validity of a student’s poetry can be extremely detrimental. The instructor must be well-versed in cross-cultural contexts in order to fairly interpret each individual student’s poems.

    If one were to strip away the highly shellacked multicultural mumbo-jumbo, this passage could be reduced to the single line of an iconic pop reference; “We don’t need no education.” Imagine, if you will, the temerity of a teacher who would actually want to teach rather than “interpret” his students’ poems! Ms. Abiyah generously concedes that students “may” need to learn some “basics of poetry” but that should never, ever stand in the way of a young poet’s need to express herself, regardless of how ill-conceived and poorly executed this form of self-expression might be. After all, any form of criticism might prove to be detrimental to the fragile psyches of these fledgling sons and daughters of the Muse.

    (At this point I must confess to being perplexed by what is meant by “pre-emptive judgment calls.” In a flight of fancy I envisioned a harried teacher exasperated with his students’ unwillingness or inability to learn, taking aim with personal sized heat-seeking missiles at the more offending miscreants.)

    Of course the gem in this is the sentence, “the instructor must be well-versed in cross-cultural contexts in order to fairly interpret each individual student’s poems.” Besides being a sterling example of the doublespeak of multicultural prosody, it leaves open the question of quantity. Just how many “cross-cultural contexts” must an instructor have under his belt in order be worthy of guiding (no criticizing, mind you!) our rights-sodden youth? Ten? Twenty? I cannot help but wonder whether, had I been more insightful in my misspent youth, I would have demanded that (fully expecting that my rights would be honored!) my instructors should be well versed in the intricate meters of the 18th century Gaelic poets which best represented my particular cultural context. Even now, I am brought to the brink of weeping knowing that the power elite’s unjust criticism of my garbled syntax was merely a cultural imperialist agenda to eradicate the vestiges of Hiberno-English from my speech and writing.

    All this leaves me wondering how, in the bad old days before political correctness became the law of the land, these little darlings would have fared with raw, unfiltered criticism. I once had an editor (who had obviously skipped his sensitivity training) tell me that I should rewrite my article before I threw it away. Stung, bewildered, and dismayed that he did not think every one of my musings was twenty-four carat gold and that I needed a paycheck that Friday, I set about rewriting the article. Three things occurred: my editor received a coherent, if not brilliant, article he could put in front of his readers, I learned I could take constructive (or for that matter, any other) criticism without too fine a point on it, and I got paid. Perhaps my stock on the self-esteem exchange plummeted momentarily, but as with so many things, I got over it. This same editor was also fond of telling me that writers should have the hides of rhinoceroses and that when he was finished with me, I would be at least an armadillo.

    With the advent of the Internet and inexpensive publishing programs, writing poetry has been thoroughly democratized. Never mind the fact that the demos can barely register anything beyond a yawn, scratching its collective head while wondering what in the world were they saying? In short, anyone who wants to be published can. Still, many fail to muster the minimal effort this requires. One should, I suppose, never underestimate the power of indolence.

    Small “literary” or “culture” ‘zines now flourish like mayflies in the spring, and enjoy equally short life spans. Their detritus litters cyberspace as well as the physical world. Within half an hour I could find dozens of these screeds to ego driven self-pity, but for the sake of brevity I shall present one particularly egregious specimen:

    “i’m (sic) trying to get out there,/to make myself known,/ i dont (sic) read other poets/ afraid they’ll mess up my flow.”

    God forbid three thousand years of accomplished verse should “mess up” his flow. The reader, if she is prepared to ignore the lapses in grammar, punctuation and syntax (and after all, they are outdated elitist modes of discourse designed to subjugate their individuality) is left only to conclude that she is dealing with (a) an egomaniac of unparalleled proportions, (b) willful ignorance of unparalleled proportions, or (c) a paranoid amateur who is simply too lazy to pick up a book.

    Hardly an inducement to read further, I would think.

    Talent is a funny thing. Well-honed and practiced, it can delight and enrich the human experience in ways very few things can, while ill-prepared, undisciplined talent can only aspire to disappointment and eventually, tragic waste.

    For those of you who might be less than charitable in regarding me as an “elitist,” “reactionary,” or my personal favorite, “cultural imperialist” (I have visions of a snappy uniform, perhaps some sort of crown, maybe?) or any number of tired invectives that those who cannot be troubled to mount any sort of intellectual response would use, I have this to say: Hey, I’m only expressing myself!

    Barney F. McClelland’s work has appeared in Electric Acorn 2001 (Dublin), The Meridian Anthology of Contemporary Poetry, Aura Literary Arts Review and The New Formalist. In 2001 he was awarded the KotaPress Anthology Award for Poetry. In his spare time, Mr. McClelland enjoys reading the works of dead white European males, smoking cigarettes, and plotting revenge.

  • Anti-realism – what’s at stake? An interview with Jonathan Rée

    There is a certain caricature of philosophers which has it that they spend
    their time arguing about whether things like tables and chairs exist. This is
    just a caricature, but nevertheless there is an element of truth in it when
    it comes to the debate about realism and anti-realism. Put crudely, realists
    – or, more precisely, external realists – think both that the world exists
    independently of our perceptions of it and thoughts about it, and that we can
    reliably know about the world. Anti-realists, for a variety of reasons, doubt
    both these propositions.


    The philosophical debate about realism and anti-realism – which involves arguments
    about, for example, sense experience, language, and the nature of knowledge
    – is complex and esoteric. However, in recent times, as Jonathan Rée
    points out, it has found more public expression in the concern that scientists
    have about the way that their endeavours are treated by the humanities.


    In fact, this is a long-standing concern. It was in 1959 that C. P. Snow gave
    his famous lecture on ‘The Two Cultures’, in which he expressed dismay at the
    division between the arts and the sciences, and the hostility with which the
    practitioners of each viewed the other. It appeared to him that ‘the intellectual
    life of the whole of western society… [was] increasingly being split into two
    polar groups.’ This he considered both culturally and politically damaging.


    More than forty years on, the divide and hostility remain. These came to the
    fore a few years ago in l’affaire Sokal. Inspired
    by what he saw as the obscurity and ambiguity of much post-modernist writing,
    the physicist Alan Sokal hoaxed the journal Social Text into publishing
    an ostensibly serious article on ‘postmodern physics’ that was in fact a clever
    parody. He followed this up with the book Intellectual Impostures, jointly
    authored with Jean Bricmont, which was sharply critical of the work of some
    of the most fashionable names in the humanities. His motivation, he said in
    The Philosophers’ Magazine, had to do with challenging the rise in a
    ‘sloppily thought-out relativism’ and with exposing ‘the gross abuse of terminology
    from the natural sciences in the writings of French, American and British authors.’


    Sokal is not the only scientist to rail against the shortcomings of the humanities.
    Twenty-eight years earlier, Peter Medawar had warned in Science and Literature
    that he ‘could quote evidence of the beginnings of a whispering campaign against
    the virtues of clarity. A writer on structuralism in the Times Literary Supplement
    has suggested that thoughts which are confused and tortuous by reason of their
    profundity are most appropriately expressed in prose that is deliberately unclear.
    What a preposterously silly idea!’ And Richard Dawkins, in a review of Intellectual
    Impostures
    in the journal Nature, encourages people to ‘Visit the
    Postmodernism Generator [http://www.cs.monash.edu.au/cgi-bin/postmodern]. It
    is a literally infinite source of randomly generated, syntactically correct
    nonsense, distinguishable from the real thing only in being more fun to read…Manuscripts
    should be submitted to the "Editorial Collective" of Social Text,
    double-spaced and in triplicate.’


    It is in the context of these vituperative exchanges that Rée, over
    the last few years, has published a number of essays and articles, which argue
    that the ‘Science Wars’ are based more on misunderstanding than on real disagreements
    about the status of scientific knowledge. Why, I ask him, is he relatively unmoved
    by the clash between the purported ‘friends’ and ‘enemies’ of science?


    "One reason that I am unmoved by the melodrama," he replies, "is
    that historically people have had absolutely no problem with the idea that science
    is a social phenomenon. Scientists such as J. G. Crowther and J. B. S. Haldane
    were very keen on the idea that science was the product of various kinds of
    social relations and that these were the social relations that made it possible
    to produce knowledge that was testable and reliable. They celebrated historical
    studies of science as ways of explaining the heroic progress of science towards
    truth. Now if you imagine yourself back in that situation, then you are reminded
    that there is not necessarily a conflict between social studies of science and
    a belief in the truth content of science."


    However, Rée has one caveat. "It seems to me that these scientists,
    who thought there was no conflict between science and history, had a notion
    of scientific progress that I think was superficial – one that nobody really
    ought to believe anymore. Their model suggested that there was a pre-ordained
    destination which scientific enquiry was going to end up at. But I think that
    one can have a very strong idea of scientific progress, without supposing that
    it is predetermined what is going to be the better form of knowledge that emerges.
    So my notion is that there are lots of possible ways in which science could
    progress in the next century, all of them would be progress, but none of them
    would be the only possible way in which progress could be made."


    The importance of this caveat is that it is suggestive of an anti-realist strand
    in Rée’s thought. Particularly, it seems that in his conception, the
    progress of science is not governed by the nature of the objects of scientific
    enquiry. However, a critic might respond that whilst indeed it is impossible
    to predict how science will progress, it is nevertheless the case that its progression
    will be constrained by the nature of the objects it investigates. And moreover,
    that there are certain ways of looking at the world, certain theories, that
    are effectively dead, for example, Lamarckism – the belief that it is possible
    to inherit acquired characteristics. I asked Rée whether this was a point
    that he would accept and if so whether he felt there was, therefore, no contradiction
    at all between the objectivity of scientific truth claims and the fact that
    science is a social and historical phenomenon.


    "Yes, I think that I would accept these points," he replies. "But
    one of my proposals for advancing this debate is that there should be an embargo
    both on the word ‘objective’ and on the word ‘relativism’. I mean it’s ludicrous
    to think that adding the word ‘objective’ makes a truth any more true. There
    is a very important distinction between propositions that are true and propositions
    that are false. But I don’t know what further distinction is intended by adding
    the word ‘objective’. It is simply a rhetorical move that does mischief to the
    whole debate. What people need to understand is that the only truths that are
    available to us are those of specific historical contexts, but that they are
    no less true for that."


    In a sociological sense, the claim that truths are necessarily historical is
    unproblematic. However, it does raise the question as to the criteria for assessing
    truth-claims. I ask Rée whether he has any settled thoughts on what these
    are?


    "I don’t think," Rée says, "that there is a useful general
    answer to that. I mean there are textbook distinctions between correspondence,
    coherence, and pragmatism, that kind of thing,
    but it doesn’t seem to me that these add up to very much. I think you need to
    ask in more detail about how particular communities work out methods for attaining
    the kinds of true propositions that they want to get agreement on."


    The problem with this kind of answer is that the absence of general criteria
    for assessing truth-claims does seem to suggest the kinds of relativism that
    scientists find so infuriating. If the claim is that both truths and the criteria
    for truth are constituted in particular discourses, then, without adding in
    some extra ingredient, how is it possible to distinguish true propositions from
    false propositions?


    "Well," responds Rée, "I suppose I should first say that
    one should always be cautious before deciding that something is false. It is
    necessary to establish conversations with people who believe seemingly false
    propositions to determine exactly what it is they believe, then, once you’ve
    understood it, you may well find that there is something true in their belief.
    I think one of the side-effects of getting worked up about the idea of objective
    truth is that people do tend to get too impatient to investigate the possibility
    that there may be something they can learn from things that they are at first
    appalled by. But, of course, that is not to say that there are not some beliefs
    that are completely false."


    But again, if the criteria for truth are themselves constituted within discourse,
    what is it that enables us to privilege certain of these criteria so that we
    can meaningfully say that some beliefs are completely false?


    "I think this is why Rorty, who is quite wise about this matter, says
    that you should talk about intersubjectivity rather than objectivity,"
    replies Rée. "The question is not about different realities and
    how they connect up, but different conceptions, different vocabularies, and
    how they connect up. What you need to do is to experiment with trying to have
    conversations with people and to see whether you can negotiate some kind of
    linkage between the way that you’re talking about things and the way that they
    do. To the extent that this strategy is unsatisfactory, it is because our epistemological
    condition is unsatisfactory. I mean the fact is that it can always turn out
    that the things that we are convinced are unrevisably true might in fact be
    problematic in completely unexpected ways.


    ""I said that there are two terms that should be embargoed,"
    continues Rée, "the second one being ‘relativism’. It does seem
    to me that people who put themselves forward as friends of science, use the
    word ‘relativism’ to describe a position that they regard as being totally opposed
    to the notion that there can be such a thing as scientific progress. But if
    that is what relativism is, then I don’t know anyone that believes in it. An
    alternative tactic is to say ‘Sure, we should all be relativists’, because it
    seems to me that when we actually think about what the term relativism means,
    then it is a theory about how you get truth and how you know you’ve got it.
    And it is simply an unfair debating point to suggest that to be a relativist
    is to be someone who does not believe there is such a thing as truth. It is
    just that a relativist is someone who tries to be explicit about the various
    standards by which truth is measured in different contexts."


    All this seems perfectly reasonable. It is, of course, important that people
    who make conflicting truth-claims should attempt to establish points of connection
    in order to examine their respective beliefs and belief systems more closely.
    It is also at least arguable that scientific truths are by their very nature
    provisional. And further, it is the case that truths are constructed within
    particular discourses, and, in that sense at least, they are contextual. But
    a nagging doubt remains. And it is the same point as before. If the validity
    of truth-claims can only be established in terms of criteria that are
    themselves internal to particular discourses, what happens when a person inhabiting
    a non-scientific discourse refuses to accept, despite all attempts at persuasion,
    some of the established truths of science – for example, that the earth is more
    than 6000 years old or that it is not flat? It seems that the logic of the kind
    of position outlined by Rée means that it is not possible to privilege
    the scientific version of truth over the non-scientific version. But surely
    he cannot be happy with that outcome?


    "Well, the truth is," Rée admits, pausing, "that I’m
    not really able to give an interesting answer to the question, as you pose it.
    But I wonder why you put it in terms of beliefs that are so barmy that they
    are scarcely intelligible? What if it were in terms of something like Holocaust
    denial, where there is a genuine disagreement and it’s not really a disagreement
    over criteria. It seems to me that whilst it is undeniably exasperating to find
    people who stubbornly refuse to accept what you take to be pretty conclusive
    evidence, it is not fair to be asked ‘What are you going to do about the fact
    that you can’t change their minds?’ – at some point you just have to shrug your
    shoulders and simply say ‘Well, I can’t.’"


    But there are reasons for posing the question in terms of ‘barmy’ beliefs.
    Firstly, plenty of people believe things which in scientific terms are very
    bizarre – for example, opinion poll data suggests that about a third of Americans
    reject the idea of human evolution, and another third are undecided. And secondly,
    the more bizarre the beliefs, the more it becomes clear what is at stake in
    committing oneself to a conception which holds that the criteria for truth are
    only internal to particular discourses. Specifically, it brings into sharp focus
    the fact that this conception allows no definitive grounds for rejecting propositions
    that we nevertheless are certain are false. So I ask Rée what exactly
    he would say to someone who insisted that the earth was flat or that mermaids
    lived under the sea?


    "What you have to say is that, as far as I can see – and I may always
    be wrong – these beliefs are barmy. I think that the phenomenon that you are
    pointing to is just the fact that people can get into disagreements where it
    is extremely difficult to make any progress. But I think that that is just our
    shared epistemological condition, and I don’t see that claiming that what you’ve
    got is absolute truth and what they have got is not, is going to help. I would
    use the example of barmy beliefs as a way to bring you round to my slogan, which
    is ‘Neither a realist nor an anti-realist be.’


    "Listen," Rée goes on, "everything that the ‘friends
    of science’ want to say about the extraordinary achievements and progress of
    the natural sciences, both in terms of knowledge and in terms of technique,
    all of these things can be said by someone who describes themselves as a ‘relativist’
    and there is no intelligible sense of relativism that would lead you to deny
    the reality of scientific progress."


    So what then about the ultimate structure of the external world? Does the contextual
    nature of all truth-claims mean that this structure is always beyond our reach?


    "Well," says Rée, "I don’t think there is anything more
    satisfactory than invoking the Rorty move that I have already mentioned. This
    consists in saying that there is no real difference between talking in an upbeat
    way about getting to know more about the ultimate structure of the world, and
    talking in a more depressed kind of way about the possibilities of including
    more people in a conversation. It seems to me that they really come to the same
    thing. So the question becomes: how do the particular discourses of specialised
    sciences relate to other scientific discourses and to discourses outside science?


    "If you’re in a conversation with someone who is worried about having
    the ultimate structure of the world taken away from them, then you need to make
    them see that what they’re asking for is beyond what any possible agreement
    in the future about how to look at the world can deliver. They keep saying that
    they want objectivity, but they don’t actually need it, so the point is to close
    the gap and to say ‘You’re worried about being deprived of something that actually
    you haven’t got, and you wouldn’t know if you had.’ It’s a chimera, this thing
    that they’re worried about having taken away from them.


    "Imagine that we’re talking with a scientist," Rée continues,
    "worried about his work not being taken seriously – I think that we’re
    paying all the respect that a scientist could dream that we’d pay to the scientific
    enterprise if we say that relative to human discourses, science improves the
    knowledge and control we have over things that matter to us. Of course, you
    can say ‘Well, it does that because it tells us the truth about the objective
    structure of the world’ – and that’s fine, you can say that, but it’s hardly
    an ontological big deal."


    But if that is what Rée thinks is going on in scientific discourses,
    that they are telling us truths about the objective structure of the world,
    then surely that is a realist position, it is not some kind of half-way house
    position.


    "Yes," admits Rée, "that is what I’m saying, except that
    I think the word objective is a waste of space. Or are you trying to contrast
    the objective structure of the world with its subjective structure? I wouldn’t
    if I were you. But rather than ‘Neither a realist nor an anti-realist be’, perhaps
    I should say, ‘Neither an anti-realist nor an anti anti-realist be!’"


    Selected Bibliography
    ‘Rorty’s Nation’, Radical Philosophy, No. 87, Jan/Feb 1998.


    This interview is extracted from What
    Philosophers Think
    , edited by Julian Baggini and Jeremy Stangroom, and published
    by Continuum
    (2003).

  • SARS in a Wilderness of Mirrors

    There is an old Chinese folk tale in which a fool
    deposits 300 pieces of silver in a hole. In order to conceal his largesse, he
    puts up a sign nearby to announce that “300 pieces of silver do not lie here.”
    The moral of the tale was that the more you try to cover something up, the more
    obvious it is that something is being concealed.


    The Chinese government, fiercely vigilant when
    it comes to any manifestation of press freedom, are learning this lesson the
    hard way with regard to the viral condition known as SARS, or Severe Acute Respiratory
    Syndrome. It used to be thought that in China, the only way of confirming if
    a story was true was if the state-owned press had already emphatically and categorically
    denied it. The belief persists.


    When there is a sort of institutional wall of
    silence on every possible issue, information of some description still squeezes
    itself out somehow. The thirst for news must be slaked, even when there is no
    news. During the brouhaha of the APEC conference, held in Shanghai in Autumn
    2001, the city pricked up its ears amid rumours about unexploded bombs in airports
    and a terrorist assault on a hotel on the city’s main thoroughfare, Nanjing
    Road. Similarly, one could hear whispers in local bars following the outrage
    of 9-11, with stories concerning Shanghai-based Muslim terrorists being rounded
    up by the authorities, and several of them going underground or fleeing to the
    remote West, where, according to another rumour, a fierce military campaign
    against insurrectionaries was being conducted by a joint Chinese-Kazakh army.
    Which of these stories was true? None of them were mentioned in the People’s
    Daily, the organ of the Communist Party, and even if they had been, most readers
    would have remained suspicious. In the West, where the press is considered to
    be relatively free, such a “wilderness of mirrors” approach to the honesty of
    the media is quite commonplace in conspiracy theory circles. In China, with
    the media compelled to “follow Xinhua”, the state-run news agency, scepticism
    is obligatory.


    The internet has offered a valuable outlet for
    a nation starved of debate. But with the internet, any cockeyed rumour one cares
    to invent very quickly rolls out of control. Indeed, one of the main scares
    about SARS in Hong Kong was caused by a fourteen-year old boy who faked a story
    saying that the region had been declared an “infected city”, prompting a wave
    of panic buying throughout the former colony and a volley of denials from the
    authorities.


    No one, of course, trusts the local Chinese press.
    Not even the local Chinese press. Couple that with an insidious network of chatroomers
    and e-mailers, most of whom have heard talk of SARS victims being dragged away
    in the dead of night to secret military hospitals, or of snifflers being dragged
    off aeroplanes just before take-off, and at the very least you have an atmosphere
    of panic. And by the beginning of April, there was a crisis of confidence. Conferences
    were being cancelled from Beijing to Jakarta, and before long, everyone would
    surely be wearing tin-foil hats and gas masks and joining the queue for rations.
    Most of us were running through the list of symptoms, checking our temperatures
    and pulses and wondering whether the malaise, myalgia and dry coughing that
    had been an inevitable part of everday life over the past several years could
    still be attributed merely to the drinking, smoking and late-night parties.


    The lonely fight for facts, it seemed, would be
    left to the foreign press stationed here in China. Meanwhile, the authorities
    finally decided to make their move.


    At the beginning of April, the municipal government
    in Shanghai finally acknowledged that the virus had hit the city. Most of the
    foreign journalists stationed in Shanghai had arrived at a small conference
    room in the International Hotel. The place hissed with gossip, and there was
    a pervasive sense that something momentous was about to be disclosed, that somehow,
    the big SARS balloon that had been blowing up for weeks was about to burst.


    Although the Health Bureau official announced
    that there was only one SARS case in Shanghai, the assumption was, naturally
    enough, that he was lying. At best, a turning point had been reached. This seemed
    to be a compromise, face-saving revelation that would allow more frankness further
    down the line. After all, the government had responded with blanket denials
    up to then, and to shift suddenly from total disavowal to the announcement of
    a dozen deaths would have been too great a volte face. As I write, the
    official figure in Shanghai has risen, but only to two – the victim’s father
    had also succumbed to the illness – and there were also a number of suspected
    cases.


    The reporters at that first press conference were,
    of course, spitting feathers. “The Shanghai government are a responsible government,”
    the spokesman said, to a chorus of groans.


    The natural assumption was, and remains, that
    behind all the reassuring smiles, the city’s hospitals actually resembled scenes
    from Night of the Living Dead. The government had denied the issue for
    so long that it came as no surprise that no one believed them when they began
    to make noises.


    And so, sensitive observers were suddenly noticing
    the ambulances nipping through the traffic, presumably rushing to deal with
    the latest sighting of SARS. Late at night, cleaners were seen to emerge, spraying
    the streets with gallons of Dettol. Minor coughs and colds, common at this time
    of year, were thought to be the beginning of a viral cataclysm.


    It seemed at one point that the scourge of SARS
    was about to bring (a) economic growth, and (b) civilization to an end. It was
    probably unfair to conclude that, finally, after years covering Communist Party
    puppet shows and international business junkets, the hacks at last had something
    to get really indignant about: after all, the panic was palpable, and
    the statistics – particularly from Hong Kong – were thoroughly disturbing. And
    it was undeniable that the Chinese government, despite 20 years of economic
    reform, remain culturally disinclined towards the sort of openness and transparency
    that might have curtailed the spread.


    Last weekend, following the revelation that there
    were as many as 339 diagnosed cases in the capital, almost ten times the previously
    acknowledged figure, a couple of scapegoats were identified, with the mayor
    of Beijing and the head of the Ministry of Health both dismissed. The government
    are now promising full disclosure, but the e-mails continue to flow, the rumours
    keep on rumbling. Who the hell believes them now?


    And so, the panic grows. Staff at the US consulate
    here in Shanghai were heard on Friday April 11 to be suffering from SARS-related
    symptoms, and Reuters breathlessly issued a report. “The US consulate
    in Shanghai said in an email seen by Reuters that two Americans were
    among nine being treated at the Shanghai Pulmonary Disease Hospital with symptoms
    of SARS.” That night, an ominous air seemed to pervade the clubs and bars scattered
    throughout Shanghai.


    But it turned out to be untrue. The victims were
    released from hospital shortly after having been shown to be suffering from
    nothing more than severe colds.


    Meanwhile, the Australian government have now
    put SARS in the same category as the Plague, Cholera and Yellow Fever. This
    follows comments by the New Zealand prime minister, Helen Clark, warning that
    the problem could be even worse than the 1918 flu epidemic.


    Many of the news agencies based in China have
    imposed emergency measures. Among the foreign community, rumours spread very
    quickly, and an otherwise luxurious existence has been laced with danger.


    Even sceptics are hedging their bets, citing the
    belated reaction of the Chinese government as well as the traditional sources
    at the World Health Organization. Public occasions are marked with a curious
    sense of esprit de corps in a climate of pre-storm calm, almost as if
    we are amazed that we have found the courage to leave our homes. The vocabulary
    of foreign residents is now larded with the jargon of virology, and their tales
    – hushed and weary – resonate with all the old fears. We listen to third-hand
    hearsay about the woman in the flat below or some friend of a friend currently
    manning the hospital barricades


    It is better to be safe than sorry, say some.
    Hong Kong – which has already seen its confidence battered in recent years –
    has been shaken to its boots by the SARS crisis. Receipts at Hong Kong’s restaurants
    are reportedly down 50-80%, and its cinemas are empty. Housing sales are down
    65%. There are 70% fewer tourists. One estimate suggests monthly economic losses
    of HK$8billion.


    Analysing the current figures is a difficult business,
    not least because they are rising every day. It is worth noting, however, that
    even if the accumulated total number of cases in Beijing reaches 1,000 in the
    coming days or weeks, it still represents only 0.0075% of the total population,
    and that of all the cases, there seems to be a survival rate of well over 90%.
    The rate is much more worrying in Hong Kong, with an infection rate of about
    0.02% and a fatality rate currently at 6.7%.


    But is the figure worrying enough to justify the
    recent claim made by the popular local e-newsletter and website, www.c-biz.org,
    unerring advocates of the “we’re all going to die” school of journalism? Citing
    the Hong Kong Standard, the newsletter claimed that preventative measures had
    come too late, and that the virus was now “threatening virtually all the country’s
    1.3 billion people”.


    There are other things happening throughout China,
    of course, none of which are covered by the domestic media. There are mass lay-offs
    at state-owned companies leading to strikes and police crackdowns. There is
    the endemic corruption and gangsterism throughout the provincial-level cities.
    There is the ongoing scandal of the Falungong, a tawdry, superstitious little
    cult that was transfigured by the excesses of the State into a pious order of
    martyrs, to which Western pilgrims regularly pay homage on Tiananmen Square
    before being unceremoniously expelled.


    The spread of HIV/AIDS in China has also become
    a subject of concern, with the official number of sufferers currently standing
    at 850,000 and thought to underestimate the real total. According to the darkest
    predictions cited by Kofi Annan in his visit to China last year, that figure
    threatens to multiply to as many as 10 million by 2010. Meanwhile, a recent
    report revealed that tuberculosis still kills 2 million people worldwide every
    year, 98% in developing countries, and that a third of the world’s population
    is infected with the TB bacillus.


    There is something about infectious diseases that
    brings out the worst in us. On April 20, the International Herald and Tribune,
    reporting from Hong Kong, told of yet another side effect of SARS. Victims,
    the report said, were being ostracised. In accordance with government regulations,
    buildings in which infections have taken place are marked clearly with a sign.
    Hong Kong officials were initially reluctant to impose harsh restrictions for
    fear that victims would be “driven underground”.


    Amid the mayhem, I look for rational voices. I
    am consoled by the words of Christine McNab, a spokesman for the WHO speaking
    to The Guardian, who said that most of the cases were in a hospital setting.
    “The rest seem to be people in very close contact with affected people. That
    narrows the risk to the general population. The pattern of how this is moving
    does not indicate at this point that there is a widespread risk to the general
    population.”


    I listen carefully to the words of John Oxford,
    professor of virology at the Queen Mary School of Medicine in London, who said
    that more attention should be paid to the potentially greater threat posed by
    new strains of influenza. “SARS does not look like it is explosively infectious,”
    he said. “Most of the cases seem to have come out of hospitals, among doctors
    and nurses, all of whom have been in very close contact with ill people. It
    is causing problems in certain environments, but it’s not zipping around the
    globe.”


    And yet, by now the panic is more infectious than
    the virus itself, and even the sturdiest of observers are struck regularly by
    the thought that infection is just a stray droplet or contaminated elevator
    button away. As is the case in most health scares, the very act of paying attention
    – of isolating, analysing and comparing statistics – creates a potent symbolic
    space, one which draws on all our fears about mutating bacteria, outfought immune
    systems, and mass pandemics, and leaves very little room for a sense of proportion.

    David Stanway is a writer, editor and translator living in Shanghai.

  • Shiva the Destroyer?

    Postmodernist anti-science thought was once primarily associated with European
    and North American academics in the humanities. Now not only has its influence
    become international, but it has become integrally intertwined with a number
    of other issues such as anti-globalization, anti-transgenic technology in agriculture,
    and conservation. Nobody can fault the prevailing internationalism of postmodernists
    and their respect for different cultures and peoples (except for the culture
    of those who are committed to modern science/technology and its benefits). Nor
    can we fault their argument that all of us have biases, though they fail to
    comprehend the vital role that scientific method plays in helping to overcome
    the limitations which personal and cultural biases impose. Their belief in the
    worth and dignity of all human beings is unexceptionable. Some of us critics
    would suspect, however, that in going global, postmodernist thought does not
    necessarily impact on other political/cultural traditions in a way which upholds
    the worthy ideas that most postmodernists claim to espouse. To the extent that
    these postmodernist ideas have become part of the globalization debates, there
    is a legitimate issue of consistency if in fact what is being forcefully advocated
    produces adverse outcomes contrary to what its proponents claim for them.


    None of us are totally consistent in all our beliefs, nor can we find total
    consistency in the various political or social movements we may be committed
    to. Life and the world of ideas are messy, and so we can take heart with Ralph
    Waldo Emerson’s strictures against that foolish consistency which is the hobgoblin
    of petty minds. A little untidiness and a few gaps in our knowledge here and
    there are probably healthy, and facilitate the emergence of new ideas. However,
    the argument to be pursued here is that there is a basic inconsistency, or more
    accurately, a fundamental contradiction between what has been advocated
    by a type of postmodernist thought, and its practical outcome in developing
    countries. It is a contradiction that is often so blatant as to undermine whatever
    merit there may be in the avowed postmodernist respect for other cultures. Stated
    baldly, the respect for “local ways” of knowing, rather than promoting multi-culturalism,
    ends up instead promoting crass forms of cultural chauvinism and intolerance
    that can devolve into violence. In our internet/information age, there is no
    excuse for those who have entered various globalization debates without knowing
    the outcomes and implications of their advocacy.


     


    Local knowledge and reactionary politics


    Dr. Vandana Shiva is likely the world’s most celebrated holistic ecofeminist,
    deep ecologist, postmodernist luddite, anti-globalizer, and spokesperson for
    those she claims are without a voice. Because she has advanced degrees in science,
    Shiva is useful for providing legitimacy to a range of anti-science views on
    the part of those who mistrust scientific inquiry (except where they think that
    it will promote their ideological agenda). Contemporary ecofeminist literature
    is almost unreadable, particularly on the Green Revolution, which ecofeminists
    deem to be a failure, and on “organic” agriculture, which they favor. Being
    able to cite Shiva as a presumed authority allows them to talk about global
    agriculture without any substantive knowledge of how peoples around the world
    raise crops and feed their families. One wonders how many academics obtained
    tenure on the basis of books and articles for which Shiva was a major source.


    One leader does not fully define a movement, to be sure, but Shiva with her
    condemnation of “scientific reductionism” has become so preeminent in the global
    deep ecology/ecofeminist movement against modern science that raising serious
    questions about her does in many respects raise questions about the entire movement.
    Shiva’s ideas, which are shared and promoted in the West by ecofeminists and
    others as radical and revolutionary, often turn out to have reactionary consequences
    where they are practiced in India.


    This may come as a shock to the true believers, but for many the faith in the
    fundamental rightness of Shiva’s message is so firm that it would be a near
    impossibility to convince them otherwise. The philosopher of science Meera Nanda
    shows that the much revered “holistic way of knowing … lies at the very heart
    of caste and gender hierarchy in India” (Nanda 2002, 54).”The role that the
    goddesses and the idea of sacredness of nature have played (and still play)
    in perpetuating the oppression of actual women is not adequately understood
    by the enthusiasts for alternative sciences” (Nanda 2003a). It is the much venerated
    “local knowledge” of the Hindu cosmology of “Karma and caste” which was used
    to justify the repression of Dalits (the crushed or oppressed – untouchable).
    The liberation of women is “linked” to overcoming the “kind of cultural assumptions
    about sacredness and holism” that are promoted by Shiva (Nanda 2003c).


    Many of those now promoting the virtues of “local ways of knowing” were, we
    hope, opponents of it in its pre-postmodernist manifestations. From 1948, with
    the election of the National Party in South Africa, to the early 1990s, a similar
    reverence for “local ways of knowing” appropriate to the culture was proclaimed
    and promoted as “Bantu education.” It was called Apartheid and many of us spent
    most of our adult life in active opposition to it, as, undoubtedly, did many
    of today’s activists who tout the special virtues of local knowledge.


    Among the many reasons for opposition to Apartheid and its repressive policies,
    was that the so-called “Bantu education” would handicap the student even in
    a non-Apartheid society by not providing her or him with the knowledge necessary
    to survive economically. Today we have what is misnamed as “Science Studies”
    promoting a “Navajo way of knowing” (which is “assuredly more spiritual and
    holistic than European ways”) in learning mathematics by “teaching calculus
    before fractions” (Olson 1999). Among many problems with this method of teaching
    is the “difficulty of expressing the slope of a line, one of the fundamentals
    of calculus, in any way other than by using a fraction or decimal” (Olson 1999).
    Thus, “while well-meaning teachers puzzle out such difficulties, Navajo children
    are … to grow up without learning how to compute sales tax” (Olson 1999).
    From the elite precincts of Western universities, “multi-culturalism” has spread
    to other parts of the world. Across the border from where Shiva’s ecofeminism
    lends support to Hindu chauvinism, Pakistani proponents of “Islamic science”
    and “Islamic epistemology” have been:




    citing the work of feminist science critics in their campaign to purge
    many Western ideas from the schools, and certain feminist professors in
    the West–perhaps caught up in the thrill of having their work cited half
    a world away–have favorably cited the Islamicists right back (Olson 1999).




    Not to be outdone by Shiva’s Indian advocacy, in the United States there are
    advocates of a mysterious entity called “feminist algebra” (Bookchin 1995, 212).
    When the right-wing Bharatiya Janata Party (BJP) came to power in Uttar Pradesh,
    India, in 1992, they sought to awaken “national pride” by making “Vedic mathematics
    compulsory for high school students” (Nanda 1996). “Hindu ways of knowing” involved
    government-approved texts replacing standard algebra and calculus with sixteen
    Sanskrit verses. Leading Indian mathematicians and historians examined the verses
    and found “nothing Vedic about them,” thinking them merely a “set of clever
    formulas for quick computation” and not a “piece of ancient wisdom” (Nanda 1996).
    According to Meera Nanda (1996), “in the name of national pride, students are
    being deprived of conceptual tools that are crucial in solving real-world mathematical
    problems they will encounter as scientists and engineers.”


    Hinduization extends beyond mathematics to promoting the “Aryan race” together
    with a disdain for all “foreigners including Muslims.” The BJP along with the
    VHP (Vishva Hindu Parishad or World Hindu Council) are offsprings of the RSS
    (Rashtirya Svyamsevak Sangh or Organization of National Volunteers) which has
    been actively promoting hatred of Muslims and Christians in India, and has been
    involved in the destruction of Muslim and Christian places of worship and fostering
    deadly riots against non-Hindus. Postmodernist/ecofeminist multi-culturalism
    might be a worthy idea in some ways, but when it is integrated with a “suspicion
    of modern science as a metanarrative of binary dualism, reductionism and consequently
    domination of nature, women and Third World people” it supports Hindu reactionary
    modernists who claim the “same holist, non-logocentric ways of knowing not as
    a standpoint of the oppressed but for the glory of the Hindu nation itself”
    (Nanda 2000, 2001a).


     


    The Chipko "Movement"


    Many activists like Shiva, who are promoted in the West by the anti-globalization
    Greens and who receive uncritical acclaim, are often the object of very severe
    criticism in their own countries, a fact which goes largely unreported. After
    an article in a Malaysian newspaper talked about Shiva in highly flattering
    terms, claiming that she was a leader of the famed Chipko (tree huggers) movement
    in India, the Chipko local activists sent a letter of protest to the editor,
    arguing that the interview was based on false claims and noting that it had
    angered many people. Those writing the letter saw themselves as being the “real
    activists,” who do not understand why Shiva is “reportedly publishing wrong
    claims about Chipko in the foreign press.”


    Shiva uses Chipko as a model for Green ideologies from deep ecology to eco-feminism.
    Jayanta Bandyopadhyay, a distinguished scientist and environmentalist, examines
    each of these ideologies and deems them myths without any basis in fact (1999).
    He is an active supporter of the Chipko villages, in which he finds “a movement
    rooted in economic conflicts over mountain forests,” and a “social movement
    based on gender collaboration” and not a “feminist movement based on gender
    conflicts” (Bandyopadhyay 1999).


    Chipko is but one example where external activists, even those who may be well
    intentioned idealists, in effect hijack a movement and use it to promote an
    ideological agenda. The original motivation for “participating in Chipko protests”
    was to gain local control of forest resources in order to create a forest-based
    industry which offered the Himalayan villagers the possibility that their kinsmen
    who had to migrate to find work, might be employed closer to home. Further,
    increased local access to forest resources might “have offered women the possibility
    of adding to their meagre incomes and insuring themselves from potential crisis
    if remittances ceased or became intermittent” (Rangan 2000, 199-200).


    Chipko is one of many cases of environmental groups in developed countries
    co-opting a cause like wildlife or habitat conservation, or a local movement
    with legitimate grievances, and then subverting them. In the case of Chipko,
    the co-option was initially by people from the urban elite in India, who received
    international acclaim as a result. As with other cases that I have examined,
    in places like Africa and the Americas, not only do local concerns get brushed
    aside, but often the locals are worse off because of the external “support.”
    This is particularly true in case after case that I have examined for conservation
    projects, be they in Africa, Central America or India, where local interests
    are swept aside in favor of saving the environment from those who live there
    (DeGregori, 2004, Chapters 4, 10 & 11 and DeGregori 2002, Chapter 2).


    One of Shiva’s ‘Chipko women’ from the Pindar Valley in Chamoli District, Gayatri
    Devi, bitterly states that the movement has made life worse in the valley:




    Now they tell me that because of Chipko the road cannot be built [to her
    village], because everything has become parovarian [environment] … We
    cannot get even wood to build a house … our ha-haycock [rights and concessions]
    have been snatched away (Rangan 2000, 42).




    This helps to answer the questions which Rangan raises:




    Why do words like environment and ecology make so many people living in
    the Garhwal Himalayas see red? Why do so many of them make derisive comments
    when the Chipko movement figures in any discussion? Why is it that in most
    parts of Garhwal today, local populations are angry and resentful of being
    held hostage by Chipko, an environmental movement of their own making (Rangan
    1993, 155)?




    When the world community was ready to hear the claims of the Garhwal Himalayan
    villages,




    their voice in the Chipko movement had all but ceased to exist. The brief
    love affair between Chipko’s activists and the state had resulted in the
    romantic ideal that the Himalayan environment by itself mattered more than
    the people who eked out their existence within it.




    Rangan adds that:




    if some of the communities are ready to banish their axes today, it must
    be seen as yet another attempt to affirm themselves and give voice to the
    difficulties of sustaining livelihoods within their localities (174-175).




    From Agarwal and Narain, we learn that the situation has driven some to advocate
    practices that violate laws which the urban conservationists have imposed. “Uttarkhand,
    the land which gave birth to the Chipko movement, now even has a Jungle Kato
    Andolan (cut the forest movement). Thanks to the ministry of environment, ‘environment’
    is no longer a nice word in Uttarkhand” (1991). Rangan argues that the Chipko
    today is a “fairy tale,” a myth sustained and propagated by a few self-appointed
    spokespeople through conferences, books, and journal articles that eulogize
    it as a social movement, peasant movement, environmental movement, women’s movement,
    Ghandian movement–in short, an all-encompassing movement (Rangan 1993, 158).


     


    The Green Revolution


    Dr. Vandana Shiva, in a book length diatribe against the Green Revolution,
    frequently refers to its voracious demand for chemical fertilizers and indicates
    that there are alternative ways, more benign, of achieving these outputs (Shiva
    1991). Plants need ingredients (nutrients) in order to grow. If a molecule is
    in the plant, it or its constituent elements must come from somewhere. Except
    for carbon dioxide from the atmosphere, plants derive their nutrients from the
    soil, or in the case of nitrogen from atmospheric nitrogen mediated by cyanobacteria
    (other than that from fertilizer). More plant output means more nutrient input.
    The often repeated claim that Green Revolution plants need more fertilizer has
    about as much meaning as saying that it takes more food to raise three children
    than it does to raise one. If sufficient nutrient is not in the soil, it must
    be added. Shiva’s argument in essence is that one can grow plants without nutrients
    or that one can achieve the same output as Green Revolution seeds yield without
    providing nutrient input other than available “organic” sources. This is patently
    nonsensical and violates our fundamental knowledge of physics.


    Shiva has made a number preposterous statements over the years about yields
    in traditional Indian agriculture or traditional agriculture elsewhere such
    as among the Maya. Even before the Green Revolution dramatically increased the
    demand for and use of synthetic fertilizer, there was a large difference between
    the nutrients extracted from the soil in India and the “organic” nutrients available
    to be returned to it. In fact, nearly twice as much nutrient was being withdrawn
    from the soil as was being returned. Contrary to Shiva’s assertions, this process
    was not sustainable. Given the dramatic increases in Indian agricultural output
    over the last four decades (which more than accommodated a doubling of the population),
    the deficit in “organic” nutrient must be vastly greater today. Shiva cites
    Sir Albert Howard, whose vitalist ideas on “organic” agriculture were developed
    in colonial India (Howard 1940). But though he was a strong proponent of composting
    (“Indore method”), Howard recognized the need for additional synthetic fertilizer
    and improved seeds, which means he might have favored GM crops if he were alive
    today.


    Shiva has a belief that “food crops for local needs” are “water prudent” (Shiva
    2000). For the Green Revolution grains, the primary output is a larger percentage
    of the plant (harvest index) and therefore requires less nutrient input per
    unit of output. These gains in agricultural efficiency and in yields per hectare,
    particularly for the Green Revolution grains, has accommodated a doubling of
    the world’s population, with about a 30% increase in per capita food consumption
    with only a slight increase in land under cultivation (about 4% for grains).
    For rice, the gains in water use efficiency have been nothing less than astounding.
    According to a recent FAO (UN Food and Agriculture Organization) report, “the
    modern rice varieties have about a threefold increase in water productivity
    compared with traditional varieties” (FAO 2003, 28). Overall, for water use
    in agriculture, “water productivity increased by at least 100 percent between
    1961 and 2001” while water use per capita was falling about in half (FAO 2003,
    25-26). What the FAO is primarily describing is the yield increases and greater
    plant efficiency of the Green Revolution technologies so sharply criticized
    by Shiva.


    Biotechnologists are working to create even more efficient plants, a goal which
    is opposed by Shiva and her followers. In her paeans in praise of cow dung,
    Shiva’s pre-Green Revolution Indian agriculture is one of a healthy, self-sufficient,
    calorically adequate, nutritious food supply produced in an ecologically sustainable
    manner (Avery 2000; for a critique of Shiva by an Indian scholar, see Nanda
    1991, 1997 & 1998.). Why hundreds of millions of peasant agriculturalists
    in India and around the world have forsaken this utopian existence and adopted
    the Green Revolution’s crops and modern agricultural technologies is never explained.
    Maybe those actually raising crops and feeding their families know something
    about agriculture that Shiva and her fellow activists don’t?


    Equally unexplained is why, if, as Shiva argues, modern technology is pauperizing
    populations and in many cases driving people to suicide, life expectancies have
    risen so dramatically throughout Asia for both rural and urban populations.
    Even more difficult to explain is why those in developed countries, who are
    presumed to be educated and informed, uncritically accept her musings and pay
    her homage, including selecting her to give prestigious presentations such as
    the Reith Lecture (Shiva 2000 and Scruton 2000).


     


    Contradictions, Mistakes and Double Standards


    Contradictions and mistakes are all too prevalent in the work of Shiva and
    those who revere her. For example, in a public lecture in Toronto, Canada, she
    claimed both that the price level of food in India was doubling and that it
    was falling. Arguing that the technologies of the Green Revolution have failed,
    she has the price of food in India doubling so that consumers can no longer
    afford it. But when she wishes to criticize the United States for “dumping”
    food on the Indian market, pushing Indian farmers to commit suicide, she claims
    that subsidized foreign food is “driving down prices” (O’Hara 2000 and Oakley
    2000).


    The following excerpt from a news item on Shiva’s visit to Houston in the October
    of 2000 is indicative. Shiva appears not to know the difference between a field
    of rice and one of weeds.




    Shiva walked across the road and looked out into a shaggy field.
    “They look unhappy,” she said. “The rice plants. Ours at home look very
    happy.”
    “That,” RiceTec reports, “is because it’s not rice. That’s our test field,
    it was harvested in August. That’s weeds” (Tyer 2000).




    Shiva inspired anti-technology criticism reached its true nadir when humanitarian
    aid for people in need was attacked because of the technology used to produce
    it. In India, following a “super-cyclone,” a team from Vandana, Shiva’s “research
    foundation”, gathered samples of donated grain while involved in “relief work”
    and had them tested in the United States to see if they were genetically modified.
    Claiming that they were genetically modified, Diverse Women for Diversity
    then demanded that the government of India “immediately withdraw the corn-soya
    blend from Orissa,” seemingly preferring starvation for the cyclone victims
    to a presumed but unproven contamination from GM food (RFSTE 2000, Devraj 2000,
    Lean 2000 and Jayarsman 2000c).


    Possibly, Shiva could arrange for “organic” agriculturalists like Prince Charles
    to provide famine relief using funds from Greenpeace and other environmental
    groups with annual budgets into the tens of millions of dollars. And once again,
    it is appropriate to ask how many poor farmers have Shiva’s Diverse Women
    for Diversity
    or The Research Foundation for Science, Technology and
    Ecology
    helped to grow more food? How many of those in need have they helped
    to feed? And in the name of transparency, what are the sources of its funding?


    These questions are legitimate because too many groups that raise and spend
    significant amounts of money and help feed no one, demand transparency from
    others and criticize groups and individuals who have assisted those in need
    by helping them to grow more food or by providing relief food that modern agricultural
    surpluses facilitate. Many “Civil Society” groups in developing countries are
    largely and in some cases fully funded by developed country NGOs, so one can
    legitimately ask questions about the independence of their judgements in much
    the same way that one would question the independence of a statement by a developing
    country employee of a multinational corporation (see DeGregori 2002c).


    Nanda accuses “populist intellectuals like Shiva” of being “guilty of hypocrisy
    and double standards” for failing to recognize that “their own growth as intellectuals
    and activists owes a tremendous debt” to the very ideas that they disparage
    (Nanda 1991, 55). It has not gone unobserved that those like Shiva who are most
    critical of modern science have gained favor in Western universities and have
    often benefited greatly as a result.




    Furthermore, the jet-setting, globe-trotting neopopulist intellectuals’
    propensity to project the life style of the poor as being morally superior
    and socially richer than that of the Western oppressors is hypocritical
    to say the least … (and) fails to offer a progressive and feasible program
    for change (Nanda 1991, 39).




     


    Local Knowledge versus Modern Knowledge


    We talked earlier about the Chipko movement in the Himalayan Garhwal region
    of Uttar Pradesh, India for whom Shiva presumes to speak and for which she has
    won international acclaim. When the Chipko movement’s battle for local control
    of vital forest resources was taken up by Shiva and other “deep ecologists,”
    the local struggles for resources and development were sacrificed to global
    environmental concerns by groups that “tacitly support coercive conservation
    tactics that weaken local claims to resource access for sustaining livelihoods”
    (Rangan 2000, 239, see also Peluso, N. 1993).


    Those who champion local wisdom too often respect it only so long as it is
    in line with their ideological agenda. Ideas that are presumed to liberate end
    up being instruments of oppression. Their advocates in developed countries seem
    to live in a virtual Potemkin village, blissfully unaware that local knowledge
    and control privileges traditional elites who tend to be dominating upper class
    males who find the rhetoric of ecofeminism useful, but not its desire for equality
    of classes, races and genders. Anyone who has been involved in economic development
    is aware of the importance of local knowledge and the need to use it along with
    any other available knowledge. But there is a very big difference between using
    local knowledge and being dominated by it. And it is important to distinguish
    between local knowledge and local myth, particularly myths of domination that
    deny some people access to productive resources.


    Intellectual elites in some developing countries such as Mexico promote local
    use and custom (usos y costumbres) with the same outcome of male domination.
    The modernism which opened up society and allowed racial and other minorities
    to demand equal rights and women to challenge male domination is being denied
    those who are most in need of change in poorer countries. “The oppressed Others
    do not need patronizing affirmations of their ways of knowing, as much as they
    need ways to challenge these ways of knowing” (Nanda 1996 and Nanda 2003b).


    Modern knowledge allowed Nanda to escape from such practices as forced marriage
    and other forms of domination but still allowed her to retain a sense of shared
    identity with the culture of her origin. It is the rationality of the Enlightenment,
    science and modernity that were instrumental in the creation of more tolerant
    multi-cultural societies. As Nanda states it, “We Are All Hybrids Now” (Nanda
    2001). I would add that we have been hybrids for some time. Over 60 years ago,
    the anthropologist Ralph Linton had a sketch of a “solid American citizen” awakening
    in a “bed built on a pattern which originated in the Near East” traversing the
    day taking for granted the diverse global origins of the items of his daily
    routine, ending it by thanking a “Hebrew deity in an Indo-European language
    that he is 100% American (Linton 1963, 326-327).


    More important, modernity allows one the freedom to participate fully in modernity
    while still being able to retain a more localized personal identity. This is
    a tolerance for diversity which is rare in the traditional societies that Shiva
    seeks to promote. Modern science and technology are central to this hybridity.
    As many of us (including Nanda) have long argued, calling science and technology
    “Western” is to accept the 19th century claim of exclusive authorship to what
    has been and remains a universal endeavor to which all peoples have contributed
    just as they contributed to the artifacts of Linton’s 100% American.


    Shiva and others can call modern science logophallocentric reductionism and
    any number of other pejorative slogans in contrast to Prakriti or the feminine
    principle, but, in fact, modern knowledge is liberating. Shiva and her cohorts
    may feel “victimized” by “alien” ideas, but it is doubtful that this is the
    case for many throughout the world who have benefited from it, whether by a
    larger crop or lives saved by immunization or antibiotics. Nanda suggests that
    it would be “interesting” to see the reaction of “untouchables” to the “knowledge
    that DNA material … has the same composition in all living beings, be it brahmin
    or bacterium. Or what would a women do with the knowledge that it is the chromosome
    in sperm that determines the sex of the new born?” (1991, 38).


    May we add that over 99.9% of the human genome is shared by all human beings
    and that of the less than 0.1% that differentiate us, only about 3 to 5% of
    it is between groups, with about 95% being intra group variation (Rosenberg
    et al. 2002). If Shiva wishes to help women and those in need in India, she
    should be promoting an understanding of DNA and molecular biology and its liberating
    implications rather than fostering false fears of its use for human betterment.
    Not only is the genome that unites us as humans vastly greater than that which
    differentiates us, but the portion of the genome that defines our individual
    biological differences within our culture is vastly greater than the minuscule
    portion of the genome, 0.05%, that defines differences between groups (Rosenberg
    et al. 2002, King et al. 2002 and Wade 2002).


    We can argue as to how far we have come on the road to a more just society
    or how much farther we have to go, but it is undeniable that in countries like
    the United States, the rights of minorities and women have been greatly expanded
    over the last decades. Shiva has been promoting a road to a past that never
    existed and to a future where nobody really wants to go, including those who
    blindly follow her.


     


    *(The article is largely drawn from the author’s book manuscript, Origins of the Organic Agriculture Debate. Ames:
    Iowa State Press, A Blackwell Scientific Publisher (in press) http://store.yahoo.com/isupress/0813805139.html. Additional material
    is taken from two recently published books, Thomas R. DeGregori, The Environment,
    Our Natural Resources, and Modern Technology
    . Ames: Iowa State Press, A
    Blackwell Scientific Publisher and Thomas R. DeGregori, Bountiful Harvest:
    Technology, Food Safety, And The Environment
    . Washington, D.C.: Cato Institute,
    which was originally published as Agriculture and Modern Technology: A Defense,
    Ames: Iowa State University Press. Author’s homepage is http:www.uh.edu/~trdegreg).


     


    References


    Agarwal, Anil and Sunita Narain. 1991. "Chipko People Driven to Jungle
    Kato [Cut the Forests] Stir", Economic Times (India), 31 March.


    Agarwal, Radha Raman. 1965. Soil Fertility in India. Bombay: Asia Publishing
    House.


    Anker, Peder. 2001. Imperial Ecology: Environmental Order in the British
    Empire
    , 1895-1945. Cambridge, MA.: Harvard University Press.


    Avery, Alex. 2000. "Vandana Shiva Antoinette: Let Them Eat Weeds!"
    Global Food Quarterly (30):6, Spring.


    Bandyopadhyay, Jayanta. 1999. Chipko Movement: Of Floated Myths and Flouted
    Realities
    . “Mountain People, Forests, and Trees,” Mountain Forum’s on-line
    library, (http://www.mtnforum.org/resources/library/bandj99a.htm)


    Bookchin, Murray. 1995. Re-enchanting Humanity: A Defense of the Human Spirit
    Against Antihumanism, Misanthropy, Mysticism, and Primitivism
    . London and
    New York: Cassell.


    DeGregori, Thomas R. 2001. Agriculture and Modern Technology: A Defense,
    Ames: Iowa State University Press.


    DeGregori, Thomas R. 2002a. The Environment, Our Natural Resources, and
    Modern Technology
    . Ames: Iowa State Press, A Blackwell Scientific Publisher.


    DeGregori, Thomas R. 2002b. Bountiful Harvest: Technology, Food Safety,
    And The Environment
    . Washington, D.C.: Cato Institute.


    DeGregori, Thomas R. 2002c. "NGOs Don’t Speak for the Hungry", AMERICAN
    COUNCIL ON SCIENCE AND HEALTH, Health Facts and Fears, 26 August. (http://www.healthfactsandfears.com/featured_articles/aug2002/ngo082602.html).


    DeGregori, Thomas R. 2004. Origins of the Organic Debate: Vitalist Junkscience
    vs. Scientific Inquiry.
    Ames: Iowa State Press, A Blackwell Scientific Publisher
    (in press).


    Devraj, Ranjit. 2000. Cyclone Victims Are Guinea Pigs for Mutant Food.
    Inter Press Service atimes.com online, 13 June.


    FAO (Food and Agriculture Organization of the United Nations). 2003.Unlocking
    the Water Potential of Agriculture
    . Rome: Food and Agriculture Organization
    of the United Nations. (http://www.fao.org/ag/AGL/aglw/aquastat/kyoto/index.stm)


    Friedmann, John and Haripriya Rangan. 1993. "Introduction: In Defense
    of Livelihood." In Defense of Livelihood: Comparative Studies on Environmental
    Action
    edited by John Friedmann and Haripriya Rangan, pp.1-21. West Hartford,
    Conn.: Kumarian Press.


    Howard, Sir Albert. 1940. Agricultural Testament. Oxford: Oxford University
    Press.


    Jardhari, Vijay. 1996. Letter Dated 01 May 1996 from Vijay Jardhari and
    other Chipko Activists of Teri Garhwal to the Editor of the Star
    (Selangor).


    Jayarsman, K.S. 2000. GM Food “Dumped on India as Food Aid,” Nature
    405(6789):875, 22 June.


    King, Mary-Claire and Arno G. Motulsky. 2002. "Human Genetics: Mapping
    Human History", Science 298(5602):2342-2343, 20 December.


    Lean, Geoffrey. 2000. "Rejected GM Food Dumped on the Poor." The
    Independent
    (London), 18 June.


    Linton, Ralph. 1963. The Study of Man. New York: Appleton-Century-Crofts.


    Nanda, Meera. 1991. Is Modern Science a Western Patriarchal Myth? A
    Critique of the Populist Orthodoxy
    , South Asian Bulletin XI(1&2):32-61.


    Nanda, Meera. 1996. "The Science Question in Postcolonial Feminism".
    In The Flight From Science and Reason, edited by Paul R. Gross, Norman
    Levitt and Martin W. Lewis, pp. 420-436. New York: The New York Academy of Sciences.


    Nanda, Meera. 1996. "The Science Wars in India", Dissent 44(1),
    Winter.


    Nanda, Meera. 1997. “History Is What Hurts: A Materialist Feminist Perspective
    on the Green Revolution and Its Ecofeminist Critics”. In Materialist Feminism:
    A Reader in Class, Difference, and Women’s Lives
    edited by Rosemary Hennessy
    and Chrys Ingraham, pp. 364-394. New York: Routledge.


    Nanda, Meera. 1998. "The Episteme Charity of the Social Constructivist
    Critics of Science and Why the Third World Should Refuse the Offer". In
    AHouse Built on Sand: Exposing Postmodernist Myths About Science edited
    by Noretta Koertge, pp. 286-311. New York: Oxford University Press.


    Nanda, Meera. 1999. "Who Needs Post-Development? Discourses of Difference,
    The Green Revolution of Agrarian Populism in India", Journal of Developing
    Societies
    15(1):1-31.


    Nanda, Meera. 2000. Dharma and the Bomb: Post-Modern Critiques of Science
    and the Rise Reactionary Modernism in India
    , paper read at the American
    Sociological Association, August.


    Nanda, Meera. 2001. "We Are All Hybrids Now: The Dangerous Epistemology
    of Post-Colonial Populism", The Journal of Peasant Studies 28(2):162-187.


    Nanda, Meera. 2002. Breaking the Spell of Dharma: A Case for Indian Enlightenment.
    Delhi: Three Essays Press.


    Nanda, Meera. 2003. "Do the Marginalized Valorize the Margins: Exploring
    the Dangers of Difference". In Development or Post Development: Which
    Way for Women in the 21st Century
    , edited by Kriemild Sunders. London: Zed
    Books.


    Nanda, Meera. 2003a. "Anti-Science". In The Oxford Companion to
    the History of Modern Science
    . New York: Oxford University Press.


    Nanda, Meera. 2003b. Prophets Facing Backwards: Postmodern Critiques of
    Science and Hindu Nationalism in India
    . New Brunswick, NJ.: Rutgers University
    Press, New Delhi: Permanent Black (in press).


    Nanda, Meera. 2003c. Do the Marginalized Valorize the Margins: Exploring
    the Dangers of Difference
    . (in press in an edited book – citations taken
    from manuscript provided by the author).


    Oakley, Aaron. 2000. "Hating Modern Agriculture", The New Australian,
    No.151, 10-16 April.


    O’Hara, Kathleen. 2000. "The Stolen Harvest", The New Australian,
    No. 151, 10-16 April.


    Olson, Walter. 1999. "Benighted Elite: Postmodernist Critics of Science
    Get Their Comeuppance", Reason online, June.


    Pearce, Fred. 2003. "The Greening of Hate: An Interview With Betsy Hartmann",
    New Scientist, 177(2383):44-47, 22 February.


    Peluso, N. 1993. "Coercing Conservation: The Politics of State Resource
    Control", Global Environmental Change 4(2):199-217.


    Rangan, Haripriya. 1993. "Romancing the Environment: Popular Environmental
    Action in Garhwal Himalayas". In In Defense of Livelihood: Comparative


    Studies on Environmental Action edited by John Friedmann and Haripriya
    Rangan, pp.155-181. West Hartford, Conn.: Kumarian Press.


    Rangan, Haripriya. 2000. Of Myths and Movements: Rewriting Chipko into Himalayan
    History
    . London; New York: Verso.


    Randhawa, Mohindar Singh. 1983. History of Agriculture in India. New
    Delhi: Indian Council of Agricultural Research, Vols 1-4.


    RFSTE. 2000. US Government Dumping Genetically Engineered Corn-soya Mix
    on Victims of Orissa Super-cyclone
    . New Delhi: Press Release, Diverse Women
    for Diversity, The Research Foundation for Science, Technology and Ecology 2
    June.


    Rosenberg, Noah A.; Jonathan K. Pritchard; James L. Weber; Howard M. Cann;
    Kenneth K. Kidd; Lev A. Zhivotovsky; and Marcus W. Feldman. 2002. "Genetic
    Structure of Human Populations", Science 298(5602):2381-2385, 20
    December.


    Scruton, Roger. 2000. "Herbicide, Pesticide, Suicide: Seed Merchants Prosper
    and Farmers Wither; That’s the Truth of Global Agribusiness", Business,
    Weekend FT Magazine, Financial Times
    , London, 6 June.


    Shiva, Vandana. 1988. Staying Alive: Women, Ecology and Survival in India.
    London: Zed Books.


    Shiva, Vandana. 1991. The Violence of the Green Revolution: Third
    World Agriculture, Ecology, and Politics
    . London: Zed Books.


    Shiva, Vandana. 2000. BBC Reith Lectures 2000, BBC online network, 12
    May Tyer, Brad. 2000. RiceTec Paddy Whack, Houston Press, 23 November.


    Wade, Nicholas. 2002. "Gene Study Identifies 5 Main Human Population",
    Science 298(5602):2381-2385, 20 December.

  • Life’s Lethal Quality Control

    One day in 1995, biologist Armand Leroi walked into Manhattan’s Strand Bookshop
    and made a remarkable discovery. He came across a rather plain-looking remaindered
    volume bearing the title Cancer Selection . The postdoc student had not heard of the book or its author, James
    Graham. But, Leroi recalls: “I’m a sucker for odd theories of evolution, so
    I bought it.” It was an impulse decision that was to have profound implications.
    For buried in the book was a bold new idea that has become a muse to the young
    scientist.


    The book was lying on a table in front of him when I visited his South London
    flat. Leroi, a reader in evolutionary developmental biology at London’s Imperial
    College, is an articulate and rather intense man with a hint of an accent betraying
    his Dutch origins. He says he knew the book was not a work by a professional
    scientist: “In fact, Graham’s very frank about this. His whole thing is he’s
    an outsider.” At this point Leroi reads aloud from the book’s cover blurb. “James
    Graham began work on his theory in 1977 while working as a senior executive
    in a large multinational corporation. He now devotes his full time to writing.
    He is a member of Mensa” He pauses. “Well, these are obviously not standard
    scientific credentials. And scientists who are members of Mensa don’t usually
    advertise it.”


    In 1995, Leroi read Graham’s book but then put it aside. He now credits it
    as having spurred him and two colleagues at Imperial to reconsider a piece of
    received wisdom in the field of evolutionary biology. Like the book, Leroi’s
    paper in last month’s Nature Reviews Cancer is titled “Cancer Selection”. In it, he asks whether cancer may
    have played a hitherto overlooked role in the evolution of complex animal life.


    Biologists view events that are both common and life-threatening as among the
    driving forces of evolutionary change. The occasional creature born genetically
    endowed to cope more successfully with such an event has a greater chance of
    surviving to reproduce and pass on whatever useful genes underpinned its good
    fortune. Thus is evolution guided by the adaptive hand of natural selection.


    Cancer is a common occurrence, and certainly a life-threatening one. But biologists
    generally regard it as having no effect on evolutionary change because it mostly
    afflicts people beyond their child-bearing years. As a result, cancer cannot
    affect their chances of passing on any protective genes they may possess to
    the next generation.


    But not all malignancies are confined to the elderly: a minority of children
    also develop lethal cancers. Leroi and his co-authors suggest that perhaps childhood
    cancer mostly affects organs that have undergone recent and rapid evolution.
    It is a radical proposal. In the first page of their paper is an acknowledgement:
    “The idea that changes in morphology and life-history can expose animals to
    an increased risk of cancer has been argued forcefully by James Graham in his
    1992 book Cancer Selection .”


    Graham – now in his early 70s and living in Lexington, Virginia – started to
    think about cancer after reading The Selfish Gene
    by Richard Dawkins. He had no particular knowledge of the disease.
    What Graham knew about was manufacturing. He had spent much of his working life
    in multinational corporations, and had become the chief financial officer of
    a leading cosmetics company. From his industrial background he was well aware
    that design improvements in a product often lead to an initial fall-off in the
    quality of the manufactured goods. To retain the advantages of the improvement
    while restoring the lost quality, adjustments have to be made to the production
    process. Among living things, Graham reasoned, death caused by cancer might
    play a similar role. It could serve to eliminate those individuals who inherited
    a genetic programme unable to cope with any damaging side-effects associated
    with change. Cancer, in other words, is evolution’s method of quality control.


    Having received what he laughingly calls this “gift from the unconscious” he
    began garnering the evidence. “I guess I went about it backwards,” Graham says.
    “The typical scientist studies for years before he even attempts to deal with
    evolutionary theory. I knew nothing when I had this idea.”


    In a determined attempt to persuade others, Graham started writing to learned
    journals, initially with some modesty. “I was looking for someone to come back
    with intelligent comments about my inept efforts, and to advise me. That didn’t
    happen. The reception I got was more like swatting a fly.”


    He also wrote to evolutionary biologists. “I knocked on so many doors my knuckles
    were bleeding. My dream was that someone would say to me, ‘Mr Graham, you’ve
    got a good idea here. Why don’t we collaborate?’ I would have jumped at the
    opportunity.” This didn’t happen either. Graham’s modesty began to evaporate.
    One journal, Evolution , did take a more sympathetic view but an attempt to set up a collaboration fell through.


    Eventually, he managed to get two severely pruned letters published in the Journal of Theoretical Biology .
    “I then put on my PR hat,” Graham says. “I wrote a press release, and even hand-delivered
    it to media people in Manhattan.” But there was still no serious consideration
    from the scientific community.


    Nevertheless, Graham persevered. “I was convinced I was correct, and if an
    idea is correct it’s important.” As simple as that. And his conviction drove
    him to the only remaining option: to publish himself. He did so under the imprint
    Aculeus Press. Aculeus is Latin for needle or sting. The book received several reviews, not least in Nature
    , whose critic wrote: “I, at least, like the idea.” But then silence.


    Frustrated at years of being ignored by the scientific establishment, Graham’s
    tone had, by this time, grown belligerent. Many biologists reading his book
    would balk at the first chapter. Provocatively titled “Biology’s dirty little
    secret” it dismisses conventional thinking about Darwinian evolution as “utter
    nonsense” and an “intellectual error of the rankest sort”. Although the prose
    is clear and readable it is also assertive, didactic and sometimes patronising.
    The reader is constantly warned that all contrary views are foolish, absurd
    or self-evidently wrong: “Unlike the old theory, mine is correct.”


    Graham goes on to compare himself to Darwin (neither had been educated as a
    scientist) and Friedrich Wegener of continental drift fame (as a meteorologist
    not a geologist, Wegener too was an outsider). Thomas Kuhn’s classic The
    Nature of Scientific Revolutions
    is wheeled out to remind us that people responsible for the “fundamental inventions
    of a new paradigm have been either very young or very new to the field
    whose paradigm they change”. (The italics are Graham’s.) He even
    compares his method of work to Albert Einstein’s.


    At this point it is difficult not to lose patience. “Its whole tenor as it
    rails against the biological establishment is that scientists are just too thick,”
    Leroi says. “They’ve missed it all. They’re stuck in their paradigms of conventional
    Darwinian evolution. But James Graham is going to set them right.”


    Happily for Graham, Leroi had the forebearance to judge the work on its merits.
    “When I’d originally read Graham’s book, I thought there was something in it.
    But I forgot it,” he admits. Then he moved to the UK and started working with
    another evolutionary biologist, Austin Burt. “I can’t remember whether he was
    in my office or I was in his, but we looked on each other’s bookshelves and
    said, ‘Gosh, Cancer Selection , you’ve got it too’.”


    It turned out that Burt had also picked up a remaindered copy of the volume
    in Moe’s Bookshop, Berkeley, while in California as a postdoc.


    An extraordinary chain of coincidences that began with two postdoc biologists
    buying the same rare book on opposite sides of America has now brought them
    together in London with a third colleague, Vassiliki Koufopanou, to pen their
    own thoughts on cancer.


    They accept Graham’s premise that the disease is ancient and ubiquitous and
    can be a force in natural selection. “That idea is the central one,” Leroi admits.
    Needless to say, Graham doesn’t stop there. He reinterprets the diversity of
    animals in the light of his discovery, opining, for example, that snails evolved
    shells to protect themselves not from birds but from ultraviolet light and hence
    cancer. “You can apply this approach to any feature of the biological world,”
    Leroi says, “and Graham does so, willy-nilly.”


    Graham claims that cancer selection is not a but the driving
    force in the emergence of complex animal life. “He believes that with good,
    clear thinking one can arrive at an answer,” Leroi says. “But this isn’t enough.


    Because something could be a certain way doesn’t mean that it actually is
    that way. All those biologists who spend their time trying to test
    evolutionary theory – well, he thinks they’re just number-crunchers who can’t
    see the big picture.”


    Nevertheless, spurred on by Graham’s reasoning, Leroi began to consider cancer
    in children. It turns out that only a few of the body’s organs are particularly
    susceptible to childhood cancer, such as the bones and the brain. Both incorporate
    recent evolutionary novelties. “As Homo sapiens, we are famous for our big brains that have evolved so enormously
    over the past few million years,” Leroi says. And then there’s our pubertal
    growth spurt.


    “Chimpanzees don’t have it. It’s possible that the osteosarcomas that kids
    get in the bones undergoing the growth spurt are a consequence of this being
    an evolutionary novelty.” Although admitting that the evidence is circumstantial,
    Leroi also thinks it is persuasive.


    Childhood cancer kills before the age of reproduction, and is therefore amenable
    to selection. Yet it continues to exist. Why? Because natural selection has
    not yet had the time to deal with it.


    The possibilities for speculation are limitless. For example, one outcome of
    evolution can be an increase in size. But bigger animals have more dividing
    cells, and therefore more cells to turn cancerous. So why don’t elephants suffer
    cancer more than mice? Maybe they do; maybe it’s the success of large animals
    in developing better anti-cancer mechanisms that has allowed them to become
    large.


    As diseases go, cancer is seldom far from our attention. The thought that this
    much-feared disease might have played a major role in shaping our evolution
    is tantalising. But it took an outsider to see this possibility, and the chance
    interest of an insider to draw it to our attention.


    This article first appeared in the Times Higher Education Supplement.
    Check out their web page here
    .

  • Don’t Bury the Bones

    A committee has met behind closed doors in London over the last two years to
    decide the future of old bones in British cultural and scientific institutions.
    Their deliberations and decision will have consequences for all of us. The skeletons
    in the closets could tell us about history, humanity and our health, if only
    we would let them.


    There is a growing feeling amongst many in the museum profession that
    old human remains should be returned to where they were originally found. Tony
    Blair raised the issue of repatriation in 2000 when he agreed to increase efforts
    to send back remains from Australian indigenous communities. The Department
    for Culture Media and Sport subsequently set up a working group to examine the
    issue and consider how the law might be changed to allow institutions to repatriate
    all human remains.


    The working group is made up of a few lawyers, museums professionals and anthropologists,
    among them Dr Neil Chalmers, director of The Natural History Museum; Norman
    Palmer, Professor of Commercial Law at University College London, and Tristram
    Besterman, director of the Manchester University Museum, who was until recently
    the convenor of the Museums Association Ethics Committee.


    The group was asked to examine the legal status of human remains in the collections
    of publicly funded Museums and Galleries in the UK and the powers of these institutions
    to deaccession the remains. They had to consider the desirability of deaccession,
    the form of changes in legislation that would be necessary, and a statement
    of principles for guidance. The group will also make recommendations on how
    to include non-human remains associated with human remains in these changes.


    The group is expected to issue recommendations to government soon. The main
    suggestion will be the relaxation of laws that currently prevent institutions
    from parting with bones. Overall it will advocate the return, for moral reasons,
    of skeletons presently held in national collections.


    The bones are evidence from the past that speak to us about life from between
    one century to many thousands of years ago. Under scrutiny they reveal patterns
    of migration, the effect of environment upon body form, and the relationship
    between different populations. We can learn who lived where and when, about
    patterns in health, origin, gene flow and microevolutionary change


    When the law changes, large and significant collections could be broken up
    and sent away. A survey by the committee found human remains from all over the
    world in more than sixty British museums. The Natural History Museum, for example,
    has a broad collection of at least twenty thousand remains that are used extensively
    by scientists for comparative research. University collections include those
    held at Edinburgh and Cambridge.


    If returned, the collections will probably be treated as sacred and then buried.
    This has already happened in similar cases, and the likelihood that it will
    continue was reinforced at the annual conference for the Museum Association.
    The keynote speech was given by Rodney Dillon, a Tasmanian Aboriginal and Torres
    Strait Islander Commissioner who travels the world campaigning for the return
    of old aboriginal skeletons. Speaking to a welcoming audience Dillon proclaimed,
    ‘We take pride in our people’s past. Without our remains where they should be,
    buried where they belong, we can’t cope. People are walking around with their
    heads down as their ancestors are not there.’


    The pending ruling won’t remove all the material, of course. Not every group
    wants the remains returned, and in some situations no link can be found to any
    group at all. Some remains are of no research value, so there is no reason for
    them to be in a lab collecting dust. And research may well continue around the
    world. But on the whole it is likely that some of the most crucial material
    about humanity will be lost.


    America has gone further down this path and indicates what could happen in
    the UK. In 1990, NAGPRA (the Native American Graves Protection and Repatriation
    Act) set new criteria for who should make the decisions regarding the disposition
    of human remains and artefacts. It is a mandate for researchers and museums
    to return all human remains to their closest hereditary or cultural descendants.
    The descendants decide the future of the bones whether they throw them into
    the sea, examine them, or bury them six feet under.


    There has been a steady impact upon collections. Museums backed by government
    have sent back vital collections and remains, most of which have been covered
    in soil. It is estimated that the Smithsonian alone has transferred more than
    3,335 sets of human remains. In 1999 the Peabody Museum based at Harvard University
    returned remains of nearly two thousand individuals to the Pecos and Jemez Pueblo
    in New Mexico.


    The Pecos were at their peak between 1300 and 1600 and ruled over a trade path
    between the Pueblo farmers of the Rio Grande and tribes of the buffalo plains.
    The bones have been studied since their discovery in 1915. The collection was
    the largest available skeletal population from a single community and was large
    enough to be statistically significant. As a result we have learned about the
    influence of diet and disease on populations. We know more about osteoporosis,
    head injuries, and the development of dental cavities. This was brought to an
    end upon return of the collection when the bones were covered in earth. We can
    learn nothing further from these bones.


    The Kennewick Man found on a riverbank in Washington state in 1996 is one of
    the most important skeletons to be found, but it cannot at present be examined.
    Initial radiocarbon samples showed the bones to be around 9,500 years old, proving
    the skeleton to be of Paleo-Indian age, and one of the oldest prehistoric individuals
    to be found in North America. Preliminary analysis suggested the bones were
    not American Indian but possibly European. Before further research could be
    done the bones were confiscated. Under NAGPRA any human remains found in North
    American that predate Columbus (1492), no matter how old the bones are, are
    considered American Indian.


    The case has been contested by anthropologists who have gone from court to
    court asking to be allowed to examine the skeleton. Early this year a federal
    judge denied the motion that had put their investigations on hold. Scientists
    and historians held their breath eager to start work only to have their hopes
    dashed a month later with another court hearing blocking the study of the bones.
    Eight years after the discovery of an amazing piece of history, it still cannot
    be investigated.


    Legislation backing the right of one group to decide prevents us all from ever
    finding out more, or challenging what we think we know. In the name of protecting
    ancient and sacred beliefs, what ought to be a rational legal system is blocking
    the furthering of our knowledge of humanity.


    At the heart of the battle is the idea that a group identity owns the sole
    rights to investigate the past and can prevent all others from doing so. Yet
    the very idea of fixed groupings and cultural continuity over thousands of years
    is a flawed supposition. The history of human beings is not one of separate
    and permanent cultures, but one of continual migration, amalgamation, fission
    and disintegration. Neither people nor language, and certainly not geographic
    location remain stable for more than a small period of time. The idea that there
    is a clear link to thousands of years ago is fundamentally wrong. It also advances
    notions of fixed and separate races that should not be tolerated today. These
    are ideas that science and a rational understanding of history have proven incorrect.


    The idea that one group should dictate to others what can and cannot be investigated
    is a serious and dangerous problem for all. The collections should belong to
    the world rather than any one group. That one group can censor and obscure access
    to knowledge on the basis of an identity from hundreds or thousands of years
    ago, is seriously wrong and threatens the future of ideas and understanding.


    At last years Museums Association conference, Rodney Dillon exclaimed in his
    keynote address, ‘We have no clean water, there is petrol sniffing and crime
    is rife.’ It was a moving speech that filled me with outrage. But he used this
    terrible situation to argue that the bones should be returned and buried. ‘It
    is no good worrying about the future, we need to think about the past,’ he claimed.


    Destroying history, understanding, and knowledge is not the solution to the
    very serious problems of this community. Indeed there is a great danger in rooting
    today’s pressing problems in the bones from thousands of years ago. The current
    circumstances of aborigines need to change in the here and now. Worrying about
    the past only obscures the nature and urgency of the problems.


    The UK working group is eager to send the bones back. Members admit their recommendations
    will be "anti-scientific" but those in the nervous and unconfident
    museum profession welcome an opportunity to improve their image. They feel that
    they can benefit from this gesture. At a recent meeting on the bones someone
    from the Heritage Lottery Fund declared we ‘need to understand the spiritual
    role of these objects and sacred artefacts that can help us find our place.’
    It seems that many are turning their backs on the scientific project of making
    new discoveries, and instead want to find new meaning in old myths.


    Secrets at our fingertips about the past are being covered up. The opportunity
    to explore and ask questions of our ancestors, to reaffirm or challenge conventional
    views, to evaluate what we discover against what we have been led to believe,
    is at stake. Those in charge of cultural institutions should not turn their
    backs on their responsibility to honour access to knowledge, for the sake of
    humanity’s past and all our futures.

    Tiffany Jenkins is a director at the Institute of Ideas.

  • A Bluffer’s Guide to Science Studies and the Sociology of “Knowledge”

    Ever since science became a going concern in the ancient world, people have
    asked: “What is this thing called science?” An early answer was given by Aristotle
    in his Organon, its focus being largely on the logic and methodology
    of scientific reasoning. Even if its substantive claims are now no longer central,
    it inaugurated a tradition of philosophical thought about science that has had
    wide acceptance by many scientists and philosophers; in their different ways
    recent philosophers such as Carnap, Popper, Lakatos and the Bayesians are all
    within this tradition. It involves belief in, and the application of, principles
    of logic, methodology and of rationality generally; on the whole such principles
    have been instrumental in leading scientists, if not others, to hold the scientific
    beliefs they do.


    But these days this tradition has fallen out of fashion and has been replaced
    by the burgeoning fields of sociology of science, cultural studies of science,
    constructivism, postmodernism, and the like. Reasons for the change in fashion
    are several, one having to do with the political, economic and social uses of
    science, some of which, rather than enhance our lives, threaten our very existence.
    By blaming these ills on science itself, rather than, say, the uses to which
    it is put by the military, industry, commerce, governments and others, advocates
    of anti-science have placed the philosophical "ideologists" who talk
    of the rationality of science under a cloud of suspicion.


    Another reason has to do with the status claimed by science. In countering
    claims about its rational basis, attempts have been made to de-legitimate and
    "demystify" science. Here the attack on the Aristotelian tradition
    is at its most profound. Science is alleged to be no more legitimate than many
    other non-scientific practices such as those embodied in "local knowledge",
    "ethnoscience" and what the postmodernist Lyotard calls "narratives".
    To claim otherwise is to indulge in philosophical metanarratives towards which
    Lyotard invites us to be incredulous, this being his definition of the postmodern.
    Nor, it is also alleged, can science claim to give us a picture of what the
    world is like, or a picture that is better than that given in non-sciences.
    It is not that science gives us, as many philosophers claim, either an ideal
    model of the world that approximates reality, or a picture that has only truthlikeness.
    Rather science is likened to a discourse that gives no picture at all since
    it fails to represent anything. Here critics of the Aristotelian tradition join
    hands with those who have strongly empiricist and anti-realist or irrealist
    inclinations about science.


    So, what are the causes of our beliefs in matters scientific if the canons
    of rationality are to be abandoned as an unbelievable metanarrative, or if scientific
    beliefs do not represent? The Ancient Greek Presocratic philosopher Xenophanes,
    rather than Aristotle, gives us a clue. Xenophanes was a sceptic who denied
    that knowledge could be obtained by us humans; at best we merely have beliefs,
    the truth or falsity of which will remain largely unknown to us. Our beliefs
    are not a response to reality – but something else. Xenophanes illustrates his
    view in the case of belief about God, but it has wider application. He says
    of the causes of beliefs in the gods, or God: ‘Each group of men paint the shape
    of the gods in a fashion similar to themselves; the Ethiopians draw them dark
    and snub-nosed, the Thracians red-haired and blue-eyed’. And he has similar
    remarks about the gods that cows, horses and lions would draw if they had hands;
    they would, respectively, look just like cows, horses or lions.


    One of the several points being made here is that if the gods are believed
    to be dark and snub-nosed, then the cause of this belief has nothing to do with
    the gods themselves; rather its cause has to do with some feature of ourselves.
    The gods themselves are not causally involved in our representations of them;
    rather it is something about ourselves that leads us to make the representations
    we do. It is as if the gods drop out of the picture as far as the causes of
    our beliefs is concerned; something completely non-god-like plays a causal role
    in the production of belief. In so far as this something else concerns social
    aspects of ourselves, then Xenophanes is the first sociologist of "knowledge"
    or, as we should more correctly say, the first sociologist of belief.


    Subsequent sociologists have extended Xenophanes’ views on the causes of beliefs
    about God to scientific belief itself. A Xenophanes-like account of scientific
    belief has come to be adopted by a wide range of people such as Marx, Mannheim,
    contemporary sociologists of scientific belief, Foucault, Nietzsche, to mention
    a few. Surprisingly, even though they differ markedly over what they claim are
    the specific causes of belief, they all espouse the same general form of explanatory
    theory which is intended to replace explanations that appeal to scientific rationality.
    For many of them, scientific belief is not a rational response to the world;
    scientific "knowledge", as the title of David Bloor’s influential
    book has it, is nothing but social imagery.


    Karl Marx was one of the first to suggest that the sciences, along with
    ideology, forms of consciousness, religious belief and the like, are also determined,
    shaped or caused by the prevailing forces and relations of production. Marx
    proposed a two-tiered view of all social factors in his doctrine of historical
    materialism. Forces and relations of production constituted the economic foundation
    of society while anything that has to do with ideas ("forms of consciousness")
    is to be placed in the superstructure, which depends, in some unspecified way,
    on the foundation. Marx bequeathed to the sociology of belief a problem that
    it has never been able to solve, viz., what exactly the relation of dependence
    is between beliefs or ideas and their alleged foundation.


    In his programmatic pronouncements about historical materialism, Marx does
    not specifically mention science in the superstructure. But some of his other
    comments do indicate that he saw science this way, while yet other comments
    indicate that perhaps science might not fit into such a simple two-tiered model
    after all. Either the model is deficient, or science is simply a third item
    outside the two-tiered model. His followers from Engels onward were not so ambivalent;
    they see science as something "determined" by forces and relations
    of production.


    Put this way, there is an evident confusion between the very content of science,
    such as its laws or theories, and other aspects of science such as the choices
    as to which lines of research to pursue, what programmes to fund, what applications
    of theory might be the most commercially promising, and so on. A case might
    be made for forces and relations of production playing some limited role concerning
    the latter; but they play no role in determining the former, the very content
    of science. It is here that those within the Aristotelian tradition of understanding
    science would claim that methodological principles play an important role (along
    with other factors) as a cause of belief. But this is denied by those who follow
    Marx and Engels in regarding such beliefs, even in the very content of science,
    as arising from the interplay between forces and relations of production.


    As an illustration of not just the claims that Marxists might make, but also
    most sociologists of scientific "knowledge", consider Forman’s account
    of how German physicists in the Weimar period were caused to believe in physical
    acausality. In Xenophanes style, the causes have nothing to do with their purported
    objects, viz., indeterministic laws and happenings in the physical world. Rather
    it is the social milieu of the physicists of the Weimar period with its Spenglerian
    hostility to science and causality that is the cause of their beliefs. Forman’s
    story does not appeal to any forces and relations of production; so it does
    not fit Marx’s model. But as will be seen it supports other sociological stories
    about the cause of belief.


    Such sociological explanations of scientific belief can be used to expose the
    "false consciousness" of the kind of explanations offered by philosophers
    in terms of (belief in) principles of rational methodology, and to debunk them.
    Physicists in Weimar Germany were deluded if they thought that methodological
    principles played a role in bringing about their scientific beliefs. Their beliefs
    are "social imagery’ caused by "socio-cultural conditions" (or
    by belief in such conditions). They are not caused by any (belief in) rational
    principles which accompany their theoretical and experimental endeavours. The
    "ideological" pretensions of rationalist philosophers, still working
    within the misleading framework bequeathed to us by the enlightenment, is now
    exposed, and debunked, by the rival explanations of the sociologists of "knowledge".


    However one might well ask how the sociologists manage to establish their claims
    about the causes of belief if the do not accept some principles of rationality.
    What they need to show, but do not, is that that the physicists’ current beliefs
    in physics, and in methodology, are causally impotent in bringing about the
    Weimar physicists’ belief in acausality; what allegedly does all the work is
    their concurrent socio-cultural circumstance, or their beliefs about this. A
    necessary smoke screen is raised to obscure their failure to employ causal methodology
    at some point.


    Karl Mannheim held that the two important progenitors of the sociology of “knowledge" were Marx and Nietzsche. We will turn to Nietzsche shortly.
    Mannheim himself proposed a sociology of "knowledge" in which, as
    he obscurely puts it, there is an “existential determination of knowledge”.
    Mannheim does not get much further than talk of bare relations of thought or
    knowledge to "historical-social existence". But he does importantly
    suggest that there are some areas of thought or knowledge, such as mathematics
    and science, that are independent of historical-social existence and that evolve
    according to their own “inner dialectic” or “imminent laws”. This liberality
    towards the independence of science is criticised by advocates of the Strong
    Programme in the sociology of scientific "knowledge". They claim that
    Mannheim lost his nerve in failing to extend the programme of the sociology
    of “knowledge” to science and mathematics.


    The most recent incarnation of the basic ideas of Marx and Mannheim can be
    found in the Strong Programme in the sociology of scientific "knowledge".
    Its central causality tenet tells us that all scientific belief or "knowledge"
    is to be causally explained by social, historical or cultural conditions (or
    belief in such conditions, an important ambiguity in many formulations
    of the doctrine often passed over). These operate in conjunction with non-social
    causes which, because of their general occurrence across humanity, cannot be
    used to explain variation in belief; this is something only variable social
    factors can provide. Importantly, there is no mention of (belief in) any norms
    of method in the causality tenet as a possible cause of scientific belief. These
    are ruled out as explainers not only on the basis of the naturalism espoused
    by the Strong Programme, but also on the basis of the all-important symmetry
    tenet which says that the same kind of explanation must apply to all beliefs
    regardless of their truth or falsity, or their rationality or irrationality.
    Since on their view the only viable explanations are those which appeal to socio-historical
    causes, the symmetry tenet then rules out all explanation of scientific belief
    on the basis of normative methodological principles. Such normative explanations
    are said to be an unnatural intrusion upon the causal realm in which only naturalistic
    social (and naturalistic non-social) factors can be causally efficacious in
    bringing about belief. Such restrictions are imposed by the scientism of the
    Strong Programme in which, like any other science, only naturalistic causal
    factors and causal laws are to be admitted.


    While there is some debate among sociologists of scientific “knowledge” as
    to the applicability of all the tenets of the Strong Programme, the symmetry
    tenet remains central. Explanations on the basis of rational principles of method
    are out. If such principles are admitted, then they are only accepted locally
    as what the community endorses; they have no further underlying authority or
    status. At this point advocates of the Strong Programme recruit Wittgenstein’s
    doctrine of rule following to their cause. In fact they adopt the communitarian
    interpretation of rule following in which what the community determines is the
    ultimate Court of Appeal and there is no further fact of the matter concerning
    the correctness of any rule of method. The crucial issue here is whether advocates
    of the Strong Programme deny that there is such a thing as scientific rationality
    expressed in methodological principles; or whether they think there are such
    principles but they are only locally accepted as such, and so are simply more
    grist to the mill of the causality tenet of their programme. If the latter,
    then advocates of the Strong Programme have undercut the authority of methodology
    (they believe it can be given no account) and instead of continuing the traditional
    discussion of scientific rationality they have, in effect, changed the subject
    under debate.


    Much criticism of the Strong Programme centres around the many case studies
    its advocates have given of episodes in the history of science. One example
    already mentioned is that of the physicists in Weimar Germany and the allegedly
    social causes of their beliefs in acausality in physics. As always, the alleged
    causes of belief are all socio-cultural with no role for principles of method.
    But as already noted, principles of causal methodology have to be assumed in
    order to establish any causal link between scientists’ social circumstances
    and their scientific beliefs. So appeal to methods that are not merely locally
    accepted as such cannot be avoided even within the Strong Programme if any case
    studies in support of the central causality tenet are to be established. The
    general verdict of outsiders is that causal methodology has been badly applied
    and no convincing case has been established.


    The power/knowledge doctrine of Michel Foucault bears a striking resemblance
    to the claims of Marx, Mannheim and advocates of the Strong Programme. Where
    he differs is in his claims about power and its alleged efficaciousness in bringing
    about “knowledge”. In turn “knowledge” itself bringing about further power relations;
    and so on in a spiral of successive connections. While Foucault denies that
    “power is knowledge” he never makes fully clear what connection there
    might be between power and "knowledge"; but he always assumes that
    there must be one and never thinks that there might be none. He talks of power
    producing "knowledge", or of their being no "knowledge"
    without power, without further exploring the assumed connection. But as we have
    seen, lack of clarity about connection is endemic in social studies of belief.


    Foucault has many perorations about the nature of power. Since he conceives
    it broadly as the effect that the actions of one person can have on the actions
    of another (either opening them up or closing them down but not completely),
    then power is simply everywhere, as Foucault notices. But this is more a defect
    in his implausibly broad notion of power than a new interesting discovery about
    its ubiquity. Nor does Foucault always talk about “knowledge” as being related
    to power. Power is also linked to a whole host of other items such as truth,
    discourses of truth, or simply discourses. Though many claim that Foucault restricted
    his claims about power to "knowledge’ in the human sciences, there is in
    fact evidence that Foucault also intended his doctrine to apply to other sciences
    such as chemistry and mathematics. Whatever the scope of Foucault’s doctrine,
    the boundaries of a plausible sociological investigation of aspects of science
    have been burst and extended in a quite unfounded way into the sociology of
    scientific “knowledge", that is, the very cognitive content of the sciences
    themselves. Finally Foucault does claim that his own genealogy of power is a
    better explainer of belief than, say, Marxism or Freudian analysis. But since
    he never discusses what he means by "better explanation" in a methodological
    context, or modifies the scope of his "power/knowledge" doctrine,
    this quite late appeal to methodology can have little force.


    Finally turn very briefly to Nietzsche. Nietzsche’s doctrine of the “will to
    power” strongly influenced Foucault’s "power/knowledge" doctrine.
    Nietzsche’s doctrine shares the same form as all the other doctrines mentioned.
    But it is broader in that the "will to power" is both a metaphysical
    and psycho-social force at work in all of nature and life, including human life.
    Understood as a primitive psychological drive within people, it is causally
    efficacious in bringing about not only our beliefs in ordinary matters and in
    morality, but also our presuppositional philosophical beliefs in logic, in the
    identity that ordinary objects have and in the existence of ordinary everyday
    objects themselves. Nietzsche’s doctrine is best illustrated in the case he
    makes about the origins of Christian moral values such as altruism, pity and
    the like. He famously claims that they are due to the resentment of those “slaves
    of morality” who advocated them while overthrowing the values of "master"
    morality. Here the will to power operates as a psychological drive of resentment
    in people causing them to bring about, and maintain, certain moral beliefs.
    In allegedly uncovering the sordid origins of Christian morality in resentment,
    Nietzsche hoped to debunk it.


    Nietzsche is a master at proposing theories of the origins of our beliefs that
    rival those that are commonly accepted. In this way he hopes to unmask them,
    and then debunk them. The double unmasking/debunking move makes him the darling
    of postmodernism. This double move can be played out not only in the sphere
    of moral belief, but in any sphere of belief, such as our beliefs concerning
    logic, or truth, or our everyday framework of belief about objects. One of Nietzsche’s
    prime targets in this respect is Kant who, like a good modernist, attempted
    to give, as far as is possible, rational grounds for our ordinary beliefs and
    for morality. But one can be a critic of much of Kantian philosophy without
    accepting Nietzsche’s critique and its alternative worldview. What is important
    for our purposes is that the Nietzschean unmasking and debunking moves have
    wide application. Nietzsche extended it to a "genealogical" critique
    of truth and our pursuit of it; and it can be extended to the sciences and the
    methodological principles that many of a rational persuasion believe have been
    instrumental in the growth of science knowledge.


    Those of a modernist persuasion hold that principles of rationality and of
    methodology can play an important role in bringing about beliefs in science
    and elsewhere, for some of us some of the time. But this is not so for the writers
    mentioned. Marxists regard all scientific belief as a response to the forces
    and relations of production. Mannheim holds that most belief is a response to
    the conditions of social existence in which we think and believe. Advocates
    of the Strong Programme postulate a strong causal role for social, historical
    and cultural factors in bringing about all belief. Foucault sees power is the
    prime mover of all belief in the sciences. And finally Nietzsche claims that
    the will to power operates even in the sphere of belief.


    All these theorists have an in-house disagreement about what does the causal
    work, be it forces and relations of production, existential conditions, socio-historico-cultural
    factors, power or "will to power". But they all agree that, whatever
    it is, it cannot be anything rational. Rational explanations of belief are mystifications
    that need to be dragged out, unmasked and debunked. There are alternative explanations
    for why we believe what we do (though the grounds on why they may be better
    explanations remain obscure). In some cases they might be right – but not always.
    Each case needs to be determined on its merits. Moreover, one might well ask
    what kind of explanations do they offer. Often their model of explanation is
    one that a rationalist could also accept; what they resist are any explanations
    that appeal to rationality or methodology. But their reasons (assuming they
    believe that holding reasons can be efficacious) for the wide scope of their
    claims are quite lame.


    What these theorists might also jibe at is that their own belief in the very
    doctrines they proclaim is itself merely an instance of the social, historical,
    cultural or psycho-causal theories of belief they advocate. More often than
    not they advance what they take to be profound truths; but at the same time
    they take themselves to be unmasking and debunking the failed modernist programme,
    part of which is the unmasking and debunking of the very notion of truth they
    employ. Their own views often fall victim to arguments similar to those Plato
    first advanced against the advocates of power, rhetoric and relativism about
    truth that he encountered in his own time from Protagoras to Callicles. The
    first time these considerations were played out they might have been tragedy
    (for Plato’s opponents); but to repeat them now is only farce.

    Robert Nola’s most recent book is Rescuing Reason: A Critique of Anti-Rationalist Views of Science and Knowledge (Boston Studies in the Philosophy of Science, V. 230), published by Kluwer. He is a professor of philosophy at the University of Auckland.

  • Postmodernism and truth

    Here is a story you probably haven’t heard, about how a team of American researchers
    inadvertently introduced a virus into a third world country they were studying.(1)
    They were experts in their field, and they had the best intentions; they thought
    they were helping the people they were studying, but in fact they had never
    really seriously considered whether what they were doing might have ill effects.
    It had not occurred to them that a side-effect of their research might be damaging
    to the fragile ecology of the country they were studying. The virus they introduced
    had some dire effects indeed: it raised infant mortality rates, led to a general
    decline in the health and wellbeing of women and children, and, perhaps worst
    of all, indirectly undermined the only effective political force for democracy
    in the country, strengthening the hand of the traditional despot who ruled the
    nation. These American researchers had something to answer for, surely, but
    when confronted with the devastation they had wrought, their response was frustrating,
    to say the least: they still thought that what they were doing was, all things
    considered, in the interests of the people, and declared that the standards
    by which this so-called devastation was being measured were simply not appropriate.
    Their critics, they contended, were trying to impose “Western” standards in
    a cultural environment that had no use for such standards. In this strange defense
    they were warmly supported by the country’s leaders–not surprisingly–and little
    was heard–not surprisingly–from those who might have been said, by
    Western standards, to have suffered as a result of their activities.


    These researchers were not biologists intent on introducing new strains of
    rice, nor were they agri-business chemists testing new pesticides, or doctors
    trying out vaccines that couldn’t legally be tested in the U.S.A. They were
    postmodernist science critics and other multiculturalists who were arguing,
    in the course of their professional researches on the culture and traditional
    “science” of this country, that Western science was just one among many equally
    valid narratives, not to be “privileged” in its competition with native traditions
    which other researchers–biologists, chemists, doctors and others–were eager
    to supplant. The virus they introduced was not a macromolecule but a meme (a
    replicating idea): the idea that science was a “colonial” imposition, not a
    worthy substitute for the practices and beliefs that had carried the third-world
    country to its current condition. And the reason you have not heard of this
    particular incident is that I made it up, to dramatize the issue and to try
    to unsettle what seems to be current orthodoxy among the literati about
    such matters. But it is inspired by real incidents–that is to say, true reports.
    Events of just this sort have occurred in India and elsewhere, reported, movingly,
    by a number of writers, among them:


    Meera Nanda, “The Epistemic Charity of the Social Constructivist Critics of
    Science and Why the Third World Should Refuse the Offer,” in N. Koertge, ed.,
    A House Built on Sand: Exposing Postmodernist Myths about Science, Oxford
    University Press, 1998, pp286-311


    Reza Afshari, “An Essay on Islamic Cultural Relativism in the Discourse of
    Human Rights,” in Human Rights Quarterly, 16, 1994, pp.235-76.


    Susan Okin, “Is Multiculturalism Bad for Women?” Boston Review, October/November,
    1997, pp 25-28.


    Pervez Hoodbhoy, Islam and Science: Religious Orthodoxy and the Battle for
    Rationality
    , London and New Jersey, Zed Books Ltd. 1991.


    My little fable is also inspired by a wonderful remark of E. O. Wilson, in
    Atlantic Monthly a few months ago: “Scientists, being held responsible
    for what they say, have not found postmodernism useful.” Actually, of course,
    we are all held responsible for what we say. The laws of libel and slander,
    for instance, exempt none of us, but most of us–including scientists in many
    or even most fields–do not typically make assertions that, independently of
    libel and slander considerations, might bring harm to others, even indirectly.
    A handy measure of this fact is the evident ridiculousness we discover in the
    idea of malpractice insurance for . . . . literary critics, philosophers, mathematicians,
    historians, cosmologists. What on earth could a mathematician or literary critic
    do, in the course of executing her profession duties, that might need the security
    blanket of malpractice insurance? She might inadvertently trip a student in
    the corridor, or drop a book on somebody‘s head, but aside from such
    outré side-effects, our activities are paradigmatically innocuous.
    One would think. But in those fields where the stakes are higher–and more direct–there
    is a longstanding tradition of being especially cautious, and of taking particular
    responsibility for ensuring that no harm results (as explicitly honored in the
    Hippocratic Oath). Engineers, knowing that thousands of people’s safety may
    depend on the bridge they design, engage in focussed exercises with specified
    constraints designed to determine that, according to all current knowledge,
    their designs are safe and sound. Even economists–often derided for the risks
    they take with other people’s livelihoods–when they find themselves
    in positions to endorse specific economic measures considered by government
    bodies or by their private clients, are known to attempt to put a salutary strain
    on their underlying assumptions, just to be safe. They are used to asking themselves,
    and to being expected to ask themselves: “What if I’m wrong?” We others seldom
    ask ourselves this question, since we have spent our student and professional
    lives working on topics that are, according both to tradition and common sense,
    incapable of affecting any lives in ways worth worrying about. If my topic is
    whether or not Vlastos had the best interpretation of Plato’s Parmenides
    or how the wool trade affected imagery in Tudor poetry, or what the best version
    of string theory says about time, or how to recast proofs in topology in some
    new formalism, if I am wrong, dead wrong, in what I say, the only damage I am
    likely to do is to my own scholarly reputation. But when we aspire to have a
    greater impact on the “real” (as opposed to “academic”) world– and many philosophers
    do aspire to this today–we need to adopt the attitudes and habits of these
    more applied disciplines. We need to hold ourselves responsible for what we
    say, recognizing that our words, if believed, can have profound effects for
    good or ill.


    When I was a young untenured professor of philosophy, I once received a visit
    from a colleague from the Comparative Literature Department, an eminent and
    fashionable literary theorist, who wanted some help from me. I was flattered
    to be asked, and did my best to oblige, but the drift of his questions about
    various philosophical topics was strangely perplexing to me. For quite a while
    we were getting nowhere, until finally he managed to make clear to me what he
    had come for. He wanted “an epistemology,” he said. An epistemology.
    Every self-respecting literary theorist had to sport an epistemology that season,
    it seems, and without one he felt naked, so he had come to me for an epistemology
    to wear–it was the very next fashion, he was sure, and he wanted the dernier
    cri
    in epistemologies. It didn’t matter to him that it be sound, or defensible,
    or (as one might as well say) true; it just had to be new and different
    and stylish. Accessorize, my good fellow, or be overlooked at the party.


    At that moment I perceived a gulf between us that I had only dimly seen before.
    It struck me at first as simply the gulf between being serious and being frivolous.
    But that initial surge of self-righteousness on my part was, in fact, a naive
    reaction. My sense of outrage, my sense that my time had been wasted by this
    man’s bizarre project, was in its own way as unsophisticated as the reaction
    of the first-time theater-goer who leaps on the stage to protect the heroine
    from the villain. “Don’t you understand?” we ask incredulously. “It’s make
    believe
    . It’s art. It isn’t supposed to be taken literally!”
    Put in that context, perhaps this man’s quest was not so disreputable after
    all. I would not have been offended, would I, if a colleague in the Drama Department
    had come by and asked if he could borrow a few yards of my books to put on the
    shelves of the set for his production of Tom Stoppard’s play, Jumpers.
    What if anything would be wrong in outfitting this fellow with a snazzy set
    of outrageous epistemological doctrines with which he could titillate or confound
    his colleagues?


    What would be wrong would be that since this man didn’t acknowledge the gulf,
    didn’t even recognize that it existed, my acquiescence in his shopping spree
    would have contributed to the debasement of a precious commodity, the erosion
    of a valuable distinction. Many people, including both onlookers and participants,
    don’t see this gulf, or actively deny its existence, and therein lies the problem.
    The sad fact is that in some intellectual circles, inhabited by some of our
    more advanced thinkers in the arts and humanities, this attitude passes as a
    sophisticated appreciation of the futility of proof and the relativity of all
    knowledge claims. In fact this opinion, far from being sophisticated, is the
    height of sheltered naiveté, made possible only by flatfooted ignorance
    of the proven methods of scientific truth-seeking and their power. Like many
    another naif, these thinkers, reflecting on the manifest inability of their
    methods of truth-seeking to achieve stable and valuable results, innocently
    generalize from their own cases and conclude that nobody else knows how
    to discover the truth either.


    Among those who contribute to this problem, I am sorry to say, is, my good
    friend Dick Rorty. Richard Rorty and I have been constructively disagreeing
    with each other for over a quarter of a century now. Each of us has taught the
    other a great deal, I believe, in the reciprocal process of chipping away at
    our residual points of disagreement. I can’t name a living philosopher from
    whom I have learned more. Rorty has opened up the horizons of contemporary philosophy,
    shrewdly showing us philosophers many things about how our own projects have
    grown out of the philosophical projects of the distant and recent past, while
    boldly describing and prescribing future paths for us to take. But there is
    one point over which he and I do not agree at all–not yet–and that concerns
    his attempt over the years to show that philosophers’ debates about Truth and
    Reality really do erase the gulf, really do license a slide into some form of
    relativism. In the end, Rorty tells us, it is all just “conversations,” and
    there are only political or historical or aesthetic grounds for taking one role
    or another in an ongoing conversation.


    Rorty has often tried to enlist me in his campaign, declaring that he could
    find in my own work one explosive insight or another that would help him with
    his project of destroying the illusory edifice of objectivity. One of his favorite
    passages is the one with which I ended my book Consciousness Explained
    (1991):


    It’s just a war of metaphors, you say–but metaphors are not “just” metaphors;
    metaphors are the tools of thought. No one can think about consciousness without
    them, so it is important to equip yourself with the best set of tools available.
    Look what we have built with our tools. Could you have imagined it without them?
    [p.455]


    “I wish,” Rorty says, “he had taken one step further, and had added that such
    tools are all that inquiry can ever provide, because inquiry is never ‘pure’
    in the sense of [Bernard] Williams’ ‘project of pure inquiry.’ It is always
    a matter of getting us something we want.” (“Holism, Intrinsicality, Transcendence,”
    in Dahlbom, ed., Dennett and his Critics. 1993.) But I would never take
    that step, for although metaphors are indeed irreplaceable tools of thought,
    they are not the only such tools. Microscopes and mathematics and MRI scanners
    are among the others. Yes, any inquiry is a matter of getting us something we
    want: the truth about something that matters to us, if all goes as it should.


    When philosophers argue about truth, they are arguing about how not to inflate
    the truth about truth into the Truth about Truth, some absolutistic doctrine
    that makes indefensible demands on our systems of thought. It is in this regard
    similar to debates about, say, the reality of time, or the reality of the past.
    There are some deep, sophisticated, worthy philosophical investigations into
    whether, properly speaking, the past is real. Opinion is divided, but you entirely
    misunderstand the point of these disagreements if you suppose that they undercut
    claims such as the following:


    Life first emerged on this planet more than three thousand million years ago.
    The Holocaust happened during World War II.
    Jack Ruby shot and killed Lee Harvey Oswald at 11:21 am, Dallas time, November
    24, 1963.


    These are truths about events that really happened. Their denials are falsehoods.
    No sane philosopher has ever thought otherwise, though in the heat of battle,
    they have sometimes made claims that could be so interpreted.


    Richard Rorty deserves his large and enthralled readership in the arts and
    humanities, and in the “humanistic” social sciences, but when his readers enthusiastically
    interpret him as encouraging their postmodernist skepticism about truth, they
    trundle down paths he himself has refrained from traveling. When I press him
    on these points, he concedes that there is indeed a useful concept of truth
    that survives intact after all the corrosive philosophical objections have been
    duly entered. This serviceable, modest concept of truth, Rorty acknowledges,
    has its uses: when we want to compare two maps of the countryside for reliability,
    for instance, or when the issue is whether the accused did or did not commit
    the crime as charged.


    Even Richard Rorty, then, acknowledges the gap, and the importance of the
    gap, between appearance and reality, between those theatrical exercises that
    may entertain us without pretence of truth-telling, and those that aim for,
    and often hit, the truth. He calls it a “vegetarian” concept of truth. Very
    well, then, let’s all be vegetarians about the truth. Scientists never wanted
    to go the whole hog anyway.


    So now, let’s ask about the sources or foundations of this mild, uncontroversial,
    vegetarian concept of truth.


    Right now, as I speak, billions of organisms on this planet are engaged in
    a game of hide and seek. It is not just a game for them. It is a matter of life
    and death. Getting it right, not making mistakes, has been of paramount
    importance to every living thing on this planet for more than three billion
    years, and so these organisms have evolved thousands of different ways of finding
    out about the world they live in, discriminating friends from foes, meals from
    mates, and ignoring the rest for the most part. It matters to them that they
    not be misinformed about these matters–indeed nothing matters more–but they
    don’t, as a rule, appreciate this. They are the beneficiaries of equipment exquisitely
    designed to get what matters right but when their equipment malfunctions and
    gets matters wrong, they have no resources, as a rule, for noticing this, let
    alone deploring it. They soldier on, unwittingly. The difference between how
    things seem and how things really are is just as fatal a gap for them as it
    can be for us, but they are largely oblivious to it. The recognition
    of the difference between appearance and reality is a human discovery. A few
    other species–some primates, some cetaceans, maybe even some birds–shows signs
    of appreciating the phenomenon of “false belief”–getting it wrong. They
    exhibit sensitivity to the errors of others, and perhaps even some sensitivity
    to their own errors as errors, but they lack the capacity for the reflection
    required to dwell on this possibility, and so they cannot use this sensitivity
    in the deliberate design of repairs or improvements of their own seeking gear
    or hiding gear. That sort of bridging of the gap between appearance and reality
    is a wrinkle that we human beings alone have mastered.


    We are the species that discovered doubt. Is there enough food laid by for
    winter? Have I miscalculated? Is my mate cheating on me? Should we have moved
    south? Is it safe to enter this cave? Other creatures are often visibly agitated
    by their own uncertainties about just such questions, but because they cannot
    actually ask themselves these questions, they cannot articulate their
    predicaments for themselves or take steps to improve their grip on the truth.
    They are stuck in a world of appearances, making the best they can of how things
    seem and seldom if ever worrying about whether how things seem is how they truly
    are.


    We alone can be wracked with doubt, and we alone have been provoked by that
    epistemic itch to seek a remedy: better truth-seeking methods. Wanting to keep
    better track of our food supplies, our territories, our families, our enemies,
    we discovered the benefits of talking it over with others, asking questions,
    passing on lore. We invented culture. Then we invented measuring, and arithmetic,
    and maps, and writing. These communicative and recording innovations come with
    a built-in ideal: truth. The point of asking questions is to find true
    answers; the point of measuring is to measure accurately; the point of
    making maps is to find your way to your destination. There may be an
    Island of the Colour-blind (allowing Oliver Sacks his usual large dose of poetic
    license), but no Island of the People Who Do Not Recognize Their Own Children.
    The Land of the Liars could exist only in philosophers’ puzzles; there are no
    traditions of False Calendar Systems for mis-recording the passage of time.
    In short, the goal of truth goes without saying, in every human culture.


    We human beings use our communicative skills not just for truth-telling, but
    also for promise-making, threatening, bargaining, story-telling, entertaining,
    mystifying, inducing hypnotic trances, and just plain kidding around, but prince
    of these activities is truth-telling, and for this activity we have invented
    ever better tools. Alongside our tools for agriculture, building, warfare, and
    transportation, we have created a technology of truth: science. Try to draw
    a straight line, or a circle, “freehand.” Unless you have considerable artistic
    talent, the result will not be impressive. With a straight edge and a compass,
    on the other hand, you can practically eliminate the sources of human variability
    and get a nice clean, objective result, the same every time.


    Is the line really straight? How straight is it? In response to these questions,
    we develop ever finer tests, and then tests of the accuracy of those tests,
    and so forth, bootstrapping our way to ever greater accuracy and objectivity.
    Scientists are just as vulnerable to wishful thinking, just as likely to be
    tempted by base motives, just as venal and gullible and forgetful as the rest
    of humankind. Scientists don’t consider themselves to be saints; they don’t
    even pretend to be priests (who according to tradition are supposed to do a
    better job than the rest of us at fighting off human temptation and frailty).
    Scientists take themselves to be just as weak and fallible as anybody else,
    but recognizing those very sources of error in themselves and in the groups
    to which they belong, they have devised elaborate systems to tie their own hands,
    forcibly preventing their frailties and prejudices from infecting their results.


    It is not just the implements, the physical tools of the trade, that are designed
    to be resistant to human error. The organization of methods is also under severe
    selection pressure for improved reliability and objectivity. The classic example
    is the double blind experiment, in which, for instance, neither the human subjects
    nor the experimenters themselves are permitted to know which subjects get the
    test drug and which the placebo, so that nobody’s subliminal hankerings and
    hunches can influence the perception of the results. The statistical design
    of both individual experiments and suites of experiments, is then embedded in
    the larger practice of routine attempts at replication by independent investigators,
    which is further embedded in a tradition–flawed, but recognized–of publication
    of both positive and negative results.


    What inspires faith in arithmetic is the fact that hundreds of scribblers,
    working independently on the same problem, will all arrive at the same answer
    (except for those negligible few whose errors can be found and identified to
    the mutual satisfaction of all). This unrivalled objectivity is also found in
    geometry and the other branches of mathematics, which since antiquity have been
    the very model of certain knowledge set against the world of flux and controversy.
    In Plato’s early dialogue, the Meno, Socrates and the slave boy work
    out together a special case of the Pythagorean theorem. Plato’s example expresses
    the frank recognition of a standard of truth to be aspired to by all truth-seekers,
    a standard that has not only never been seriously challenged, but that has been
    tacitly accepted–indeed heavily relied upon, even in matters of life and death–by
    the most vigorous opponents of science. (Or do you know a church that keeps
    track of its flock, and their donations, without benefit of arithmetic?)


    Yes, but science almost never looks as uncontroversial, as cut-and-dried,
    as arithmetic. Indeed rival scientific factions often engage in propaganda battles
    as ferocious as anything to be found in politics, or even in religious conflict.
    The fury with which the defenders of scientific orthodoxy often defend their
    doctrines against the heretics is probably unmatched in other arenas of human
    rhetorical combat. These competitions for allegiance–and, of course, funding–are
    designed to capture attention, and being well-designed, they typically succeed.
    This has the side effect that the warfare on the cutting edge of any science
    draws attention away from the huge uncontested background, the dull metal heft
    of the axe that gives the cutting edge its power. What goes without saying,
    during these heated disagreements, is an organized, encyclopedic collection
    of agreed-upon, humdrum scientific fact.


    Robert Proctor usefully draws our attention to a distinction between neutrality
    and objectivity.(2) Geologists, he notes, know
    a lot more about oil-bearing shales than about other rocks–for the obvious
    economic and political reasons–but they do know objectively about oil
    bearing shales. And much of what they learn about oil-bearing shales can be
    generalized to other, less favored rocks. We want science to be objective; we
    should not want science to be neutral. Biologists know a lot more about the
    fruit-fly, Drosophila, than they do about other insects–not because
    you can get rich off fruit flies, but because you can get knowledge out of fruit
    flies easier than you can get it out of most other species. Biologists also
    know a lot more about mosquitoes than about other insects, and here it is because
    mosquitoes are more harmful to people than other species that might be much
    easier to study. Many are the reasons for concentrating attention in science,
    and they all conspire to making the paths of investigation far from neutral;
    they do not, in general, make those paths any less objective. Sometimes, to
    be sure, one bias or another leads to a violation of the canons of scientific
    method. Studying the pattern of a disease in men, for instance, while neglecting
    to gather the data on the same disease in women, is not just not neutral; it
    is bad science, as indefensible in scientific terms as it is in political terms.


    It is true that past scientific orthodoxies have themselves inspired policies
    that hindsight reveals to be seriously flawed. One can sympathize, for instance,
    with Ashis Nandy, editor of the passionately anti-scientific anthology, Science,
    Hegemony and Violence: A Requiem for Modernity
    , Delhi: Oxford Univ. Press,
    1988. Having lived through Atoms for Peace, and the Green Revolution, to name
    two of the most ballyhooed scientific juggernauts that have seriously disrupted
    third world societies, he sees how “the adaptation in India of decades-old western
    technologies are advertised and purchased as great leaps forward in science,
    even when such adaptations turn entire disciplines or areas of knowledge into
    mere intellectual machines for the adaptation, replication and testing of shop-worn
    western models which have often been given up in the west itself as too dangerous
    or as ecologically non-viable.” (p8) But we should recognize this as a political
    misuse of science, not as a fundamental flaw in science itself.


    The methods of science aren’t foolproof, but they are indefinitely perfectible.
    Just as important: there is a tradition of criticism that enforces improvement
    whenever and wherever flaws are discovered. The methods of science, like everything
    else under the sun, are themselves objects of scientific scrutiny, as method
    becomes methodology, the analysis of methods. Methodology in turn falls
    under the gaze of epistemology, the investigation of investigation itself–nothing
    is off limits to scientific questioning. The irony is that these fruits of scientific
    reflection, showing us the ineliminable smudges of imperfection, are sometimes
    used by those who are suspicious of science as their grounds for denying it
    a privileged status in the truth-seeking department–as if the institutions
    and practices they see competing with it were no worse off in these regards.
    But where are the examples of religious orthodoxy being simply abandoned in
    the face of irresistible evidence? Again and again in science, yesterday’s heresies
    have become today’s new orthodoxies. No religion exhibits that pattern in its
    history.


    1. Portions of this paper are derived from “Faith in the
    Truth,” my Amnesty Lecture, Oxford, February 17, 1997.
    2. Value-Free Science?, Harvard Univ. Press, 1991.

    This is the final draft of a paper given at the 1998 World Congress of Philosophy. Daniel Dennett’s most recent book, Freedom Evolves, has just been published by Viking Press.

  • Relatively Speaking

    There are philosophers (‘absolutists’) who like to stress truth, objectivity, rationality, and knowledge. Then there are others (‘relativists’) who like to stress contingency, mutability, culture, historicity, situatedness. The first group think that the second group have no standards. The second group are accused of encouraging ‘postmodernism’, or the licentious thinking and bullshitting that goes on in some parts of the humanities. The second group think the first group are conservative and complacent, and that their words simply mark fetishes.


    I like to illustrate the way these groups talk past each other with an anecdote of a friend of mine (I apologise to readers of my book Being Good, where I also tell this story). He was present at a high-powered ethics institute which had put on a forum in which representatives of the great religions held a panel. First the Buddhist talked of the ways to calm, the mastery of desire, the path of enlightenment. The panellists all said ‘Wow, terrific, if that works for you that’s great’. Then the Hindu talked of the cycles of suffering and birth and rebirth, the teachings of Krishna and the way to release, and they all said ‘Wow, terrific, if that works for you that’s great’. And so on, until the Catholic priest talked of the message of Jesus Christ, the promise of salvation and the way to life eternal, and they all said ‘Wow, terrific, if that works for you that’s great’. And he thumped the table and shouted: ‘No! It’s not a question of it if works for me! It’s the true word of the living God, and if you don’t believe it you’re all damned to Hell!’


    And they all said: ‘Wow, terrific, if that works for you that’s great’.


    The joke here lies in the mismatch between what the priest intends – a claim to unique authority and truth – and what he is heard as offering, which is one more saying like all the others. Of course that person talks of certainty and truth, says the relativist. That’s just his certainty and truth, made absolute in his eyes, which means no more than: made into a fetish.


    Having said this, the relativist need not attack people for putting words like ‘true’ on their doctrines. Of course people do this, because to have a belief and to hold it to be true are the same thing. In the story the priest will not be the only one who seizes on the word ‘true’: a Buddhist holds Buddhist doctrine to be true, and a Hindu holds Hindu doctrine to be true, just as inevitably.


    So far the absolutists seem to be on the defensive. The relativists mock them for adding nothing with their big words, or disapprove of them for being insufficiently tolerant of other perspectives and points of view. And toleration is surely a Good Thing. But is the relativist view really so attractive?


    Suppose I believe that fox-hunting is cruel and should be banned. And then I come across someone (Genghis, let us call him) who holds that it is not cruel, and should be allowed. We dispute, and perhaps neither of us can convince the other. Suppose now a relativist (Rosie) comes in, and mocks our conversation. ‘You absolutists’, she says, ‘always banging on as if there is just one truth. What you don’t realize is that there is a plurality of truths. It’s true for you that fox-hunting should be banned – but don’t forget that it’s true for Genghis that it should not’


    How does Rosie’s contribution help? Indeed, what does it mean? ‘It’s true for me that hunting should be banned’ just means that I believe that hunting should be banned. And the same thing said about Genghis just means that he believes the opposite. But we already knew that: that‘s why we are in disagreement!


    Perhaps Rosie is trying to get us to see that there is no real disagreement. But how can that be so? I want people to aim at one outcome, that hunting be banned, and Genghis wants another. At most one of us can succeed, and I want it to be me. Rosie cannot stop us from seeing each other as opponents.


    Perhaps Rosie is trying to get us to respect and tolerate each other’s point of view. But why should I respect and tolerate another point of view simply on the grounds that someone else holds it? I already have my suspicions of Genghis: in my book he is perhaps cruel and insensitive, so why should his point of view be ‘tolerated’? And in any case, I should be suspicious of any encouragement to toleration here. The whole point of my position is that hunting should not be tolerated – it should be banned. Tolerating Genghis’s point of view is too near to tolerating Genghis’s hunting, which I am not going to do.


    Rosie seems to be skating on thin ice in another way as well. Suppose she gets ruffled by what I have just written: ‘Look’, she says, ‘you must learn that Genghis is a human being like you; respect and toleration of his views and his activities are essential. If you did not fetishize absolute truth you would see that’. I on the other hand say ‘toleration of Genghis is just soggy; it is time to take a stand’. If Rosie thumps the table and says that tolerating Genghis is really good, then isn’t she sounding just like the fetishists she mocked? She has taken the fact that there are no absolute values to justify elevating toleration into an absolute value!


    Rosie has to avoid that contradiction. So perhaps she needs to say that she has her truth (tolerating Genghis is good) and I have mine (tolerating Genghis is bad) and that’s the end of it. But that sounds like bowing out of the conversation, leaving Genghis and I to go on arguing exactly as before. In practice, Rosie’s intervention hasn’t helped at all. She hasn’t made foxes, or those who hunt them, look one jot more or less likeable. Her intervention seems just to have been a distraction.


    Perhaps Rosie wanted to stop the conversation: she is like someone asking ‘Will you two just stop bickering?’ This can be a good thing to say. Some conversations are pointless. If you and I are in an art gallery, and I say Rembrandt is better than Vermeer and you say Vermeer is better than Rembrandt, and we start bickering about it, the best advice may well be that we stop. Perhaps we can agree to differ, because nothing practical hangs on our different taste. It is not as if we have enough money to buy just one, and I want it to be one and you want it to be the other (on the other hand, it does not follow that our conversation is useless. We might be forcing each other to look closer and see things we would otherwise have missed, or to reconsider what we find valuable about art in general).


    But however it may be in the art gallery, in moral issues we often cannot agree to differ. Agreeing to differ with Genghis is in effect agreeing to tolerate fox-hunting, and my whole stance was against that. Moral issues are frequently ones where we want to coordinate, and where we are finding what to forbid and what to allow. Naturally, the burden falls on those who want to forbid: in liberal societies, freedom is the default. But this cannot be a carte blanche for any kind of behaviour, however sickening or distressful or damaging. It is just not true that anything goes. So conversation has to go on about what to allow and what to forbid. Again, Rosie is not helping: she seems just to be a distraction.


    So why do people like to chip in with remarks like ‘it’s all relative’ or ‘I suppose it depends on your point of view’? What you say of course depends on your point of view, and whether another person agrees with it depends on their point of view. But the phrase is dangerous, and can be misleading. The spatial metaphor of points of view might be taken to imply that all points of view are equally ‘valid’. After all, there is no one place from which it is right to look at the Eiffel tower, and indeed no one place that is better than another, except for one purpose or another. But when it comes to our commitments, we cannot think this. If I believe that O.J. Simpson murdered his wife, then I cannot at the same time hold that the point of view that he did not, is equally good. It follows from my belief that anyone who holds he did not murder his wife is wrong. They may be excusable, but they are out of touch or misled or thinking wishfully. It is only if I do not hold a belief at all, but am just indulging in an idle play of fancy, that I can admit that an inconsistent fancy is equally good. If I like fancying Henry VIII to have been a disguised Indian, I am not in opposition to someone who enjoys fancying him to have been a Chinese. But that’s just the difference between fiction, where the brakes are off, and history, where they are on.


    Relativists are more apt to stay away from mundane historical truth. Relativism really grips us when we are talking of contested moral issues, although it also rears its head when we think of difficult theoretical issues. In these cases we are more apt to think that ‘there is no fact of the matter’. Some philosophers think that this is true in such areas, and that our commitments are better seen as taking up stances or attitudes, rather than believing in strict and literal truths. But to have a stance is to stand somewhere, and in practical matters just as in history, that means being set to disagree with those who stand somewhere else.


    If relativism, then, is just a distraction, is it a valuable one or a dangerous one? I think it all depends. Sometimes we need reminding of alternative ways of thinking, alternative practices and ways of life, from which we can learn and which we have no reason to condemn. We need to appreciate our differences. Hence, in academic circles, relativism has often been associated with the expansion of literature and history to include alternatives that went unnoticed in previous times. That is excellent. But sometimes we need reminding that there is time to draw a line and take a stand, and that alternative ways of looking at things can be corrupt, ignorant, superstitious, wishful, out of touch, or plain evil. It is a moral issue, whether we tolerate and learn or regret and oppose. Rosie the relativist may do well to highlight that decision. But she does not do well to suggest that it always falls out the one way.


    Simon Blackburn is Professor of Philosophy at Trinity College, Cambridge.

    This article was originally published in the Royal Institute of Philosophy journal, Think. It has a web site here.

  • Claiming Darwin for the Left: an interview with Peter Singer

    Peter Singer looks a very tired man. It’s not
    so much the early morning start of the interview, but the weeks of media scrutiny,
    misrepresentation and criticism, which seem to have taken their toll.


    Singer came to England to talk about “A Darwinian
    Left”, but no sooner had he stepped off the plane than the Daily Express
    was reviving the old controversy over Singer’s view that in certain circumstances,
    it may be better to end the life of a very severely handicapped baby in a humane
    way, rather than use all modern medicine can do to let it live a painful and
    often brief life. Singer tried to defend himself on Radio Four’s Today
    programme, but in such a brief news item, his calm reasoning was always likely
    to have less impact than the emotive pleas of his opponent.


    So once again, what Singer really wanted to say
    was overshadowed by his reputation. Which is a pity, because in
    his LSE lecture, A Darwinian Left?,
    which formed the centrepiece of his visit, Singer challenges a rather different
    taboo: the exclusion from left-wing thought of the ideas of Charles Darwin.


    Singer argues that the left’s utopianism has failed
    to take account of human nature, because it has denied there is such a thing
    as a human nature. For Marx, it is the “ensemble of social relations” which
    makes us the people we are, and so, as Singer points out, “It follows from this
    belief that if you can change the ‘ensemble of social relations’, you can totally
    change human nature.”


    The corruption and authoritarianism of so-called
    Marxist and communist states in this century is testament to the naïveté
    of this view. As the anarchist Bakunin said, once even workers are given absolute
    power, “they represent not the people but themselves … Those who doubt this
    know nothing at all about human nature.”


    But what then is this human nature? Singer believes
    the answer comes from Darwin. Human nature is an evolved human nature. To understand
    why we are the way we are and the origins of ethics, we have to understand how
    we have evolved not just physically, but mentally. Evolutionary psychology,
    as it is known, is the intellectual growth industry of the last decade of the
    millennium, though it is not without its detractors.


    If the left takes account of evolutionary psychology,
    Singer argues, it will be better able to harness that understanding of human
    nature to implement policies which have a better chance of success. In doing
    so, two evolutionary fallacies have to be cleared up. First of all, we have
    evolved not to be ruthless proto-capitalists, but to “enter into mutually beneficial
    forms of co-operation.” It is the evolutionary psychologist’s work in explaining
    how ‘survival of the fittest’ translates into co-operative behaviour which has
    been, arguably, its greatest success. Secondly, there is the “is/ought” gap.
    To say a certain type of behaviour has evolved is not to say it is morally right.
    To accept a need to understand how our minds evolved is not to endorse every
    human trait with an evolutionary origin.


    When I spoke to Peter Singer, I wanted to get
    clearer about what he thinks Darwinism can do to help us understand ethics.
    Singer is a preference utilitarian, which means he thinks the morally right
    action is that which has the consequences of satisfying the preferences of the
    greatest number of people. Singer seems now to be saying
    that the importance of Darwinism is that if we take it into account, we will
    be better at producing the greatest utility – the satisfaction of people’s preferences.


    “That’s my philosophical goal,” acknowledges Singer.
    “I was speaking more broadly for anyone who shares a whole range of values.
    You don’t have to be a preference utilitarian. But I think it would be true
    generally that anyone who has views about how society should end up will have
    a better chance to achieve that if they understand the Darwinian framework of
    human nature.”


    Singer also argues that Darwinism has a destructive
    effect, in that if you accept it, certain other positions are fatally undermined.
    For example, the idea that God gave Adam, and by proxy, us, dominion over the
    animal kingdom is a view “thoroughly refuted by the theory of evolution.” I
    was unsure that those victories are always so straightforward. For example,
    there are, presumably, many Christians who don’t buy the Adam and Eve creation
    myth as literal truth. Nevertheless, can’t they live with Darwinism and have
    their ethics?


    “I don’t think Darwinism is incompatible with
    any Christian ethic,” Singer is happy to allow, “except a really fundamentalist
    one that takes Genesis literally. And it’s not even incompatible strictly with
    the divine command theory, it just means the divine command theory is based
    on all sorts of hypotheses which you don’t need because you’ve got other explanations.”


    So how is the divine command theory undermined
    by evolution? Couldn’t the Christian, for example, say, yes, evolution is how
    man came to be, but given there is an is/ought gap, can’t the ethical commands
    come from on high, as it were?


    “Entirely possible. I was just saying that a lot
    of the impetus for a divine command theory comes from the question ‘where could
    ethics come from?’. It’s something totally different, out of this world, so
    therefore you have to assume we’re talking about the will of God or something.
    Once you have a Darwinian understanding of how ethics can emerge, you absolutely
    don’t have to assume that, but it’s still possible to assume it. It’s really
    the ‘I have no need of the hypothesis’ rather than ‘that hypothesis is hereby
    refuted’.”


    The question of how far evolution can help us
    understand the origin of ethics is perhaps the most contentious part of evolutionary
    psychologists’ claims in general and Singer’s thesis in particular. Singer believes
    Darwinian theory gives us an understanding of the origin of ethics, because,
    for example, it gives an evolutionary explanation of how reciprocity came to
    be. Put crudely, if you model the survival prospects for different kinds of
    creatures with different ways of interacting with others – from serial exploiters
    to serial co-operators and every shade in between – it turns out that the creatures
    who thrive in the long run are those that adopt a strategy called ‘tit for tat’.
    This means that they always seek to co-operate with others, but withdraw that
    co-operation as soon as they are taken advantage of. Because this is the attitude
    which increases the survival value of a species, it would seem to follow that
    humans have evolved an in-built tendency to co-operation, along with a tendency
    to withdraw that co-operation if exploited. Hence, it is argued, and essential
    feature of ethics – reciprocity – is explained by evolution.


    But, I put it to Singer, does the is/ought gap
    reappear in a historical version if you follow that theory? When we give an
    evolutionary explanation of how reciprocity came to be, so far we’re only describing
    evolved behaviour, but it’s quite clear that what you think ethics is now goes
    beyond a mere description of our evolved behaviour. So how, historically or
    logically, is that gap bridged?


    “It’s not bridged historically at all. Of any
    culture and people you can describe their ethic, but that remains entirely on
    the level of description. ‘The Inuit people do this and this and this, the British
    people do that and that and that’. You can describe that ethic but you don’t
    get from the answer to ‘what ought I to do?’ So the gap is a logical one and
    it just arises from the fact that when we seek to answer the question, ‘What
    ought I to do?’ we’re asking for a prescription, we’re not asking for a description.
    Any description of existing morals in our culture or the origins of morals is
    not going to enable us to deduce what ought we to do.”


    But, I insist, doesn’t evolution then merely explain
    the descriptive part of how certain behaviours came to be? It doesn’t really
    explain our ethics, it explains social codes, rules of social conduct. If ethics
    is a prescriptive field rather than a descriptive one, how does evolutionary
    explanation of how merely described behaviour comes to be explain how ethics
    came to be?


    “I think in a way that’s so obvious that it doesn’t
    need any explanation,” retorts Singer. “That’s just that we have the capacity
    to make choices and that we make judgements which are prescriptive: first person,
    second person or third person judgements. So, in a way, that is not what I’m
    trying to explain the origins of, although you can see how if you add it to
    the kinds of accounts I’ve given, we have language and we are a social animals
    you can see why we end up talking about these things and discussing them. It’s
    not something that I talked about. We know that we do that, and that’s a process
    you would expect beings, once they had a certain degree of language, faced with
    these choices, to do.”


    The question is important, because some prominent
    workers in the area of decision-theory and evolution argue that evolution explains
    how it comes to be that we have social rules and that in fact understanding
    these origins shows us that there’s no extra moral dimension to these things.
    They are merely evolved and we deceive ourselves if we think there is an ethical
    dimension.


    I tried to probe this apparent gap between evolution
    and ethics by considering two of Singer’s examples of how our ethics must account
    for our evolved human nature. If we take into account the fact that we feel
    more protective towards our own offspring than towards children in general,
    it’s a good rule that parents should take care of their children because there’s
    a greater chance it will increase the general happiness. On the other hand,
    the double standard towards female and male sexual behaviour, even though it
    may have an evolutionary explanation, is something that should not be tolerated.
    I put it to Singer that, it follows that the moral judgements that we’re going
    to make are going to be of the sort, ‘If the evolved behaviour is going to lead
    to the morally desirable result follow it and if the evolved behaviour does
    not lead to the morally desirable result, don’t follow it’. So isn’t the observation
    of what has evolved going to drop out of the equation? It’s not going to feed
    at all directly into what our moral rules are going to be.


    Singer’s answer reveals more precisely the limited,
    but important role, he believes Darwinian explanations play in our ethics. “I
    think the Darwinian is going to alert us to what rules are going to work and
    what rules are going to meet a lot of resistance and I think we have to bear
    that in mind. But always there’s a trade off between how important the values
    are to us and the strength of the evolved tendency in our natures.”


    Given Singer’s willingness to challenge established
    views, I was a little surprised that he still talks in terms of the left and
    right, particularly as it seems his conception of the left is a long way from
    any traditional view. Singer characterises the left as being concerned with
    eliminating the sufferings of others and of the oppressed. A lot of people on
    the left would consider that quite a diluted view of the left, which is generally
    thought to have something to do with common ownership. I wondered if it was
    useful to maintain the label ‘the left’.


    “The label’s kind of there to stay,” replies Singer.
    “It’s been there so long. We’re not about to get rid of it. You would have to
    be rather far on the left now to think that a lot of common ownership is a good
    idea, beyond some major utilities. I wouldn’t say the left ought to be committed
    to common ownership. Common Ownership is possibly a means to achieving the goals
    of the left. That debate should continue. But I wouldn’t say it was a prerequisite
    for being part of the left.”


    But is Singer’s view really leftist at all? Take
    what he says about tit-for tat, for example. He argues that tit-for-tat would
    appeal to people on the left because it is a ‘nice’ strategy, but presumably
    a lot of people who wouldn’t identify themselves as left-wing would be keen
    on adopting what he calls nice strategies. And similarly a number of people
    on the left might be against the nicer strategies – the more revolutionary left
    wing, for example. So I’m left wondering if there really is a significant distinction
    to be made between the left and other political stances that are committed to
    the reduction of inequality.


    But it’s quite clear that Singer, though keen
    to identify himself as being on the left, isn’t as interested this particular
    issue as I am. He simply replies, “I think there’s a lot less to the distinction
    [left/right] than there was, undoubtedly.”


    Singer’s interest in Darwin pre-dates the current
    revival in evolutionary explanations, and goes back to his earlier work in animal
    liberation.


    “It was there in the background. It wasn’t central
    to it but I did talk about it a bit in Animal Liberation. I certainly
    was interested in it before Wilson’s Sociobiology came out, but my interest
    in it as an aid to understanding what ethics is really does date from Sociobiology
    because I then addressed that in the Expanding Circle, which was published
    in 1982, and that was explicitly a response to Wilson.”


    What, I wondered, explains the current explosion
    of interest in Darwin, particularly in philosophy and psychology?


    “Well, Darwin’s been around for such a long period
    of time but understanding Darwinian accounts of human affairs has not been about
    for such a long time. It’s been neglected after Darwin himself. And then there
    was this great taboo against applying Darwinian hypotheses to human social behaviour
    until Wilson’s Sociobiology in 1975 and that was greeted with a huge
    amount of hostility, which is evidence of the taboo. Even then a lot of people
    shuddered because they saw it as something that was associated with nasty, right-wing
    biological determinism, which is not really true. So it’s only more recently
    than 1975 that the taboo has broken down and people have started to accept that
    there are interesting and important insights into human affairs that come from
    applying Darwinian thinking to human social conduct.”


    Singer may feel that his new take on Darwin ought
    to have been the main focus of his visit to London. But aside from Singer the
    academic philosopher, there is also Singer the campaigner and polemicist. If
    the media have focused on other things, it is at least partly due to Singer’s
    own outspokenness about issues that matter to him.


    One of the first controversies to blow up in the
    press during Singer’s visit was his withdrawal of a lecture he was due to give
    at the King’s Centre for Philosophy, because of its sponsorship by Shell UK,
    and his very public letter to the Guardian explaining why.


    Singer’s reason for pulling out is that, “I did
    not really want to appear on a programme that says ‘supported by Shell’ and
    is seen as therefore promoting the idea that Shell is a good corporate citizen.
    It’s not that I’m against taking corporate money under any circumstances. I
    think there are some circumstances in which I would take it, but I think that
    you always have to be careful about taking corporate money. At present Shell’s
    record, particularly in Nigeria, is really lamentable. I think that you can
    see a connection between the money that is going here [to the King’s Centre]
    and the profits made out of the extraction of oil in Nigeria, with all of the
    consequences that has for the Ogoni people, both in terms of environmental damage
    to their land, the way in which Shell revenues support the Nigerian dictatorship,
    which is one of the most oppressive around. So I just didn’t want to be part
    of that.”


    Interestingly, at one recent Environmental Ethics
    conference, at which the Shell issue in particular was in the forefront of people’s
    minds, a lot of the people who ran consequentialist arguments at the conference
    actually came out in favour of taking the money because they felt that the benefits
    of having the conference supported would outweigh the very marginal benefits
    that Shell would receive for having its logo in the corner of the posters. People
    said thinks like “It’s better this money is spent on a conference in environmental
    ethics, which should be discussed, than the money should go to a big billboard
    poster for Shell or something.” What does Singer, as a consequentialist, make
    of this argument?


    “The consequentialist could go both ways, I don’t
    deny that. I don’t think it’s all that important to have another environmental
    ethics conference frankly – there are plenty of environmental ethics conferences
    and discussions about environmental ethics around. There’s certainly an argument
    about what else would happen to the money. But I think that in fact it’s clear
    that as far as my gesture of refusing to take Shell’s sponsorship is concerned
    – and it was a gesture, there’s no doubt about it – it’s had worthwhile consequences.
    What it’s meant is that there’s one lecture in my London programme that did
    not go ahead as sponsored, but in fact that was made up for by the fact that
    I gave a lecture organised at King’s College by some students who were opposed
    to Shell’s sponsorship. So people at King’s still got to hear me give a lecture,
    if that’s what the were interested in. Because I refused and because I wrote
    a letter to The Guardian about my refusing to do so, there was a whole
    lot more discussion of the issue, so people have again become more aware that
    there is a real issue about corporate sponsorship and the question about Shell
    in particular has got aired. So, it seems to me that’s clearly been a good thing.
    In other words, it’s clear that I made the right decision on consequentialist
    grounds.


    “But I think it’s important that people enter
    some discussion, that it’s not just a silent gesture that I didn’t give a lecture
    and no one ever heard about why I didn’t.”


    Singer is always very open in showing the full
    implications, consequences and ramifications of his viewpoint, which doesn’t
    always make him popular. As a consequentialist, how does he feel about the argument
    that the best way to bring about a better society from a utilitarian point of
    view is not to advance complex utilitarian arguments but to appeal to more simple
    concepts?


    “I think people are in different positions and
    different roles. For a political heavyweight involved in strategies for a political
    party to achieve office, it probably wouldn’t be possible to be quite so open.
    But I think philosophers can have a role in clarifying people’s thinking, with
    broader aims than simply saying ‘I want the political party with these view
    to get into office and do this and that’.”


    As an animal rights campaigner, I suggest, his
    roles perhaps are more mixed and I asked Singer whether he felt that being so
    open, and talking about the implications of his views on animals for mentally
    handicapped children, has had the effect of blunting his points on animal liberation,
    because people are inevitably not going to focus on his positive points about
    animals, they focus on the perceived negative implications for the sanctity
    of life.


    “Maybe that’s true. It’s become a larger focus
    in recent years. I’m not quite sure why, but I think that what you’d have to
    say there was that if you take the line that that was a mistake to write Should
    the Baby Live?
    back in 1985. It’s done now and I think the book’s done some
    good in alerting people to the nature of that particular problem and making
    parents of the disabled able to discuss it more openly. I’m not going to deny
    that the conclusions still seem to be sound ones.


    “I think you could say that politically it’s been
    a mistake to accept invitations to debate it. What’s happened in Britain over
    the last couple of weeks is that there was a rather silly article in the Daily
    Express
    that raised this issue, which probably should have been ignored,
    and I was called by the BBC for the Today programme and a lot of people
    heard that, so maybe I would be more prudent to tell the BBC that I didn’t really
    want to discuss that anymore and that wasn’t what I was coming here to discuss
    this time.


    “It’s very hard because on the other hand some
    of the discussions were quite useful and it wasn’t all silly stuff as the one
    on the Today programme I think was. So you have to say, well, it gets
    more attention and more read about my views, maybe some of them will think,
    ‘Well, this is not so silly and bad, maybe I should look at some of his books’,
    and maybe more people will get involved in it. It’s very hard to say, I think.”


    Singer is always going to be a controversial thinker
    because of his willingness to confront political and ethical issues without
    being constrained by current orthodoxy. His application of Darwin to left-wing
    thought is certainly not going to make him popular with the right, but it is
    also likely to lose him some friends on the left, just as his measured contribution
    to the issue of animal rights challenges society’s attitudes while not going
    far enough to satisfy many activists.


    Singer returned to his native Australia leaving
    behind a big question and a tentative answer. Can the scientific theories of
    Charles Darwin really contribute to our philosophical understanding of ethics?
    Singer has tried to show how it can, but this is a debate which clearly has
    a lot further to run.

    This article was originally published in Issue 4 of The Philosophers’ Magazine.

    Julian Baggini has a web site here.

  • Psychoanalytic Mythology

    During the last decades of the twentieth century researchers showed that much
    of the received history of psychoanalysis consisted of stories that were largely
    mythological. Perhaps the most enduring of all these myths is that Freud postulated
    his seduction theory as a result of hearing frequent reports from his female
    patients that they had been sexually abused in childhood. In this article I
    want to focus on this story, one that for most of the twentieth century was
    taken as historical fact, and is still widely believed to be so.
    According to the traditional account, in the 1890s most of Freud’s female patients
    told him that they had been sexually abused in early childhood, usually by their
    father. How the story continues depends on whether it is based on received history
    or on the revised version embraced by many feminists and popularised by Jeffrey
    Masson. In the orthodox version we are told that within a short time Freud came
    to realise that many of the reports he was hearing were not authentic, that
    the women were fantasizing, and that this led to his epoch-making discovery
    of infantile incestuous fantasies. But according to the feminist account, it
    was the staunch opposition of colleagues outraged by his claims of widespread
    childhood sexual abuse that led Freud to abandon the theory. Previously a sympathetic
    listener, Freud now betrayed the women who had had the courage to reveal their
    terrible experiences of abuse.
    Whichever version you choose to believe, both make dramatic stories, and each
    has its strong adherents. The basic elements are the same, but the interpretation
    of them is very different. I suspect that most people rely on their gut feeling
    and opt for Masson and the suppression of the truth about the widespread sexual
    abuse of girls at that time. But it’s time for a reality check.
    The articles that Freud published in the 1890s, and his correspondence with
    his confidant Wilhelm Fliess, tell a very different story. Putting it briefly,
    Freud’s patients in the mid-1890s did not tell him that they had been
    sexually abused in early childhood. In contrast to what he was to assert in
    his later accounts, at the time he wrote that they assured him “emphatically
    of their unbelief” in the preconceived infantile sexual traumas that he
    insisted they had experienced.
    The essential features of the episode can be outlined as follows. During the
    early 1890s Freud had become convinced that repressed memories of sexual ideas
    or experiences, not necessarily from childhood, lay at the root of the symptoms
    of patients he had diagnosed as hysterics. Then in October 1895, on the basis
    of a speculative notion, he alighted on a theory that he was convinced had solved
    once and for all the problem of the causes of the psychoneuroses. Hysterical
    symptoms were invariably caused by unconscious memories of sexual molestations
    in infancy.
    Using his newly developed analytic technique for uncovering unconscious ideas
    in the minds of his patients, he immediately set about showing that he was right.
    Although he had not previously reported any instances of his having uncovered
    sexual abuse in infancy, within four months of announcing the new theory to
    Fliess he completed two papers in which he claimed that with every one of thirteen
    “hysterical” patients, plus some obsessionals, he had been able “trace
    back” to infantile experiences of sexual abuse. A few months later he delivered
    a lecture, “The Aetiology of Hysteria”, in which he gave a more detailed
    exposition of his theory, claiming confirmation for eighteen patients diagnosed
    as hysterics.

    How did he manage to access deeply repressed experiences of this nature with
    all his patients in such a short time? Although he claimed that he had induced
    patients to “reproduce” the infantile experiences (what he meant by
    “reproductions” is open to a wide range of interpretations), it is
    evident that he typically arrived at his clinical findings by the decoding of
    symptoms, and the analytic interpretation of patients’ ideas produced under
    the influence of the “pressure” procedure he was using at that time.
    He explained that patients’ symptoms correspond to the “sensory content
    of the infantile scenes” of sexual abuse that he had inferred to lie at
    their root. His analytic procedure, he wrote, was analogous to that of forensic
    physician who can arrive at the cause of an injury “even if he has to do
    without any information from the injured person”.

    This is exemplified by the case of a patient who had a facial tic and eczema
    around the mouth. On the basis of these symptoms Freud analytically inferred
    that she had in infancy been forced to engage in fellatio. “I thrust the
    explanation at her”, he told Fliess. And when she expressed her disbelief
    he “threatened to send her away” if she persisted in her scepticism.
    Of course for Freud a rejection of his inference was evidence of the patient’s
    “resistance”, providing further confirmation that his analytic reconstruction
    was valid.
    For reasons impossible to deal with in a short space, within two years of announcing
    publicly his solution to the aetiology of the neuroses Freud lost faith in it.
    But instead of this leading him to question the reliability of his newly developed
    technique for reconstructing unconscious memories, he sought to explain his
    claimed findings as patients’ unconscious fantasies. This necessitated some
    retrospective emendation of the original claims to make the new theory minimally
    plausible. In fact the story went through a number of stages before finally
    arriving at the familiar version in New Introductory Lectures in Psychoanalysis
    (1933): “In the period in which the main interest was directed to discovering
    infantile sexual traumas, almost all my women patients told me that they had
    been seduced by their father.” (Incidentally, no one seemed to think it
    odd that it was only in this short period that “almost all” his female
    patients should have reported early childhood sexual abuse.)
    It is important to appreciate that the traditional accounts give no idea that
    the putative “fantasies” were unconscious ideas or memories
    in the patients’ minds that Freud believed he had uncovered (i.e., reconstructed)
    by his analytic technique of interpretation. (Freud’s use of the word Phantasie
    is translated as ‘phantasy’ by James Strachey in the Hogarth Standard Edition,
    but usually as ‘fantasy’ elsewhere in the literature, giving readers the misleading
    impression that Freud was generally referring to conscious ideas that
    patients reported to him.)
    There are a considerable number of anomalies in Freud’s retrospective accounts
    of the episode, too many to be dealt with here. One of these is that originally
    he had claimed that the “infantile traumas” he had uncovered could
    be described “without exception” as “grave sexual injuries”.
    How putative ‘memories’ of experiences that he had described as “brutal”
    and “absolutely appalling” could plausibly turn out to be unconscious
    fantasies of “seduction” that had the purpose (according to his first
    explanation) of “fending off” patients’ disturbing memories of infantile
    masturbation, Freud made no attempt to explain. The same objection applies to
    his later story that the putative “seduction fantasies” were projections
    of patients’ Oedipal desires. In any case, he was in no position to know whether
    his analytic reconstructions represented repressed memories of actual events,
    or patients’ unconscious fantasies — or indeed, as was actually the case, imaginative
    scenarios originating in his own mind.
    A little known fact is that, in accord with his theoretical requirements, Freud
    claimed in 1896 that for each of his six obsessional patients he had uncovered
    repressed memories not only of passive infantile sexual abuse scenes, but also
    of active sexual experiences at a slightly older age. Nothing was heard again
    of these remarkable clinical ‘findings’, and Freud made no attempt to explain
    how his later unconscious fantasy theory could possibly account for them.
    The above arguments, of course, refute Jeffrey Masson’s version of events as
    well as the received psychoanalytic story, though his case lacks cogency for
    other reasons. He suggested in The Assault on Truth that Freud’s motive
    for abandoning the seduction theory was in part an attempt to ingratiate himself
    with his colleagues, who supposedly were outraged by his clinical claims. This
    thesis is undermined by the fact that Masson’s account of the ostracizing of
    Freud by his colleagues is entirely erroneous. But it is also invalidated by
    the fact that Freud did not reveal his abandonment of the seduction theory to
    his colleagues for some seven years after he had privately renounced it. (Masson
    erroneously stated that “the critical period for Freud’s change of heart
    about the seduction hypothesis” was “during the years 1900-1903”.
    This vague dating effectively closes most of the gap between the abandonment
    of the theory and Freud’s public announcement of his change of view, and tallies
    with Masson’s thesis, but Freud’s letters to Fliess show clearly that he had
    completely given up the theory by the end of 1898.)
    That the traditional story of the seduction theory episode is false in all
    its essentials is especially important in recent times, when it has been drawn
    into the debate about the repression of memories of childhood abuse that are
    supposedly ‘recovered’ some decades later. People need to get the historical
    facts straight before Freud’s supposed early clinical experiences are erroneously
    cited to support the arguments of one side or the other. More generally, as
    Cioffi has emphasized, an accurate account of the transition from the seduction
    theory to its successor fantasy theory calls into question the reasoning which
    Freud was to employ for the rest of his career to reconstruct infantile fantasy
    life and the contents of the unconscious.
    References
    Cioffi, F. (1998 [1974]). “Was Freud a liar?” In Freud and
    the Question of Pseudoscience
    . Chicago and La Salle: Open Court, pp. 199-204.
    Esterson, A. (1993). Seductive Mirage: An Exploration of the Work of Sigmund
    Freud
    , Chicago: Open Court.
    Esterson, A. (1998). “Jeffrey Masson and Freud’s Seduction Theory: a New
    Fable Based on Old Myths.” History of the Human Sciences, 11
    (1)
    , pp. 1-21.
    Esterson, A. (2001). “The Mythologizing of Psychoanalytic History: Deception
    and Self-deception in Freud’s Accounts of the Seduction Theory Episode.”
    History of Psychiatry, xii, pp. 329-352.
    Esterson, A. (2002). “The Myth of Freud’s Ostracism by the Medical Community
    in 1896-1905.” History of Psychology, 5 (2), pp. 115-134.
    Freud, S. (1953-1974). The Standard Edition of the Complete Psychological
    Works of Sigmund Freud
    , ed. and trans. by J. Strachey et al. London: Hogarth
    Press.
    Israëls, H. and Schatzman, M. (1993). “The Seduction Theory.”
    History of Psychiatry, iv, pp. 23-59.
    Masson, J. M. (1984). The Assault on Truth: Freud’s Suppression of the Seduction
    Theory
    . New York: Farrar, Straus and Giroux.
    Masson, J. M. (ed. and trans.) (1985). The Complete Letters of Sigmund Freud
    to Wilhelm Fliess 1887-1904
    . Cambridge, MA: Harvard University Press.
    Scharnberg, M. (1993). The Non-Authentic Nature of Freud’s Observations:
    Vol. 1. The Seduction Theory. Uppsala Studies in Education, No
    47 and 48. Stockholm: Almqvist & Wiksell International.
    Schimek, J.G. (1987). “Fact and fantasy in the seduction theory: a historical
    review.” Journal of the American Pyschoanalytic Association, 35,
    pp. 937-965.

  • The PC Tyranny

    political correctness (noun): conformity to a belief that language and
    practices which could offend political sensibilities should be eliminated.
    Merriam Webster’s Collegiate Dictionary

    I’ve been invited to write about political correctness and philosophy
    in the North American academy. What qualifies me? I’m a refugee
    from political correctness. I emigrated from Canada to the USA because
    of an insidious quota system, euphemistically called ’employment
    equity’, which decrees that there are too many white male philosophers
    in Canadian universities. The Nuremburg Laws excluded Jews from
    Nazified German universities because we were ‘non-Aryan’; Jews are
    now excluded from Canadian universities because we are ‘white’.
    This is a compelling irony. It compelled me to get the hell out.

    Before quitting Canada in 1994, I penned a satire on political correctness,
    called Fair New World. Libertarian lawyer Karen Selick called
    it "the most politically incorrect work of art I have ever
    seen. It’s also hilariously funny and scathingly insightful."
    Since no Canadian publisher had the courage to bring it out, I founded
    my own press, Backlash Books, and published it myself. Fair New
    World
    continues to be taught in colleges and universities, by
    politically absolutely incorrect professors, all of whom have received
    Backlash Books’ highest award: ‘Offender of the Faith’. So much
    for my political credentials.

    I am currently tenured at The City College of New York, which graduated
    eight eventual Nobel laureates among its illustrious alumni of halcyon
    years, but where thanks to a generation of open admissions Great
    Books have been replaced by Comic Books. What kind of refuge is
    this? I offer two stock answers. To the cognoscenti, I reply that
    I have Bertrand Russell’s job. Russell’s appointment at CCNY was
    infamously denied by the New York Supreme Court, which convicted
    him, much as Athens convicted Socrates, of moral corruption. Instead
    of putting Russell to death, they merely denied him employment.
    This is called ‘social progress’. To the incognoscenti, I reply
    that I was hired by CCNY to fill a quota system: New York City was
    running short of Jews, so they imported me.

    By now you should be persuaded that I am politically incorrect enough
    to write this piece. Now let me unpack Webster’s definition. First,
    to which ‘political sensibilities’ does it allude? These generally
    entail a Rousseauesque-cum-Marxist vision of the world, which perceives
    humanity as an innocent and well-meaning horde of erstwhile noble
    savages, inequitably differentiated by race, class and gender by
    an evil conspiracy of white male heterosexual patriarchal hegemonists,
    who use logic, mathematics, science, classics, capitalism, democracy
    and testosterone to disenfranchise politically and deprive socio-economically
    the rest of the world, who are the ‘victims’ of ‘oppression’.

    While Marx’s putative ‘remedy’ was partly predicated on his slogan ‘from
    each according to his ability, to each according to his need’, current
    political correctness is incomparably more surreal: it has no truck
    with ability at all, which it finds intolerably offensive and therefore
    among the first things slated for elimination. For example, many
    primary schools now give ribbons to all children who run in field-day
    races, because they are terrified of ‘offending’ and therefore also
    (by the puerile etiology that informs their world-view) of traumatising
    the children who do not win or place in the contest. Thus they have
    confused fleetness of foot with moral worthiness. This has two serious
    consequences.

    First, at the grass-roots level, political correctness fails to teach children
    that sportsmanship and self-development are the lasting lessons
    of competition. Win or lose, one is morally worthy if one runs the
    race and does one’s best. If Jane is a better runner than Sally,
    there is nothing wrong (i.e. ‘offensive’) about rewarding Jane for
    fleetness of foot. If Jane wins a gold medal and Sally finishes
    out of the medals, it means that Jane is a better runner than
    Sally: it does not mean that Jane is better than Sally. But
    a politically correct race is socially-engineered: all runners must
    finish together, or all must receive identical ribbons regardless
    of place. This is an offence against fleetness of foot. It is typical
    of a pervasive unwillingness to acknowledge natural and acquired
    differences among human beings, which in turn devalues individual
    excellence and obliterates moral worthiness. That is an offence
    against humanity.

    The second consequence marks a death-threat to American democracy. Tocqueville
    had observed presciently that Americans must choose between liberty
    and equality. Any undeluded person knows that equality of opportunity
    leads inevitably to inequality of outcome. However, the inability
    of political correctness to tolerate unequal outcomes in the wake
    of equal opportunities, and its dogmatic commitment to a neo-Marxist
    doctrine that equates justice and fairness with a levelling of outcomes,
    have contorted the North American Academy into a sublime estate,
    in which equal outcomes in higher education are guaranteed by pervasive
    illiteracy, innumeracy and aculturality. The Academy has become
    a neo-Procrustean Inn, whose former halls of learning are converted
    into dormitories of indoctrination, whose patrons (the students)
    have their heads chopped off instead of their legs, so that all
    fit equally into its deconstructed cots.

    The ‘language and practices’ that offend the deepest sensibilities of political
    correctness form the very foundations of Western civilization: the languages
    of logic, mathematics, classics, philosophy – along with the language of Shakespeare
    too – and the practices of science, capitalism, democracy and due legal process-along
    with the inescapably allied and respective notions of reliable method, generation
    of wealth, government by consent of the governed, and protection of inalienable
    individual rights. By metastasising like an opportunistic cancer throughout
    the mind-politic of the academy, political correctness has proceeded, true to
    Webster’s definition, to eliminate the language and practice of Western civilization
    itself, and therefore to kill the very body-politic upon which it parasitically
    feeds. Lest you deem my accusations implausible or exaggerated, I will regale
    you with a few examples.

    Grade inflation is rampant in American universities, to the extent that
    undergraduate degrees are increasingly worthless pieces of paper.
    From the Ivy to the Poison Ivy Leagues, institutions have capitulated
    to ‘egalitarian’ demands that students receive A’s for attendance.
    They graduate hapless victims of victimology, who can neither read
    with comprehension nor write grammatically correct sentences. When
    such students receive D’s or F’s in my upper-level philosophy electives,
    they complain that they are ‘straight-A’ majors in psychology, or
    education, or in some other department that subscribes to the barker’s
    slogan ‘Everybody plays, everybody wins’. By the same token, one
    very bright and hard-working student, who happened to be a black
    female, asked me if she had really ‘earned’ the A she received in
    my course. When I assured her that she merited the grade based on
    her performance and nothing else, she actually wept with gratitude
    – at having been allowed to display her merit. By contrast, politically
    correct ideology systematically deprives excellent students of opportunities
    to excel, so as not to ‘offend’ mediocrity and worse.

    Political correctness eradicates individual liberties as well as merit. Princeton
    University’s Office of Student Life annually prints a handbook lauding
    ‘tolerance’ and extolling the ‘virtues’ of cultural diversity. The
    office also compels attendance at freshman orientation films, one
    of which illustrates methods of contraception and abortion. When
    a Roman Catholic student tried to exit the cinema, asserting that
    she had no need watch these practices because her religion forbade
    them, she was physically prevented from leaving. She was coerced
    (in the name of tolerance and diversity!) to watch the entire film.
    This is another face of political correctness: rank hypocrisy.

    Freedom of speech was an early casualty. In denial, Katherine Whitehorn wrote
    in the London Observer: "The thing has been blown up out of all
    proportion. PC language is not enjoined on one and all – there are a lot more
    places where you can say ‘spic’ and ‘bitch’ with impunity than places where
    you can smoke a cigarette." She should have been at a Canadian University
    in 1994, when a professor of political science remarked jocularly to a teaching
    assistant noted for her stern grading: "I’ll bet the students think you’re
    a real black bitch." The president of that university promptly shut down
    the graduate studies program in political science, while the teaching assistant
    sued the university and pocketed more than $300,000. (Hey, for that kind of
    cash, you can call me anything you like.) This catapulted UBC onto the national
    news, and cost the president his job. Stand-up comedy proliferates precisely
    because the comics remain at liberty to say what – thanks to political correctness
    – their audiences are increasingly afraid to think.

    Around the same time, Yale University was busily refusing a gift of 20
    million dollars, offered by a Texas oilman and patron of high culture.
    He wanted the money spent on a humanities program that celebrated
    Great Books of Western civilization. Unfortunately, Yale was long-since
    committed to the politically correct doctrine that there are no
    great books, that the idea of great books is a pernicious myth used
    to oppress illiterate and innumerate savages, to keep women barefoot
    and pregnant, to exploit the developing world, and to glorify dead
    white European males who apparently plagiarised Western civilization
    from an unidentified tribe of transvestites. Thus Yale could not
    possibly accept 20 million dollars to teach so-called ‘Great Books’,
    either because ‘greatness’ is entirely arbitrary, or because recognising
    a few ‘Great Books’ would be offensive to a great many inconsequential
    ones.

    PC hiring practices are utterly Orwellian. In a Canadian university,
    a male and a female candidate were finalists for a tenurable position
    in philosophy. The male was demonstrably better qualified, but the
    female was offered the position owing to an alleged ‘gender imbalance’.
    Two members of the selection committee were willing to testify to
    the province’s Human Rights Commission that the female’s appointment
    had been politically orchestrated. But when the male finalist formally
    asked the province’s HRC to investigate, his request was summarily
    denied. He was informed by the HRC that, since he was a white male,
    it was impossible for anyone to discriminate against him.

    The siege engines of political correctness have been dragged to the very walls
    of MIT, where cries of ‘gender imbalance’ herald the administrative re-allocation
    of scientific funding to satisfy arbitrary gender quotas. Copious evidence on
    sex difference, much of it accumulated by female researchers themselves, shows
    that males are, on average and by nature, more adept than females at mathematical
    and spatio-temporal reasoning. But any fact that offends regnant political sensibility
    is dismissed as a ‘social construct’, and ignored by wishful thinking. The politically
    correct explanation for the dearth of female Newtons and Einsteins is that female
    geniuses have been ‘oppressed’ by the usual conspiracy of white males, and by
    the very institution of civilization itself.

    And what is philosophy’s explicit role in all this? It varies across
    a continuum. In so far as academic philosophers are political animals,
    prey to the edicts of a brain-dead academy, they either resist political
    correctness, or pay lip-service to it, or embrace it according to
    their respective lights or darknesses. But those who fail to resist
    its fatuous tyranny, or who revel in its egregious self-righteousness
    become apologists for the deconstruction of the very intellectual
    culture that makes philosophy possible, and accomplices to the sapping
    of the principles which sustain that culture itself. Thus North
    American philosophers who champion group rights and trample individual
    liberties (epitomised by proponents of quota-based hirings), who
    hysterically demonise reason, and who absurdly deny Hume’s distinction
    between fact and value on the alleged grounds that all ideas are
    ‘social constructs’, excepting this idea itself, which they take
    as brute fact (epitomised by Richard Rorty’s flagrant anti-realism)
    – these are not lovers of wisdom, but high priests and handmaidens
    of hubris.

    To philosophy students who can yet read, I recommend J S Mill’s On
    Liberty
    . His enlightened conception entails

    …liberty of tastes and pursuits, of framing the plan of our life to suit
    our own character, of doing as we like, without impediment from
    our fellow creatures, so long as what we do does not harm them,
    even though they should think our conduct foolish, perverse, or
    wrong.

    Mill’s salient distinction is between offence and harm; its implications
    for political correctness are pellucid. People who are offended
    by others’ languages and practices should not have the liberty to
    eliminate them, as long as such words and deeds are not harmful.
    But once this critical distinction between offence and harm is blurred,
    as it is daily and extravagantly by the politically correct, then
    those who blur it arrogate to themselves the supremely illegitimate
    authority to proscribe whatever conduct they deem ‘offensive’ (for
    example, affairs between professors and graduate students, or ideologically
    unpopular research), to silence whatever speech they deem ‘offensive’
    (such as ethnic humour or sexual innuendo), and to censor whatever
    ideas they deem ‘offensive’ (for example that there are biologically-based
    human differences that may not be eradicable by social engineering,
    or that equal opportunity virtually guarantees unequal outcomes).
    The near-ubiquitous conflation of offence with harm has sanctioned
    a thirty-year reign of political terror in North American universities,
    whose degenerate administrative ideologues daily micromanage the
    minutiae of thought, speech and deed.

    In such a totalitarian climate, philosophers who fail to draw and defend
    Mill’s distinction between offence and harm are not only professionally
    derelict, but are also party to the catastrophe that has ensued
    from its blurring.

    The ‘dark side’ of philosophy is compassed both by what it has failed
    to do in defence and preservation its own mission – the love of
    wisdom – and by what this failure has permitted the enemies of open
    and reasoned inquiry to entrench in its place – the worship of folly.

    This article was originally published in Issue 14 of The Philosophers’ Magazine.

  • Evolutionary Psychology and its Enemies: an interview with Steven Pinker

    Steven Pinker has a new book out, The Blank Slate. We have been closely observing and reporting on the reception of this particular volume of science for the public, because that reception and the probable reasons for it are closely related to the subject matter of Butterflies and Wheels. Evolutionary explanations of human nature and behavior and ways of thinking make many people very suspicious and afraid, and hence willing to make some highly dubious arguments.

    But as many people have noticed and pointed out in the last few years (e.g. E.O. Wilson in The Philosophers’ Magazine), the tide does seem to be turning. Pinker’s book has been getting a largely favorable or at least attentively respectful hearing, including even a favorable review in the US magazine The Nation. Steven Pinker generously took some time from his busy schedule complete with book tour, to answer some questions for us.

     

    Butterflies and Wheels: You wrote The Blank Slate to address the fears people have about evolutionary psychology. Although the days of emptying pitchers of water over peaceable entomologists may be over, I’ve noticed that many opponents of the field still resort to highly questionable tactics, including guilt by association, confusion of terms, loaded questions. Are there any critics of evolutionary psychology you respect? Any who have doubts about the evidence, the methods, the interpretations, but pose the questions without resorting to rhetoric or consequentialist arguments?

    Pinker: If “evolutionary psychology” just means bringing evolutionary biology to bear on the human mind, frankly I don’t think there could be honest criticism of evolutionary psychology, because it would simply be obscurantism or disciplinary parochialism. It would be in effect declaring that the insights of one discipline must never be brought to bear on another, as if one were attacking neuroscience, or sociolinguistics, or the history of science. This is especially true given that evolutionary thinking is already pervasive in the less politically sensitive areas of psychology, like perception and motivation. It would be perverse to insist that researchers in stereo vision not be allowed to take into consideration the evolutionary function of being able to see in depth, or if scientists who study thirst were condemned for analyzing how thirst works to keep the body’s fluids and electrolytes in balance. Ultimately that is what evolutionary psychology is about, but applied to more contentious domains cognition and the social emotions.

    Now, “evolutionary psychology” has also come to refer to a particular way of applying evolutionary theory to the mind, with an emphasis on adaptation, gene-level selection, and modularity. Obviously that can be criticized, just like any other empirical theory; some of the sharper critics include David Sloan Wilson, Elliot Sober, Robert Boyd, and Sarah Blaffer Hrdy.

    But ultimately “evolutionary psychology” is not a single theory but a large set of hypotheses about particular topics, and any one of them can be, indeed, should be, criticized, just like any scientific hypothesis. Indeed, just about every concrete hypothesis in evolutionary psychology has come under criticism in the technical literature (and been defended in turn), just like the rest of empirical psychology. Did a preference for symmetrical faces evolve because symmetrical organisms are fitter and hence better mates, or does the visual system like symmetrical patterns even in artifacts, where mating is irrelevant? Are people especially good at detecting logical violations when they pertain to social contracts, or can they detect them more generally, whenever such a violation is relevant to our current interests? Can the nongenetic variation in personality be explained in terms of sibling competition over parental investment, in terms of carving out a niche in a peer group, or in terms of sheer chance? The researchers who raise these objections to hypotheses emerging from evolutionary psychology are, of course, doing their job as scientists. Many of these issues can take decades to resolve, again, just like the rest of psychology. It is conceivable that when the dust settles not a single hypothesis motivated by evolutionary biology will survive (but I doubt it).

    B and W: One reservation that I hear from rational people is the “just-so stories” aspect. That evolutionary explanations of human nature can operate the way Freud’s did: simply twist and turn to meet objections, interpret the evidence so that it fits the theory rather than adjusting the theory. Is there any merit to this idea, or is evolutionary psychology just as falsifiable as any other science?

    Pinker: “Evolutionary psychology” is an approach and a set of theories, not a single hypothesis, so no single experiment can falsify it, just as no single experiment can falsify the theory of evolution or the connectionist (neural network) approach to cognition. But particular hypotheses can be individually tested, such as the ones on the relation of symmetry to beauty or the relation of logical cognition to social contracts, and tests of these are the day-to-day activity of evolutionary psychology. Journals such as Evolution and Human Behavior are not filled with speculative articles; they contain experiments, survey data, meta-analyses, and so on, hashing out particular hypotheses. And as I mentioned above, over the long run the approach called evolutionary psychology could be found unhelpful if all of its specific hypotheses are individually falsified.

    B and W: You discuss, in The Blank Slate, the way an excessively optimistic view of human plasticity can lead to social engineering, coercion, and genocide. But you also point out that though we have drives and instincts that served us well in the distant past but don’t serve us well now, we also have a cerebral cortex that can override those drives. Is there any tension between those two thoughts? Is there any way to distinguish between dangerous social engineering on the one hand, and necessary laws that seek to restrain such drives, laws against rape for example, on the other?

    Pinker: I don’t think there is a contradiction because I don’t think the cerebral cortex is an infinitely malleable substance or an all-powerful problem solver. In language, a finite set of rules can generate an infinite set of sentences; not just any old set (million-word sentences, programs in Java, musical notation, humpback whale songs, etc.), but only sets conforming to “Universal Grammar.” Likewise there is an infinite space of possible thoughts and goals, but they are subject to the quirks and limitations of human nature.

    Your question about a middle ground between coercive social engineering and necessary restraints on antisocial behavior is basically the age-old question (debated by Hobbes, Rousseau, Locke, and the framers of the American constitution, among others) of whether there can be a middle ground between anarchy and totalitarianism. Democratic government, the rule of law, and civil-libertarian restraint on the power of government and the police are, in my view, solutions that do make such distinctions. Ultimately the distinctions hinge on the promotion of human well-being and the reduction of human suffering – some medium-sized amount of government coercion (constitutionally limited, and operating with the consent of the governed) seems to maximize this function better than anarchy or totalitarianism.

    B and W: It seemed to me that the audience was at least not hostile when I saw you on your book tour. Indeed, when one man tried the rhetorical move of wondering what use future tyrants might make of your books and you replied that you wouldn’t worry about it as long as they read them with understanding, the audience applauded. Has the reaction been generally favourable so far? And was there any difference between the responses in the UK and the US?

    Pinker: People who come to my talks are not a random sample, of course, but you are correct that I have not received anything like the abuse that greeted E. O. Wilson or Richard Herrnstein in the 1970s. The only truly intemperate reaction was from a British psychoanalyst who (correctly) inferred that if people’s personalities and neuroses are not shaped by parental treatment in the first six years of life, he and his colleagues are guilty of malpractice. I also have received a small number of nasty – and I would say grossly unfair — reviews from academics and journalists who vaguely sensed that their 1960s-era leftism was not being affirmed by the book, who could not put their finger on anything wrong with the arguments, and who resorted to distortion and sweeping dismissal. But that has been true of a minority of the reviews and probably could be expected of any book that takes on controversial subjects. Indeed, with the exception of the man you noticed, I have not received any hostile reaction among the hundreds of audience questions and pieces of correspondence I have received so far.

    One of the reasons is that the climate has changed – I first noticed this a few years ago when my the students in my classes at MIT were not outraged by hearing about research on, say, violence or sex differences that would have been inflammatory a few years ago. (They are a whole new generation – it was their parents, or even their grandparents, who were carrying placards in the 1960s and 1970s!) Also, whereas Wilson was blindsided by the attacks, not realizing that his proposals might have political implications, The Blank Slate is about the political implications (and non-implications) of human nature, and shows how an acknowledgment of human nature does not, in fact, justify racism, sexism, reactionary politics, or moral nihilism. Anyone who is morally incensed by the book cannot have read it.

    Steven Pinker is Peter de Florez Professor of Psychology at Massachusetts Institute of Technology. He is the author of The Blank Slate, How the Mind Works, and The Language Instinct.

  • Higher Superstition Revisited: an interview with Norman Levitt

    Paul R. Gross and Norman Levitt’s book Higher Superstition appeared
    in 1994, rattled a good many cages, and prompted the Sokal Hoax. The book describes
    a bizarre situation in American universities in which academics in various (mostly
    new-minted) fields such as Cultural Studies, Literary Theory, and Science Studies,
    plus a few more familiar ones such as Sociology, Comparative Literature and
    the like, make a career of writing about science without taking the trouble
    to know anything about it. Gross and Levitt have a good deal of fun exposing
    the absurd mistakes perpetrated by people who rhapsodise about quantum mechanics
    and chaos theory without having the faintest idea what they’re talking about.


    But hilarity aside, there are serious issues involved. The Cultural Studies
    brigade attack not only the misuses to which science can be put, but scientific
    ways of thinking themselves; not only possible inequities in hiring and promotion,
    but logic, truth and the ‘Enlightenment project’. Gross and Levitt did an admirable
    job of sounding the alarm which Butterflies and Wheels plans to go on sounding.
    Norman Levitt very kindly agreed to answer some question for us.


     


    Benson: Do you think the situation has improved since you wrote Higher
    Superstition?
    Do people seem any more embarrassed or self-conscious about
    writing ignorant absurdities? Or do they merely congratulate themselves all
    the more on how “transgressive” they are?


    Levitt: This is a complicated question in that it depends on the parameters
    one chooses to measure improvement or deterioration. My main motivation for
    writing HS was to alert scientists to the fact that bizarre views of
    science were being taught and fervently advocated in various enclaves of the
    humanities and social sciences. The main immediate danger, I thought, was that
    "science studies" imperialists were, in many schools, proposing that
    the "science requirement" for non-science students be replaced by
    courses in "science and society" or some such. This was an attractive
    proposition for some of the scientists, who view "Physics for Poets"
    and the like as joyless and time-wasting tasks. But since word has gotten round
    that science studies is a dubious cult and actively hostile to science to boot,
    the prospects for this kind of coup have pretty much crashed. Of course, HS
    can’t by any means claim all the credit. The real shocker was the Sokal Hoax,
    and there were other instructive flaps as well, such as the Science in American
    Life
    exhibit at the Smithsonian.


    Another fact worth noting with some satisfaction is that the enthusiasm for
    extreme constructivist claims regarding the nature of scientific knowledge has
    cooled considerably. Constructivist slogans are no longer reflexively adopted
    and mouthed in literature and sociology departments. This doesn’t mean that
    science studies has become any more worthwhile or intellectually responsible,
    or that its hostility toward science has lessened. But a lot of the rebellious
    glee has gone out of it, as has the smug delight in being outrageous.


    Most important, the ultimate ambition of many postmodern science-studies enthusiasts–that
    is, to become the primary mediators between science and political institutions
    (the commissars, as it were, of science and technology)–have largely been squelched.
    Embarrassing questions were raised far too early in the game, well before any
    successful infiltration of the corridors of power.


    On the other hand, alas, few of the more notorious academic anti-science celebrities
    have lost out, career-wise, as a result of being flayed by HS, A House
    Built on Sand
    , The Flight from Science and Reason, and so forth.
    There are some partial exceptions to this, most notably the failure of the silly
    attempts to appoint a science studies professor at the Institute for Advanced
    Study. But even the characters gulled by Sokal’s prank remain pretty much immune
    to retribution for their intellectual dereliction. Indeed, being attacked by
    one of those dreaded scientists in warpaint amounts to a crown of martyrdom
    that nicely adorns one’s curriculum vita. Most so attacked have prospered
    quite nicely, thank you. One laments the injustice of it all!


    But then, we’re not talking about child rape or looting employee pension funds.
    It’s just the university culture being its customary silly self, which is hardly
    surprising or curable.


    Finally, something should be said about the leaching of postmodern antiscience
    attitudes into the more general culture. Some real damage has been done. Schools
    of education have picked up a bit of this nonsense, especially in connection
    with "constructivist" theories of pedagogy. Even worse, so have some
    schools of nursing, which invoke shoddy philosophy and pomo slogans to justify
    their flirtation with worthless alternative medicine dogmas and practices. Some
    of the same stuff works its way into environmental activism or into the kind
    of ethnic activism that has set up western science as an ideological enemy.
    Postmodern cant has also softened up many intellectuals for the renewed assaults
    of creationists, now taking form as "Intelligent Design Theory." (An
    example may be found in The Nation, the best-known American leftist journal,
    which recently ran a bizarre, effectively pro-ID review of Stephen Gould’s last
    book. It was written by an au courant literary critic with scant knowledge of
    biology but a thorough grounding in social-constructivist drivel.)


     


    Benson: Are undergraduates aware of the controversies? Do science teachers
    have to waste much energy combatting trendy notions about the situatedness of
    truth, or is that one area where ignorance is an advantage?


    Levitt: Speaking as a mathematics teacher, I waste a lot of energy,
    perhaps, but not in order to combat postmodern attitudinizing. In lower-level
    courses, I talk and I write stuff on the blackboard, the undergraduates listen
    and take notes (or not). Hence, there’s little interchange that isn’t initiated
    by "Is this going to be on the exam?" In upper-level courses, things
    are a bit more relaxed and clubby, but students who get as far as upper-level
    mathematics are hardly the type to pay much attention to the archdruids of deconstruction,
    the pythonesses of feminist theory, or the jongleurs of multiculturalism.


    Nonetheless, one can get a sense of overall undergraduate attitudes toward
    the spectrum of experiences encountered at a university. I’m specifically talking
    about a large, multiplex American state university, which encompasses, academically,
    all sorts of activity, from hard-nosed engineering to the squishiest, touchy-feely
    "oppression studies," and which serves a host of other functions by
    way of baby sitting young people between late adolescence and the grim, inevitable
    day they have to go out and earn a living. Most students invest most of their
    ardor in having a helluva good time-sex, rock ‘n’ roll, and partying down, with
    the occasional foray to the football stadium or the basketball arena for those
    rituals of mass mindlessness. To extent that they take education seriously,
    it’s as a gateway to the possibility of making a decent income-upper middle-class
    or better. So they fight like hell to get into the business and management programs,
    and sometimes do some serious grinding to qualify for law or medical school.
    But the issues that raise tempers in purely academic circles-the culture wars,
    the fights over "diversity," and all that rot-are not undergraduate
    issues, by and large.


    The PC/Pomo faction, to describe it as tersely as possible, certainly has a
    death-grip on a lot of instructional turf. Basic courses like expository writing
    are in their hands, and many students have to go through the mill of a pious
    course on "diversity" or some such. There they get a fairly strong
    whiff of academic-left doxology. But the upshot is not, on the whole, a cohort
    of enthusiastic recruits, but rather a mass of skeptical-to-cynical young people
    who have caught on to the fact that their instructors, or at least, those who
    give pedagogical marching orders to their instructors, can often be prigs, bores,
    and bigots-and none too bright, at that. PC/Pomo preaching far more readily
    generates disdain for itself than hostility toward its declared foes. This is
    even true for black and Hispanic students. It is an obvious corollary that the
    quirky attitudes toward science common amongst the pomo faithful do not diffuse
    very far or very fast into mainstream undergraduate culture.


    This is not to say that undergrads are generally knowledgeable about science
    or that they have a sophisticated grasp of the canons of reliable knowledge.
    Even science majors have a shallow knowledge of science beyond what they’ve
    specifically acquired in courses. The non-science majors know little and care
    little about science and tend to be clueless and inarticulate when scientific
    matters arise at any level. But PC/Pomo demonology, per se, doesn’t seem to
    have been responsible for this.


     


    Benson: Is the issue still alive among the faculty? Are there arguments,
    debates, quarrels? Do you personally have to deal with bristling, indignant
    colleagues from “Cultural Studies” and such who are outraged by your work?


    Levitt: The basic story is this: A certain camp within the PC/Pomo enterprise
    made a fetish of "science criticism" and, at its high water mark 10
    or 12 years ago, even dreamt of becoming a powerful oracular presence on the
    societal scale, a major player in determining science and technology policy.
    However, it couldn’t fly below the radar forever; its vanguard was spotted and
    chased back to the starting line fairly easily. The "science wars"
    have dissolved such susceptibility to its wiles as might have existed among
    scientists and science administrators. But its own turf is pretty secure, thanks
    to the inertia of the academic world. Its members have their little club and
    retain their power to praise and promote each other for as long as the money
    holds out (which it will for some time to come).


    Hurt feelings persist, and folks like Paul Gross and me, not to mention Sokal,
    are still excoriated for our villainy and obtuseness, evoking imprecations from
    many courses and published papers (and a good thing, too, since it keeps our
    books in print!). Yet people in the science-studies racket have also grown more
    prudent; they are chary of making outrageous epistemological claims with flags
    flying and trumpets blaring the way they used to on a daily basis. Their dreams
    of exercising actual political power over science and scientists are on the
    back shelf. But the academic cult as such keeps grinding away on its own narrow
    terms.


    Debates do continue, in some sense, but they have moved to such arenas as schools
    of education and-quite frightening-into medicine, where bits and pieces of the
    radical science-studies litany are now and then recruited to defend quackery.
    But, as I see it, the main thing to keep in mind is that the academic assault
    on science began, when you get down to it, because antiscientific attitudes
    widespread in the general culture penetrated the academy, becoming over time
    increasingly stark and explicit in the thinking of intellectuals who think of
    themselves as radically opposed to middle-class politics, culture, and values.
    In the universities, this antagonism acquired a particular rationale-philosophies
    propounding the situated and socially-constructed character of knowledge claims
    and the malign effects of a Eurocentric episteme. It also acquired a vocabulary
    and a certain characteristic rhetorical style. At that point, it leached back
    into the wider culture, slightly altering the rhetoric, but not necessarily
    the essential substance, of demotic antiscience. But it is well to remember
    that the basic problem is not that of an assault on rationality by a cabal of
    reckless university intellectuals; the assault was going on long before these
    guys came on the scene, and will continue even if each and every poststructuralist
    and feminist-epistemologist were to convert overnight to logical positivism.


    As for my personal experiences; obviously there are people who don’t like me
    at all because of HS et seq. Once in a while I get roundly attacked
    at a conference where I’m speaking. But this is hardly terrifying; superannuated
    as I am, I love being denounced as a dangerous character. In defense of my own
    institution, Rutgers University, I must say that it has treated me rather well
    as a result of my involvement in the science wars (while treating some of my
    local enemies quite well, too!) Post 9/11, we-scientists, intellectuals along
    with all the rest–seem to be headed into "interesting times," which
    takes a lot of the ginger out of academic quarrels. All this science-wars stuff
    may be fading irrelevancy within a few years.


     


    Benson: If some faculty members are indignant, are others as it were
    recruited? Do you find new allies in departments such as History, Sociology,
    Anthropology, where evidence and the validity of evidence are highly relevant?
    If so, could this be a hopeful sign? Could one by-product be a heightened awareness
    in many disciplines of the need to ground truth-claims and knowledge-claims
    with evidence and logic?


    Levitt: Though one could hardly call it a mass movement, there has been
    a recurrent "Rally round the flag, boys!" mood amongst some scholars
    in areas besieged by relativism and anti-rationality, with the serio-comic episode
    of the science wars providing some ground for renewed hope that terminal silliness
    will not prevail. I’ve been in touch with quite a few of these people in areas
    like psychology, anthropology, and even philosophy. It’s no longer quite so
    easy to sneer at words like evidence"" and "objectivity,"
    and, to a certain extent, it’s become possible to sneer at the sneerers. Things
    are changing slowly, but largely they are changing for the good. The project
    of serious inquiry in all kinds of fields is in better shape than it was a few
    years ago, and the intellectual fops are no longer quite so sure of their ground.


     


    Norman Levitt is Professor of Mathematics at Rutgers University. He is author,
    with Paul R. Gross, of
    Higher Superstition: The Academic Left and Its Quarrels
    with Science (The Johns Hopkins University Press, 1997). Edward O. Wilson
    said about this book that it was "original" and "brilliant".

  • Postmodernism and History

    Postmodernism comes in many guises and many varieties,
    and it has had many kinds of positive influences on historical scholarship.
    It has encouraged historians to take the irrational in the past more seriously,
    to pay more attention to ideas, beliefs and culture as influences in their own
    right, to devote more effort to framing our work in literary terms, to put individuals,
    often humble individuals, back into history, to emancipate ourselves from what
    became in the end a constricting straitjacket of social-science approaches,
    quantification and socio-economic determinism.


    But this is postmodernism in its more moderate
    guise. The literature on postmodernism usefully distinguishes between the moderate
    and the radical. What I call radical postmodernism takes its cue from another
    post, post-structuralism, roughly speaking the idea that language is arbitrarily
    constructed, and represents nothing but itself, so that whenever we read something,
    the meaning we put into it is necessarily our own and nobody else’s, except
    of course insofar as our own way of reading is part of a wider discourse or
    set of beliefs.


    It must be obvious that this idea has a corrosive
    effect on the discipline of history, which depends on the belief that the sources
    the historian reads can enable us to reconstruct past reality. It is just this
    idea that many post-structuralists have attacked. Alan Munslow, for example,
    proclaims: ‘The past is not discovered or found. It is created and represented
    by the historian as a text.’ Keith Jenkins believes that ‘history is just ideology’.’
    And Hans Kellner complains that historians‚ ’routinely behave as though their
    researches were into the past, as though their writings were about “it”, and
    as though “it” were as real as the text which is the object of their labours.’
    The past is unknowable; all we can know about is historians’ writings, so history
    disappears and we are left with historiography as a species of literary endeavour.
    What historians write depends on their own purposes and their own point of view,
    and there is no way of deciding whether one representation of the past is true
    and another, contradictory one, untrue.


    Arguments such as these are extremely self-contradictory,
    however. If the statement, commonly made by postmodernists, that truth is always
    relative to a particular society or culture or group in society, is true, then
    it is true in an absolute sense, not a relative one, since as a statement, it
    must hold good for all societies and cultures. Similarly, when postmodernists
    claim that nobody has access to the truth, they must believe that this is in
    fact a true statement, so the person making it does have access to the truth.
    If texts are given meaning by the reader and not the writer, then why
    have so many postmodernists complained that when I have criticized them I have
    been basing my criticisms on a misrepresentation of what they have written?
    Presumably postmodernists believe that the texts they are writing are
    not capable of an infinity of interpretations, that they make their meaning
    unmistakeably clear so that the reader is left with only one way of interpreting
    it. Again, therefore, the postmodernist proposition refutes itself.


    All of these points are in the end fairly obvious.
    Postmodernism of the Jenkins/Munslow variety shows what one might call a naive
    cynicism that is too simple-minded to cope with doubt and imperfection. Let
    me illustrate this by looking at the concept of Truth, a term one usually
    finds in post-structuralist writings placed inside a cordon sanitaire of
    quotation marks, as if it would cause some horrible infection of old-fashioned
    empiricism in the writer or reader if it was let out.


    Of course it is right to say that we can never
    know the whole or absolute truth about anything in the past. But just because
    we can never attain the whole or absolute truth, just because we make mistakes
    in our search for the truth about the past, just because there will always be
    something new to say about any historical subject, it does not follow that there
    is no such thing as the truth at all. In a similar way, just because what is
    accepted as true isn’t necessarily so, does not mean that the concept of the
    truth itself is merely ideological. Truth, as I noted earlier, is not relative
    to perspective, though what is accepted as true is; ‘a statement is true
    if and only if things are as it represents them to be.’ So there cannot be incompatible
    truths; after all, ‘incompatible’ actually means ‘cannot be jointly true’.


    So if we claim that there is no such thing as
    truth, then either that statement is true, in which case there is such as thing
    as truth, or it is not true, which amounts to the same thing. The point is,
    of course, that postmodernists passionately want us to believe that what they
    are saying is true and objective, even when they say that nothing anybody says
    is true and objective.


    You’ll notice that I’ve finally introduced the
    term ‘objective’ here. This has been the source of a lot of confusion. It does
    not mean the same as absolutely, completely and irrefutably true, and postmodernists
    who say it does, are setting up a target deliberately manufactured to be able
    to knock over without too many problems. Objectivity does not really
    have this strong meaning, however; it generally means, fairly obviously, a perspective
    or representation deriving from something external to the mind, the object,
    rather than from the mind of the person doing the representation, the subject.
    We see a car coming towards us as we’re crossing the road, and we recognise
    it as an object, so we jump out of its way. The evidence that it’s coming is
    provided by our senses, sight and hearing, possibly also smell, though hopefully
    it’s not such a close call that we have to involve touching and feeling as well.


    When we read a historical document, or look at
    an archaeological site, we can’t read into it, or see in it, anything we want
    to. We can read it for a variety of purposes and in a variety of ways, but the
    possibilities are not unlimited. We bring to our sources all kinds of theories,
    ideas, beliefs, questions, and the more conscious we are of them, the better,
    but what happens when all of this comes into contact with the sources is a dialogue,
    a two-way process, not the simple one-way imposition of our own views on a blank
    sheet of paper or an empty piece of ground. We can argue about what the sources
    tell us, but it’s not the case that one interpretation is always going to be
    as good as another; some are more persuasive than others because they achieve
    a better fit with the evidence, and sometimes, some are actually right and others
    wrong.


    Postmodernists like Keith Jenkins and Frank Ankersmit
    have tried to respond to points like these by insisting that there is a huge
    difference between historical fact and historical interpretation. Facts are
    easy to establish, it’s interpretation that is the problem. Postmodernists,
    Jenkins has recently claimed, have never argued that there was no ‘cognitive
    element in history‚ at the level of the individual statement, only that certainty
    and objectivity were impossible at the level of interpretation (narrative discourse).’
    But enormous amounts of postmodernist ink have been spilled on trying to prove
    that documents are so unreliable you can never tell anything from them at all,
    that we can never recover the intentions of their authors, and so on. It is
    a fundamental premise of postmodernist critiques of history that a document
    is re-invented and re-interpreted every time someone looks at it, so that it
    can never have any fixed meaning at all. If this claim doesn’t mean that we
    can never use documents to find out basic historical facts, then it doesn’t
    mean anything at all.


    The point here is that it is not really possible
    to distinguish so sharply between fact and interpretation in history as this.
    There’s an element of interpretation, however small, involved in the establishment
    of even the most basic historical facts. It’s not in practice possible to draw
    a clear line between fact and meaning in history; rather, it’s a sliding scale,
    so that the larger the fact, or ensemble of facts, the historian wants to establish,
    the larger the element of interpretation involved. The ultimate test of any
    historical statement is the extent to which it fits with the evidence, but just
    because no fit is ever perfect, just because no fact can be established as anything
    more than an overwhelming probability, doesn’t mean that we can naively and
    impatiently discard all historical statements as mere inventions of the historian.
    Let’s have a bit of subtlety and sophistication here, qualities that postmodernists
    are always urging us to adopt.


    Let me make this a bit clearer by giving you an
    example. In the David Irving libel trial held two years ago, in which I served
    as an expert witness for the High Court in London, Irving was suing Penguin
    Books and their author Deborah Lipstadt for calling him a Holocaust denier and
    a falsifier of history. It was not difficult to show that Irving had claimed
    on many occasions that no Jews were killed in gas chambers at the Auschwitz
    concentration camp. He argued in the courtroom, however, that his claim was
    supported by the historical evidence. The defence therefore brought forward
    the world’s leading expert on Auschwitz, Robert Jan Van Pelt, to present the
    evidence that showed that hundreds of thousands of Jews were in fact killed
    in this way. Van Pelt examined eyewitness testimony from camp officials and
    inmates, he looked at photographic evidence of the physical remains of the camp,
    and he studied contemporary documents such as plans, blueprints, letters, equipment
    orders, architectural designs, reports and so on. Each of these three kinds
    of evidence, as the judge concluded, had its flaws and its problems. But all
    three converged along the same lines, creating an overwhelming probability that
    Irving was wrong.


    Just as important as this was the fact that it
    was possible to demonstrate that Irving’s historical works deliberately falsified
    the documentary evidence in order to lend plausibility to his preconceived arguments,
    principally his belief that Hitler was, as he said on one occasion, ‘probably
    the best friend the Jews ever had in the Third Reich’. Falsifying documents
    involved not just leaving words out from quotes but even putting extra words
    in to change the meaning. For example, quoting an order from Himmler that a
    ‘Jew-transport from Berlin’ to the East should not be annihilated as if it were
    a general order that no Jews at all, anywhere, were to be killed, by the simple
    expedients of adding an ‘e’ to the German word Transport, making it plural,
    and omitting the words ‘from Berlin’, and hoping that other researchers wouldn’t
    trouble to check the source, or if they did, wouldn’t be able to read the handwriting
    (which is actually very clear and unambiguous). Or by adding the word ‘All’
    to the note of a judge at the Nuremberg Trial in 1946 on the testimony of an
    Auschwitz survivor which actually said ‘this I do not believe’, after a small
    part of her testimony, to make it look as if he did not believe any of it. If
    we actually believed that documents could say anything we wanted them to, then
    none of this would actually matter, and it would not be possible to expose historical
    fraud for what it really is.


    This brings me to my final point about the postmodernist
    positions I’ve been describing, and that is, postmodernists tend to think of
    themselves as left-wing, and their views as liberating and emancipatory, but
    in fact they are none of these things at all. Postmodernist hyper-relativism
    has no political implications of a positive kind at all. If history really is
    nothing more than propaganda, then there’s nothing to say it has to be left-wing
    propaganda, it can just as easily be right-wing propaganda, or racist propaganda,
    or neo-fascist propaganda, as the High Court in London decided in the end that
    David Irving’s writings were. If we don’t believe it’s possible to distinguish
    between truth and falsehood, then we have no means of exposing racism, antisemitism,
    and neo-fascism as doctrines of hate built on an edifice of lies, indeed we
    have no real means of discrediting them at all. We can say of course that we
    disapprove of them in moral and political terms, but neo-fascists can just put
    forward opposing moral and political arguments of their own in response, and
    in the end there are no objective criteria by which we can choose between the
    two positions.


    What the Irving trial showed in the end was the
    ability of historians to come to reasoned and persuasive conclusions about the
    past on the basis of a fair-minded and objective examination of the evidence.
    It didn’t show that the evidence in question was totally flawless, but it did
    show that attempts to discredit it rested on demonstrable forgery and falsification.
    If there is such a thing as historical untruth, there must also be such a thing
    as historical truth. And if there is such a thing as a biased, tendentious historian
    who tried to support preconceived ideas about the past by a selective use of
    the evidence and by doctoring the documents, there must be such a thing as an
    objective historian who puts preconceived ideas about the past to the test of
    whether or not they are supported by the evidence, and modifies or abandons
    them if they are not.


    Contribution to the ‘Great Debate on History
    and Postmodernism’, University of Sydney, Australia, 27 July 2002