Category: Articles

Welcome to our articles section. The articles below either have been written specifically for ButterfliesandWheels or are appearing here having been published elsewhere previously.

If you’re interested in writing an article for ButterfliesandWheels, please click here for our information for contributors page.

  • A Curious Accident in Space-Time

    Despite the lack of evidence to support the existence of extraterrestrial intelligence, many people firmly believe in it. If you are skeptical on this matter you are likely to be accused of being arrogant, anthropocentric or even a religious fanatic. However, to consider the possibility that we might be alone in the universe doesn’t necessarily make you any of those things. You can believe both that humans are rare or unique and at the same time that they are a purposeless arrangement of matter or a curious accident in space-time.

    In 1961 the astronomer Frank Drake announced that the number of extraterrestrial civilizations in our galaxy that might contact us could be calculated with the following equation:

    N = R fp ne fl fi fc L

    Where N is the number of communicative civilizations, R is the rate of formation of suitable stars, fp is the fraction of those stars with planets, ne is the number of Earth-like planets per solar system, fl is the fraction of planets with life, fi is the fraction of planets with intelligent life, fc is the fraction of planets with communicating technology and L is the lifetime of communicating civilizations.

    Many people think that this equation actually proves the existence of extraterrestrial intelligence and some even believe that a close encounter of the third kind could be just around the corner. However, the truth of the matter is that there is no scientific evidence to support that intelligent life exists anywhere beyond Earth and the only factor that can be calculated with some certainty in this equation is R the rate of stellar formation (1). Numbers for the other components are the product of the creative speculations of astronomers, SETI researchers and Star Trek fans.

    Of course, there’s nothing wrong about speculation (or about being a Star Trek fan). After all, if speculation is based on concrete facts and is not just a wild guess, it’s part of science. However, when it comes to evolution the facts are frequently misunderstood. People, including some scientists, tend to regard it as a linear process instead of as a tree of increasing complexity. Many assume evolution works towards achieving a certain goal, like intelligence. For instance, Lemarchand says “The principle of mediocrity suggests a logical progression: the emergence of life will lead to the emergence of intelligence, which will give rise to interstellar communications technology” (2). In the case of Drake’s equation these misconceptions can lead to fi and fc being hugely overestimated. Carl Sagan, for example, considered a guesstimate of one million possible civilizations in the galaxy “to be conservative” (2).

    It is true that wherever life emerges in the universe it’s likely to evolve according to the same rules. However, as Alan Turing explained, incredibly complex and diverse patterns can come into being by following very simple rules. In the same way that there cannot be two identical trees in a forest with the same foliage or number of branches, there cannot be two identical evolutionary histories (unless they exist in some kind of bizarre parallel universe).

    Similarly, although there are billions of us, all built from the same DNA instructions, we’re all unique (even identical twins). Just as we can say that you wouldn’t be yourself if a series of interrelated factors and fortunate events (or unfortunate depending on your self-esteem) have not taken place -for instance, your father meeting your mother, your father’s condom breaking, you being born a boy with green eyes, surviving meningitis, developing a twisted sense of humor, deciding to study philosophy, etc.- we may say that intelligence may have never appeared if a sequence of events and a series of factors had never occurred and interacted in the way they did. A single event, like the asteroid that killed the dinosaurs 65 million years ago, can conspire against or in favor of entire species and permanently modify the structure of evolution’s tree. We don’t know, but the number of events that lead to intelligence could be larger than the number of stars in the universe and the interaction of factors necessary for it to evolve more complex than your girlfriend’s moods. Hence, we can say that timing, luck and the interplay of biological and environmental factors are critical aspects in evolution.

    Even though it can be argued that intelligence, as the ability to get process and act on information is such a useful and common trait (apparent even in slime molds) that it’s likely to evolve elsewhere, the kind of ability that you need to build civilizations, technology and be aware of it is, in fact, rare. Among the billions of species that have evolved in the planet, perhaps as many as 50, we are the only one that has developed that kind of ability. Furthermore, as Jared Diamond has pointed out, compared to other more successful species like rats and beetles this feature doesn’t seem to be the best way to take over the world.

    So, if the same sequence of events is unlikely to play equally elsewhere, if the kind of intelligence you need to build civilizations and technology is rare even in our own planet and if there’s no actual evidence to support the existence of ET intelligence, there might be enough reasons to be a bit skeptical about having an interstellar chat with any space being in the near future.

    The problem is that skeptics are often accused of being led by illegitimate motivations such as arrogance, anthropocentrism or religious beliefs. Of course, in some cases that can be true. However, what’s also true is that there are several moral stands and mistaken assumptions behind the “we are not alone” argument and behind these accusations.

    On the one hand, there’s a kind of “IQ relativism” based on the notion that “there are many forms of intelligence, all different but equally good, valid and/or complex”. The idea of intellectual diversity is used to sustain that there’s nothing special about us, that intelligence is a standard outcome of evolution and therefore species like ours are likely to evolve elsewhere (yes, this may be the herald of intergalactic political correctness, we should perhaps start calling aliens “intellectually-diverse beings” so that they don’t get mad in case they’re listening).

    IQ relativists assume that if you think extraterrestrial intelligence is unlikely, is because you somehow believe humans are superior and, of course, that’s arrogant. However, one thing doesn’t necessarily entail the other. Rarity or uniqueness is not equal to superiority. You can believe that cephalopods are also fascinatingly unique and that doesn’t mean you think they are superior. Moreover, it can be argued that viewing intelligence as an inevitable outcome of evolution is what’s indeed arrogant.

    On the other hand, there are the Galileans who react against anthropocentrism assuming that if you are skeptical about the existence of extraterrestrial civilizations you automatically believe humans are the center of the universe. Thus, you may be some kind of religious fanatic or creationist freak who claims that we are God’s favorite creatures or the supreme objective of “Intelligent Design”.

    However, there’s a difference between thinking that human-like intelligence could be exceptional and thinking that the heart of the universe is in Alabama or that we are the preferred children of some supernatural being. Again, “unique” doesn’t mean “central”, or “most important”. Exceptions are also part of nature. And, although homo-sapiens could be unique in the universe, so does cephalopods and that doesn’t make them God’s master pieces. What’s more, it could be said that:

    • Thinking extraterrestrial intelligence is in some way human-like i.e., having civilizations and technology, is in fact what’s anthropocentric. (By the way, if they are really like us, are they also arrogant and think they are the center of the universe? Maybe that’s why they haven’t bothered to call and that will explain Fermi’s paradox.)
    • Believing in extraterrestrial intelligence is as superstitious as believing in God because there’s no evidence of their existence.

    In short, to consider the possibility that we might be alone in the universe doesn’t necessarily mean you are arrogant, anthropocentric or irrational. You can believe that humans are both, unique or rare and at the same time a purposeless arrangement of matter, a curious accident in space-time.

    References

    (1) Shermer, M., 2002, ‘Skeptic: Why ET Hasn’t Called’; Scientific American Magazine; August 2002; www.sciamdigital.com

    (2) Lemarchand G., 1998, ‘Is There Intelligent Life Out There?’; Scientific American Presents; Exploring Intelligence; www.sciamdigital.com

    (3) Darling, D. The Encyclopedia of Astrobiology, Astronomy, and Spaceflight; www.daviddarling.info/encyclopedia/M/mediocrity.html

    (4) Darling, D. The Encyclopedia of Astrobiology, Astronomy, and Spaceflight; Fermi’s paradox

  • Old News You Can Use: the denaturing of history

    Who controls the past controls the future. Who controls the present controls the past.

    George Orwell, 1984

    If there were a poll assessing the least favorite subject taught in high school, I would have to put my money on history or its more au courant euphemistic title, “social studies”. If history is not the clear cut winner, it would certainly be among the top three – my choice, mathematics, I suppose, would also be a strong contender.

    The chronic complaint against history as a subject, you will hear from most Americans, is that it is “old news”. In our up-to-the-minute media saturated culture this is an undeniable fact. “That was soooo last year,” is perhaps a bit exaggerated, but hardly far from describing the willful amnesia of most young people today. More concerned with the staggering demands of the present tense, is it any wonder students find knowing that the Battle of Antietam took place on September 17, 1862 is of little or no material use in their lives? In fairness, I can think of no occasion where my knowing the date of the bloodiest day in U.S. history has put food on my table or helped to pay the electric bill. The meticulous chronology of momentous dates, more often than not, takes on the appearance of a sadistic ritual perpetrated by underpaid civil servants bent on making their charges suffer for the mistake in their career choice. While mathematics might be equally hated, it at least redeems its existence in the popular consciousness if for no other reason than it is reckoned to be necessary for the development of new and faster video games.

    So what if Johnny and Susie, as the song says, “don’t know much about history”, is it really such a big deal? This is a common response from parents and by extension, school boards – who, in all probability, “don’t know much about history” themselves. (After all, only 49 percent of American adults could identify the Soviet Union as an ally in World War II.) Perhaps not, but it is a peculiar response indeed from a nation that, according to pollsters, places such a high emphasis on what is obliquely referred to as “traditional values”.

    Or, to refer to my earlier supposition, would it not be within the purview of traditional values to know what exactly led 3,600 Americans to their deaths on the killing fields of Maryland in the autumn of 1862? Perhaps knowing the details of a battle that took place 143 years ago might give a sense of proportion to more recent events, most notably the horror of 9/11. Wouldn’t our children benefit from the knowledge that there have been other periods in our history when our future looked frightening. Had there been a clear cut victory for the Confederacy in a northern state, the British were prepared to intervene on their side and we might have had an entirely different country today. Most of the heavy casualties (over 23,000 both north and south) were sustained in a four hour period, nine times that of Omaha Beach in the second World War (of “Saving Private Ryan” fame, to give the obligatory pop culture citation). Regretably, few of our children know this, in fact, the majority of them are hardpressed to name what century the greatest danger our nation ever faced, the Civil War, took place.

    In his 1998 essay, “Goodbye to all that: why Americans are not taught history”, Christopher Hitchens found some ghastly statistics:

    According to the last ‘National Assessment of Educational Progress in U.S. History,’ which was undertaken in 1994, we can no longer call upon the traditional schoolmarm concept of history as a pageant, or even as one damn thing after another. In order to argue against this caricature, you would need to know at least the official reason why Pilgrims and Puritans first voyaged to America, which 59 percent of fourth graders were unable to do. You would certainly need to be able to name one of the original thirteen colonies, which was beyond the capacity of 68 percent of that grade. By the eighth grade, matters have got worse, as they are bound to do. Ninety percent of eighth graders could recount nothing of the debates at the Constitutional Convention. Even when prompted by mentions of Yalta, Lend-Lease, and Hiroshima, 59 percent of the eighth grade were unprepared to say which conflict these references brought to mind. In the twelfth grade, 53 percent looked blank when invited to specify “the goal that was most important in shaping United States foreign policy between 1945 and 1990.

    There is little sign that things have improved. In fact, the national amnesia Hitchens writes about sheds light on a recent comment from Hodding Carter III:

    These results are not only disturbing; they are dangerous…Ignorance about the basics of this free society is a danger to our nation’s future.

    The results that Carter, president of the John S. and James L. Knight Foundation, decries are the findings of the “Future of the First Amendment” research project conducted under the foundation’s auspices. The comprehensive study which surveyed over 112,000 students across the United States found some disturbing trends, exonerating Mr. Carter of the accusation that he was being chickenlittleish in his assessment. A few of these key findings include:

    • High School Students express little appreciation for the First Amendment. Nearly three-fourths say either they don’t know how they feel about it or take it for granted.
    • Students are less likely than adults to think that people should be allowed to express unpopular opinions and only fifty-one percent think newspapers should be allowed to publish freely without government approval of stories.
    • Students lack knowledge and understanding about key aspects of the First Amendment. Seventy-five percent incorrectly think that flag burning is illegal. Nearly half erroneously believe the government can restrict indecent material on the internet.
    • Administrators say student learning about the First Amendment is a priority, but not a high priority.

    This leaves one wondering if our students are learning their civics lessons in 1984’s infamous Room 101. Suddenly, in this context, the garbled outpourings of Pop Tart Britney Spears on the Tucker Carlson show, “Honestly I think we should just trust our president in every decision he makes and should just support that, you know, and be faithful in what happens” no longer seems like those of a superfluous bimbo but rather the spokesperson of her generation.

    If, as Hitchens contends, “the measure of an education is that you acquire some idea of the extent of your ignorance”; why is there not more of an outcry about the dismal performance of U.S. students? Perhaps his next statement could be part of the answer, “…it seems at least thinkable that today’s history students don’t quite know what subject they are not being taught.” It does not help that according to the National Center for Education Standards, fewer than 19 percent of high school and middle school social studies teachers had majored (or minored) in history.

    Diane Ravitch’s The Language Police perhaps gives a clearer picture of why schools do such a bad of job firing the imaginations of young scholars in pursuit of history. Compared to the periodic conflagrations that erupt with the regularity of a herpes infection in biology’s evolution/creationism debate or the shamefaced prudery of a faux pas in the sex education class, the trench warfare of history teaching is particularly grinding. Partisans of every stripe weigh in with strident campaigning for their particular “narrative”; Native Americans, conservatives, feminists, Afrocentrists, and environmentalists – to name but a few – lunge and parry, form strange alliances and undo any systematic attempt to develop a comprehensive, or for that matter, coherent plan for teaching history.

    Meanwhile, textbook publishers, whose job it is to sell books, and school administrators, whose job it is to, well, administrate, have firmly staked out the no-man’s land amid the shifting battle lines. Fearing political retribution, the ever-dreaded lawsuit, or still worse, no sales, there is a silent conspiracy of self-censorship and an ardent striving for superficiality. The reasoning, I suppose, is: is history really worth all of this? The result is bland pablum as nutritious as the sugarcoated breakfast cereals their increasing overweight customers hurriedly consume before climbing onto the school bus.

    Ravitch, in a chapter appropriately entitled “History: The Endless Battle”, concisely elucidates the minefield that ill-prepared teachers (remember, the majority of history teachers have never studied history) step onto in our results-oriented and multiculturally sensitive classroom:

    The states that ignore content are very prescriptive about the skills that students must learn. They call on students to do research, use technology, evaluate information, discover relationships, solve problems, work in teams, communicate, and exercise minutely specified “critical thinking skills.” But they leave blank the historical knowledge to which these skills should be applied.

    With that said, is it any wonder that Hitchens finds his own children “could not tell Thomas Jefferson from Thomas the Tank Engine”?

    Ravitch rails against the “multicultural steering committees” of the left and the “family values” types of the right and their overweening concern for the feelings of their constituancies:

    Historians, like writers of fiction, must be able to write what they know, based on evidence and scholarship, without fear of the censor and without deference to political, religious, ethnic, or gender sensitivities.

    The late Neil Postman argued that history is a more an idea than a subject, or rather a meta-subject and the “single most important idea for our youth to take with them into the future.” Postman argues that all subjects have a history or histories; science and its attending branches, literature, music, etc. Without the overarching idea of history, it is difficult indeed to benchmark progress (or the lack thereof) and we are left with a vacuous temporality inhibiting real problem-solving skills. Hitchens found this in his own teaching experience:

    Since you can’t teach the American literary canon (indeed, you can’t even teach people to deconstruct it) without some reference to historical context, I began every class with an abbreviated introduction about the period in which the author was writing. I still have my notes and papers sent me by my students, asking why they had to get all the way to college before anyone anyone bothered to fill in this nagging blank.

    Yes, the nagging blank. As if the fictional “memory hole” of Orwell’s dystopia had come to pass without any perspective as to the when or where of its happening. The conservative philosopher, George Santayana, addressed the danger of the lack of retentiveness in response to what Leon Edel, Henry James’s biographer, refered to as “America’s cult of impermanence”:

    Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it. In the first stages of life the mind is frivolous and easily distracted, it misses progress by failing in consecutiveness and persistence. This is the condition of children and barbarians, in which instinct has learned nothing from experience.

    If history is the “single most important idea for our youth to take with them into the future,” having only 51 percent of our young believing that newspapers have the right to publish stories without government approval, that future is looking increasingly bleak indeed.

    ©2005 Barney F. McClelland at As I Please

  • ‘More Than a Stretch’

    The first casualty of David Horowitz’s effort to impose ideological “diversity” on American campuses has been the truth. Horowitz initially supported his proposal for an Academic Bill of Rights (ABOR) with “independent” studies pointing to a vast predominance of “leftists” on American campuses. As I pointed out last September, neither of the studies in question seems to be independent of Horowitz’s own Center for the Study of Popular Culture. (Nor does the help of the notoriously mendacious Frank Luntz serve as any guarantee of credibility.) In a subsequent exchange with me, Horowitz underwent breathtaking contortions in an effort to back out of bogus claims he had made in support of his ABOR campaign. For instance, in order to deny that he had repeatedly called for the institution of ideological “balance” on American campuses, Horowitz disclaimed all responsibility for a letter written in the first person, bearing his name and a photograph of his signature, and published in his own FrontPage Magazine. I responded with a reflection on “Academic vs. Horowitzian Truth Standards.”

    Despite the clear widening of his credibility gap, Horowitz has continued to propagate lies about the academy and the ABOR. Just last week, in an article posted on his Students for Academic Freedom website without comment, Horowitz is quoted as saying that university professors are a privileged elite who work between six to nine hours a week, eight months a year for an annual salary of about $150,000. As anyone with even a passing knowledge of actual university life would know, this is a ‘fact’ that even Frank Luntz would have trouble substantiating.

    Let’s consider a second claim in greater depth. In “A Campaign of Lies” (a recent fulmination against the AAUP and the Council on Arab-Islamic Relations) Horowitz states that

    [w]hen I drafted the Academic Bill of Rights — and before I published it — I took pains to vet the text with three leftwing academics — Stanley Fish, Todd Gitlin and Michael Berube — and with Eugene Volokh, a libertarian law professor at UCLA, who is one of the nation’s leading experts on First Amendment law. Anything in the original draft of the Academic Bill of Rights that so much as irritated these gentlemen I removed.

    This case for a solid academic fan base is audacious even by Horowitzian standards. Could it be true that four genuine intellectuals — leading American scholars with nuanced and varied political views — are actually in favor of the ABOR’s radical diversity agenda? To get to the truth of the matter, I sent all four professors Horowitz’s claim, and simply asked whether they were indeed satisfied with the ABOR as it stands. All four graciously replied. While Volokh declined to comment on his advice to Horowitz, Fish, Gitlin and Bérubé provided me with detailed responses.

    Stanley Fish — an authority in both rhetoric and legal studies, famous for his contention that politics and academics don’t mix — responded by directing me to his year-old essay “Intellectual Diversity,” noting that it displays “the extent of both my agreement and disagreement” with the final ABOR. In this essay Fish tries to give Horowitz the benefit of the doubt, yet he determines that the ABOR’s plea for “intellectual diversity” is “the Trojan horse of a dark design.” According to Fish, “it is precisely because the pursuit of truth is the cardinal value of the academy that the value (if it is one) of intellectual diversity should be rejected.” Pointing to the dangers of two recent calls for “intellectual diversity,” Fish concludes that

    these are not examples of a good idea taken too far, but of a bad idea taken in the only direction — a political direction — it is capable of going. As a genuine academic value, intellectual diversity is a nonstarter. As an imposed imperative, it is a disaster.

    The responses from Todd Gitlin and Michael Bérubé are reprinted below. I leave it to the reader to determine whether the members of Horowitz’s supposed support group are indeed behind the ABOR, and to consider the source of the real “campaign of lies” surrounding the bill.

    Graham Larkin

    Stanford University, Department of Art & Art History
    CA-AAUP VP for Private Colleges and Universities

    ADDENDUM: Todd Gitlin and Michael Bérubé respond to David Horowitz’s claim that that he removed “anything in the original draft of the Academic Bill of Rights that so much as irritated [them].”


    Todd Gitlin


    (from an e-mail to Graham Larkin, dated February 15 2005)

    In September 2003, David Horowitz sent me a draft of his Academic Bill. I objected explicitly to a provision that would have required the taping of hiring and tenure meetings for faculty, for scrutiny by university boards and others. We went around about the dangers of such surveillance and in the end he said he would remove that provision. To say that he removed “anything….that so much irritated” this particular gentleman, however, would be excessive. In fact, we did not correspond about the concept that state legislatures or other trans-university bodies should sit in executive or quasi-judiciary authority over faculty bodies charged with defending the academic freedom of students and faculty. I did, and do, object to interventions by such higher authorities, as is envisioned in his current campaigns directed at state legislatures. But the issue didn’t come up in our correspondence. So far as I understood matters then, it was Horowitz’s intention to campaign for university resolutions, not legislative interventions.


    By the way, you might be interested in a piece I have on the current Academic Bill campaign forthcoming in the March/April issue of Mother Jones.

    Michael Bérubé

    (from an e-mail to Graham Larkin, dated February 15 2005)


    I told David that the taping of hiring and tenure meetings was at once intrusive and counterproductive — that is, it would have the effect of making sure that no one said an honest word in those meetings, and conducting the real business of hiring/tenure committees in bars and bathrooms instead. Then David suggested that every candidate rejected for a job should be informed of the basis for the decision in writing, and I replied that David clearly hadn’t been serving on any hiring committees recently — otherwise he’d know how impossible it is to send personalized rejection letters to 500 or 1000 job applicants. So yes, David abandoned those two suggestions.


    But it’s more than a stretch for David to suggest now that I endorsed the final ABOR. In fact, I rather pointedly declined to sign it, as David asked me to, precisely because it would lead to all manner of absurd conclusions, under the seemingly benign banner of “diversity.” We should ask David if he really wants, for example, the al-Qaeda perspective on the Middle East more widely taught in American universities, because right now it is severely underrepresented. Brian Leiter put it best, I think:

    The real difficulty, of course, is that if you create rights, you also have to have remedies. And at some point even the genuinely dumb conservatives will notice that the Horowitz proposal will create causes of action for Marxist economists who can’t be hired by economics departments, for postmodernists who can’t get hired by philosophy departments, and on and on. And what is to stop Intelligent Design creationists from suing biology departments that won’t hire them? Or alchemists from suing Chemistry departments? You get the idea.



    And, of course, I object to “diversity” in academic departments and subjects being mandated by state legislators. Note that Ohio’s bill, introduced by State Senator Larry Mumper, prohibits instructors from “persistently” discussing controversial subjects. His examples of controversial subjects? “Religion and politics.” So that’s what Republican state senators want in Ohio — universities devoted solely to sports and weather.

    To join the fight against the Academic Bill of Rights, get involved with the AAUP, tireless defenders of academic freedom since 1915.

    ©2005 CA-AAUP

    This article was first published on the California AAUP website and is republished here by permission.

  • Rationalist International Bulletin # 140

    Vatican: The Kidnap Program

    After the end of the Second World War, the Vatican issued a secret order to the French church authorities, directing them to keep all baptized children from Jewish families in their custody, who had been accommodated in Catholic homes and convents during the Nazi occupation of France. The Vatican had decided that these children should not be returned to their surviving Jewish parents, but handed over to Christian institutions to ensure their Christian education. This secret Vatican order, a document in French language dated October 23, 1946, has recently been digged up by Italian church historians and was published in January in translation in the respected Italian daily Corriere della Siera. It triggered yet another controversy over Pope Pius XII, who stands accused of supporting the fascist regimes in Italy, Germany and Croatia during the Second World War. Despite his dubious role during the war, Pope Pius XII tops the Vatican’s list for beatification today. After the new controversy, however, there are demands that the canonization process be stopped and an independent committee be set up to determine how many Jewish children were kidnapped by the Catholic church in Europe after the war.

    The secret order gives detailed instructions, how to operate the kidnap program most effectively and secretly. Most important to avoid resistance of the Jewish authorities: they should never be given any written reply on their queries, and they should be held under the impression that each request for restitution of a child is carefully evaluated and decided individually. So they would fight hundreds of allegedly individual decisions individually, missing the chance to put up a political fight against the centrally ordered systematic kidnap program.

    “The decision was taken by the Holy Office and has the approval of the Holy Father”, says the document. But who was actually its author and who its receiver is still under debate. Most likely, say historians, both – author and receiver – have been the same person: Angelo Guiseppe Roncalli, the then papal envoy to France, who was in any case the most responsible for the execution of the secret order. Roncalli, the later Pope John XXIII, is often called “il papa buono”, because he seemed to be one of the few popes in history without dirt on his robe. The document, which is going to be published together with Roncalli’s French diaries next year, could change this reputation.

    Saudi Arabia: Women-free democracy!

    The ultra-conservative Islamic kingdom of Saudi Arabia is holding the first ever public elections in its history. In three phases, starting on 11 February with the capital Riyadh and its sourroundings, half the members of the 178 local municipal councils across the kingdom will be elected. The other half will be nominated by the government. The absolute monarchy’s careful first step towards democracy, however, excludes more than half the Saudi population: Women are barred from participating. Though there are rules saying that all citizens over 21 years (except military personal) are entitled to cast their vote, this election is an exclusively male affair. The streets of Riyadh are full of posters with male candidates, all of them wearing traditional Saudi headgear and addressing the “brotherly citizens”. In the Sunni Islamic world headquarters Saudi Arabia – quite similar to the Catholic world headquarters Vatican – it goes without saying that all citizens translates into all men only. But there are a few voices – male voices – proposing that next time women should also be allowed to cast their ballots.

    India: Pious spies

    Christian missionaries are very active in India’s Northeast. Many of them, however, are not exactly in search of lost souls. They are busy collecting information and filing reports. Some local people, becoming suspicious, found out that those “missionaries” are working for a project with the name “Ploughshare”, sponsored by the Institute of Peace and Conflict Studies, Canada. The “Ploughshare Project” is an in depth study about the many insurgent groups operating in the area, about the action taken by the Indian government against them and about the position taken by the neighbouring states. The Northeast is a politically sensitive part of India and of special strategic interest for the whole of Southeast Asia. It includes the states / union territories including Assam, Sikkim, Mizoram, Tripura, Manipur and Nagaland. Except a thin land connection to main land India, it has all international borders to Bhutan, China, Burma and Bangladesh. This part of India is constantly troubled by insurgent groups, taking advantage of the exposed situation.

    Christian churches based in the USA and Canada are sending spying “missionaries” to the area to observe the political and military situation and file regular reports. The information collected by them includes studies about the various insurgent groups, their members, equipment, aims and techniques as well as studies about all “parties in the conflict” and details about their encounters and positions in the border region: the Indian Army, the Burmese Army, the Bhutanese Army, the Bangladesh Army, Assam police incl. all special units and commandos, etc. “Project Ploughshare” seems to be one of several spy rings under religious cover in the explosive area.

    Rationalist International Bulletin # 140. Copyright © 2004 Rationalist International.

  • Behe Jumps the Shark

    Nick Matzke has also commented on this, but the op-ed is so bad I can’t resist piling on. From the very first sentence, Michael Behe’s op-ed in today’s NY Times is an exercise in unwarranted hubris.

    In the wake of the recent lawsuits over the teaching of Darwinian evolution, there has been a rush to debate the merits of the rival theory of intelligent design.

    And it’s all downhill from there.

    Intelligent Design creationism is not a “rival theory.” It is an ad hoc pile of mush, and once again we catch a creationist using the term “theory” as if it means “wild-ass guess.” I think a theory is an idea that integrates and explains a large body of observation, and is well supported by the evidence, not a random idea about untestable mechanisms which have not been seen. I suspect Behe knows this, too, and what he is doing is a conscious bait-and-switch. See here, where he asserts that there is evidence for ID:

    Rather, the contemporary argument for intelligent design is based on physical evidence and a straightforward application of logic. The argument for it consists of four linked claims.

    This is where he first pulls the rug over the reader’s eyes. He claims the Intelligent Design guess is based on physical evidence, and that he has four lines of argument; you’d expect him to then succinctly list the evidence, as was done in the 29+ Evidences for Macroevolution FAQ on the talkorigins site. He doesn’t. Not once in the entire op-ed does he give a single piece of this “physical evidence.” Instead, we get four bald assertions, every one false.

    The first claim is uncontroversial: we can often recognize the effects of design in nature.

    He then tells us that Mt Rushmore is designed, and the Rocky Mountains aren’t. How is this an argument for anything? Nobody is denying that human beings design things, and that Mt Rushmore was carved with intelligent planning. Saying that Rushmore was designed does not help us resolve whether the frond of a fern is designed.

    Which leads to the second claim of the intelligent design argument: the physical marks of design are visible in aspects of biology. This is uncontroversial, too.

    No, this is controversial, in the sense that Behe is claiming it while most biologists are denying it. Again, he does not present any evidence to back up his contention, but instead invokes two words: “Paley” and “machine.”

    The Reverend Paley, of course, is long dead and his argument equally deceased,
    thoroughly scuttled
    . I will give Behe credit that he only wants to turn the clock of science back to about 1850, rather than 1350, as his fellow creationists at the Discovery Institute seem to desire, but resurrecting Paley won’t help him.

    The rest of his argument consists of citing a number of instances of biologists using the word “machine” to refer to the workings of a cell. This is ludicrous; he’s playing a game with words, assuming that everyone will automatically link the word “machine” to “design.” But of course, Crick and Alberts and the other scientists who compared the mechanism of the cell to an intricate machine were making no presumption of design.

    There is another sneaky bit of dishonesty here; Behe is trying to use the good names of Crick and Alberts to endorse his crackpot theory, when the creationists know full well that Crick did not believe in ID, and that Alberts has been vocal in his opposition.

    So far, Behe’s argument has been that “it’s obvious!”, accompanied by a little sleight of hand. It doesn’t get any better.

    The next claim in the argument for design is that we have no good explanation for the foundation of life that doesn’t involve intelligence. Here is where thoughtful people part company. Darwinists assert that their theory can explain the appearance of design in life as the result of random mutation and natural selection acting over immense stretches of time. Some scientists, however, think the Darwinists’ confidence is unjustified. They note that although natural selection can explain some aspects of biology, there are no research studies indicating that Darwinian processes can make molecular machines of the complexity we find in the cell.

    Oh, so many creationists tropes in such a short paragraph.

    Remember, this is supposed to be an outline of the evidence for Intelligent Design creationism. Declaring that evolutionary biology is “no good” is not evidence for his pet guess.

    Similarly, declaring that some small minority of scientists, most of whom seem to be employed by creationist organizations like the Discovery Institute or the Creation Research Society or Answers in Genesis, does not make their ideas correct. Some small minority of historians also believe the Holocaust never happened; does that validate their denial? There are also people who call themselves physicists and engineers who promote perpetual motion machines. Credible historians, physicists, and engineers repudiate all of these people, just as credible biologists repudiate the fringe elements that babble about intelligent design.

    The last bit of his claim is simply Behe’s standard misrepresentation. For years, he’s been going around telling people that he has analyzed the content of the Journal of Molecular Evolution and that they have never published anything on “detailed models for intermediates in the development of complex biomolecular structures”, and that the textbooks similarly lack any credible evidence for such processes. Both claims are false. A list of research studies that show exactly what he claims doesn’t exist is easily found.

    The fourth claim in the design argument is also controversial: in the absence of any convincing non-design explanation, we are justified in thinking that real intelligent design was involved in life. To evaluate this claim, it’s important to keep in mind that it is the profound appearance of design in life that everyone is laboring to explain, not the appearance of natural selection or the appearance of self-organization.

    How does Behe get away with this?

    How does this crap get published in the NY Times?

    Look at what he is doing: he is simply declaring that there is no convincing explanation in biology that doesn’t require intelligent design, therefore Intelligent Design creationism is true. But thousands of biologists think the large body of evidence in the scientific literature is convincing! Behe doesn’t get to just wave his hands and have all the evidence for evolutionary biology magically disappear; he is trusting that his audience, lacking any knowledge of biology, will simply believe him.

    After this resoundingly vacant series of non-explanations, Behe tops it all off with a cliche.

    The strong appearance of design allows a disarmingly simple argument: if it looks, walks and quacks like a duck, then, absent compelling evidence to the contrary, we have warrant to conclude it’s a duck. Design should not be overlooked simply because it’s so obvious.

    Behe began this op-ed by telling us that he was going to give us the contemporary argument for Intelligent Design creationism, consisting of four linked claims. Here’s a shorter Behe for you:

    The evidence for Intelligent Design.

    • It’s obvious.
    • It’s obvious!
    • Evolutionary explanations are no good.
    • There aren’t any good evolutionary explanations.

    That’s it.

    That’s pathetic.

    And it’s in the New York Times? Journalism has fallen on very hard times.

    This article was first published on Pharyngula and appears here by permission.

  • The Naturalistic Fallacy and Sophie’s Choice

    It’s not hard to accept that there’s a pressing need to find answers for the questions that issues such as cloning, pollution, or genetic manipulation entail. However, it is difficult to agree which are these questions and their possible answers because the debate is often driven by the naturalistic fallacy, the belief that nature is essentially good. The environmentalist movement, for instance, frequently appeals to the goodness of nature as a way to promote their causes. Many of the fears and misconceptions that shape our options and influence our choices are a result of this fallacy. Exposing them is therefore essential to reconcile clashing positions and find solutions that don’t force us to choose between man and nature.

    A friend told me once that he was afraid of genetic manipulation because it could produce Frankesteins. Since selective breeding is also a form of manipulating genes, I wonder if he thinks his French Poodle is some kind of monster. I certainly do although not for the same reasons.

    Genetically modified food, cloning, sustainable development, and pollution are some of the issues that today demand expedited answers and entail making difficult choices. Should we preserve nature or procure human development? Should we increase our control or reduce it? Do we have the right to change nature? However, some of these questions and their possible answers are driven by the naturalistic fallacy, the belief that nature is essentially good. Many of the fears and misconceptions shaping our options and influencing our choices are by-products of this fallacy. From our distrust of artificial things to the fear of tampering with the natural order, the following are some of the most common distortions behind the human vs. nature debate.

    The Good Nature of Nature

    Rape, infanticide and infidelity are not just examples of despicable human behavior that make tabloids’ headlines. They also illustrate the kind of action you can regularly see at Animal Planet. Filled with predators, parasites, starvation, sickness, cannibalism, extreme temperatures, hurricanes and earthquakes, nature is neither a peaceful paradise nor a wise and kind mother that cares deeply about her children, even though, most people equate it with something legitimate, dignified, pure, or at least, normal. In fact, nature knows nothing about justice or dignity and violence or abnormalities are ubiquitous in the natural world. Death from fighting, for example, is more common in most animal species than in the most violent American cities(1). Of course, we shouldn’t feel glad about the extinction of entire species or about oil spilling in the ocean but there’s something wrong about having a partial picture of nature, especially if man is portrayed as the enemy. The majority of our interactions with the natural world involve some form of control or transformation, so this partial picture makes most of the things we do look ignoble or illegitimate. Any alterations to the natural order are seen as bad and selfish because nobody wants good and pure things to change or to be corrupted.

    Artificial Stupidity

    Man is part of nature, of course, but a line has to be drawn between them to tell apart those horrible plastic flowers from the lovely fresh ones. We can sometimes be really mean to ourselves. As we’ve learned to praise nature, we’ve also learned to despise and distrust all sorts of man-made things from breast implants to instant coffee. We often think of artificial products as fake, ugly, dangerous or at least, suspicious. These feelings become particularly exacerbated when it comes to food. People firmly believe that artificial and genetically modified foods are major health and environmental threats when in fact, they are no more dangerous than natural foods. Artificial and natural flavors, for instance, are usually chemically indistinguishable, and when they aren’t, the natural flavor can sometimes be the dangerous one (as in the case of almond extract)(2). Man has been genetically modifying plants and animals with selective breeding and hybridization for millennia. These are the “traditional” ways of producing foods. Making them in a lab makes no big difference. A recent report of 81 research studies about genetically modified food failed to find any new risks to human health or the environment(3). The problem with these fears, usually irrational on health grounds, is that they can have serious implications. They can make food more expensive harming consumers and farmers alike, and less accessible to the millions of people in the Third World that suffer from starvation and nutritional deficiencies.

    Thou Shalt not Play God

    Apparently manipulating life was not in the original job description of mankind. From contraception to cloning, there is no other topic that makes us speak more passionately about our role on this earth. Human cloning in particular, is seen as the new evil concept that corrupt God’s natural order and that will eventually lead us to our own doom. This fear, rooted in the belief that nature is wise and we should not tamper with it, is behind many colossal misunderstandings. People believe that cloning entails things like becoming immortal, engineering “perfect societies”, producing zombies or bringing Hitler back to life. However, clones are nothing more than identical twins born at different times. Cloning will never be able to reproduce a person’s identity, design an entire population, produce people without a “soul” or bring a psychopath back to life.

    Other people are preoccupied with cloning due to the use of human embryos in stem cell research. Arguments for and against frequently focus on the need to find a “biologic line” that can define what can be considered as a person. Of course, they all agree that nature being the ultimate source of goodness and truth should have the last word. The problem is that nature is not very specific on this respect. The biologic line is too fuzzy. A person emerges from a gradual development not from a crucial moment. A fourteen-week embryo is not substantially different from a fourteen-week-and-one day one, or from a thirteen-week-and-six days one. This is why it is necessary to shift the question from finding a line, to consciously choosing one that best trades off the conflicting goods and evils for each dilemma(4). Finding cures for Alzheimer’s and Parkinson disease, diabetes, spinal cord injuries, infertility, birth defects, and cancer, could depend on that.

    Control Freaks

    Since people believe not only in a natural order but also in the rightness of this order, man’s power to control nature is generally perceived as evil or at least as wrong. We seem to be convinced that the control we exert over nature is destroying it. However, we don’t realize that protecting or reconstructing it is another way of exerting that power. Human beings have always been controlling their environment. Finding increasingly sophisticated ways to do so has been the key to our survival and success. We are among the few species that protect the handicapped and the sick, and the only that can save other animals and plants from extinction or preserve entire ecosystems. We even have developed ways to control our own harmful behavior (laws, morals, etc.) to protect the environment and ourselves. Control, has made our lives safer by reducing accidents and by helping us plan for the future (environmental tragedies are due to lack of control not excess of it). So the question is not about whether we should have control over nature or not, but about the ways in which we exercise that power.

    Gloomy Forecasts

    Another myth deriving from the naturalistic fallacy is that we are so stupid, greedy and selfish that we will inevitably consume all the resources that good Mother Nature provides for our survival. In fact, according to the 18th century economist Thomas Malthus we should all be starving by now. He predicted a cataclysm based on the notion that population increased at a geometrical ratio while subsistence could only increase at an arithmetical one. His predictions failed mainly because he didn’t considered the creative power of people to find innovative solutions. Nevertheless, many people still fear or predict this kind of scenarios and think sustainable development is mere wishful thinking. The problem is that the definition of this concept is still fixed on the context of the availability of natural resources. Long term development depends not on obtaining things like paper or coal but on finding ways to communicate our thoughts and heat our homes. Strategies that focus only on resources can become quickly obsolete. Therefore, our forecasts should consider that our relation with the environment includes not only people and resources but also their minds and their exponential power to come up with new ideas and solutions.(5)

    Figuring out the Figures

    Our unconditional love for nature along with our relentless distrust for humans have an impact on the way we read information such as statistics and technical reports. We like to make weird correlations, come up with bizarre explanations and sometimes, we just give in to collective hysteria. If for instance, the probability of an accident occurring in a nuclear plant is 0.1%, we only grasp that something can go wrong and apply Murphy’s laws to predict it will go wrong. We tend to overreact and don’t pay attention to relations and quantities. Panic follows when we see the words “apple” and “cancer” in the same sentence (even heavy smokers will feel appalled by the imminent danger). We think too that when a substance is found to be hazardous a single of its molecules can kill us and we assume that everything that happens to laboratory overdosed rats will inevitably happen to us.
    The problem is that if we don’t leave our prejudices aside and learn to read the data we can be easily manipulated by selfish interests or we can end-up distrusting everybody, unable to discriminate between lies and information supported by scientific evidence.

    Facts and Acts

    In addition to how we read statistical data and technical reports, there’s also how we react to scientific facts and discoveries when they challenge our beliefs about nature. We don’t accept easily, for instance, the fact that violence is pervasive in nature or that there’s no such thing as the Noble Savage. The problem is that we sometimes insist on blaming what we know for what we do. But one thing is knowledge and another is what we do with that knowledge. By attacking research, assaulting scientists, or condemning the facts we won’t be able to stop irresponsible human behavior. Scientific research poses many ethical questions, but facts are just facts. They cannot be moral or immoral, innocent or guilty. Scientific interpretations are ways of understanding the facts not calls to action. We may question these interpretations, but we cannot say they’re wrong just because they challenge our beliefs. We need to recognize that science doesn’t dictate our choices, it enriches them. We also need to complement facts and findings with formal expressions of our values and with ways of resolving the conflicts that they can yield so we can be in a better position to account for what we do with what we know.

    Sophie’s Choice

    Imagine a world in which nature is declared sacred and we are left powerless against its impelling forces. A world without pharmaceutical research and fast means of transportation. One where it is impossible to produce large quantities of food, where there’s no hope for cancer patients and where there’s no future for the poor.
    A world without science and technology is a world without choices. Our long-term interests and our increasingly complex relationship with nature depend more than ever on them. That’s why our options shouldn’t be dictated by those who force us to choose between man and nature. It is not like Sophie’s choice. We can, and should, choose both.

    References

    (1) Pinker, S. 2002. The Blank Slate. Viking; p.163

    (2) Schlosser, E. “Why MacDonald’s fries taste so good,” Atlantic Monthly, January 2001 in Pinker, S. 2002. The Blank Slate. Viking; p.230

    (3) “EC-sponsored research on safety of genetically modified organisms-A review of results.” Report EUR 19884, October 2001, European Union Office for publications.

    (4) Green, R. M. 2001. The human embryo Research Debates: Bioethics in the vortex of controversy. New York; Oxford University Press.

    (5) Romer, P. “Ideas and Things,” The Economist September 11, 1993; Pinker, S. 2002. op. cit. pp. 237, 238.

    Paula Bourges Waldegg has a web page here. She can be emailed at bwaldegg@prodigy.net.mx

  • Academic vs. Horowitzian Truth Standards

    28 January 2005

    Dear Mr. Horowitz,

    Thank you for
    your response
    to my recent
    investigation
    of your interest in promoting left-right balance.
    In it, you urge me to comment more on the specific contents of the
    Academic
    Bill of Rights
    , rather than on your statements in defense of
    the Bill. While I’m more than happy to share my thoughts on the
    Bill’s contents, it is not easy, in the context of our exchange,
    to separate this material from your own arguments. Indeed, I think
    it would be very enlightening to show how your own way of thinking
    epitomizes many of the things that most trouble me about the Bill.
    A consideration of competing concepts of truth (or, as some would
    have it, “truth”) should make the case.

    To my mind,
    one of the ABOR’s most unsettling features is its encouragement
    of epistemological relativism. For instance it
    states
    that

    human
    knowledge is a never-ending pursuit of the truth, that there is
    no humanly accessible truth that is not in principle open to challenge,
    and that no party or intellectual faction has a monopoly on wisdom.

    Further down,
    the Bill refers to “the
    uncertainty and unsettled character of all human knowledge
    .”
    My gut reaction to this kind of radical relativism is a pragmatic
    one. As the saying goes, it’s good to keep an open mind, but don’t
    keep it so open that your brain falls out. Unlike those people who
    only ever put the word “truth” in quotation marks, I feel that some
    principles, ideas and conventions are more right than others, even
    in cases when their truth-value is not categorically demonstrable.
    Otherwise, what is to prevent us from slipping into a dangerous
    moral relativism?

    When it comes
    to loosening prevailing standards of truthfulness, you certainly
    practice what you preach. For instance, in paragraph 11 of your
    latest
    response
    , you imply that you had not read a
    document
    which is written in the first person, and published
    in your magazine, complete with your name and an image of your signature
    at the bottom. After insisting that the piece was entirely ghostwritten,
    you go on (in the interest of denying that you never talk about
    balance) to disclaim any knowledge of the statement that the ABOR
    “demands balance in reading lists,” even though you cannot deny
    my observation that this very passage is actually an unsourced quotation
    (or variation of a quotation) of something you wrote elsewhere.
    You wrap up this prodigious little nugget of indirection by simply
    plead[ing]
    guilty to not paying more attending [sic] to my fund-raising mail
    .”

    To those of
    us who don’t share your ABOR-endorsed relativism, you’re guilty
    of a lot more than neglecting to check your mail. By the measure
    of the enduring civil standards upheld in reputable academic research,
    this kind of double- or triple-dealing is simply inexcusable. Like
    every academic I know, I personally make a habit of reading all
    first-person statements that I authorize to bear my signature. Indeed,
    I go so far as actually to write any such statements myself.
    After that, I stand behind my words. In my profession, writing one’s
    own material, and standing behind it, is nothing short of an ethical
    imperative. For academics, serious writing is properly viewed as
    an outward sign of inner integrity — or, as the case may be, lack
    of integrity. That’s why we consider plagiarism and ghost-writing
    to be such grave offences. (Another practice that keeps us honest
    is linking
    readers to our sources by means of footnotes or other citation methods,
    even when these sources complicate our argument.)

    In short, by
    academic standards, your cavalier practice of allowing your name
    to be attached to an influential document that you didn’t write
    (even if it is “very obviously a direct mail solicitation,” as if
    that matters) and your subsequent effort to shirk responsibility
    for the content, amount to a serious abuse of your readers’ trust.
    Given the discrepancy between the standard academic reverence for
    truthfulness and your own more nonchalant attitude, is it any wonder
    that academics question your motivations when you try to force us
    to submit to “the
    uncertainty and unsettled character of all human knowledge
    ?”
    This phrase, coupled with your own instrumental view of truth, makes
    me worry about how moral relativists might act on the idea of the
    never-ending
    pursuit of the truth
    ,” or on the phrase that “there
    is no humanly accessible truth that is not in principle open to
    challenge
    .”

    Let’s see how
    this last pronouncement breaks down in real life, by applying it
    to the following truth-claim.

    People
    should always strive to be honest and free from delusion.

    Does anyone
    care to challenge this? This statement is certainly not ‘open
    to challenge’ according to my principles. I believe it to be
    both ‘humanly accessible’ and absolutely, categorically
    true. Anyone who wants to pass blanket legislation suggesting otherwise
    had better come up with a pretty good explanation of what could
    possibly be wrong with my truth-claim, or my principles, in this
    particular instance.

    On the subject
    of truth and delusion, I continue to be astonished by your persistent
    denial
    of the fact that the ABOR movement has repeatedly pushed
    for ideological “balance.” In the face of all
    my evidence
    , you have had little choice but to back down just
    a little, yet you now ask

    If
    “left-right balance” were the agenda of the Academic Bill of Rights,
    or the academic freedom campaign, why wouldn’t it be at the center
    of both?

    Given the facts
    of the matter, how can I respond, except by offering yet another
    example of the very term you initially denied using at all, and
    by choosing it from the “center” of your campaign? My latest example
    is a phrase in the Students for Academic Freedom Mission
    and Strategy statement
    . It asserts that

    [b]ecause
    the university is not the arm of any political party but an institution
    whose purpose is to promote learning and the exchange of ideas,
    student programs of a partisan nature should be fair and balanced
    [my emphasis].

    There you have
    it. That pesky “b” word again, and once again in an explicitly political
    context. Are parts of the SAF Mission Statement also ghostwritten,
    and full of things that its authors (whoever they may be) didn’t
    mean to assert? Or is this sentence something that you’re willing
    to stand behind? If you do admit to the reality of this call for
    “balance,” how will you then reconcile it with your insistence that
    the “balance” issue isn’t (as you now rephrase it) “at the center”
    of your freedom campaign. If a “Mission and Strategy” statement
    isn’t “at the center” of a movement’s agenda, then it’s a funny
    kind of mission statement.

    Thank you for
    your attention. I look forward to any future responses.

    Graham Larkin

    Stanford University, Department of Art & Art History
    CA-AAUP VP for Private Colleges and Universities

    To join the fight against the Academic Bill of Rights, get involved with the AAUP, tireless defenders of academic freedom since 1915.

    ©2005 CA-AAUP

    This article was first published on the California AAUP website and is republished here by permission.

  • “Chief” Objections: Racism, Rhetoric and Native American Mascots on College Campuses

    The recent success of the University of Illinois at Urbana’s basketball team has distracted attention from a longstanding and contentious issue: the status of school sports mascot Chief Illiniwek. The Chief is one of the last remaining college team mascots modeled after Native Americans – the kind usually portrayed by white students wearing face paint and “traditional” native costumes. The school’s Board of Trustees has debated the fate of the Chief for more than a decade, but a resolution seems no closer. Despite recent statements about the need to retire the Chief, the university continues to delay progress toward this goal. It may be a good time to review this controversy, since doing so may reveal much about the nature of muddled thinking, as well as the baffling attitudes about racial dialogue on modern college campuses.

    The Chief has served as the University’s sports mascot for more than 70 years. His main job, like that of similar team symbols, is to rally crowd support during competitive events. He does this by performing “war dance” rituals, complete with wild drumbeats and tomahawk-style chops. The Chief has apparently been very successful in his efforts to stir the blood of Illini fans, since he remains popular long after most of his fellow “Indian” mascots have vanished. Indeed, when intense debates about the racism of Chief symbol started in the mid ‘90s, many Illini alumni defiantly supported their cherished team symbol. In 1995, Lou Liay and Don Dodds of the Alumni Association presented evidence that some alumni threatened to end their annual contributions to the school if the Chief met retirement. Similar threats reportedly persist to the present day, and seemingly provide the main motivation for the university to avoid taking decisive action. Yet, there is reason to wonder what further facets of this issue require analysis before the university can reach a firm conclusion. At this point, all of the arguments in favor of retaining the Chief have already been examined, and all have been found lacking.

    Does the Chief Honor Native American Traditions?

    Some supporters of the Chief claim that the symbol honors the memory of the Native American Illini tribe who once lived throughout present-day Illinois. Some further argue that modern Native Americans should feel flattered, not offended, by the Chief’s representation of their culture. We should acknowledge that many of those holding this position are sincere in their belief, and hold no real malice toward Native Americans. But we should also admit that the merit of this position depends on its conformity to logic and truth, not simply the intentions of its proponents. We need to clarify the assumptions inherent in this argument before we can determine its validity.

    To begin with, we can ask why we should expect members of an ethnic group to be automatically flattered by our representations of their culture. Even if the portrayal is accurate (which the Chief, as we will see, is not), why should the culture feel honored by it? This seems to be a rather patronizing attitude toward a minority culture – one that demands its appreciation for the scraps of esteem we toss its way. Furthermore, it just isn’t true that any representation in any context can legitimately function as a tribute. Even if the university believes the Chief to be an authentic representation of Native American traditions, it is using those traditions as a novelty act to entertain sports fans. How would supporters of the Chief feel if, say, the Holy Eucharist were re-enacted as a halftime skit at the Superbowl (admittedly, a rather boring one), or if a mascot dressed as a Bishop chased young boys around during the seventh inning stretch? Many Christians would doubtless find many reasons to be offended by these antics, but I suggest they’d especially dislike the fact that a symbol they hold sacred is functioning as ribald mass entertainment. They should keep this in mind when arguing that the similar use of tribal traditions should overwhelm Native Americans with gratitude to their great white father.

    The argument that the Chief honors Native Americans also assumes that the Chief reflects genuine Illini traditions. Indeed, without this assumption, the entire argument immediately falls apart. But as research by many sociologists and anthropologists has shown, the Chief is an inaccurate symbol of Illini tribal customs on every level. His clothes and actions resemble the stereotypical “Indians” of old Hollywood cowboy movies, not the real traditions of the Illini. Moreover, the popular Western movie image of Indians was loosely based on tribes such as the Lakota of the American plains who had very different cultures from most other tribes, including the Illini. As Joe Gone argues in his classic article about the Chief, “the Illini were Woodlands people – not Plains people – and as a result evidenced an entirely different material culture than the Lakota people whose clothing the current Chief dons (Gone, 1995).” The University of Illinois’ anthropology department elaborates on the the inaccuracy of the Chief by noting:

    Archaeological records inform us that the Illini were primarily farmers and people of trade and commerce who lived in settled villages within a loose political confederacy of twelve tribes. The men did not wear war bonnets, nor were they warriors in the sense of having military societies like the Plains tribes. To represent the Illini with a Plains Indian war bonnet, to name them the “fighting Illini,” and to dress the mascot in the military regalia of a Sioux warrior, is therefore totally inaccurate. It is the direct equivalent of representing Italians or Germans with someone dressed in a Scottish kilt and playing the bagpipes (Letter to the Board of Trustees, 2004).

    Many proponents of the Chief argue that at least the Chief’s dance is authentic, since it reportedly derives from a Lakota ritual known as “the Devil’s Dance.” But this statement is irrelevant even if true, since as we have seen, Lakota rituals are entirely different from those of the Illini. The “Devil’s Dance” is also not an ancient Native American tradition, but a later invention taught to Illini as part of a scouting project. And whatever its origins, there is no question that the Chief’s dance is at best a hokey modification of real dance traditions, one adapted to the demands of the sports arena. Finally, the Chief’s music bears no resemblance to any current or past Native musical styles. It is, as Joe Gone maintains, “a creation of white America.” It seems relevant to ask if Native Americans should feel honored by a symbol that distorts their cherished traditions in such flagrant fashion, especially when similar symbols have served to stereotype and discriminate against them for so long (Munson, 1997).

    The University of Illinois acknowledged these inconsistencies in 1990 by removing all references to the Chief’s authenticity from its official statement about the mascot. Anyone wishing to argue for the Chief’s value as a cultural symbol must therefore honestly confront the fact that neither Native Americans nor the University itself currently consider the Chief to reflect real traditions. If they choose to ignore this information and persist in their arguments, they can only do so through the kind of willful ignorance condemned by Thomas Aquinas – the ignorance preserving our favorite biases from the light of scrutiny.

    Are anti-Chief Lobbyists Just Being “Politically Correct” and Overly Sensitive?

    Others proponents of the Chief don’t bother to argue for his authenticity, and simply express disgust toward those who lobby for his retirement. For some of these people, the movement to eliminate the Chief is the latest example of “political correctness” run amok in America – another example of oversensitivity to ethnic issues we would be wiser to ignore. According to many of these proponents, the Chief mascot is simply harmless fun, and isn’t worth all the attention and controversy. Why do the anti-Chief groups have to make such a big deal out everything? Shouldn’t they just melt in the pot, instead of trying to stir it?

    It is difficult to address this particular objection, since it is really a confluence of unexamined opinions rather than a coherent position. We might begin, however, by asking what the term “political correctness” means, and why we might think of it as harmful. This is more difficult than it may first appear, since it is an ideologically loaded term meaning different things to different people. Still, it is possible to say that political correctness is usually considered a bad thing when it precludes rational discussion of issues, or shelters harmful ideas under the umbrella of politeness or ethnic tolerance. For example, we could plausibly denounce “speech codes” that stifle criticism of Islamic extremists, or prevent discussions of abortion. In these cases, we could maintain that “political correctness” is an arbitrary and hypocritical attempt to shelter students from ideas they’d prefer not to consider, and that this is harmful to a republic like ours. If we have good reasons for criticizing Islamic suicide bombers or holding public debates about abortion, we should be able to do so, even if some students are offended by what they may read or hear in the process. It is wrong to place arbitrary limits on the kinds of discussions we consider legitimate, since doing so limits our ability to understand the issues facing us. For all of these reasons, we can understand why this form of “political correctness” is a detriment.

    However, we can also understand how the term “political correctness” might be used as a convenient blanket term to condemn superficially similar ideas, especially when the ideas bear on racial or ethnic issues we’d rather not acknowledge. Ironically, this attempt to short circuit dialogue about race is itself similar to the kind of “political correctness” condemned above. The argument that complaints about the Chief lie on the silly side of political correctness exemplifies this tendency. To recognize this, we need only realize that there is a major difference between a university allowing free discussion of Native American culture on campus, and the same university officially establishing an inaccurate and offensive image of Native Americans as its symbol. The first is an example of free speech in action, while the second is an example of institutionalized racism (Munson, 1997). For the same reason we would condemn a university for sponsoring minstrel shows or using the image of a thick-lipped, Sambo-style African-American as its official mascot, we can condemn the use of a Native American stereotype as a university trademark. Lumping condemnation of active, university-sponsored racism with the “political correctness” that stifles free debate only obscures the real issues. Free speech is primarily important as a component of a free republic, in which all citizens are protected from undue discrimination by public institutions. Condoning a symbol of institutionalized racism in the name of free speech therefore undermines the very values free speech is intended to protect.

    What about the opinion that the Chief is a minor issue unworthy of such heated debate? We should note that this position depends on the assumption that the symbols we don’t acknowledge as harmful are truly not harmful – that we can accurately measure the good or harm of any image based on the knowledge we already possess. But this is a dubious position, because most people simply do not make a serious effort to learn about the effects of stereotypes, or empathize with the perspective of cultures other than their own. Most white Americans, for instance, saw nothing wrong with the portrayals of African Americans as grinning simpletons in Hollywood movies of the ‘30s and ‘40s. They thought the images harmless, and often expressed exasperation with anyone arguing that the images were immoral or dangerous. Yet, there is little doubt that these stereotypes helped white Americans to stomach the civic inequalities faced by blacks, and thus helped to hold a race of human beings in subjugation. Likewise, the image of the Chief is a comforting substitute for images of real Native Americans, and perpetuates widely held misconceptions about Native culture.

    We should not forget that the symbol of the Chief, like the earlier stereotypes of blacks, is not just an image imposed upon a minority culture by the dominant culture. It is also an image that directly functioned to give the minority culture its inferior status in the first place. Without the notion that Native Americans were wild, silly people who didn’t deserve the same rights as whites, American settlers would have had a much harder time displacing and killing them. The image of Native Americans as primal, buffoonish savages in newspaper cartoons served this purpose quite nicely – it eased the minds of those who forcibly removed them from their lands, shot them dead, or applauded their demise from infectious diseases spread by white settlers. But hey, that’s no reason for anyone to make such a big deal about the Chief, is it? All that death and sadness is in the past, and we have sporting events right now in need of a comical mascot.

    When images reinforce or even form popular opinion in harmful ways, those who recognize the harm have a right to address the issue. Without the efforts of civil rights activists, stereotypes of blacks would still enjoy unquestioned supremacy in the media. What are the chances that public prejudices against Native Americans will vanish if we do not criticize the images that trivialize and simplify their cultural traditions? Indeed, the sentimental attachment that many Illini fans seemingly feel toward the Chief as a symbol of their university – the kind of mawkish obsession that a child feels toward her teddy bear – helps to ensure that few will bother to question the validity of the image and the harm it perpetrates.

    The Chief and Modern University Life

    So far, we have discussed the arguments made in support of the Chief, and exposed the foggy thinking at their foundation. But it remains to ask how it is even possible for such a stereotypical image as the Chief to cheat death for so long on a modern college campus, where there tends to be so much blustery talk of creating favorable environments for students of all races. In practice, as discussed previously, this concern often expresses itself as the pernicious kind of political correctness. For instance, in an infamous incident at the University of Pennsylvania, Jewish student Eden Jacobowitz faced severe disciplinary action for shouting “Shut up, you water buffalo,” at a group of noisy students outside his dorm window. The students were black, and the school accused Jacobowitz of using the term “water buffalo” as a racial epithet. Jacobowitz repeatedly disavowed any racist intent in his remark, especially since the darkness of night prevented him from even seeing the students who disturbed him. Additionally, he explained that “water buffalo” is a common Jewish slang term for noisy people, and he was able to document this fact through expert testimony. None of this mattered, because the university, hell-bent on punishing “any behavior, verbal or physical, that stigmatizes or victimizes individuals,” wanted to make an example of Jacobowitz. Eventually, after involvement by the ACLU and a great deal of unwanted media attention, the university dropped all charges (Kors and Silvergate, 1998). Still, a similar pattern has occurred at many other public colleges throughout the country. As Alan Charles Kors and Harvey A. Silvergate document in their excellent book The Shadow University, this strain of wrong-minded political correctness has created cultures of oppression on many college campuses, and precludes almost all real racial dialogue in the name of “protecting” students.

    And yet, in the face of so much sensitivity about racial issues, the Chief is still alive and kicking. To be sure, some of the reasons for this may stem from student demographics at University of Illinois. The university has more students from small towns in southern Illinois, and such students may have a greater tendency to be annoyed with “liberal” causes such as anti-Chief lobbies. Still, given the current climate on college campuses, the Chief’s stay of execution is surprising. For one thing, the university often makes official statements very similar to those of other colleges, in which it stresses its pursuit of “inclusiveness” and an environment hospitable to all students of all races. For another, it must be aware of that the Chief represents a detriment to this goal. Indeed, a recent report to the university by the Higher Learning Commission emphasized this very point, stating that “it is incumbent upon any public institution, however, to articulate the rationale for its policies, especially when they are in apparent contradiction with each other (Report of a Focused Visit, 2004).”

    The fact that the university is a “public institution” raises other issues. In lawsuits about campus persecution of “offensive speech,” courts have repeatedly found that public institutions dependent upon federal money must make every effort to uphold civic freedoms. This means, in cases similar to that of Eden Jacobowitz, that the mere fact that some students take offense at speech does not warrant the punishment of the speaker, since such punishment denies the federally protected right to free speech. But as previously mentioned, the federal guarantee of civic freedom also forbids acts of discrimination by institutions, and the Chief seemingly results from just such an act. This is one way the Chief issue differs from those of similar Native American mascots on professional sports teams such as the Atlanta Braves or Washington Redskins. Although the same racism arguably exists in both cases, public institutions have a special obligation toward their constituents, especially when the institution is a college committed to higher learning. Here again, the Chief’s existence represents a direct contradiction of the university’s primary purpose for existing.

    These contradictions are hard to understand, unless we realize that many universities have redesigned themselves as providers of entertainment rather than traditional education. That is, universities try to please the most number of students the greatest amount of the time, and often find principles and ethics just get in the way of this pursuit. As cultural critic Neil Postman shrewdly observes, contemporary education is simply one form of amusement among many others in contemporary society, and the primary value of amusement is the comfort it brings (Postman, 1985). In a brilliant article about this trend, Mark Edmundson notes,

    From the start, the contemporary university’s relationship with students has a solicitous, nearly servile tone. As soon as someone enters his junior year in high school, and especially if he’s living in a prosperous zip code, the informational material – the advertising – comes flooding in. Pictures, testimonials, videocassettes, and CD ROMs (some bidden, some not) arrive at the door from colleges all across the country, all trying to capture the student and his tuition cash. The freshman-to-be sees photos of well-appointed dorm rooms; of elaborate phys-ed facilities; of fine dining rooms; of expertly kept sports fields; of orchestras and drama troupes; of students working alone (no overbearing grown-ups in range), peering with high seriousness into computers and microscopes; or of students arrayed outdoors in attractive conversational garlands (Edmundson, 1997).

    A corollary to the mission of catering to students is the assumption that students shouldn’t have to think about anything they don’t want to think about, since doing so may make them less willing to ante up their tuition payments. The student is the customer, and the customer is always right.

    We can thus see talk of celebrating inclusiveness and diversity as a means to end – a way to make paying students feel comfortable about their purchase. In practice, this implies more than the usual coddling of students and lowering of academic standards – although it certainly does imply that. It also implies that if a group of students decide to accuse someone like Eden Jacobowitz of racism, the university will sacrifice Jacobowitz’s rights for the good of the many. And it implies that if most students at a university support a racist, puerile stereotype, then by God, so does the university, even if it knows full well it is wrong to do so. Here is the key to explaining how one university could willingly tar one of its students as a racist in order to please others, how another could indulge in racism itself in order to please its own students, while both claim to espouse values of liberty and inclusiveness. While many critics rightly complain about the dominance of irrational leftwing thought on college campuses, school policies often seem to be products of convenience rather than ideology.

    For evidence that some universities now stand for little more than appeasement of students, we need only review the University of Illinois’ handling of the Chief issue. The university began an “enhanced dialogue process” in 2000, which has entailed solicitation of comments about the Chief from every conceivable constituency at the university. Nearly 5 years later, the University has yet to act upon this avalanche of information. The primary reason for this inertia is polarization within the student population. In a 2004 student government referendum about the Chief in which more than 13,000 students voted, approximately 69 % voted to retain the Chief, with 31% voting for his retirement (Report of a Focused Visit, 2004). Thus, while the majority of the students support the Chief, a very significant minority does not, and the university does not want to risk losing potential dollars from either faction.

    The Board of Trustees therefore continues its unending review of constituent comments, and uses ambiguous language when pressed for progress reports. For instance, the Board recently resolved to “publicly celebrate” Native American culture, but did not clarify whether these celebrations would mean for the fate of the Chief (Slezak, 2004). This kind of equivocation thinly disguises the university’s inability to address the issue on its merits, and its primary concern to mollify the greatest possible number of students. As the Higher Learning Commission recently concluded,

    The University of Illinois, and especially the Board of Trustees, have modeled behavior that suggests that the issues of minority rights and cultural identity associated with the Chief can be addressed by providing a forum for all interested parties, regardless of the rationale or logical consistency of their arguments…The approach seems to be that if enough votes are taken, the issue can eventually be decided on that basis alone,, without an examination of the merits of the competing opinions. This behavior does not constitute a positive example of dispute resolution nor one that is educationally sound (Report of a Focused Visit, 2004).

    Indeed, it does not. It fails to realize that moral issues cannot be resolved by majority vote, or that the eagerness to cater to prevailing opinions runs contrary to the deeper mission of the university. But such failure is the inevitable result of turning education into a form of entertainment.

    The glacial pace of the university on this issue is all the more vexing in light of the past successes of other universities in eliminating their Native American mascots. While universities still face myriad contradictions about racial dialogue and student rights, many of them have at least been able to eliminate their “Indian” trademarks. As newspaper columnist Carol Slezak notes, Marquette, St. John’s, and Stanford are just a few of the many colleges to have implemented this change without much incident (Slezak, 2004). The report by the Higher Learning Commission also addresses this issue, stating “the list of institutions who have dealt successfully with similar issues is long, and all have moved forward as a result. The list of those institutions still attempting to defer or avoid the obvious solution is very short (Report of a Focused Visit, 2004).” University of Illinois has the dubious honor of a prominent position on this very short list. Despite widespread negative publicity about the mascot and repeated warning that the Chief tarnishes the respectability of the school, the University continues to waffle for fear of alienating its students. This kind of cheap pragmatism unites the Chief issue with those of speech codes used at other universities, however ideologically opposed the two phenomena may at first appear. Both are spineless attempts to bypass moral standards for majority rule, and pump up profits in the process.

    What judgments can be made about the Chief’s supporters, in light of this discussion? It’s not quite right to say that most are hardened racists, since as we have seen, at least some of them honestly believe that the Chief honors Native American tradition. However, if they persist in this mistake despite ample opportunities to correct their ignorance, we can conclude that they care more about a team mascot than they do about the dignity and feelings of their fellow human beings. Likewise, we can reach some justified conclusions about a university leadership that pays lips service to democratic values, but undermines those same values through a symbol it knows to be racist and inaccurate. Such a university cares more about placating students than it does about enlightening them, even when the placation jeopardizes the pursuit of higher learning.

    University of Illinois law professor Francis Boyle notes that, in the absence of a lawsuit, the Board of Trustees will continue to debate the subject for another ten years (Slezak, 2004). He may just be right about that. But however the Chief issue is eventually resolved, its implications for university life are disheartening. The problems posed by this age of “education as entertainment” will remain long after the Chief hangs up his feathers, if indeed he does.

    References

    Edmundson, Mark. (1997). On the uses of a liberal education: I. As lite entertainment. Harper’s, v.295, 39-49.

    Gone, Joseph G. (1995). Chief Illiniwek: dignified or damaging? Retrieved on December 4, 2004 from In Whose Honor.

    Kors, Charles and Silvergate, Harvey A. (1998). The Shadow University: The Betrayal of Liberty on America’s Campuses. New York: Free Press.

    Letter to the Board of Trustees from the University of Illinois at Urbana-Champaign Department of Anthropology, dated February 17, 1998. Retrieved on December 4, 2004 from University Senate page.

    Munson, Barbara E. (1997). Common themes and questions about the use of “Indian” logos. Retrieved on December 4, 2004 from “Indian Mascot & Logo Taskforce”.

    Postman, Neil. 1985. Amusing Ourselves to Death: Public Discourse in the Age of Entertainment. New York: Penguin.

    Report of a Focused Visit (Commission Mandated) to the University of Illinois-Champaign, Urbana-Champaign, Illinois, April 26-27, 2004, for The Higher Learning Commission, a Commission of the North Central Association of Colleges and Schools.

    Slezak, Carol. (2004). Illinois’ leaders must step up, get rid of Chief. Chicago Sun Times, September 7.

  • Beloved Cartoon Character Comes Out of Retirement

    Bithlo, Florida — With controversy swirling around several prominent cartoon characters, the most recent incident involving the wildly popular SpongeBob Squarepants and his sidekick Patrick coming under fire from conservative Christian groups led by Dr. James Dobson the founder of Focus on the Family, claiming that they are working to promote homosexuality, an old hand at children’s entertainment has announced that he is coming out of retirement.

    From his trailer in this seedy Orlando suburb, Mighty Mouse says he is ready to make a comeback.

    “The whole scene just makes me sick,” said the now portly Mouse who just celebrated his 63rd birthday, “I mean, just look at that Tinky [Teletubbie] and tell me he ain’t a little light in the loafers. That Falwell guy was right on the money if you ask me.”

    The aging Mouse of Steel says the recent accusations are hardly any surprise to those who have watching the “’toon” scene in recent years. “It all started with that Rocky and Bullwinkle, a couple of weirdoes, those two were high all the time.”

    Mouse began life as Michelangelo Mousellini, born under a staircase in Newark, New Jersey to his immigrant Italian rodent parents. After his father was killed in a turf dispute with Irish rats, he set out for Hollywood and fame. That is where he met with Izzy Klein of Terrytoon studios. “Izzy told me he liked my looks, but the name had to go,” Mouse reminisces in his Jersey accent fondly, “It was during the war and Mousellini sounding a little too much like Il Duce. Too ethnic. They called me Mickey at home, but Izzy said he didn’t want any trouble with the Disney people. The ‘toon world was pretty rough back in those days. So we came up with Mighty Mouse and it stuck.”

    Mouse went on to star in several animated features during the forties, moving to the small screen in 1955 with his own show on CBS called Mighty Mouse Playhouse. It was here he teamed up with his perennial arch-nemesis Oil Can Harry and Mitzi, the object of his rescue attempts. He later married Mitzi after a whirlwind studio romance in 1957.

    These “mellerdramas” were famous for their opera-style singing and slapstick violence and adored by millions of American children. “That’s all the kids needed back then,” Mouse says lighting a cigarette and opening a can a beer, “Some music, some mindless mayhem and violence and a pretty dame. Same thing every week.”

    Not that the cartoon world of the Fifties and Sixties was without controversy. Mighty Mouse Playhouse broke the color barrier with the introduction of “Heckle and Jeckle” the mischievous magpies. “Nobody could toss a stick of dynamite like them two,” chuckled Mouse, “Kinda of a shame how they made the colored acts come to rehearsals through the back door though.”

    During the social upheaval of the Sixties it was apparent that the cast was becoming increasingly frayed. There were rumors of heavy drinking, backstage fighting and extramarital affairs. Mouse is still irritated by this after forty years, “Sure, there was some wild times, but it was just booze and broads, none of this homo stuff like these pansies today.”

    The highly publicized divorced proceedings of Mighty and Mitzi, the accidental death of Jeckle when a stunt went wrong, and Oil Can Harry going into rehab took its toll on the production and it aired its last episode in 1967.

    Mouse came back to television briefly in 1979 to reprise his roll as the Mouse of Steel, but times had changed and the program went off the air after only sixteen episodes. His voice ruined by a three pack a day cigarette habit and hard drinking, Mouse then went on the road doing dinner theatre on the Chuckie Cheez circuit with Heckle and Pearl Pureheart (who replaced Mitzi in later episodes) and Mighty Manfred the Wonder Dog who had just lost his partner Tom Terrific to an apparent heroin overdose.

    “Those were tough times,” Mouse reflects, “but now that I’ve accepted Jesus into my heart I feel like I’m ready to try again.” He points to a photo of him and televangelist Pat Robertson on the mantle, “After Pearl (Mouse and Pureheart were married in 1971) was killed by that hawk, my life was a mess. But Brother Pat turned my life around,” referring to his televised baptism on the 700 Club in 1991.

    A spokesman for Focus on the Family responded that the organization was happy to see the return of Mighty Mouse, “While Mr. Mouse has had his share of troubles over the years, he is still one of our finest entertainers. His unique style of merriment combined violence and romance without a hint of sexual ambiguity. In short, just what our children need today.”

    When asked if he was nervous about returning to the public eye after such a long hiatus, Mouse, lighting one of his signature non-filter Pall Malls, replied, “I’ve got to do this for the kids. They’re the future.”

    Barney McClelland’s website is As I Please.

  • Letter to David Horowitz

    January 20, 2005

    Dear Mr. Horowitz,

    Thank you for joining me and AAUP Associate Secretary, Marcus Harvey, in last Saturday’s exchange on 1360 AM KLSD (Air America Radio, San Diego). I’m glad that you feel you fared so well in that exchange. In the interests of furthering the conversation, I would be delighted to have another live discussion with you, or with any of the so-called Students for Academic Freedom. Perhaps, in the interests of balance, any future debate can be held in a conservative venue.

    To the extent that it continues the dialogue, I also welcome your blog response to our debate, and to my article ” What’s Not to Like About the Academic Bill of Rights.” Given your many misconstruals of my own position, I feel compelled to respond. To keep it relatively brief, I will confine myself to some remarks on the central issue of left-right balance.

    In your blog you describe my article as

    …an attack on me personally and on imaginary demands of the academic freedom movement for “balance” and equal representation (there are no such demands).

    Your insistence that the supporters of the ABOR make “no such demands” for left-right “balance” (a claim that I think you also made in the radio interview) is bizarre, since from the start you have successfully framed the whole issue in terms of the need for intellectual or ideological balance, equity and diversity. Whatever else you might have said, you have persistently used the very term “balance” when promoting the ABOR. For instance in your article The Campus Blacklist (April 18, 2003), you note that

    I have encouraged students to demand that their schools adopt an “academic bill of rights” that stresses intellectual diversity, that demands balance [my italics] in their reading lists, that recognizes that political partisanship by professors in the classroom is an abuse of students’ academic freedom, that the inequity in funding of student organizations and visiting speakers is unacceptable, and that a learning environment hostile to conservatives is unacceptable.

    I’m assuming that this is not a casual phrase, given that you quote a version of this passage, complete with the reference to “balanced” reading lists, in a later article. There, you also note that the ABOR will ensure that “[s]election of speakers, allocation of funds for speaker activities and other student activities will observe the principles of academic freedom and promote intellectual balance [my italics].” Just last month, in your article ” It’s Time for Fairness and Inclusion in Our Universities,” you opened a defense of the ABOR by pointing to the ideological “imbalance” among American university faculty. Later in that article you refer again to “the faculty imbalance.”

    The repetition of these key terms is no mere semantic detail, given that politicians and the press have followed your lead in construing the Bill as means of ensuring balance, or correcting imbalance. To confine myself to a few examples republished on your FrontPage Magazine and Students for Academic Freedom websites, the New York Times reports that your movement was inspired by “political imbalance on faculties,” and a Washington Times editorial supporting the ABOR concludes that “[t]he issue here is balance.” A press release by Rep. Walter B. Jones (R-NC) states that the Bill “encourages university and college officials to even out the imbalance between liberal and conservative influences in higher education.” These examples are taken at random. There are at least a hundred more pleas for “balance” or denunciations of ” imbalance” scattered throughout articles describing the ABOR on the Students for Academic Freedom website. This is not to mention the repeated calls, by the Bill’s supporters, for “equality,” “equity,” “evenhandedness,” “fair-mindedness,” and other terms implying balance.

    How, in the face of all this evidence, can you claim that your movement makes “no demands” for balance and equal representation? No amount of retroactive spin can reverse the fact that the ABOR’s supporters have campaigned relentlessly to legislate ideological “balance” in American universities.

    In the same blog, you ask why I didn’t “reel off half a dozen conservatives in [my] own department” in response to your allegations of faculty imbalance. The answer should be obvious to anyone who has actually read my article. It explicates, in meticulous detail, the deficiencies of your jaundiced habit of labeling everyone as either “leftist” or “conservative.” I really mean it when I write that “[i]t is hard to think of any method that would provide us with reliable statistics about such a subtle and complex phenomenon as personal ideology–not least in environments, such as elite humanities departments, which actively cultivate ideological subtlety and complexity.”

    You dismiss as “pure invention” my claim that you compartmentalize (as you put it) “all ideas into only two categories, left and right.” And yet, in the space of the same few paragraphs, you repeatedly brand me as a “leftist.” I would prefer for you to have acknowledged my self-characterization, in the very article you are critiquing, as one of the people “who feel we have little to gain–intellectually, professionally, or financially–by accommodating ourselves to either of Horowitz’s two stifling compartments.” As a self-appointed authority on freedom and diversity, you would do well to respect peoples’ right to resist your labels.

    Now let’s imagine, for the sake of argument, that I were to correspond to one of your imagined ” two sides .” What makes you think I would end up in the “leftist” camp in my opposition to the ABOR? Wouldn’t it more likely be the conservative in me who resents your radical efforts to enforce a ” diversity of approaches ” and ” appropriate knowledge ” in the academy? After all, as the libertarians have been quick to point out , such efforts at diversity legislation are little more than a sick parody of political correctness.

    Thankfully, no such legislation has yet taken root in our own state of California, although it might in the coming months. Until such time as the opponents of academic freedom succeed in their devious campaign to prevent me from ” [taking] unfair advantage of a student’s immaturity by indoctrinating him or her with the teacher’s own opinions ,” I will continue to treat my students like grown-ups, with the blessing of the First Amendment and the sane precepts of the AAUP.

    Thank you for your attention. In the interest of promoting “both sides” instead of just “half the story,” I hope you will be kind enough to link to this letter, or to republish it complete with my links, alongside your own postings on the FrontPage Magazine and Students for Academic Freedom websites.

    Sincerely,

    Graham Larkin
    Stanford University, Department of Art & Art History
    CA-AAUP VP for Private Colleges and Universities

    To join the fight against the Academic Bill of Rights, get involved with the AAUP, tireless defenders of academic freedom since 1915.

    ©2005 CA-AAUP

    This article was first published on the California AAUP website and is republished here by permission.

  • What’s Not To Like About The Academic Bill of Rights

    Locking up my bike on the way to the office on May 3, 2004, I noticed that events were underway in the large pavilion pitched in front of the Hoover Center, the right-wing think tank overshadowing my office in the Nathan Cummings Art Building at Stanford University. The voice on the microphone was introducing prominent ultra-conservative intellectual David Horowitz. As the representative for private universities on the steering committee of the California Conference of the American Association of University Professors (CA-AAUP), I had recently taken a pressing interest in Mr. Horowitz’s activities. He is, after all, the brains behind the mischievously-named-and-crafted Academic Bill of Rights – a document which co-opts post-modern ideas on the situated nature of truth and knowledge, along with politically inclusive language, to counteract what Horowitz depicts as the stranglehold of progressive politics on university campuses. [1]

    Thanks in part, perhaps, to the protestations of the CA-AAUP, a version of this bill ( CA Senate Bill 1335) died in committee, with only one vote cast in its favor. And yet, prior to this, another version had actually been passed as law in Georgia with a 41-5 vote, and it is making the rounds elsewhere. Clearly the battle is only beginning. I wanted to see this guy.

    By the time I had dropped off my bag and returned to the doorway of the climate-controlled pavilion, Horowitz was already speaking, to a packed audience consisting mainly of white-haired men with Hoover Center tote bags. To my disappointment, the parts of the speech that I stayed for were not about the university at all. Instead they amounted to a generalized rant about the war in Iraq. What’s Not To Like About This War? the speaker intoned repeatedly, with shrill voice and sweeping gestures. With each re-utterance he would offer more proof of how great the invasion has been in every respect. Looking smaller and angrier every minute, Horowitz went on to lash out at the portrayal of the war in the major American media, which he characterized as nothing more than a “megaphone” for “neo-communist” viewpoints.

    It is disheartening to see such an intelligent man resort to such reckless overstatement, even when he’s preaching to a choir in need of a little martial uplift. (Nor did his audience seem especially receptive; I was impressed by their somber lack of reaction to his more strenuously “funny” digs at the war’s detractors.) Realizing that I had not garnered a single piece of substantive knowledge after ten minutes of attentive listening, I returned to my office to check the online news, and to prepare for my afternoon class.

    The news was more of the same–the siege of Falluja, the Bush government’s efforts to suppress any mention of the embarrassing tide of American casualties, and revelations of the Abu Ghraib brutalities. I thought about how Horowitz, whose words were still echoing outside my window, would view these demonstrations of What’s Not To Like About This War. More evidence of the same old neo-communist, anti-American media conspiracy, no doubt. It also struck me that the readings for my afternoon art history seminar,Towards the Modern Museum, could easily be marshaled to support his image of left-dominated American university campuses. The more overtly political of these readings (written in 1980 by two leftists) proposes that the Louvre’s ultimate aspirations to an even-handed inclusiveness belie an inescapable ritual ‘script’ of Western triumphalism. The second reading was not unsympathetic to this view. [2]

    In the class I openly critiqued the hyperbole of the first article, while applauding its attention to the fact that the museum is, indeed, an ideological space. (With Horowitz’s lurid performance fresh in my mind, I even compared the article’s overstated thesis to the conviction–equally widespread among left- and right-wing extremists–that the mainstream American media is simply the mouthpiece of the enemy within.) According to the way of thinking promoted by Horowitz and the Students for Academic Freedom, however, my forbearing critique would hardly have been enough to absolve the stain of the readings. Their embattled, politicized conception of intellectual diversity would require that any such left-wing content be balanced out by readings fostering a divergent ideological agenda. [3]

    In other words, I would be required to find readings that were openly anti-leftist, and which espoused conservative ideas about the neutrality of the great western museums, the sanctity of nationhood, the superiority of classic Western art, and so on. Even if I could find readings intelligently defending such notions, I doubt that they would profitably advance the thinking in the seminar, given that the leftist critique was explicitly dissecting these received ideas. Although I love museums, I designed the class in order to subject ideas and institutions to critical scrutiny–not to perpetuate their uncritical celebration.

    Another Horowitz-approved corrective would be to ensure that for every art historian inclined to assign ‘leftist’ material, the department hire a person who tends toward right-wing thinking. And reading lists are only one of the places in which Horowitz and his followers think university or government administrators should “protect” such ideological “diversity.” His Academic Bill of Rights also tries to ensure a greater spectrum of opinions (by which he invariably means left-to-right political positions) in matters of grading, curriculum development, selection of invited speakers, allocation of university funds, hiring, firing, promotion and tenure review.

    Such legislation would be a very dangerous incursion on academic freedom, for all kinds of reasons. To begin in the broadest terms, I don’t think anyone should ever be forced to conform to the kind of simplistic, two-sided worldview that Horowitz is, in effect, trying to pass into law. Such Manicheanism famously led George W. Bush, in an address to a joint session of Congress and the nation on September 20, 2001, to declare that “either you are with us, or you are with the terrorists.” Although nominally a defense of freedom, these words are really just a heavy-handed effort to force every American citizen (if not the whole world) to acquiesce to the terms of a perilously reductive world-picture.

    Faced with such radically restrictive alternatives, any free-thinking person should, at the very least, resent the lack of a third radio-button that would allow her to opt out of both choices. In a free country, the decision not to consent to the conditions of either Button A or Button B – the decision to actively abstain from any directives to declare one’s loyalties, or categorize one’s self, according to such limited terms – should always be available. This freedom to resist anyone else’s ideological categorization is a fundamental democratic principle. It makes no difference whether the purported opposites are Bush Loyalists and Terrorists, Good and Evil, Freedom Lovers and Freedom Haters, Christians and Non-Christians, Pro-Family Values Folks and Anti-Family Values Folks, or People Who Liked Kill Bill and People Who Didn’t.

    The two kinds of people in David Horowitz’s world-picture are alternatively described as members of the Left and the Right, or as Democrats and Republicans. This view of an ideological yin and yang works just fine for Horowitz, who has enjoyed remarkable political and financial success at being first a left-wing radical, and then a professional hard-line Republican. [4]

    But what about those of use who feel we have little to gain–intellectually, professionally, or financially–by accommodating ourselves to either of Horowitz’s two stifling compartments? The real issue here is not how two people happen to feel about one method of carving up the world. It is, rather, the fact that I am working to preserve (and Horowitz is working to undermine) the liberty of belief and speech implicit in the Constitution and the First Amendment. As justices Roberts and Reed marvelously put it in 1943, “[i]f there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion or force citizens to confess by word or act their faith therein.” (West Virginia State Bd. of Educ. v. Barnette, 319 U.S. 624, 642 (1943)).

    Despite his claim to be a defender of freedom, David Horowitz reveals an unnerving lack of regard for the kind of ideological abstention that the Virginian judges were working to defend. This disregard is glaringly evident in the way he arrives at the “statistics” which he regularly evokes as the very reason for implementing the Academic Bill of Rights. In a recent response to the AAUP’s condemnation of the Billand the thinking behind it, Horowitz baldly asserts that

    a series of recent studies by independent researchers has shown that on any given university faculty in America, professors to the left of the political center outnumber professors to the right of the political center by a factor of 10-1 and more. At some elite schools like Brown and Wesleyan the ratio rises to 28-1 and 30-1.

    He goes on to contend that this “huge correlation between political categories and academic standing” amounts to a “corruption of academic integrity.”

    Because he doesn’t resort to his opponents’ tactic of supplying footnotes, I cannot be certain which “independent studies” produced the “10-1” left-right ratio, but all the circumstantial evidence points to two studies. These are the loopy 2001 “survey” by the Frank Luntz Research Center and the Horowitz-run Center for the Study of Popular Culture, and the complementary study, co-authored by Horowitz and Eli Lehrer, titled Political Bias in the Administrations and Faculties of 32 Elite Colleges and Universities . In a declaration very similar to the one in his retort to the AAUP, Horowitz contends in the latter study that “[t]he overall ratio of Democrats to Republicans we were able to identify at the 32 schools was more than 10 to 1.” This also seems to be the source of the more extreme “statistics” for Brown and Wesleyan.

    If these are indeed the “independent” studies Horowitz has in mind, then the “Democrats” and “Republicans” mentioned in Horowitz’s AAUP retort are the 1,431 professors of Economics, English, History, Philosophy, Political Science and Sociology in various subjectively-selected “elite colleges and universities,” mostly in the Northeast, whose names seem to match up with those of registered party members in voter records. Even if one were able to reasonably extend the resulting findings to represent the ratio of Democrats to Republicans “on any given university faculty in America,” the question remains of how one could possibly use the exact same statistics to “show” just how much “professors to the left of the political center outnumber professors to the right.” Easy! All you need to do is ignore the existence of the 1,891 professors in the same departments who you estimate to be “unaffiliated” in their party loyalty.

    I can think of only two ways of coherently defending such a move. On the one hand, one could argue that the unaffiliated majority simply doesn’t matter, thereby leaving Horowitz free to concoct his 10-1 generalizations about all professors on the basis of less than half his dubious little data sample. On the other hand, one could simply assume that the unaffiliated majority must ‘really’ break down into exactly the same left/right proportions as the card-carrying Democrats and Republicans, leaving us with a 10-1 statistic that reasonably represents everyone.

    Take your pick. Whether Horowitz is declaring the political irrelevancy of the inconveniently-unaffiliated majority, or whether he is presuming to represent their unstated affiliations, his fundamental disregard for their abstention from self-definition is obvious, and his “10-1” ratio is ludicrous. This is the kind of ‘statistic’ you pray your opponents will use. And they do. The Students for Academic Freedom take their endorsement of Horowitz’s tactics to the limit by earnestly disclosing his patented technique of “How to Research Faculty Bias” as a link on their home page. [5]

    Using this simple recipe, even the most clueless ideology buffs can now manufacture impressive-looking facts about professorial politics in no time.

    The problem with such quantification goes beyond the deficiency of Horowitz’s particular method of data fabrication. It is hard to think of any method that would provide us with reliable statistics about such a subtle and complex phenomenon as personal ideology–not least in environments, such as elite humanities departments, which actively cultivate ideological subtlety and complexity. The inherent absurdity of any claim to objective ideological profiling raises the issue of how one could possibly go about implementing the kind of diversity that the Academic Bill of Rights is aiming to institute in the university. After all, to successfully foster “a plurality of methodologies and perspectives” and ensure against “political, ideological, religious or anti-religious indoctrination,” one would first have to develop a sufficiently broad and clear model onto which to map these differences and deviations, and then keep very close tabs on the professors.

    How does Horowitz think one should go about gauging and administering the desired spectrum of opinion? He tends to avoid the subject, although when pressed on the matter in an online forum hosted by the Chronicle of Higher Education, Horowitz chillingly asserted that such details of implementation are not a problem, or at least not his problem. [6]

    Well it should be his problem. It seems to me morally repugnant to promote the legislation of substantial executive powers–powers which could seriously affect the careers of countless individuals–without caring about how (or even whether) such powers could be fairly exercised. Anyone who wants to make professors stick to the “appropriate knowledge” of their respective fields had better lay down some explicit guidelines detailing exactly (1) who’s doing the fostering, (2) what invests them with the special knowledge to have this authority, (3) where their standards of appropriateness are coming from, and (4) how these standards will be implemented. Horowitz’s academic interlocutors in the Chronicle forum were absolutely right to worry about these details of “appropriateness” assessment and enforcement, and he was wrong to dismiss them.

    Horowitz argues that such worries are misplaced, because these details of implementation only have a bearing on the enforcement of ideological appropriateness, which has nothing to do with his own purely negative project of making sure every professor and student is free to pursue his or her own thing. Don’t believe it. Despite all his mollifying talk of freedom and fostering and diversity, it is clear that Horowitz would just love to see knowledge policed, and that he knows how to get it done. Witness the recent Chronicle article in which he takes deep offense at the UC-Denver political-science department for having “office doors and bulletin boards … plastered with cartoons and statements ridiculing Republicans.” In an effort to demonstrate why this material should not be up there, Horowitz asserts that “[w]e do not go to our doctors’ offices and expect to see partisan propaganda posted on the doors, or go to hospital operating rooms and expect to hear political lectures from our surgeons. The same should be true of our classrooms and professors, yet it is not.”

    Excuse me? Even as someone who’s generally bored by propaganda, I would be delighted for my doctor to post political cartoons on his door. Why not give me something to look at, besides faded Norman Rockwell reproductions, while I’m waiting around on a vinyl slab in an over-ventilated smock? I would grant exactly the same cartoon-posting privileges to anyone–even professors in a political science department! As for the cartoons’ criticism of Republicans, what would you expect in early 2004, when Republicans are running the country? Nostalgic Clinton-bashing? It’s not as if we’re talking about kiddie porn here, Mr. Horowitz, and it’s not as if anyone is trying to make you clear the propaganda out of your office. And what’s the point of the analogy about “political lectures from our surgeons”? Surgeons are not lecturers, and surgery is not politics, so yes, a political lecture from a surgeon might be a little weird, at least in the context of an operating room. But professors are lecturers, and one of the things they habitually lecture about is politics – a deeply human enterprise with a bearing on many scholarly domains, including my own. So why do you want to start cleaning off my door and policing my lectures?

    The real reason, at least in the examples regularly provided by Horowitz and the Students for Academic Freedom, is that certain thin-skinned ideologues don’t like the message. This is not a good enough reason to go around rewriting the laws. And in any case, the whole Academic Bill of Rights project is utopian, or dystopian. In order to meaningfully “foster” the kinds of “diversity” it purports to defend, one would first have to come up with objective or reasonable parameters for ideological stock-taking and policing–or, if one prefers, proactive anti-ideological diversity fostering. Whatever you want to call it, this monitoring would deprive people of fundamental liberties of expression, and legislating it would lead to an ethical and administrative quagmire. Don’t believe the doubletalk; Mr. Horowitz and the so-called Students for Academic Freedom are enemies of free thought and free speech.

    David Horowitz’s Response

    Larkin’s Reply

    Endnotes

    1: For unguarded critiques, see Horowitz’s articles in FrontPage Magazine, such as “The Battle for the Bill of Rights” , or “Missing Diversity On America’s Campuses” , where he writes: “Not only are the overwhelming majority of college professors fashionably ‘liberal,’ most faculties have a strong contingent of hard leftists whose views are extreme, and whose concentrated numbers make it possible for them to dominate (and even define) entire academic fields.” As evidence that things have gotten completely out of hand, Horowitz regularly offers the same few outlandish examples of “typical” left-wing behavior. He refers to a peculiar criminology assignment on “Why George Bush is a war criminal” with a compulsiveness not seen since the anti-PC “Jane Austen and the Masturbating Girl” rants of the early ’90s.

    2: The leftist article is Carol Duncan and Alan Wallach, “The Universal Survey Museum,” Art History 3, no. 4 (December 1980); The first footnote of the book from the other reading is taken (Andrew McClellan, Inventing the Louvre , 1994) describes Duncan and Wallach’s article as “persuasively argued.”

    3: General comments about reading lists in the Students for Academic Freedom Complaint Center (a place for anonymous students to denounce named professors) invariably point to the unhealthy preponderance of leftist material.

    4: See Scott Sherman, “David Horowitz’s Long March,” ( The Nation , July 3, 2000), which traces Horowitz’s success from the 1962 book titled Student , which sold 25,000 copies, to his enormously well-funded career as a reactionary.

    5: As a rule, Americans don’t like this kind of snooping into public records, as the media-savvy Horowitz clearly senses when pressed on the matter by Alan Colmes in a Fox News interview : ” COLMES : Here’s what concerns me, David. Is it true, as the Denver Post claimed, that you are encouraging students on your web site to go to use public records, to go to the county clerk’s office to find teachers’ political affiliations and then create a spreadsheet to have a list of teachers and where they stand politically? Is that accurate? HOROWITZ : Alan, look, I spent many years…you know, I actually was on this show. We had… COLMES : Is that accurate? That’s all I’m asking. HOROWITZ : No, I’m not encouraging people…I have one student who has gone to primary registrations just to show the skew.”

    6: Faced with a very articulate question that begins “How would an Academic Bill of Rights be enforced on a campus level?” Horowitz–a bitter opponent of affirmative action–responds: “There is no enforcement proposed in the Academic Bill of Rights. This would be up to the institutions that adopt it. The university seems to have no problem promoting skin diversity. Why should intellectual diversity pose a problem?” To the next question (which, similarly, ends by asking “on what basis is ‘intellectual diversity’ to be assessed and with what expertise”) Horowitz simply replies: “No one is suggesting that an outside authority make these judgments. Read the Academic Bill of Rights.”

    Dr Graham Larkin, Stanford University, Department of Art & Art History.
    CA-AAUP VP for Private Colleges and Universities

    To join the fight against the Academic Bill of Rights, get involved with the AAUP, tireless defenders of academic freedom since 1915.

    ©2005 CA-AAUP

    This article was first published on the California AAUP website and is republished here by permission.

  • Minority or Citizen? A Roundtable Discussion

    Worker-communist review: The debate surrounding the banning of conspicuous religious symbols in schools and government workplaces in France have raised some fundamental questions about religious freedom and freedom of choice and dress. Is the ban a restriction on religious freedom, choice and dress? How far must a ban go? Why?

    Hamid Taghvaee: In my view, banning religious symbols in schools and workplaces is completely justified. The ban has nothing to do with religious freedom because it is a social and public ban. In civil societies, religion and religious practices must be free as long as they remain private matters. Civil society can only recognise freedom of religion as a private matter; otherwise it will not be civil society anymore. Any interference of religious practices in the affairs and social activities of civil society should be banned.

    In terms of freedom of dress, it is obvious that the veil is not a kind of dress and has got nothing to do with freedom of choice. Veiling is a religious must for Muslim women; wearing it means abandoning the right to choose any other sort of dress. One can say that by choosing Islam, Muslim women in fact choose not to practice their civil freedom of wearing any kind of dress they like. The veil is a religious obligation and comes with Islam! By selecting Islam, women abandon the right to select their dress.

    Azar Majedi: This is a restriction on the role of religion in the affairs of civil society rather than religious freedom as such. The ban is aiming to restrict the meddling of religion as an institution in the running of the state and society at large.

    Religious freedom is commonly understood as freedom of religious beliefs and practice. However, depending on your point of view, practicing one’s beliefs takes different dimensions. In a secular society, religion is and must be separated from the state, education, citizens’ formal identification and so on; it must be a private matter. Therefore, from a secular point of view, the state and educational system must not represent any particular religion or religious belief. Using religious symbols, such as veiling, would be considered a denial of the principle of secularism, and contradicts the principles of a secular society. By banning religious symbols in public schools and state institutions, one is aiming to safeguard a freer society where religion remains a private affair.

    To get a clearer picture and to avoid any false assumptions, one must look at the history of the development of modern and civil society. Secularism is the product of this process and one of the pillars of such a society. To eradicate the influence of the church from the affairs of the state, to relegate religion to the private sphere and to restrict the role of religion as an institution are all significant achievements of modern society. The French revolution is an important historical moment in this process. These restrictions on religion became necessary in order to materialize the main slogans of this revolution: ‘Freedom and Equality’.

    Going back to your question, this ban is a restriction on religion but not a restriction on individual freedom or individual rights. In my opinion, this ban is a necessary step towards a freer society, and furthermore, I believe restricting religion will help create a more equal society, particularly for women. By restricting religion, society is in a better position to respect individual/citizen rights. However, I believe that this ban is not enough. We should ban religious schools and the veiling of under-aged girls.

    Ali Javadi: I agree that banning conspicuous religious symbols in schools and workplaces is a ‘restriction’ on religious freedom and on the freedom of choice and dress. However, I think we should welcome such restrictions. Allow me to explain. There are many restrictions in society that limit the ‘freedom’ of individuals in some shape or form, such as banning driving while intoxicated, smoking in public buildings or the driving of motorcycles without helmets. Even imposing speed limits on highways in some way limits the freedom of individuals to drive at their chosen speed. However, all these restrictions are necessary and essential to safeguard society and individuals from danger – the danger of being killed by a drunk driver, getting lung disease from second-hand smoke and so on. In the same way, I think society should protect individuals from the influence of religion and the religion industry as we protect ourselves from contagious deadly diseases. The banning of conspicuous religious symbols in workplaces and schools has a similar meaning and intent.

    These restrictions are direct applications of the basic principle of secularism, which calls for the separation of state and religion. No adult should be allowed to wear her/his religion on his/her forehead or sleeves, neither at schools nor at workplaces. Children should be kept completely away from the influence of religion and religious institutions, in the same way that we keep medicine and drugs away from children. Children don’t have any religion. In my opinion, religion should be a private matter and not a state matter. The state should be free from the influence of religion. No individual should be allowed to appear at his/her workplace and in schools with conspicuous religious symbols.

    One should recall that the basic premise of secularism was to eradicate religion and the religion industry’s influence from the most important instruments of society, the state and educational system, particularly as far as the shaping of the lives of individuals and citizens are concerned. If we strive for a free society then freeing society from the influence of religion is a precondition for a free society. Secularism is the first step toward the freedom of society from religion. A free society should be free from religion and superstitions and limitations that this anti-human ideology imposes on humanity.

    I should also mention that while I am for such restrictions, I am not pro an all out ban on all religious expression. I believe in freedom of religion and freedom to campaign against religion. I think a complete ban on religion will allow it to survive longer in the society.

    Finally, I would like to say few words about the nature and character of the fight that we are witnessing in France. The fact of the matter is that the Islamists have spread their reactionary wings in Europe and North America and are openly attacking the secular values of western societies. This is an attack on the progressive achievements of these societies. I believe all secularists, all progressive and socialists should fight these attacks by Islamists.

    Worker-communist Review : In the debate around the banning of religious symbols in France as well as regarding the establishment of a Sharia court in Canada, the issue of minority rights has been raised and that minorities and ‘their’ cultural and religious difference need to be respected in a multicultural and pluralist society. Please comment on minority rights. Isn’t there a conflict between minority and collective rights versus individual rights? What about vis-à-vis the concept of citizenship?

    Hamid Taghvaee: Civil societies are based on the concept of the equal and universal rights of all citizens. Philosophically, it is based on the social identity of human beings as opposed to religious, national, ethnic, and any other – one can say – non-humanistic identities imposed on people in bourgeois societies. Society is not a mosaic of different minorities and cannot be based on a collection of different rights for different groups of people; this sort of concept of society is a huge step backwards to medieval societies. The very concept of the minority is not a modern and civil concept. If every citizen, independent of her or his country of origin, religion, race, gender, and so on and so forth, has the exact same rights, then the concept of minority disappears altogether and there will be no need to recognise special rights for non-existent social groups. The problem with contemporary societies is that they first divide people based on their nationality, religion, ethnicity, race and other non-civil – or if you like pre-modernist – factors and then try to be multi-culturalist and ‘respect’ minority rights and other such nonsense! This is not post-modernism; in fact it is medieval and pre-modernism in the true sense of the word.

    Azar Majedi: If I remember correctly, historically, the concept of minority rights was raised in the US civil rights movement. The struggle against racism and for the recognition of equal rights for black people in the US acknowledged minority rights as a valid and credible legal concept. Later, the concept of respect for minority rights extended to any deprived or disadvantaged section of society, even women. In fact, historically, minority rights meant the recognition of equal and universal rights for all citizens in a given society by extending equal rights to members of a deprived section in the society. In this context, minority rights do not contradict individual or citizen’s rights; on the contrary it extends it to all citizens. Whereas now, in this new context i.e. respect for multi-culturalism, respect for different cultures, or cultural relativism, minority rights has been transformed to imply the rights of a collective, not members of that collective. In reality, this practice is discriminatory. Recognising certain rights for a community or a collective based on culture, race, or religion in essence means depriving the individual members of that collective of the universal laws of the larger society. It gives prevalence to the collective vis-à-vis individuals. Thus, contrary to what the defenders of multi-culturalism like to portray, this practice is not egalitarian but is discriminatory. In a given society, there must exist one set of laws that applies to all citizens, not different laws applying to different communities.

    Ali Javadi: Frankly, I don’t believe in any ‘minority rights’. This is a totally reactionary concept. I am for the general and undeniable rights of individuals in society. These individual rights are universal and should apply to everyone independent of their race, gender, religion, or ethnicity.

    I am sure that individuals have different ‘cultures’ and ‘religions’, however subscribing to a different ‘culture’ or ‘religion’ does not automatically give anyone a different set of rights in society. To understand the full reactionary notion of this concept let me give you an example. In Islam, girls can be forced to marry a man when they become nine years old. Marrying a minor at the age of nine is child molestation or paedophilia and is punished severely in many western societies. ‘Minority rights’ for Islamists would mean allowing little girls to be raped at the age of nine. This is only one aspect of ‘minority rights’. Politically speaking, multi-culturalism is a reactionary theory designed to make concessions and maintain the control of reaction over segments of society referred to as ‘minorities’.

    I am for an egalitarian society, a society in which every individual has equal rights.

    Worker-communist Review: Some say that disregarding the special needs and rights of minorities leads to racism? Is it racist and discriminatory and ‘Islamophobic’ to ban conspicuous religious symbols or oppose a Sharia court in the west?

    Hamid Taghvaee: I think I have already answered this question. I should just add that the opposite is true. Believing in different sets of laws for different groups of people is racism and anybody who respects universal human rights should stand up against it. We say people are people in every corner of the world and have the same needs and ideals. They should therefore have the same rights everywhere. A few hundred years ago for say French revolutionaries, you wouldn’t even need to try to prove this. But unfortunately today for our post-modernists this is not the case. They apparently live in the pre-French revolution era!

    Azar Majedi: I addressed the first part of the question above. I should also mention that I do not recognise the concept of ‘special needs of minorities’. Regarding the second part of the question, I should state that not only it is not racist or discriminatory to oppose the Sharia court in the west or ban conspicuous religious symbols, it is the contrary. Setting up of such courts is a discriminatory and racist act. (I have explained this issue and talked about Islamophobia further in my speech in Canada, which is published in this issue.)

    Ali Javadi: Let me add that some still believe the earth is flat. Nonetheless, these arguments are pure nonsense. Let me ask: Is raping a nine year old child part of the ‘special needs’ of ‘minorities’? Is reducing the status of women to second-class citizens in the society part of ‘minority’ rights? These arguments are designed to advance the cause of Islamists in their reactionary holy war against humanity. In fact it is completely the other way around; dividing society into different ‘groups’ and ‘minorities’ is an inseparable part of a racist approach and notion.

    Worker-communist Review: We are told that banning religious symbols and or a Sharia court will lead to extremism yet we see a rise in extremism in the west as a result of multi-culturalism and in the identification of people with the political Islamic movement. Please comment.

    Hamid Taghvaee: You are right again! The recent dispute on the veil and Sharia in western societies shows the rising role of political Islam in the west; multi-culturalism is its intellectual ally! It is the Trojan horse that opens the gates for this medieval ‘culture’ and Islamic Sharia that tolerates nothing and respects nothing but Allah’s holy laws. Tolerating Sharia in the name of multi-culturalism is like defending Hitler in the name of Jews and gays and other minorities! Islam, like all other religions, recognises no boundaries and ignores and denies human beings and their needs and rights in the name of God (the most recent experiences are the Islamic Republic in Iran and the Taliban in Afghanistan). Changing God to ‘Multi-culturalism’ makes no difference. Civilised humanity should stand up against political Islam and its multi-culturalist apologists. This is the only way of respecting and defending any humanistic aspect of any culture in the world.

    Azar Majedi: I do not see any direct relation between these two, i.e. the rise in one would result in the rise or fall of the other. As far as political Islam is concerned, the main characteristic of this movement is extreme reaction, and its main tool for political advancement is resorting to terror. The rise in the identification of certain sections of the society in the west with political Islam, especially among the youth, is a result of a more complex situation. I believe that the existing racism in the west, the socio-economic deprivation of the immigrant population, or citizens from non-western origin, the alienation this section feels and so on create fertile ground for resentment towards the west and western values. On this ground, and in the absence of a strong, progressive and humanitarian anti-racist and pro-integration movement, political Islam has been able to recruit using its aggressive methods of propaganda. Political Islam has been able to take the real resentment and frustration of this section of the population hostage and cash in on it.

    Ali Javadi: I don’t believe there is any correlation between the rise of ‘extremism’ and banning of religious symbols in workplaces and schools. The rise of political Islam and its reactionary holy war does not have much to do with these restrictions.

    Political Islam as a movement emerged in the seventies while the secular-nationalist states in the Middle East and North Africa were in deep political and ideological crisis. The objective of this movement is to restructure political power in these societies in order to have a bigger share. Extremism and terror and maiming of people are the main instruments of this movement to advance its reactionary cause. One should not pay any attention to these arguments.

    The above interview was first published in the Worker-communist Review 1 dated June 2004.

  • Are You an Altie?

    A while back on misc.health.alternative, a term was coined to describe people who are so militantly pro-alternative medicine and so distrustful of conventional medicine that they will never admit when conventional medicine is effective and refuse ever to concede that any alternative medical practitioner might, just might, possibly be a quack. (Certain regulars on misc.health.alternative inspired this term. One day perhaps I will discuss a couple of specific examples with actual posts by them to Usenet, so that you can see even more clearly what I mean.) I forgot which m.h.a. skeptical regular coined the term, but the term was “altie.” About a year ago, we even came up with a Jeff Foxworthy-like list of traits of alties (“You might be an altie if…”). Several regulars in m.h.a. contributed, after a regular named Rich Shewmaker got the ball rolling.

    DISCLAIMER: Before the hate mail and nasty comments start rolling in, please remember that the following traits (and the term “altie”) are NOT meant to describe all (or even most) users of alternative medicine or people who think certain alternative medicine modalities are useful treatments. They describe a strident, anti-intellectual, and anti-science subset of alt-med users, who tend to make impossibly grandiose claims for their favorite remedy and usually also express a strong distrust (or even hatred) of conventional medicine. The problem is, rational users of alt-med, who have a more realistic concept of where it might and might not be useful, tend to be reluctant to criticize alties, at least on Usenet and web discussion groups. Unfortunately, alties are not hard to find. So, without further ado, here we go:

    YOU JUST MIGHT BE AN ALTIE IF….

    • If you believe that doctors, scientists, and the pharmaceutical companies conspire to suppress your favorite “alternative medicine” modality, you just might be an altie.
    • If you like to claim that science is a religion, you might be an altie.
    • If you accept vague and/or poorly documented anecdotes and testimonials as sufficient evidence that an “alternative” therapy “works,” you just might be an altie.
    • If you make claims for a product or therapy like, “strengthens the immune system,” “restores balance,” “detoxifies the liver,” “cleanses the colon,” or “cleanses the blood,” you may be an altie.
    • If you are impressed by such claims when made by others, you just might be an altie.
    • If you do most of your “scientific” research on websites that exist to sell “alternative health” products, you might be an altie.
    • If you carefully avoid any criticism of any “alternative medicine” practitioner, product, or theory, regardless of how mind-numbingly obviously unscientific, illogical, internally inconsistent, or fraudulent it may be, you might be an altie.
    • If you accept or agree with every vilification of medicine and science as The Truth, regardless of the source or of how obviously irrational, without basis, or unjustified the vilification is, you might just be an altie.
    • If you believe that Hulda Clark is being unjustly “persecuted” by “conventional medicine” and/or “the government” because she is a “threat,” you are very likely an altie.
    • If you absolutely, positively cannot ever admit that a conventional therapy, any conventional medical therapy, can cure a disease, any disease, you may well be an altie.
    • If you believe that vaccines “don’t work” or that they cause autism or other chronic diseases, you just might be an altie.
    • If you believe anything you read at Whale.to or Cure Zone, you just might be an altie.
    • If you regularly post to the message boards on Cure Zone, you’re very likely to be an altie. Explanation: Cure Zone’s message boards are highly moderated. (Translation: censored.) Skeptical posts, no matter how polite, unabusive, or well-reasoned, are often summarily deleted by the moderators. If a skeptic persists in questioning the alt-med dogma there, he/she will usually eventually be banned by the moderators.
    • If you think misc.health.alternative should be a sunny little support group where true believers in alternative healthcare share testimonials and gleefully trash science and medicine without comment from skeptics (in other words, if you want it to be like Cure Zone), you may be an altie.
    • If you think it’s OK for misc.health.alternative (or any other such newsgroup) to be awash in advertising for snake oil quackery and other spam, you may be an altie.
    • If you believe that alternative medicine practitioners are far more caring for their patients and far more moral (and therefore, by implication, less corruptible by money) than conventional doctors, you just might be an altie.
    • If you believe that companies selling alternative medicines have every right to charge high prices for their products (example: Glow Life charging hundreds of dollars for a 150 g tin of Ginseng powder, as I described earlier), but that pharmaceutical companies (which spend hundreds of millions of dollars and several years to get each new drug developed, tested, and approved) don’t, you are very likely an altie.
    • If you dismiss every well-designed randomized clinical study that failed to show a benefit for an alternative medicine or therapy over placebo control as either not proving that the therapy is ineffective or as having been manipulated by nefarious forces (conventional medicine, the pharmaceutical companies, the government, etc.) to produce a negative result, you may well be an altie.

    Feel free to send me suggestions for more “You just might be an altie” items!

    By the way, I’ve got dibs on this one: If you are deeply offended by the above list, you just might be an altie!

    This article first appeared on the blog Respectful Insolence and is published here by permission.

  • Stop the Death by Stoning of a Woman in Iran

    Hajiyeh Esmaelvand lives in the city of Jelfa in Iran. She has been condemned to death by stoning. The Islamic court in Iran has given verdict of execution by stoning to be carried out 2 weeks from now for having sexual relations out side marriage.

    Think about it, a lot of people all over the world are looking forward to some time off and the celebrations that they are going to have in two weeks time. The Christmas and New Year season just around the corner. In another part of the world a woman is suffering with the trauma and fear of the deadly moment awaiting her.

    The Islamic government of Iran is planning to kill a human being by casting stones at her. She is to be buried in a ditch up to her neck and stoned to death. This is the reality of Islam in power. It must be stopped.

    In the past, through our intensive work and international support we have managed to stop stonings in Iran and Nigeria. We can do it again.

    Join us to save Hajiyeh’s life. Support us in putting pressure on the Islamic Republic of Iran to ban stoning.

    Recommended actions:

    Write to the Iranian President Mohammad Khatami demanding:

    Immediate abolition of stoning and all other forms of punishment for extra-marital relations and all other Sharia laws;

    Immediate release of Hajiyeh Esmaelvand and all those imprisoned for extra-marital relations.

    Email: khatami@president.ir Fax: 0098 21 649 5880

    Please send your letters to us: Mina Ahadi Email: minaahadi@aol.com

    Tel: 0049-01775692413 Fax : 0049 2012488510 www.stopstoningnow.com

  • The Stinking Ninth Class

    It’s a hard life for educated folk. Earlier this year, the Chinese state newsagency Xinhua reported that the life expectancy of the Chinese intellectual was, at 58, more than ten years lower than the national average. A survey also showed that 76% of the nation’s journalists died between 40 and 60.

    Many were surprised by the findings. The insanities of Chairman Mao’s “anti-rightist” campaigns and, worse still, the Cultural Revolution, had by now given way to a kind of modus vivendi. Intellectuals were no longer the “stinking ninth class” of society, some way behind criminals, prostitutes and vagrants in a peasant-led pecking order. By now, in the interests of “stability and economic development”, there would be no more mass persecutions. While freedom of speech would not be tolerated, at least the freedom to earn money had now been firmly established. The network of informers began to disintegrate, the role of the Party in ordinary life began to recede, and something approaching a civil society had begun to develop.

    Still, something was obviously not quite right. While intellectuals are no longer being buried alive, as they were during the brutal reign of the Emperor Qin Shihuang, they are marginalized by a Singapore-style authoritarian consumer society and hemmed in by the old-fashioned strictures of the Chinese Communist Party. The recent discussion in the state press about the role of “public intellectuals” is a case in point.

    George Steiner once suggested that the persecution of intellectuals in the old Soviet bloc indicated at the very least a kind of respect for high culture. “Writers were persecuted and killed precisely because literature was recognized as an important and potentially dangerous force,” the old polymath wrote in his 1961 essay, The Writer and Communism. The role and influence of the intellectual was enhanced by the fact that he was being oppressed, and the ultimate compliment a government could pay to its grand penseurs was to lock them up. After all, in bourgeois democratic countries, intellectuals had been sidelined, and their attempts to enrich cultural life had been smothered by the mass media, where thoughtless gratification was the norm.

    In his memoirs, Steiner described his sense of intoxication while listening to classical musical recitals and watching the staging of “serious plays” by Sophocles and Brecht in the former East Berlin. After the Wall fell, he said, “virtually overnight, freedom reclaimed its inalienable right to junk food.” The liberated masses rushed not to dissident poets, but to adult cinemas and MacDonalds.

    It was no doubt reassuring for the likes of Blok, Mayakovsky and Akhmatova, and the great Chinese playwright Lao She, hounded to his death by the Red Guards, that at least the government was paying implicit homage to the power and influence of intellectuals.

    Steiner did at least concede that the inculcation of love for Bach fugues and the more recondite poems of Goethe came at a high price, and was hardly worth the gulags. But he could not hide his regret. Modern capitalist democracies were obeisant at the twin idols of “Madonna and Maradona”, Steiner wrote, and swamped with porn and filth and fast food. Rare book shops in Prague were being ripped up and replaced by smut sellers and burger joints.

    China, it seems, has the worst of both worlds. One veteran China watcher once described the post-Mao reforms as a Leninist velvet prison with consumer characteristics. Trash and kitsch remain easy to find, and while you might be able to attend the occasional recital of ancient pipa compositions, should you so wish, the sound of that delicate instrument is invariably drowned out by mobile phones.

    “To get rich is glorious,” the late leader Deng Xiaoping urged. Render unto Caesar what should be rendered, and leave everything else to Mammon. We now have a nation where the fairy light is the most prominent cultural symbol, and where the quest for riches has become the simplest avenue of achievement.

    The recent debate about the role of public intellectuals began when the relatively liberal weekly magazine, Southern People’s Weekly, published a list of the 50 most influential cultural figures in China, led by the exiled poet, Bei Dao, and including the veteran Beijing rocker Cui Jian, who was at the height of his subversive influence in the years immediately after the Tian’anmen Square crackdown in 1989. The magazine noted that the market economy had led to the marginalization of public intellectuals, but they had never been more necessary.

    And so, facing marginalization by the market, they now had to face another salvo from the Propaganda Ministry. Two months later, the Shanghai-based government organ, Liberation Daily, published a characteristically reactionary editorial mocking the “imported” notion of “public intellectuals” and accusing the figures on the list of being estranged from the Party and the masses. “Public intellectuals” are arrogant and elitist, and trying to monopolize debate with their own views, it said.

    Communist Party watchers might have been reminded of the case of the Hungarian writer, Tibor Dery, condemned for leading an “organization hostile to the state” in the wake of the Soviet crackdown in 1956. What might this hostile organization be, the joke ran. It was the Hungarian people.

    In any case, in its bilious doctrinal carping, the editorial was quite exemplary, a sinister, jargon-ridden spasm of Stalinesque nastiness. Out of touch, certainly, but beneath its cod-Marxist babble, there were signs that a renewed assault on press freedom was on its way.

    It quickly drew the attention of the foreign media – themselves anxious, in these confusing times, to find evidence that the core values of the CCP had not changed, and that the whole Party edifice tottered precariously on a huge wave of socioeconomic transformation and was now returning, reflexively, to its roots. The Economist was particularly scathing, and attributed the latest government assault to the freedom afforded to the modern Chinese thinker by the internet. According to the Christian Science Monitor, which broke the story, the Propaganda Ministry had now ordered newspapers to refrain from compiling such lists and from paying too much attention to the assortment of poets, writers, environmental activists and other critical voices now on its “gray list” of troublemakers.

    Meanwhile, Chinese newspapers, driven for the first time by market pressures, are filling their pages with the blathering of TV celebs, with salacious tales involving ménages a trois or cocaine busts, failed bank heists and kidnappings, tawdry trysts in saunas and karaoke houses, and the sort of titillating tabloid fodder that attracts readers of every stripe in every country. The papers have been anxious to boost their circulation, and the readers themselves rarely want to read editorials issued by the Propaganda Ministry.

    This has led to tensions, between mass-market philistinism on the one hand, and the government on the other.

    TV stations pile celebrity profiles upon celebrity profiles, regaling us with the eating habits and interests of the latest manufactured Taiwanese pop band or Hong Kong diva, just like western TV, only without the counterweight of serious news and debate. Desperate for advertising revenues, they try filling prime-time with real-life crime shows, and one regional station even introduced bikini-clad weather girls, but they are then rapped on the knuckles for “corrupting the morality of the youth”.

    The crisis of Chinese civilization that hit during and after the collapse of the imperial order in 1911 was regarded as a golden age for intellectuals, but their role has always been slightly different from that in the west. They were traditionally the servants of their nation, motivated not by what Graham Greene has called the “duty of disloyalty” but by a genuine desire to serve. And so, thinkers queued up to declare solutions to the national malaise, drawing inspiration from Herbert Spencer, John Stuart Mill, Dewey or even the anarchist Bakhunin. After the Bolshevik revolution, they finally turned to Marx.

    Intellectuals were always merely functional, instruments of the state. When we shoot a crossbow at a target, we do not praise the arrow, said Mao Zedong in 1942 during a criticism of the writer, Ding Ling. It was a sign of things to come. The intellectuals had played a crucial role in the revolution, but they would quickly come under attack themselves.

    In the novelist and essayist Lu Xun, China had at least one great writer and political figure. Lu Xun died of tuberculosis in 1936, but was already thoroughly sceptical about the revolution, suggesting that “the oppressed quickly turn into the oppressors”. After his death, he remained a key figure, both lionized and bowdlerized by the regime, with statues and shrines set up to celebrate him as a “champion of the Party”. The only good intellectual, it seems, was a dead intellectual.

  • Sorry to Disappoint, but I’m Still an Atheist!

    Has Antony Flew ceased to be an atheist?

    In a sensationalist campaign in the internet, it is alleged that Professor Antony Flew, British philosopher, reputed rationalist, atheist and Honorary Associate of Rationalist International, has left atheism and decided that a god might exist.

    The controversy revolves around some remarks of Prof. Antony Flew that seems to allow different interpretations. Has Antony Flew ever asserted that “probably God exists”? Richard Carrier, editor in chief of the Secular Web quotes Antony Flew from a letter addressed to him in his own hand (dated 19 October 2004): “I do not think I will ever make that assertion, precisely because any assertion which I am prepared to make about God would not be about a God in that sense … I think we need here a fundamental distinction between the God of Aristotle or Spinoza and the Gods of the Christian and the Islamic Revelations.”

    This is not the first time that Professor Antony Flew’s atheist position is attacked. In reaction to an internet campaign in 2001 that tried to brand him a “convert” to religious belief, Professor Antony Flew made the following statement. In 2003 he answered yet another campaign in this direction with the same statement. It is still now his latest official position in this regard.

    Richard C. Carrier, current Editor in Chief of the Secular Web, tells me that “the internet has now become awash with rumors” that I “have converted to Christianity, or am at least no longer an atheist.” Perhaps because I was born too soon to be involved in the internet world I had heard nothing of this rumour. So Mr. Carrier asks me to explain myself in cyberspace. This, with the help of the Internet Infidels, I now attempt.

    Those rumours speak false. I remain still what I have been now for over fifty years, a negative atheist. By this I mean that I construe the initial letter in the word ‘atheist’ in the way in which everyone construes the same initial letter in such words as ‘atypical’ and ‘amoral’. For I still believe that it is impossible either to verify or to falsify – to show to be false – what David Hume in his Dialogues concerning Natural Religion happily described as “the religious hypothesis.” The more I contemplate the eschatological teachings of Christianity and Islam the more I wish I could demonstrate their falsity.

    I first argued the impossibility in ‘Theology and Falsification’, a short paper originally published in 1950 and since reprinted over forty times in different places, including translations into German, Italian, Spanish, Danish, Welsh, Finnish and Slovak. The most recent reprint was as part of ‘A Golden Jubilee Celebration’ in the October/November 2001 issue of the semi-popular British journal Philosophy Now, which the editors of that periodical have graciously allowed the Internet Infidels to publish online: see “Theology & Falsification.”

    I can suggest only one possible source of the rumours. Several weeks ago I submitted to the Editor of Philo (The Journal of the Society of Humanist Philosophers) a short paper making two points which might well disturb atheists of the more positive kind. The point more relevant here was that it can be entirely rational for believers and negative atheists to respond in quite different ways to the same scientific developments.

    We negative atheists are bound to see the Big Bang cosmology as requiring a physical explanation; and that one which, in the nature of the case, may nevertheless be forever inaccessible to human beings. But believers may, equally reasonably, welcome the Big Bang cosmology as tending to confirm their prior belief that “in the beginning” the Universe was created by God.

    Again, negative atheists meeting the argument that the fundamental constants of physics would seem to have been ‘fine tuned’ to make the emergence of mankind possible will first object to the application of either the frequency or the propensity theory of probability ‘outside’ the Universe, and then go on to ask why omnipotence should have been satisfied to produce a Universe in which the origin and rise of the human race was merely possible rather than absolutely inevitable. But believers are equally bound and, on their opposite assumptions, equally justified in seeing the Fine Tuning Argument as providing impressive confirmation of a fundamental belief shared by all the three great systems of revealed theistic religion – Judaism, Christianity, and Islam. For all three are agreed that we human beings are members of a special kind of creatures, made in the image of God and for a purpose intended by God.

    In short, I recognize that developments in physics coming on the last twenty or thirty years can reasonably be seen as in some degree confirmatory of a previously faith-based belief in god, even though they still provide no sufficient reason for unbelievers to change their minds. They certainly have not persuaded me.

    Copyright © 2004 Rationalist International.The recipients of Rationalist International Bulletin may publish, post, forward or reproduce articles and reports from it, acknowledging the source: Rationalist International Bulletin # 137. Copyright © 2004 Rationalist International

  • Review of Taner Edis’ Ghost in the Universe

    Taner Edis’ excellent book The Ghost in the Universe comes to us at a rather unique period for writing about science and religion. Never before have so many books tried to analyze the relationship between theological and scientific views of the world, and never before have so many utterly failed in the attempt. Often, writers distort something essential about both disciplines, and ignore the complexities at the heart of their relationship. Thus, although the bookshelves groan under the weight of volumes contributing to the debate, clear-minded analyses of the fundamental issues are harder than ever to find.

    To better understand the achievements of Edis’ book, we should quickly survey some of the competing contributions. We find many theistic writers enlisting bogus paranormal phenomena to prop up arguments for God’s existence, turning a blind eye to scientific evidence disproving those same phenomena. Meanwhile, creationists tirelessly seek gaps in scientific knowledge to fill with theistic explanations. Then we find the academic theologians, who often write in convoluted, abstract styles that defy clear summary. Partially in response to the sins of believers, writers who oppose religion document its various shortcomings, but also reduce it to a collection of false empirical claims and naïve hopes. In doing so, they fail to address the ethical and social functions from which religion draws much of its appeal.

    Authors sympathetic to religion seek to avoid the mistakes of religion’s most ardent supporters and passionate detractors, but usually commit distortions of their own. The late Stephen Jay Gould, for instance, redefined “religion” as a docile hybrid of wide-eyed wonder and ethical discourse, ignoring its claims to divinely revealed truth about the empirical world. This definition should lead us to wonder why so many theists insist on rejecting naturalistic explanations of the world, if these explanations cannot undermine religious belief. Following a slightly different strategy, writers such as the outstanding science educator Eugenie Scott reduce science to a provincial system of methodological naturalism, incapable of addressing questions about the existence of God. This view is appealingly nonconfrontational, and assuages fears that science dissolves cherished religious doctrines. Yet, it does not explain why its proponents don’t believe in unicorns, leprechauns or other such conjectural entities, all of which also lie outside the imagined boundaries of science. In fact, it doesn’t help us to understand why many of its proponents, including Scott herself, do not believe in God. Slanted reasoning and wishful thinking dominate at all levels of the debate, leaving the honest inquirer struggling to find informed analysis.

    The Ghost in the Universe is an extremely important step in the right direction. The great strength of the book stems from Edis’ mastery of vast quantities of knowledge from different subject areas, and his ability to incorporate this information into solid generalizations. His attention to factual and logical details causes him to resist playing fast and loose with definitions of terms like “science,” “religion” or “God.” Instead, Edis carefully interrogates the assumptions and definitions of most popular discourse about science and religion, and uses historical, scientific and philosophical insights to illuminate every issue he discusses. In fact, the book goes a long way toward explaining why Eugenie Scott and many other scientists no longer believe in God, even if some of these same scientists diplomatically deny casual connections between scientific literacy and non-belief.

    Edis begins the book by surveying the failures of traditional theistic arguments, and examining attempts by contemporary theologians to explain the concept of God. He explains the ideas of theists ranging from traditional creationist Henry Morris to academic theologian Richard Swinburne. In the process, he shows that philosophically sophisticated theologians have rightly rejected the simplistic model of God as a “Great Boss in the Sky,” but have been unable to present a logically coherent alternative. In fact, Edis persuasively argues that these theologians have multiplied the conceptual confusions of theism by presenting God as an abstraction with no tangible connection to the real world. For instance, we read that “God is absolute because the divine experience is composed of the totality of all experience” in the work of process theologian Frank T. Miosi, but what can such opaque rhetoric really tell us about God? Heinrich Heine seemed to have this kind of pseudo-discourse in mind when he noted, “in such a style, the truth cannot be told.” Indeed, many academic theologians seem to have lost interest in truth altogether. As we wade through the swamp of double-speak in their discourse, we find nothing solid on which to build a foundation of belief. Neither we nor they seem to understand why this new, bloodless God should even matter. He seems to be a thin wisp of speculation – a Ghost in the Universe.

    As Edis argues, traditional creationists like Morris understand the problem of God better than their more “respectable” peers in one important way – they realize that an all-powerful, all-loving Creator would leave tangible evidence of his existence. That is, a world designed by a loving, powerful deity should be distinguishable from a world designed by physical laws alone. It doesn’t make sense to claim God as the “ground of all being,” as obscurantist theologian Paul Tillich did, while also retreating from any real claims about what God is, and how we can know he exists. Thus, Morris cites the Second Law of thermodynamics and other empirical details in an attempt to explain why God is necessary. The attempt fails, because he states and interprets the facts incorrectly, but at least he strives for an intellectually coherent theology. As Edis concludes, “Morris’ basic approach of presenting a religious picture of the world and supporting it with science is correct. There is no doubt the creationists’ God makes a difference.” The God of the philosophers, however, is a conceptual muddle:

    We do not know what it means for the unsurpassingly great, wholly unanthropomorphic God of sophisticated theism to exist or not. Devout philosophers still say a God exists and cares about us, but such religiously crucial fact claims have become idle words. We cannot even take a leap of faith against the evidence and accept such a God, for we have no idea what we are supposed to be talking about.

    Thus, we come to the modern dilemma of discussing God. If we try to build a case for theism based on scientific facts and explanations, the case invariably fails. If, however, we try to paper over the conceptual problems of theism with tortured abstractions, we end with a model of God that is both irrelevant and incomprehensible.

    Things weren’t always this way. At one time, God was more than a hypothetical abstraction, and faith in his providence and design buttressed every major discipline of study. Theology was the queen of the sciences, because it confidently claimed insight into the nature of this benevolent God. The natural theology of the 17th and 18th centuries stood upon the belief that growing scientific knowledge would reveal more evidence of God’s providential design. As Edis observes, John Ray’s popular 1691 book The Wisdom of God Manifested in the Works of Creation was typical in its faith that everything in nature had its divinely ordained place, and the great chain of living beings embodied God’s plan for creation. For centuries, this belief changed in detail, but not in essence. In Thomas Burnet’s late 17th century work The Sacred Theory of the Earth (not discussed in Edis’ book), the scholar integrated contemporary geological knowledge into a history of the earth grounded in Christian theology. In this book, God mattered, and science and religion were two complementary and necessary approaches to a cohesive worldly philosophy. The origin of the earth, the Great Flood of Genesis, and the assumed final conflagration of the planet were depicted as historical events in the lifespan of creation, all benevolently presided over by Jesus.

    Within a couple of centuries, things would change. Scientific knowledge about the natural world progressively eroded the credibility of evidence for God’s handiwork. The Copernican revolution removed earth from the center of the universe, and showed that same materialistic laws governed earth and the most distant stars. Geology showed the earth was much older than most people previously believed, casting serious doubt on the idea that such a vast history could be a planned prelude to mankind’s grand entrance. Darwinian evolution followed up by puncturing arguments from design, revealing the history of life to have nothing to do with predetermined plans and everything to do with survival and contingency. Physics and cosmology have also revealed a universe that is infinitely old, with no need for a Creator. Through these developments, the traditional empirical arguments for God have crumbled.

    The rise of objective social sciences research has further damaged the case for theism by removing God from the stage of history. We now see that nations and peoples wage war for complex social and cultural reasons having nothing to do with God’s purposes. Higher Biblical criticism has revealed the Bible to be a collection of legends without solid basis in fact. Similar deflation of sacred versions of history occur when he critically examine the history of Islam or Buddhism. Indeed, one of the merits of Edis’ book is its familiarity with critical histories of other religions, especially Islam. Drawing upon these histories, he shows that stories of miracles and divine revelations have consistently fractured when subjected to rigorous inquiry. Today, these stories are best understood as culturally determined products of human hopes rather than real historical events. This knowledge has understandably crippled our ability to create a meaningful theology. If there were utterly singular and inexplicable events in human history, stories of divine intervention would be more credible. But God is not in the details, after all. As Edis states,

    The practice of critical history is also continuous with that of science. As in science, the results of historical inquiry are not predetermined by principles of interpretation. Historians could have ended in supporting the Bible; we could have discovered that Jewish history, for example, follows a pattern of disloyalty and retribution. We could even have found something about a transcendent force behind events, and changed how we do history accordingly. We did not. In the modern world, we had to risk putting sacred history to the test; unless revelation is an objective historical fact, as conservative theologians fear, God itself increasingly becomes a psychological metaphor. And if critical historians had confirmed the claims of our religions theologians would surely be celebrating. Instead, we have found that our history fits the naturalistic world of science. The only purposes shaping the courses of events appear to be our own. If theologians now cry foul, this shows the depth of loyalty to their myth, but no more. We have learned something about how to do history, as well as natural science; theological spin-doctoring under the name of “interpretation” is not part of either.

    We may add that the genocides of the twentieth century cast ridicule on the notion that God cares for and protects his children. It stretches credibility to claim knowledge of God’s boundless love and infinite powers while the charred victims of ethnic warfare rot in common graves.

    Edis also argues that the existence of God was a plausible hypothesis when it was integrated into a supernatural model of the world. Just as gravity integrates physical evidence as varied as the orbital motion of planets and the acceleration of falling bodies, the notion of God once seemed to unify a whole range of otherworldly phenomena. To the medieval mind, exposed to the threat of plagues and other deadly diseases, the world seemed to be swayed by mystical forces. Demons and witches allegedly caused epidemics and village fires, and worship of particular saints could hypothetically protect humans from the ravages of unseen evils. Many believed that God heard and answered prayers, while others believed in ongoing divine revelations through prophecies or visitations by angels. As long as people remained ignorant of science or human psychology, such anecdotes about supernatural occurrences could convince even the most well-informed members of society. That is no longer the case. Instead of confirming the existence of angels and demons, science reveals them to be products of human imagination (although admittedly, you’d never know this from watching television). If we had good reasons to believe in the existence of the kind of supernatural beings discussed by most major religions, we’d have better reasons to believe in God as well. Likewise, if we could show a significant tendency for prayers to be answered, we’d have reason to believe in a God who listened to and responded to these prayers. No such evidence exists.

    This brings us to the present day. As we can see, things could have turned out differently. We could have collected evidence showing that the history presented by one of the world’s religions was essentially correct, divine interventions and all. We could have shown that prayer makes a difference, or confirmed reports of visits by angels or other ethereal beings. Science could have led us to conclude that life has been guided by the hand of divine providence, and the world looks exactly like we’d expect it to if a loving God designed it for our use. Instead, as Richard Dawkins once pointed out, the world looks exactly the way we’d expect it to if there is no God. Why would a beneficent and kindly Creator choose such as messy, wasteful and indirect way to create his cherished human beings – why would He choose the one way that makes it look as if He doesn’t exist? All evidence points to the wholesale falsity of the entire supernatural worldview, and thus undermines potential reasons to believe in God.

    Often, contemporary believers congratulate themselves for their open-mindedness, and laugh at the literal thinking of children who believe God is a bearded, kindly old man in the sky. Yet, Edis’ masterful analysis shows that these children correctly intuit something that their parents have trained themselves not to see – a being as important as God should be recognizable in his creation. A God that matters must be more than a linguistic conjuring trick, or a name we impose on our own hopes and fears. Otherwise, religion will have nothing tangible to say about how we should live, or what God might want from his creatures. But if there are no good reasons to believe in God, how can religion have any significance at all in the modern world?

    This problem, as Edis recognizes, is a central dilemma of religion. While recognizing that no definitive answer may exist, Edis insightfully discusses some of the functions of religion that do not fully depend on its truth claims. For instance, the sacred texts of many religions offer compelling narratives which, at their best, can promote ethical reflection and a sense of shared experience. Like all good narratives, they can give us the ability to appreciate the complexities of human life, and to give meaning to our experience of the world. This meaning is valuable even when we know the story itself never literally happened.

    For instance, Edis describes the sense of wonder that he and many other readers experience through the purely fictional work of authors like J.R.R. Tolkien. Drawing a perceptive analogy with sacred literature, Edis argues that Tolkien’s vast body of mythological lore, resulting partially from his many volumes of unfinished material, negates the possibility of carving a fully coherent worldview out of the great author’s work. Yet, against the claims of fundamentalists seeking simplistic and inflexible understandings of scripture, Edis asserts that this complexity is precisely what makes sacred literature so vitally resonant with human needs. As Edis concludes,

    Perhaps we can think of religions the same way. At their best, they are stories we can appreciate regardless of whether they are remotely true, morally uplifting, or practically significant. After all, human hopes and desires are an incoherent mess, so to consistently speak to us, a myth must be able to generate many different, contradicting levels of meaning. So even the strange, disreputable corners of religion – Gnostic visions and mystic cosmologies, demented apocalyptic fantasies, legends of magic and mystery set in ancient times – are wonderful stories.

    Edis surely knows that this argument can only be taken so far – when it is all said and done, the religious faithful want to believe that their myths provide meaning as well as disclose deep realities about the purpose of the universe. Sacred stories, as literary critic Daniel Green argues, are like other fictional narratives in their selective recreation of reality, but with the important difference that many readers of sacred texts passionately want the narratives to be true. Still, Edis is probably onto something in his when he argues that the rigidly literal interpretations of fundamentalists are not the only valid ways of interpreting sacred literature. Fundamentalists opposed to the deeper reading of the Bible sometimes argue that religious texts demand to be seen as more than convenient fictions, or they will serve no function for us. Yet, this assertion ignores the fact that stories don’t have to be true to be useful, and are often most useful when they’re not convenient. There’s nothing convenient about the multileveled tragedies of Shakespeare, or the intricate moral universe of Jane Austen’s Persuasion. Similarly, it is precisely because some of the Bible is so complex and multifaceted that it serves so well as fodder for ethical reflection. Much of the Bible may not be literally true, but all of it would be considerably less valuable if it were as simplistic as fundamentalists claim it to be.

    Thus, while Edis does not consider religion the only or even the best way to engage in moral reflection, he recognizes its usefulness as one possible expression of moral meanings. Religion may be an illusion, but it is at least partially a rational one because of the social functions it serves. Secular humanists often give short shrift to the possible benefits of religion, so Edis’ attention to this facet of theism comes as a welcome contribution. It is this mixture of empathy and clear-minded analysis that finally distinguishes The Ghost in the Universe from most other books about science and religion. No one interested in this subject can ignore this book – it is a masterpiece.

  • Introduction to Creationism’s Trojan Horse

    Introduction

    It used to be obvious that the world
    was designed by some sort of intelligence.
    What else could account for fire
    and rain and lightning and earthquakes?
    Above all, the wonderful abilities
    of living things seemed to point to a
    creator who had a special interest in
    life. Today we understand most of these
    things in terms of physical forces acting
    under impersonal laws.We don’t yet
    know the most fundamental laws, and
    we can’t work out the consequences of
    all the laws we know. The human
    mind remains extraordinarily difficult
    to understand, but so is the weather.
    We can’t predict whether it will rain
    one month from today, but we do know
    the rules that govern the rain, even
    though we can’t always calculate the
    consequences. I see nothing about the
    human mind any more than about
    the weather that stands out as beyond
    the hope of understanding as a consequence
    of impersonal laws acting over
    billions of years.

    Steven Weinberg,
    1979 Nobel Laureate in Physics

    Dr. Fox’s Lecture

    Nearly thirty years ago one of the
    funniest articles ever published in
    a respectable medical journal appeared.
    Of course, it was not meant
    to be funny. Its purposes were serious
    and sober enough. The conclusions,
    moreover, were trustworthy and had
    important implications for education at all levels. In fact, the conclusions
    had implications for all conveyance of knowledge by experts to intelligent,
    but nonexpert, audiences. In the Journal of Medical Education, D. H.
    Naftulin, M.D., and colleagues published a research study entitled “The
    Doctor Fox Lecture: A Paradigm of Educational Seduction.”1 There is no
    better way to explain the intention and the results of this work than to
    quote from its abstract:

    [T]he authors programmed an actor to teach charismatically and nonsubstantively
    on a topic about which he knew nothing
    . The authors hypothesized that
    given a sufficiently impressive lecture paradigm, even experienced educators
    participating in a new learning experience can be seduced into feeling satisfied
    that they have learned despite irrelevant, conflicting, and meaningless content
    conveyed by the lecturer. The hypothesis was supported when 55 subjects responded
    favorably at the significant level to an eight-item questionnaire concerning
    their attitudes toward the lecture.(emphasis added)

    For purposes of this experiment, the investigators hired a mature, respectable,
    scholarly looking fellow, a professional actor. He memorized a
    prefabricated nonsense lecture entitled “Mathematical Game Theory as
    Applied to Physician Education.” The better popular science magazines
    had recently covered (real) game theory and its possible applications, so
    the title was appropriate. The silver-haired actor was trained to answer
    affably all audience questions following his lecture-by means, as the authors
    explain, of “double talk, neologisms, non sequiturs, and contradictory
    statements. All this was to be interspersed with parenthetical humor
    and meaningless references to unrelated topics.”2 In two of the three trials
    of this experiment, the audience consisted of “psychiatrists, psychologists,
    and social-worker educators,” while that of the third trial “consisted
    of 33 educators and administrators enrolled in a graduate level university
    educational philosophy course.” This counterfeit scholar of “Mathematical
    Game Theory” was called Dr. Myron L. Fox, and a fraudulent but respectful
    and laudatory introduction was supplied.

    Very interesting data followed from the survey and questionnaire administered
    after each session in which Fox’s (and other) presentations
    were made. These were simply the detailed statistics of approval or disapproval.
    The phony Dr. Fox’s presentations of discoveries in mathematical
    game theory were strongly approved by these educationally sophisticated,
    lecture-experienced audiences. But the really funny results are in
    the “subjective” comments added to the questionnaire, that is, in what
    listeners wrote as prose responses to the invitation to comment (the following
    comments are from a number of different respondents). “No respondent
    [in the first group],” Dr. Naftulin and his co-authors wrote, “reported
    having read Dr. Fox’s publications. [But] subjective responses
    included the following: ‘Excellent presentation, enjoyed listening. Has
    warm manner. Good flow, seems enthusiastic. What about the two types
    of games, zero-sum and non-zero-sum? Too intellectual a presentation.
    My orientation is more pragmatic.’” From the largest group of subjects
    for this experiment, the substantive comments were, if possible, even
    funnier: “Lively examples. His relaxed manner of presentation was a large
    factor in holding my interest. Extremely articulate. Interesting, wish he
    had dwelled more on background. Good analysis of subject that has been
    personally studied before. Very dramatic presentation. He was certainly
    captivating. Somewhat disorganized. Frustratingly boring. Unorganized
    and ineffective.Articulate. Knowledgeable.”3

    We highly recommend this article. It should still be possible to find it
    in any university, especially one with a good medical or education library.
    The “educational seduction” of the title refers to what “Dr. Fox” did for
    (and to?) his listeners. This result and many others like it should have affected
    all schools of education, if not teachers generally. However, such
    was not the case. The possibility, indeed the likelihood, of intellectual “seduction”
    in circumstances such as these is probably increasing as specialization
    increases. Countless clones of Dr. Fox tread the academic and
    public policy boards today, as always. Readers familiar with the now universal
    practice in higher education of using end-of-course student
    evaluations as key evidence in faculty promotion and tenure decisions
    will know this: evaluations by students, who lack the requisite knowledge
    but are called on to judge their professors’ expertise in their disciplines,
    can determine the academic fate of nontenured faculty and the possibility
    of merit raises for tenured ones. Intellectual seduction by substantive
    (“content”) nonsense, offered to audiences who want or like to hear what
    they are being told, or who simply assume that what they don’t understand
    must be correct if it sounds scholarly, is nearly universal.

    This book is about a current, national, intellectual seduction phenomenon,
    not in mathematical game theory, but close enough to it. It is a
    case, at least formally, not much different from the Dr. Fox lecture, except
    that the lecturers here actually believe what they are lecturing
    about, or at least they want very much to believe it, or are convinced that
    they must believe it. And they are not actors, but executors of a real and
    serious political strategy. The “audiences” in this case are large; they consist
    of decent people: students, parents, teachers, public officials across
    the length and breadth of the United States (and now in other countries
    of the “developed world”)-people who don’t, in most cases, know much
    about science, especially the modern biological sciences. But they are
    people who are deeply and justifiably concerned about their religious
    faith, the state of their society, and the education of their children. They
    include some people for whom “fairness” and openness to the ideas of
    “the other side” have become the cherished, even the indispensable, characteristics
    of our civilization. Their insistence on the equal worth of all
    earnestly held opinions-whether or not those opinions are well
    founded-makes them relativists whether they know it or not. This book
    is about the newest form of creationism, named by its proponents “intelligent
    design” (ID); but it is, especially, about the organization of the system
    of public and political relations that drives the movement. That system
    operates on a very detailed plan-a set of well articulated goals,
    strategies, and tactics-named “The Wedge” by its executors. It offers an
    upgraded form of the religious fundamentalist creationism long familiar
    in America.

    Neo-creationism

    Creationism has been a perennial nuisance for American science education.
    Despite the persistent fecklessness of creationist arguments and
    their continued failure in the courts since 1925, the creationists refuse
    to go away. The attempts to insert religion into public elementary and
    secondary science education are unceasing, and they now include direct
    efforts to influence college students as well. Efforts to force it into curricula-
    especially those having anything at all to do with biology and the
    history of Earth-have been unremitting since the late nineteenth century,
    and they have continued into the present. The most notorious recent,
    nearly successful, attempt was the 1999 deletion of evolution and
    all immediately relevant geology and cosmology from the Kansas public
    school science standards, by action of the state board of education. Scientific
    integrity was restored to those defaced standards only after a protracted
    political effort to defeat creationist board members and replace
    them with moderates-who eventually undid the damage to science
    teaching and to the state’s reputation.

    The defeated have not given up, however; today they are more active
    than ever in the politics and public affairs of Kansas and other states.And
    increasingly it appears that pro-evolution (pro-science) victories are secure
    only until the next election, when old battles may be revived by
    “stealth” candidates who do not disclose their anti-evolution agenda until
    after they are elected to office. Soon after the restoration of the integrity
    of science standards in Kansas, new efforts, even more forceful and better
    organized than those in Kansas, were mounted in Ohio. More are brewing
    in several other states, gaining added impetus from the Wedge’s efforts
    in the United States Congress. Nor is the phenomenon likely to remain
    limited to the United States; similar efforts are in progress or being
    planned in a number of other countries.

    This struggle is cyclic; there have been short periods of relative quiet
    after major creationist failures in the courts. But the effects of the struggle
    are being felt today far beyond pedagogy in the schools. They are
    everywhere visible, and except for a few conscientious media outlets,
    they also threaten to lower the already variable and uncertain standards
    of science journalism. Contrary to the perception of most scientifically
    literate people, creationism as a cultural presence has in the recent past
    grown generally stronger-even as its arguments, in the face of scientific
    progress, have grown steadily weaker and more hypocritical. Despite the
    intense activity of creationists, no faction, nor any individual advocate of
    one, and no modern creationist “research” program has as yet come up
    with a new, verifiable, fruitful, and important fact about the mechanisms
    or the history of life or the ancestral relationships among living things on
    Earth. For that reason, the scorecard of scientific successes for any form
    of creationism, including ID theory, is blank.

    Creationists, including the newest kind-the neo-creationist “intelligent
    design theorists” who are the subject of this book-offer an abundance
    of theories. These theories are often decorated with open or only
    thinly disguised religious allusions, and they always include the nowstandard
    rejection of naturalism, which is, in these circumstances, the indirect
    admission of supernaturalism. Their contributions to ongoing science
    consist of nit-picking and the extraction of trivialities from the vast
    literature of biology and of unsupported statements about what-they
    insist-cannot happen: “Darwinism”-organic evolution shaped by natural
    selection and reflecting the common ancestry of all life forms. In the
    face of the extraordinary and often highly practical twentieth-century
    progress of the life sciences under the unifying concepts of evolution,
    their “science” consists of quote-mining-minute searching of the biological
    literature-including outdated literature-for minor slips and inconsistencies
    and for polemically promising examples of internal arguments.
    These internal disagreements, fundamental to the working of all
    natural science, are then presented dramatically to lay audiences as evidence
    of the fraudulence and impending collapse of “Darwinism.” How
    are such audiences to know that modern biology is not a house of cards,
    not founded on a “dying theory”?

    Intelligent Design

    Until a few years ago, “scientific” creationism was led by biblical literalists
    like Duane Gish and Henry Morris, whose Bible-thumping and logicchopping
    were easy to discount, even for ordinary (nonscience) journalists,
    by exposing the obvious errors of fact and logic-independently of
    the gross errors of actual science. But those old-timers have now been
    eclipsed by a new brand of creationists who have absorbed a part of their
    following: the new boys are intelligent design promoters, mainly those associated
    with the Discovery Institute’s Center for the Renewal of Science
    and Culture (now Center for Science and Culture), based in Seattle,
    Washington. This group operates under a detailed and ambitious plan of
    action: “The Wedge.” Through relentlessly energetic programs of publication,
    conferences, and public appearances, all aimed at impressing lay audiences
    and political people, the Wedge is working its way into the
    American cultural mainstream. Editorials and opinion pieces in national
    journals, prime-time television interviews, and other high-profile public
    appearances, offhand but highly visible negative judgments on evolution
    or “Darwinism” from conservative politicians and sympathetic public intellectuals
    (assisted in their anti-science by a scattering of “feminist epistemologists,”
    postmodernists, and Marxists)-all these contribute to a rising
    receptiveness to ID claims by those who do not know, or who simply
    refuse to consider, the actual state of the relevant sciences. In documenting
    and analyzing the political and religious nature of the Wedge, and
    bringing together expert comment on the ID “science” claims, we show
    that such grateful reception of the glad tidings of intelligent design is entirely
    unjustified by either the scientific, the mathematical, or the philosophic
    weight of any evidence offered.

    THE WEDGE’S HAMMERS

    Under cover of advanced degrees, including a few in science, obtained in
    some of the major universities, the Wedge’s workers have been carving
    out a habitable and expanding niche within higher education, cultivating
    cells of followers-students as well as (primarily nonbiology) faculty-on
    campus after campus. This is the first real success of creationism in the
    formerly hostile grove of academe. Furthermore, the Wedge’s political alliances
    reach into a large, partisan elite among the nation’s legislators and
    other political leaders. Armed thus with a potentially huge base of popular
    support that includes most of the Religious Right, wielding a new
    legal strategy with which it hopes to win in the litigation certain to follow
    insertion of ID into public school science anywhere-and lawyers
    ready to go to work when it does-the Wedge of ID creationism is, indeed,
    intelligently designed. To be sure, its science component is not. But
    in a public relations-driven and mass-communications world, that is not
    a disadvantage. In the West, opinions, perceptions, loyalties, and, ultimately,
    votes are what matter when the goal is to change public policy-
    or for that matter, cultural patterns. Serious inquiry and questions of
    truth are often a mere diversion.

    This newly energized, intellectually reactionary enterprise will not
    fade quietly away as the current team of ID promoters ages. It is already
    too well organized and funded, and the leading Wedge figures have invested
    too much of themselves for that to happen. Moreover, there is
    every reason to think that religiously conservative, anti-science agitation
    will increase, especially as the life sciences and medical research continue
    to probe the fundamentals of human behavior. As that happens, the general
    public uneasiness with evolutionary biology and the underlying genetics
    and cell biology becomes simple hostility, not just on the political
    right. Some of the far-left intelligentsia help to fuel the hostility, at least
    in academia. Therefore, we have undertaken to document very thoroughly,
    largely but not exclusively by means of the Wedge’s own announcements
    and productions, its steadily increasing output of antievolution
    and more broadly anti-science materials.

    The Discovery Institute’s creationists are younger and better educated
    than most of the traditional “young-earth” creationists. Their public
    relations tricks are up to date and skillful; they know how to manipulate
    the media. They are very well funded, and their commitment is fired by
    the same sincere religious fervor that characterized earlier and less affluent
    versions of creationism. This combination makes them crusaders, just
    as inspired as, but much more effective than, the old literalists, whose
    pseudo-science was easily recognized as ludicrous. And the Wedge carries
    out its program as a part of the evangelical Christian community, which
    William Dembski credits with “for now providing the safest haven for intelligent
    design.”4 The welcoming voices within this community have all
    but drowned out those of its many members who are honest in their approach
    to science, sincere in their Christian faith, and appreciative of the
    protection afforded to both by secular, constitutional democracy. Dembski
    admits that the Wedge’s acceptance among evangelicals is not “particularly
    safe by any absolute standard.”5 Yet in our survey of this issue,
    we see that the evangelical voices most prominently heard, with a few
    notable exceptions, support the Wedge.

    FOCUS ON EDUCATION

    Unfortunately, ID, by now quite familiar among scientifically qualified
    and religiously neutral observers as the recycled, old-fashioned creationism
    it is, drapes its religious skeleton in the fancy-dress language of modern
    science, albeit without having contributed to science, at least so far,
    any data or any testable theoretical notions. Therefore, ID creationism is
    most unlikely in the short term to change genuine science as practiced in
    industry, universities, and independent research laboratories. But the
    Wedge’s public relations blitz (intended to revolutionize public opinion);
    its legal strategizing (intended as groundwork for major court cases yet to
    come); and its feverish political alliance-building (through which the
    Discovery Institute hopes to shape public policy) all constitute a threat
    to the integrity of education and in the end to the ability of the public to
    judge scientific and technological claims. This last threat is not just a secondary,
    long-term worry. Competent, honest scientific thinking is critically
    important now, not only to the intellectual maturation of our
    species, especially of its children, but also to optimal management of
    such current, urgent policy problems as environmental preservation and
    improvement, energy resources, management and support of scientific
    research, financing medicine and public health (including human heredity
    and reproduction), and, in general, the support and use of advanced
    technology.

    Led by Phillip Johnson, William Dembski, Michael Behe, and
    Jonathan Wells-the four current top names of the Discovery Institute’s
    Center for Science and Culture-with a growing group of like-minded
    fellows and co-workers, this movement seeks nothing less than to overthrow
    the system of rules and procedures of modern science and those
    intellectual footings of our culture laid down in the Enlightenment and
    over some 300 years. If this sounds overwrought, we ask our readers to
    proceed at least a little way into the following chapters to judge for
    themselves. In any case, the Wedge admits that this is its aim. By its own
    boastful reports, the Wedge has undertaken to discredit the naturalistic
    methodology that has been the working principle of all effective science
    since the seventeenth century. It desires to substitute for it a particular
    version of “theistic science,” whose chief argument is that nothing about
    nature is to be understood or taught without reference to supernatural or
    at least unknowable causes-in effect, to God. The evidence that this is a
    fundamental goal follows within the pages of this book. No matter that
    these creationists have produced not even a research program, despite
    their endlessly repeated scientific claims. Pretensions to the contrary, this
    strategy is not really aimed at science and scientists, whom they consider
    lost in grievous error and whom they regularly accuse of fraud (as we will
    demonstrate), of conspiring to hide from a gulled public the failures of
    modern science, especially of “Darwinism.” It is aimed, rather, at a vast,
    mostly science-innocent populace and at the public officials and lawmakers
    who depend on it for votes.

    A Neo-creationist’s Progress

    In April 2001, ID movement founder Phillip Johnson released on the creationist
    Access Research Network website “The Wedge: A Progress Report.”
    6 There he reviewed the Wedge’s goals: “to legitimate the topic of
    intelligent design . . . within the mainstream intellectual community”
    and “to make naturalism the central focus of discussion [meaning “of attack”]
    in the religious world.” He cited the establishment of a “beachhead”
    in American journalism, exemplified by articles in major newspapers.
    He declared that “the Wedge is lodged securely in the crack”
    between empirical science and naturalistic philosophy, which he calls
    “the dominant naturalistic system of thought control.” According to
    Johnson, “the [Wedge] train is already moving along the logical track and
    it will not stop until it reaches its destination. . . . The initial goals of
    the Wedge strategy have been accomplished. . . . [I]t’s not the beginning
    of the end, but it is the end of the beginning.”7

    There is some justification for this aggressive show of confidence. As
    Johnson says, ID has won significant coverage in major U.S. newspapers
    and, more recently, abroad as well. In the New York Times, James Glanz
    wrote that “evolutionists find themselves arrayed not against traditional
    creationism, with its roots in biblical literalism, but against a more sophisticated
    idea: the intelligent design theory.” On the front page of the
    Los Angeles Times, Teresa Watanabe wrote that “a new breed of mostly
    Christian scholars redefines the old evolution-versus-creationism debate
    and fashions a movement with more intellectual firepower, mainstream
    appeal, and academic respectability.”8 And Robert Wright (author of The
    Moral Animal: Evolutionary Psychology and Everyday Life
    , Vintage Books,
    1994) points out in a critical Slate article that while ID presents no new
    ideas of any significance, the New York Times article “has granted official
    significance to the latest form of opposition to Darwinism.” Wright concludes
    that although ID is just a new label, a marketing device for an old
    product, it is also an effective one.9

    The admirable, but in this particular case misguided, concern of most
    Americans to be fair, “even-handed,” to consider both sides of a dispute
    respectfully, especially the side claiming to suffer discrimination, creates
    a fertile field for ID activists. They have enough financial backing and
    self-righteous zeal to outlast what little effectively organized opposition
    to them presently exists, especially in the higher education community,
    which one would quite reasonably expect to be in the forefront of opposition
    to the Wedge. There is, of course, the further-and very real-
    possibility that the demographics of the judiciary will shift toward creationism
    should there be appointments of judges with strong doctrinal or
    emotional ties to the Religious Right, where one’s views on evolution are
    once again, as they were in the 1920s, a “litmus test.” There is no doubt
    that the Wedge’s immediate goal is to change what is taught in classrooms
    about the basics of biology and the history of life, as we show here
    from its own documents, sources of support, and productions. But based
    on our demonstration in chapter 9 of the religious foundation of the intelligent
    design movement and the importance of this foundation to the
    Wedge’s goal of “renewing” American culture, we also believe that its ultimate
    goal is to create a theocratic state, which would provide a protective
    framework for its pedagogical goals. In an important respect, the
    Wedge is another strand in the well organized Religious Right network,
    whose own well documented but poorly understood purposes are
    strongly antagonistic to the constitutional barriers between church and
    state.

    As of March 2001, creationists had launched programs to change
    public school curricula in one out of five states across the nation. During
    the writing of this book, creationists were causing significant problems in
    Ohio,Washington, Idaho, Montana, Kansas, Missouri, Alabama, Georgia,
    Michigan, Pennsylvania, and Wisconsin.10 At present, there are renewed
    rumblings in New Mexico, where a hard-fought battle was presumably
    resolved. These programs have not yet attained their broadest goals, but
    they continue to divert precious educational resources, time, and energy
    from the real problems of public education in the United States toward
    the work of responding to creationist attacks. Even in the small, rural
    state of Louisiana, ID advocates seem to be waiting in the wings to initiate
    a sequel to recent attempts by Representative Sharon Weston-
    Broome to declare the idea of evolution “racist.”11 In Kansas, where creationist
    changes to the state’s science standards have finally been
    reversed, the Discovery Institute is nevertheless actively assisting a satellite
    group, the Intelligent Design Network (IDnet), in pushing ID more
    aggressively than ever. In June 2001, IDnet held its Second Annual Symposium,
    “Darwin, Design, and Democracy II: Teaching the Evidence in
    Science Education,” featuring three key Wedge campaigners-Phillip
    Johnson,William Dembski, and Jonathan Wells.12 The great public universities
    are now a main target of wedge efforts: a Discovery Institute fellow,
    Jed Macosko, taught ID in a for-credit course at the University of
    California-Berkeley; his father, Chris Macosko, has been doing the same
    at the University of Minnesota.13

    Concern about the Wedge is building, very late but finally, in scientific
    and academic quarters. The American Geophysical Union considered
    ID a problem serious enough to require scheduling at least six presentations
    on it at the spring 2001 conference.14 Philosopher Robert
    Pennock’s eye-opening book, Tower of Babel: The Evidence Against the
    New Creationism
    (MIT, 1999), analyzed and recounted the philosophical
    and scientific flaws of ID creationism. It is followed by his anthology, Intelligent
    Design Creationism and Its Critics: Philosophical, Theological, and
    Scientific Perspectives
    (MIT, 2001). These books seem to be making a contribution
    in awakening academics to the need for an effective counterstrategy.
    Similar books are on the way; and in book reviews and a spate of
    recent writings, distinguished scientists are at last taking the trouble (and
    it is troublesome, and time-consuming, and costly!) to rebut, point by
    point, the new creationist claims. Of course, those claims are not really
    new. They are rather pretentious variants of the ancient, and discredited,
    argument from design (aptly renamed for our era, by Richard Dawkins,
    the argument from personal incredulity).

    So far, however, no book has documented the genesis, the support,
    the real goals, and the remarkable sheer volume of Wedge activities.We
    have come to believe that such a chronicle is needed if people of good
    will toward science and toward honest inquiry are to understand the
    magnitude of this threat-not only to education but to the principle of
    separation of church and state. The chapters that follow are our effort
    to supply the facts: as complete an account, within the limits of a single
    volume and the reader’s patience, as can be assembled-and checked
    independently-from easily accessible public sources. To convince those
    with the indispensable basic knowledge who are in a position to act, that
    they must do so, we must first make the case that (1) a formal intelligent
    design strategy, apart from and above the familiar creationist carping
    about evolutionary and historical science, does exist, and (2) it is being
    executed successfully in all respects except the production of hard scientific
    results-data. To accomplish these aims, we have had to accumulate
    the evidence, which consists of the massive schedule of the Wedge’s own
    activities in execution of the strategy, together with the actual pronouncements
    of Wedge members. We have allowed them to speak for
    themselves here at length and as often as possible.

    The Wedge’s busy schedule of ID activities and its increasing public
    visibility have been accompanied by a steadily evolving public relations
    effort to present itself as a mainstream organization. In August 2002, the
    CRSC changed its name, now calling itself simply the “Center for Science
    and Culture.” This move parallels the Wedge’s low-key phase-out of the
    overtly religious banners on its early web pages: from Michelangelo’s
    God creating Adam, to Michelangelo’s God creating DNA, to the current
    Hubble telescope photo of the MyCn18 Hourglass Nebula.15 But despite
    the attempt to alter its public face, the Wedge’s substantive identity
    remains. Thus, we refer henceforth to the Center for Science and Culture
    by the name under which it has been known during the period covered
    in this book: the Center for the Renewal of Science and Culture (CRSC).

    The readers’ patience may well be tried at times by the repetitiousness
    of Wedge activities: conferences, websites, trade book and media
    publications and appearances, testimony before legislative bodies and
    education committees, summonses to religious and cultural renewal
    predicated on anti-science. The Wedge’s efficient and planned repetitiousness
    is itself one of our main points. In fact, it is one of the most remarkable
    examples in our time of naked public relations management substituting
    successfully for knowledge and the facts of the case
    – substituting for
    the truth. For that reason alone, it is both interesting and important. It
    must be known and understood if there is to be recognition-among scientists
    as well as the literate nonscientist public-of current anti-evolutionism
    and its aims.

    The Issue

    The issue, then, is not-as ID creationists insist it is, to their increasingly
    large and credulous audiences nationwide-that the biological sciences
    are in deep trouble due to a collapse of Darwinism. The issue is that the
    public relations work, but not the “science,” of the Wedge and of ID
    “theorists” is proving all too effective. It is not refutations or technical dismissals
    of ID scientific claims that are needed. The literature of science
    and the book review pages of excellent journals are already replete with
    those: expert reviews of ID books and other public products are readily
    available to anyone.We provide here what we hope is an adequate sampling
    of those technical dismissals and expert scientific opinions, and we
    document the sound science and the ID anti-science as needed. But in
    the past few years, very detailed disproof has been provided, again and
    again, by the commentators best qualified to speak to the substance:
    some of the world’s most honored evolutionary and physical scientists, as
    well as some of the most distinguished philosophers of mind and science.
    Rather, what is needed now is documentation of the Wedge itself, from its
    own internal and public relations documents, so that the public may understand
    its purposes and the magnitude of its impact, current and projected.
    The issue is not Darwinism or science: the issue is the Wedge
    itself.

    Providing the necessary documentation, including the minutiae that
    can turn out to be important, is always a writer’s strategic problem when
    the intended audience is broader than a small group of specialists. Even
    scholars who demand and are accustomed to copious documentation can
    find it off-putting. Others, members of the most important audience of
    all-curious, able, and genuinely fair-minded general readers-who rarely
    if ever read with constant eye and hand movement between text and references,
    are strongly tempted to give up when confronted with profuse
    supporting data and the necessary but distracting scholarly apparatus of
    notes and references.We do not have a good solution to this problem. The
    endnotes can be taken, however, as running commentary, supplementary
    to, but not essential for, the main text. Our references to literature include,
    whenever possible and therefore in abundance, pointers to sites on
    the World Wide Web.

    No reader needs to use the notes to apprehend the argument and to
    judge its broad justifications-or lack of them. The main text can usefully
    and properly be read for itself alone. But for those readers who decide
    that this argument is to be taken seriously, and who feel the need to arm
    themselves with facts, they are here; or there is a pointer to them, immediately
    serviceable for anyone with access to a computer and an Internet
    connection. Initially, we envisioned a much shorter response than this
    book to the Wedge’s campaign.We have delayed work on other projects
    to write it, even though we would have preferred not to have found it
    necessary. The more we examined the situation, the more expansive and
    invasive the Wedge’s program proved to be, and the greater, therefore,
    was the need we saw for full public examination and for a proper response
    to it. We have watched and waited for the coalescence of an
    appropriately organized counter-movement, and, indeed, a few small
    organizations and individual members of the scientific and academic
    communities, as well as concerned citizens, have recently mounted admirable
    efforts, with only a minute fraction of the resources available to
    the Wedge. But those active people are few, and they need the help of
    everyone who has a stake in the high quality of our civic, scientific, and
    educational cultures.

    Notes

    1. Donald H. Naftulin, John E. Ware, Jr., and Frank A. Donnelly, “The Doctor Fox Lecture: A Paradigm
    of Educational Seduction,” Journal of Medical Education 48 (July 1973), 630-635.

    2. Naftulin et al., 631.

    3. Naftulin et al., 633.

    4. William A. Dembski, “Intelligent Design Coming Clean.” Posted on Metaviews in November 2000.
    Accessed on May 4, 2002, at this site.

    5. Dembski, “Intelligent Design Coming Clean.”

    6. Phillip E. Johnson, “The Wedge: A Progress Report,” Access Research Network. Accessed on April 21,
    2001, at this site.

    7. See the archive of “Phillip Johnson’s Weekly Wedge Update”.

    8. See James Glanz, “Evolutionists Battle New Theory on Creation,” New York Times, April 8, 2001.
    Accessed on April 22, 2001, at this page. See also Teresa
    Watanabe, “Enlisting Science to Find the Fingerprints of a Creator: Believers in ‘Intelligent Design’ Try to Redirect
    Evolution Disputes Along Intellectual Lines,” Los Angeles Times, March 25, 2001. Accessed on April 22, 2001, at
    http://www.discovery.org/news/EnlistingScience.html.

    9. Robert Wright, “The ‘New’ Creationism,” Slate, April 16, 2001. Accessed on April 22, 2001, at
    http://slate.msn.com/Earthling/01-04-16/Earthling.asp.

    10. “2001 Church/State Legislation,” Americans United for the Separation of Church and State, April 18,
    2001.

    11. Will Sentell, “Baton Rouge Legislator Calls Theory Racist,” The Advocate, April 18, 2001. Accessed
    on April 26, 2001, at http://www.theadvocate.com/news/story.asp?StoryID=20792. At Weston-Broome’s April 17,
    2001, public meeting, when a questioner asked her what alternatives to teaching evolution she would consider, she
    mentioned “the design intelligence [sic] theory.”

    12. Intelligent Design Network, “Second Annual IDNet Symposium.” Accessed on April 26, 2001, at
    this site.

    13. See references to the two Macoskos’ teaching activities in Newsletter of the American Scientific
    Affiliation and Canadian Scientific and Christian Affiliation
    , January/February 2001. Accessed on April 23, 2003, at
    this site.

    14. “2001 Spring Meeting,” American Geophysical Union. Accessed on April 26, 2001, at
    this site.

    15. National Center for Science Education, “Evolving Banners at the Discovery Institute.” Accessed on
    August 29, 2002, at this site.

    Barbara Forrest is Associate Professor of Philosophy, Department of History and Government, Southeastern Louisiana University. Paul R. Gross is University Professor of Life Sciences, University of Virginia (Emeritus).

    This Introduction to Creationism’s Trojan Horse: the Wedge of Intelligent Design, Oxford University Press, is republished by permission. Creationism’s Trojan Horse can be ordered at the OUP website.

    There is a website about Creationism’s Trojan Horse here, with reviews and other material.

  • The Derrida Industry…

    …has been working overtime to salvage the reputation of their man. Things are so bad that Joan Scott–who I’m told is a substantial historian, but apparently not much of a philosopher–actually wrote the following to The New York Times:

    [Your obituary writer] is embarrassingly illiterate in the history of philosophy. His obituary is also terribly one sided. I thought the Times was committed to balance. Where are the appreciative quotes from American philosophers and literary critics? From those (and there are many) who have used his work to great effect and taught whole generations of students how to read [sic] differently [i.e., badly]?

    The obituary author may, indeed, be ignorant of the history of philosophy, but certainly no more so than Professor Scott, whose ignorance extends to the present: there are no “appreciative quotes” from “American philosophers,” because American philosophers thought he was a fraud, a betrayal to philosophy’s grand history. Consider this letter from philosopher Bryan Frances (Philosophy, Leeds), who reports he sent his own letter to The New York Times; there is no doubt he speaks for most philosophers here:

    To the Editor:

    Re ‘Jacques Derrida, Abstruse Theorist, Dies at 74’ (obituary, Oct. 9):

    What if philosophy was baseball and Jacques Derrida a baseball player? Judging by your obituary of Derrida, the reader would get the impression that Derrida was a superstar, with a lifetime .330 batting average and over 500 home runs. In reality, he was a substitute second baseman, hitting about .255 over his career with no more than 100 homers or any other baseball accomplishments. He was a particularly flamboyant and outspoken baseball player, for certain, but one who failed to earn respect for his baseball skills.

    Contrary to your obituary, Derrida’s influence in philosophy is very slim indeed in the US, UK, and Australia. In literature and other areas he might have held some respect, but in virtually all the world’s English-speaking philosophy departments, in which you’ll find attempts to formulate relatively precise views accompanied by rigorous supporting arguments, he is viewed as more a charlatan than a philosopher.

    Tough language, but certainly well-supported by the biggest coup for the Derrida Industry, the opinion piece by Mark Taylor (yet another non-philosopher) in The New York Times proclaiming Derrida one of the three great philosophers of the 20th-century, along with Wittgenstein and Heidegger. Of course, even Wittgenstein and Heidegger are controversial choices, though in terms of sheer impact, they are plainly in a wholly different league from Derrida, so much so that anyone knowledgeable about 20th-century European and Anglophone philosophy and intellectual culture must laugh out loud at Professor Taylor’s dishonest hyperbole. (Why do those in literary studies think the intellectual world revolves around their once proud discipline, now enfeebled by three decades of bad philosophy, bad history, and bad social science?)

    Far more interesting, though, is the quality of argumentation offered in support of Derrida’s importance. Let’s take a few representative paragraphs, to see what it is that accounts for Derrida’s importance according to this PR man for intellectual charlatancy:

    Taylor: “Most of his infamously demanding texts consist of careful interpretations of canonical writers in the Western philosophical, literary and artistic traditions – from Plato to Joyce. By reading familiar works against the grain, he disclosed concealed meanings that created new possibilities for imaginative expression.”

    Leiter: It is impossible in the abstract to assess this proposition, but surely it bears noting that a primary reason for skepticism about Derrida is that overwhelmingly those who engage in philosophical scholarship on figures like Plato and Nietzsche and Husserl find that Derrida misreads the texts, in careless and often intentionally flippant ways, inventing meanings, lifting passages out of context, misunderstanding philosophical arguments, and on and on. Derrida was the bad reader par excellence, who had the gall to conceal his scholarly recklessness within a theoretical framework. He was the figure who did more violence than any other to what Nietzsche had aptly called “the great, the incomparable art of reading well,” “of reading facts without falsifying them by interpretation, without losing caution, patience, delicacy, in the desire to understand” (The Antichrist, sections 59 and 52).

    Taylor: “When responsibly understood, the implications of deconstruction are quite different from the misleading clichés often used to describe a process of dismantling or taking things apart. The guiding insight of deconstruction is that every structure – be it literary, psychological, social, economic, political or religious – that organizes our experience is constituted and maintained through acts of exclusion. In the process of creating something, something else inevitably gets left out.”

    Leiter: This isn’t an insight, it’s a tautology. Necessarily, every X excludes not-X, else it would not be X. As even Professor Taylor notes: “something else inevitably [i.e., necessarily] gets left out.” (Whether as an hypothesis about the fundamental workings of language–as Saussure originally conceived it–it is a more substantial hypothesis is a different question, not implicated in Taylor’s formulation.)

    Taylor: These exclusive structures can become repressive – and that repression comes with consequences. In a manner reminiscent of Freud, Mr. Derrida insists that what is repressed does not disappear but always returns to unsettle every construction, no matter how secure it seems.

    Leiter: Whether the “excluded” elements “return” depends on the plausiblity of Derridean readings of texts–and thus we are back at the first point, which can only be adjudicated by contrasting Derridean readings of texts with readings by other scholars. The resemblance to the Freudian conception of repression is superficial and misleading: Freud presents a scientific account of the psychic economy of the mind, according to which, necessarily, certain kinds of psychic energy and the ideas to which they were originally attached manifest themselves in human behavior long after their original occurrence. It is a straightforward empirical hypothesis, for which various kinds of empirical evidence have been offered, both for and against. There is nothing empirical about the Derridean claim, and no theoretical grounding for a claim that necessarily that which has been “excluded” from a text will return.

    Taylor: To his critics, Mr. Derrida appeared to be a pernicious nihilist who threatened the very foundation of Western society and culture. By insisting that truth and absolute value cannot be known with certainty, his detractors argue, he undercut the very possibility of moral judgment. To follow Mr. Derrida, they maintain, is to start down the slippery slope of skepticism and relativism that inevitably leaves us powerless to act responsibly.

    Leiter: Perhaps this is what has been at issue among anti-intellectual right-wingers, who generally rival Derrida for dialectical feebleness and scholarly shoddiness. But this has never been the philosophoical issue about Derrida–after all, skepticism about the existence of truth and/or absolute value, and our knowledge of either, has been a staple of Western philosophy in one form or another, from the Sophists to Hume to Michael Dummett. The problem with Derrida is that, unlike these other important philosophers, Derrida has no arguments that are both good and original; his case for skepticism is the stuff of bad sophomore-year philosophy papers.

    Professor Taylor ends with an homage to Derrida’s personal kindness and consideration–something I’ve had confirmed by others who knew him. There seems no doubt that unlike, say, Heidegger, who was a personal and moral monster, Derrida really was a decent human being in his interpersonal dealings. But that, I’m afraid, is not what is at issue here. If he had become a football player as he had apparently hoped, or taken up honest work of some other kind, then we might simply remember him as a “good man.” But he devoted his professional life to obfuscation and increasing the amount of ignorance in the world: by “teaching” legions of earnest individuals how to read badly and think carelessly. He may have been a morally decent man, but he led a bad life, and his legacy is one of shame for the humanities.

    Was it entirely an accident that at the same time that deconstruction became the rage in literary studies (namely, the 1980s), American politics went off the rails with the Great Prevaricator, Ronald Reagan? Is it simply coincidental that the total corruption of public discourse and language–which we may only hope has reached its peak at the present moment–coincided with the collapse of careful reading and the responsible use of language in one of the central humanities disciplines? These are important questions, and I wonder whether they have been, or will be, addressed.

    UPDATE: A student at Yale Law School writes: “They are most certainly important questions, and one book that deals with them is a book by a Yale professor of English, David Bromwich; it’s entitled Politics by Other Means. It gives a good and thoughtful lashing both to academic identity politics and to the Reagan administration’s corruption of the public sphere. I commend it to your attention.”

    [Ed.: See this In the Library for another recommendation of David Bromwich’s Politics by Other Means.]

    This article first appeared on The Leiter Report October 31 and is republished here by permission. Brian Leiter is Joseph D. Jamail Centennial Chair in Law, Professor of Philosophy, and Director of the Law & Philosophy Program
    at the University of Texas at Austin. The Leiter Report is here.

  • 1621: A Historian Looks Anew at Thanksgiving

    “A Thanksgiving for plenty. O Most merciful Father, which of thy gracious goodness hast heard the devout prayers of thy church, and turned our dearth and scarcity into cheapnesse and plenty: we giue thee humble thankes for this thy special bounty, beseeching thee to continue this thy louing kindnes unto vs, that our land may yeild vs her fruits of increase, to thy glory and our comfort, through Iesus Christ our Lord, Amen.”

    This prayer of Thanksgiving was not used by the Pilgrims in 1621, but with these words we must begin, if we want to assess the claims that, “The 1621 gathering in Plymouth was not a religious gathering but most likely a harvest celebration much like those the English had known in farming communities back home,” [1] or that the Pilgrims’ rejoicing together in 1621 was a harvest home best described as a “secular event.” [2] The Pilgrims did not use that specific prayer of thanksgiving for a plenteous harvest for the reason that its words are found among those “stinted prayers” prescribed in the Church of England’s Book of Common Prayer and thereby required by state authority to be used by all Englishmen. Although the Pilgrims preferred extemporaneous prayer, these words from the Book of Common Prayer are exactly what “the English had known in farming communities back home,” repeating them year after year in celebrations where, by the combined authority of state and church, a harvest home simply was not a “secular event.”

    Edward Winslow, in Mourt’s Relation, has given us a brief description of the colonists’ first harvest celebration. Wheat and Indian corn had grown well; barley he described as “indifferently good” [3]; but pease were “not worth the gathering.”[4] Winslow continues: “Our harvest being gotten in, our Governor sent foure men on fowling; so that we might after a more speciall manner rejoyce together, after we had gathered the fruit of our labours. They foure in one day killed as much fowle as, with a little help besid, served the company almost a weeke. At which time amongst other Recreations, we exercised our Armes, many of the Indians coming amongst us, and amongst the rest their greatest King Massasoyt, with some nintie men, whom for three days we entertained and feasted. And they went out and killed five deere, which they brought to the plantation and bestowed on our Governour, and upon the Captaine and others. And although it be not alwayes so plentifull, as it was at this time, with us, yet by goodnesse of God, we are so farre from want, that we often wish you partakers of our plentie.”[5]

    Governor William Bradford, in Of Plymouth Plantation, reported that fishing had been good all summer, and, in the fall, “begane to come in store of foule, as winter approached […] And besides water foule, ther was great store of wild Turkies, of which they tooke many, besids venison, etc.”[6] One would suppose that Bradford’s text justifies the assumption that turkey was included when the four Pilgrim hunters returned with “much fowle.” James Deetz, an archaeologist and anthropologist who enjoyed making iconoclastic pronouncements about Pilgrim history, opined, however, that , “As for turkeys, it is less than likely, though not impossible, that some may have been taken as well.” How does he arrive at “less than likely”? Deetz surmises that the plentiful presence of migrating waterfowl made shooting turkeys inefficient. (Forget Bradford!) Besides, writes Deetz, “Bradford distinguished between fowl/waterfowl and turkeys, and while turkeys are fowl, the fowl mentioned by Winslow were almost certainly ducks and geese, and, therefore, fall into Bradford’s fowl/waterfowl category.” Bradford’s text mentions fowl, then divides that general category into waterfowl, on the one hand, and land birds, on the other, specifically naming turkeys, “of which they took many.” Bradford does not place turkeys over against a “fowl/waterfowl category.” Ignoring the careful practice of categorization that characterizes much seventeenth-century thought (influenced by the philosopher Petrus Ramus) and is here in a simple form expressed by Bradford, Deetz stretches a quasi-analytical examination of Bradford’s use of vocabulary to reach a non-sensical conclusion, whose only purpose seems to be its denial that today’s Thanksgiving turkey dinner has even a remote origin in the festivities of 1621.

    “The most remarkable thing about Winslow’s brief account is that it makes no mention of giving thanks,” writes Deetz, who is clearly the inspiration for Grace and Bruchac’s version, that, “The English never once used the word ‘thanksgiving’ in association with their 1621 harvest celebration.” How are we to understand this omission? Does it mean there was no Thanksgiving?

    Despite the nearly total absence of any mention by the Pilgrims of witchcraft (a topic noticed explicitly only twice in all their colony’s court records – inconclusively and without any convictions), the Deetzes’ book devotes an entire chapter to the subject of witchcraft in Plymouth Colony. “There Be Witches Too Many” is the misleading title. Magical beliefs and superstition having been common in England and obviously present in other parts of New England, “It is not possible that the men, women, and children who settled in Plymouth Colony would have been free of such influences,” write the Deetzes; “[…] such beliefs would be taken for granted, part of a popular culture that did not need to be detailed.” Well and good — but how is it, then, that one must assume that the Pilgrims, whose history was called into existence by a shared religious conviction and vision, were “free of such influences” (religious influences) when it came to their harvest celebration?

    A more careful examination of Winslow’s vocabulary and of the specific cultural context in which he wrote will illuminate the implications of the words he did choose, indicating the assumptions of that culture “that did not need to be detailed.” Among many examples of the contextual meaning of Winslow’s words, his assertion that, “the Civill Magistrate is the Minister of God, a Revenger to execute wrath on him that doeth evil,” typifies the implications of his vocabulary. It also reveals Winslow’ expectations of his audience. He did not need to state in so many words that he was referring to Romans 13:4. St. Paul was commenting on “the powers that be [and that] are ordeined of God,” when he wrote in that verse that, “he is the minister of God for thy wealth: but if thou do evil, feare: for he beareth not the sworde for nought: for he is the minister of God to take vengeance on him that doeth evil.” The Pilgrims and others in the Puritan and Separatist tradition used the Geneva Bible translation of 1560, where a marginal note explains that, in the Greek text, the verse reads “a revenger with wrath.” Winslow obviously knew that, and he could presume that his readers knew it, too.
    When Winslow described the Pilgrims’ intention, “after a more speciall manner [to] rejoice together, after we had gathered the fruit of our labours,” he was alluding to John 4: 36 and to Psalm 33. The first is, “And he that reapeth, receiueth wages, & gathereth frute vnto life eternal, that bothe he that soweth, & he yt [that] reapeth, might reioyce together.”

    Psalm 33, verses 1-5 and 18-22:

    Reioyce in the Lord, ô ye righteous: for it becometh vpright men to be thankeful.
    Praise ye [the] Lord with harpe: sing vnto him with viole & instrument of ten strings
    Sing vnto him a new song: sing cheerfully with a loud voyce.
    For the worde of the Lord is righteous and his workes are faithful.
    He loueth righteousness & iudgement: the earth is ful of the goodness of ye Lord.
    […]
    Beholde, the eye of the Lord is vpon them that feare him, & vpon them, that trust in his mercie,
    To deliver their soules from death, and to preserue them in famine.
    Our soule waiteth for the Lord: for he is our helpe and our shield.
    Surely our heart shal reioyce in him, because we trusted in his holie Name.
    Let they mercie, ô Lord, be upon vs, as we trust in thee.

    The upright, whose souls will be delivered from death and who are preserved from present famine, are enjoined by the Psalmist to be thankful. The marginal interpretation in the Geneva Bible, however, explains that the specific form described is no longer literally required: “to sing on instruments was a parte of the ceremonial service of the Temple, which doeth no more apperteine.” What was appropriate, now? The established, traditional forms of Anglican liturgy, or of recurrent Catholic festivals – obviously not! Puritans had not yet become dominant in England and had not yet reformed England’s calendar of medieval superstition, so Calvinist days of thanksgiving or penitence had not yet taken shape there. Radical Protestants like the Pilgrims looked for Christian, biblical precedents. Games and feasting were biblical. Everyone was familiar with the rhymed version of Psalm 33, by Thomas Sternhold and John Hopkins. “Our soule in God hath ioy and game / reioycing in his might: For why? In his most holy Name / we hope and much delight.” “A day of feasting and ioye” was the biblical precedent provided by the celebration of Purim established in Esther 9: 18-22. {The Old Testament Feast of Tabernacles (Deut. 16: 13-14) was a harvest festival lasting “seuen daies, when thou hast gathered in thy corne, and thy wine. And thou shalt reioyce in thy feast, thou, and they sonne, and thy daughter, and thy servant, and thy maid, and the Levite and the stranger, and the fatherles, and the widow, that are within thy gates.” The biblical injunction to include the “stranger” may have led to the Pilgrims’ inviting their Native neighbors to rejoice with them.}*

    Their exile in Leiden, Holland, had provided the Pilgrims with an even more explicit pattern for how a Reformed people could express its thanks to God. “Every year throughout the city a General Day of Prayer and Thanksgiving [was] held and celebrated on the Third of October, to thank and praise God Almighty that he so mercifully had saved the city from her enemies,” wrote William Brewster’s friend, Leiden’s mayor, publisher, and historian, Jan Orlers, describing the celebration of the lifting of Leiden’s siege in 1574. In Leiden, bread and fish brought in to revive the city’s starving survivors (half the people had died) gave a parallel with the New Testament story of the feeding of the five thousand (Matt. 14: 13-21; Mk. 6: 34-44; Jn. 6: 5-13). The celebration included feasting preceded by prayers of thanksgiving. Festivities lasted several days, with games, militia reviews, and general jollity, besides a free market fair. Leiden’s Thanksgiving on October 3 is not the only source (all agrarian communities have harvest thanksgivings), but it is one of the important sources for understanding how the Pilgrims chose to give form to their thankful rejoicing together in a more special manner. They thanked God for their preservation during their first year in Plymouth, where, as in Leiden’s siege, half the community had died, leaving the survivors to hope for and depend on divine protection and providence.

    Returning to the historical sources for a contextual understanding of Winslow’s words brings no shocking revelations. The Pilgrim leaders undeniably conceived of their lives in religious ways. A thankless or secular harvest festival was unthinkable.

    The interpretive obtuseness indicated by the recent revised version of the 1621 Thanksgiving is not, however, isolated. “The colonists thought they had a right to help themselves to whatever they pleased,” we are told. Never mind that Winslow details the efforts, ultimately successful after several months, to locate and pay Native owners of corn removed from storage baskets – to provide compensation for what must have looked like theft. He repeatedly expressed the Pilgrims’ desire to make it clear that those particular colonists would neither practice nor condone theft from the Indians. Grace and Bruchac also proclaim to their audience of school children that the Pilgrims robbed Indian graves, despite Winslow’s explicit statement to the contrary. Obviously one must assume that the Pilgrims were self-serving liars. Winslow writes that coming on “a bow with rotted arrows” in a mound Pilgrim explorers were investigating, “we supposed there were many other things, but because we deemed them graves, we put in the bow again and made it up as it was, and left the rest untouched, because we thought it would be odious unto them to ransack their sepulchres.” The Pilgrims later removed some objects from the grave of a European sailor but avoided disturbing what they recognized as Indian graves. Contrasting with the grossly insensitive, lying, thieving Pilgrims, the Indians are presented as neo-Romantic idealists who “considered themselves caretakers of this land […] owned by none, but held and used with respect by all.” (In point of fact, Native land tenure in the 17th century was personal and hereditary; see my book Indian Deeds, Land Transactions in Plymouth Colony, 1620-1691 (Boston: N.E.H.G.S., 2002). Communal use appears in the 19th century as a response then to new circumstances.) Sentimental photographs of high quality continue the maudlin iconography of Indians as last representatives of a fine and more noble pristine past, oppressed by crude invaders.

    What is most remarkable is that the National Geographic could get it so wrong! (A few years ago The National Geographic Magazine repeated the fantasy that the so-called Mayflower Barn in England was built with timbers from the ship, even though that myth was expertly demolished more than eighty years ago.) Grace and Bruchac do not pretend to be professional historians and their reliance on the anonymous contribution represented by the phrase “with Plimoth Plantation” did not save them from repeating stereotypical myths that arose in the 19th century as a response to the dominant and equally unrealistic glorification of the Pilgrims as the embodiment of all virtue.

    Our knowledge of the 1621 Thanksgiving comes from Winslow and Bradford. Winslow’s choice of words, understood by his contemporaries, implies to us that the Pilgrims gave thanks to God for their preservation and for the plenty that gave hope for the future. Winslow specifically tells us that the colonists sat down with their Native neighbors and enjoyed several days of peaceful rejoicing together. It is a history with potent symbolism, and it needs neither apology nor distortion.

    *The sentence in brackets { }in paragraph 9, referring to the Feast of the Tabernacles, does not appear in the Mayflower Quarterly article, having been lost through computer problems.

    [1] Quotation from “Catherine O’Neill Grace and Margaret M. Bruchac, with Plimoth Plantation, Photographs by Sisse Brimberg and Cotton Coulson,” 1621, A New Look at Thanksgiving (Washington, D.C.: National Geographic Society, 2001, p. 39.

    [2] Quotation from James Deetz and Patricia Scott Deetz, The Times of Their Lives, Life, Love, and Death in Plymouth Colony (New York: W. H. Freeeman, 2000), p. 9.
    [notes 3-6 as provided by the editors of Mayflower Quarterly]
    [3] Mourt’s Relation, published in cooperation with Plimoth Plantation by Applewood Books, Bedford MA, Edited by Dwight B. Heath from the original text of 1622 and copyright 1963 by Dwight B. Heath, p. 82. ISBN: 0-918222-84-2.
    [4] ibid.
    [5] ibid.
    [6] Of Plymouth Plantation 1620-1647 by William Bradford. A new edition by Samuel Eliot Morison; First published Sept. 19, 1952; 21st printing Jan. 2001, p. 90.

    Jeremy Bangs (Ph.D. Leiden, 1976) writes about Dutch cultural history, the Pilgrims, and Plymouth Colony. Among his books are: Church Art and Architecture in the Low Countries before 1566 (1997), The Seventeenth-Century Town Records of Scituate, Massachusetts (3 vols., 1997, 1999, 2001), Indian Deeds, Land Transactions in Plymouth Colony, 1620-1699 (2002), and Pilgrim Edward Winslow, New England’s First International Diplomat (2004). He is the author of articles about the Dutch “Remonstrants” and the “Pilgrim Fathers” in the Routledge Encyclopedia of Protestantism (Hans Hillerbrand, ed.). Bangs’ many publications on Pilgrim topics began with articles in The Mayflower Quarterly.

    from The Mayflower Quarterly, 70, nr. 3 (September, 2004), pp. 225-230