Notes and Comment Blog


Maybe a Lottery Would be Better?

Mar 23rd, 2003 10:08 pm | By

Richard Dawkins likes to outrage people. He’s not the only person in the world who likes to do that, in fact it’s just barely possible that there are one or two people connected with Butterflies and Wheels who don’t mind irritating. However that may be, Dawkins has done it again.

Evil is not an entity, not a spirit, not a force to be opposed and subdued. Evil is a miscellaneous collection of nasty things that nasty people do. There are nasty people in every country, stupid people, insane people, people who should never be allowed to get anywhere near power. Just killing nasty people doesn’t help: they will be replaced. We must try to tailor our institutions, our constitutions, our electoral systems, so as to minimise the chance that such people will rise to the top…And we democracies might look to our own vaunted institutions. Are they well designed to ensure that we don’t make disastrous mistakes when we choose our own leaders? Isn’t it, indeed, just such a mistake that has led us to this terrible pass?

Leaving aside what he says about the war in Iraq (because my colleague doesn’t agree with him on this point, while I think a thousand and one contradictory things), I do think he’s absolutely right about the US method of electing a president. We do keep electing shockingly embarrassingly unqualified people. I’ve often thought we ought to think about the UK system, where the parties choose the candidates and the voters choose between them. Here we choose the candidates ourselves and boy do we do a crap job of it. But then again maybe it wouldn’t help. Over there it seems to be accepted that a candidate with brains and skill and competence will be more electable than a bumbling inarticulate folksy mediocrity with ‘family values’, but here that is not the case. In the last election I heard with my own ears people rejoice at the fact that George Bush II was an ordinary guy just like the rest of us. Not, as Richard Dawkins points out, the way a CEO is chosen, so why this job? Who knows.

It’s not new though. We started off this way. There was much irrelevant nonsense in the very first election, Jefferson being slagged off as a Frenchified intellectual who had children by a slave concubine (which turned out to be true, confoundingly enough). Richard Hofstadter tells of the anti-intellectualism of the Jackson-Adams elections. We elected one military ‘hero’ after another, most of them with no civilian talent at all.

And yet this, as our folksy head of state keeps reminding the world, is the world’s only superpower. So the single most powerful human on the planet is chosen by a process that mingles elements of a high school popularity contest, an ad campaign for the newest most macho SUV, and good old-fashioned backroom bribery. It is a bit of a mismatch.



Solidarity and Group Think

Mar 21st, 2003 10:51 pm | By

This review by Alan Wolfe is an odd mix of insight and blindness, shrewdness and obtuseness.

Wolfe makes some good points about the inherent difficulties of trying to make a progressive politics out of consumer movements, and about the value of thinking big when writing about history.

For the past two or three decades, historians have been studiously thinking small…As important as social history has been, however, it has also been mind-numbingly narrow in its evocation of detail and in its reluctance to consider the larger meanings of its findings. But Cohen thinks big…One hopes that her book will stimulate her colleagues to take similar risks, even the risk of emulating historians of previous generations whose efforts at intellectual synthesis and grand narrative are treated now with contempt by postmodern pygmies.

But there is also a passage where Wolfe draws a bizarre moral from the segmentation of U.S. consumer markets in the post World War II period.

In theory, consumption, whether we like it or not, ought to unify us, because we all become consumers of roughly similar goods. In reality, marketing specialists discovered in the postwar years that the best way to sell goods is to segment the audience that is buying them…Once again, consumption determined politics. We shopped alone before we bowled alone. Segmented into our zip codes, is it any wonder that our politics became so contentious and our unity around a common conception of the good so impossible?

What can he mean? U.S. politics didn’t ‘become’ contentious after WWII, they always have been. The Depression, WWI, strikes and riots, Wobblies and miners and anarchists, the 1890s, the 1850s, not to mention a contentious little item known as the Civil War. And then again what can he mean in any case? What would a non-contentious politics look like? An ant farm? Clone Nation? There is much to be said for communitarianism, solidarity, and such, but it has to be said with caution. How exactly does one distinguish between solidarity and group think, conformity, organization people in grey flannel suits, outer-directed suburban robots, the pressure of majority opinion that so worried de Tocqueville and Mill? The answer is not self-evident, and not easy.

And then there is the last paragraph, the grotesque last three sentences.

It was not just perversity that led Ralph Nader, a hero of Lizabeth Cohen’s youth, to work so hard on behalf of the Republican Party. He must have realized on some level–and if he did not, then consumers certainly did–that if small cars are unsafe at any speed, one ought to buy SUVs instead. And for that ignoble end, conservative Republicans are the ones to have in office.

That is such an odd thing to say that it actually fooled me, I thought for a minute that Nader had in literal fact been a Republican in some earlier phase. But no, it was merely yet another assertion that It Is Forbidden to vote for a new party, a principle that would have left Lincoln with little outlet for his talents. And what a ridiculous non-argument he presents for it! The SUV! Which took over the universe precisely in the years Clinton and Gore were in office. What did they ever do to push Detroit to engineer better gas mileage, or to change the law so that SUVs would have to meet the same standards that non-bloated cars do? Nothing! Not one thing! They went on bleating about the sacred freedoms of the consumer, that’s what they did, but we should have voted for Gore anyway, because…the SUV situation under Bush is just exactly as bad as it would be under Gore. Huh?



Teach me how to think, please!

Mar 19th, 2003 5:58 pm | By

I have found something useful for philosophers to do!

Surprising news indeed, but take a look at this paragraph from Helen Salmon, student representative for the Stop the War Coalition.

This is not a war for the liberation of the people of Iraq. The US and Britain were happy to back Saddam’s tyrannical regime, his gassing of the Kurds and his war against Iran until he invaded Kuwait. Nor is this a war against weapons of mass destruction. No evidence of such weapons has been found in Iraq, and no war has been threatened against North Korea, despite its possession of nuclear weapons.

Never in the field of writing about human confict, have so many bad argumentative moves been made in so short a paragraph. Let’s count!

1. The fact that the US and Britain were willing to back Saddam has no necessary bearing on whether their combined action in Iraq is a war for the liberation of the people of Iraq (they may simply have realised the error of their ways, for example).

2. War against weapons of mass destruction. Oh dear. The problem here is that Ms Salmon’s logic compels her to the conclusion that whether there is such a thing depends on how good people are at hiding these weapons. Bad regimes, good at hiding – no war against weapons of mass destruction. Bad regimes, bad at hiding – the war’s on!

3. Terrible logic in the next bit about North Korea. Indeed, it’s Stangroomesque in its awfulness! No more to be said, really.*

So philosophers, the challenge for you, should you choose to accept it, is to teach this person to think. Scary, eh?!

*Yes, I know – that’s no kind of argument!



Philosophers – Shut Up Now!

Mar 17th, 2003 7:37 pm | By

What is it about philosophers that they can’t resist pontificating about things they know nothing about? The examples are legion. Mary Midgley and David Stove wittering on about Darwinism and selfish genes. Simon Blackburn and Mary Warnock making a mess even of amateur political commentary. And Roger Scruton demonstrating that there’s no start to what he knows about popular music.

And the latest example? Have a look at this from an article in Issue 22 of The Philosophers’ Magazine (a title which sounds vaguely familiar):

Subjects like sociology, psychology, religious studies and history, which adjoin philosophy, all require empirical support, which is interpreted within the lines of a largely unquestioned methodology. Philosophy is the only subject in which the basic assumptions of these other subjects could conceivably be questioned, so if you don’t fall into line with the assumptions predominant in these other subjects it’s no good running to them for refuge. You’ll probably find minds even more closed there than they are in philosophy itself.

So who wrote this? Maybe a (bad) GCSE student. Nope. Robert Ellis, a philosophy PhD.

Needless to say, it is absolute, utter tosh. Sociology, for example, is rife with theoretical and methodological debate. Even at high school level, students are required to understand that there are huge differences, for example, between the way in which positivists and phenomenologists do their sociology. Method is an explicit part of the A-Level examination. Texbooks have been put together and organised around arguments about what constitutes sociology proper.

And, of course, it’s the same in the other subjects (at least the ones that I know something about.) So, for example, the history of psychology is at least in part dominated by an argument about the appropriateness of behaviourism as a strategy for finding out about behaviour and the mind.

So here’s my Message to Philosophers: Shut up!* You’re making fools of yourselves.

* You are permitted to talk quietly, amongst yourselves – though preferably not in public – about your own subject.



Yum, Gefilte Fish

Mar 16th, 2003 7:56 pm | By

Well, this is a fun item for the eve of war. Even, or do I mean especially, if it’s not really true that many Jews worldwide are hailing this nonsense as a modern miracle. Perhaps that’s just a bit of casual journalistic exaggeration, hmm? After all there are only two witnesses, and the fish is no longer talking, to say the least. Surely the smallness, the minusculity, of the number of witnesses ought to give the most credulous believer pause. Two. I ask you. At that rate couldn’t any one of us get any other one of us to join in a fun-loving prank and tell the world any old thing? ‘My garden gnome suddenly recited page 7 of the Nebraska State Highway Code in Finnish, a language I don’t speak.’ ‘My electric kettle sang the Hallelujah chorus as it came to a boil this morning.’ ‘My scone has the face of the Blessed Virgin on it.’ Oh wait, that last one really happened.

Not to mention the interesting and poignant detail that the two witnesses’ reaction to the miracle was to kill the fish. Well there you go. Jahweh incarnates himself as a giant carp in order to shout warnings at a pair of fish-cutters (what better audience after all? Not a couple of journalists or pundits or heads of state, oh no, that would make too much sense for our whimsical deity) and what does he get? Whacked on the head, cut up, and turned into gefilte fish. That’ll teach him. Smarty-boots. ‘If you want to send a message, call Western Union,’ as my high school English teacher used to snarl when we searched for the ‘meaning’ of Wuthering Heights.

Still, the shouting carp corresponds with the belief of some Hasidic sects that righteous people can be reincarnated as fish.

Can be? Can be? What, because this is a reward? ‘Hey, you’ve been so righteous and good and all-around what is needed that you have the option of going back as a fish. Top that! Am I generous or what?’

Oh well, never mind, I never have understood these things, obviously I’m far too shallow and boring and scientistic. I gotta go, the kettle’s boiling.



Fun at Skool

Mar 15th, 2003 8:23 pm | By

John Sutherland has redeemed himself. I took issue with him a few weeks ago when he wrote a column recommending the UK imitate the US in using athletic scholarships to increase minority access to higher education. I think there are some serious drawbacks to that way of doing things, so I said as much. But I think he’s right on the money here. I’ve nattered about this issue of students as consumers several times on B & W. I’m glad to know other people are noticing. One would think it would be self-evident that 18-22 year olds might possibly want qualities in their teachers other than scholarship or the ability to inspire, and that hence their evaluations would be of limited utility, for the same sort of reason that one doesn’t ask a five-year-old to plan the dinner menu.

It is instructive to note what students rate highly and what royally pisses them off. They like younger professors, generally…Above all, the younger instructors do not “condescend”. Students dislike boring instructors; they avoid waffling instructors who don’t know their stuff; but they loathe, with homicidal intensity, instructors who talk down to them…On the whole, professors know more than a first year undergraduate. How can wisdom and learning “not” condescend when confronted with vacant ignorance? Should you flatter a know-nothing student…?

Exactly so. That is, one would think, what the whole enterprise is about. But of course the idea that a teacher might know more than a student is an awfully ‘elitist’, hierarchical, hegemonic, kind of like colonialist idea, so we’d probably better get rid of it.

The UCLA system demonstrably encourages crowd-pleasing. I have trawled through a few hundred of the review pages and the one criticism which is never made is: “This professor is just an entertainer – there is no substance in his/her class”. Students will happily put up with bad teaching if it is “fun” bad teaching. “Amuse me!”, orders Demos…

Neil Postman wrote an interesting book called Amusing Ourselves to Death. I think he was on to something.



How to Make Bloody-Minded Women

Mar 12th, 2003 7:42 pm | By

The last women’s college in Oxford has just voted to remain a single-sex college. I’m always interested in these campaigns to keep women’s schools single sex, and the idea (which I tend to believe) that single sex education is good for girls and bad for boys. I went to a single sex school myself, one that combined with a boy’s school the year after I graduated. I regretted it at the time but later decided I’d been lucky. If nothing else, I derived the benefit (at least I think I did) that it never crossed my mind for an instant that women were supposed to shut up and let men do the talking. So when I went to a double-sex university I talked and argued with the best of them, if not more. Maybe I would have anyway, not being a notably compliant person; but I wonder.

It is a difficult question. The whole issue of whether women do better when they’ve had a chance to build up some blithe, unaware confidence in a boy-free zone, or whether that notion merely perpetuates the idea that women are so fragile and malleable and pathetic that they have to live in a bubble to survive at all. Val McDermid chooses the first option in this article by a graduate of St. Hilda’s from last year:

I think the single-sex environment allowed women to flourish in a way that is much harder for them in a male-dominated college. It meant that, when we emerged into the world of work, we had a bedrock of self-confidence that made it far easier for us to compete on the unequal terms we found there.

Former student Katherine Wheatley is definite: ‘Women benefit from a single-sex education, whereas men benefit from a mixed one,’ she says, and that this ‘is borne out by the results at GCSE and A-levels year on year.’ I think it’s probably true, I’m glad St. Hilda’s stayed single-sex, and yet, and yet…I also wish women didn’t need special enclaves in order to flourish. But then I wish a lot of things, as we all do. If wishes were horses.



I Win I Win

Mar 8th, 2003 8:37 pm | By

Sometimes I find myself in an odd sort of competition with friends from other countries, specifically the UK: we argue over which of us lives in the more anti-intellectual culture. I say I do, they say they do, and so we improve the shining hour.

But I have a nice little piece of evidence here. Specifically this remark:

One reason people trained as philosophers press so hard for academic jobs is that the United States offers few other opportunities to use their training. Television here, unlike its counterparts in Europe and Asia, almost completely ignores university and intellectual life. So do radio and print journalism, devoting far more airtime and space to sports.

I rest my case. Who can deny it? Is there any equivalent of, say, Radio 4’s ‘Start the Week’ in the US? There is not. Are you kidding? A show on a mainstream (not even the more avowedly ‘highbrow’ Radio 3) radio station where five people talk about serious books and ideas, about books that all five of them have actually read, for a whole hour? I don’t think so! Do we see a lot of people starting their own philosophy magazines in the US and actually making a go of it? Not that I’m aware of!

No, I think I get to declare myself the winner in that particular game.



Oh dear, some journalists should only write about the Spice Girls…

Mar 7th, 2003 10:50 am | By

It is well-known that most journalists write mostly nonsense most of the time. Happily, this is normally about things like Posh Spice, the g-spot or Iraq. But Zoe Williams clearly has greater ambitions, for she writes nonsense about sociobiology.

This article is so bad that it is hard to know where to start when discussing it. Take this claim:

“There are logical problems with it which it doesn’t take a degree in zoology (even from Oregon) to determine. First, it relies, as so many of these theories do, on the egregious notion that, while women’s fertility is all downhill from the moment they start enjoying The Archers, men suffer no deterioration of sperm quality till they’re one day older than Charlie Chaplin. This is a misconception so basic that it’s incredible to hear scientists still peddling it…”.

Right. But no it doesn’t. What it relies on is the fact that women have a far greater investment in their genetic material than do men, so it pays them to get it right in a way in which it doesn’t pay men to get it right. Crudely, if a man has sex with a bad genetic risk, he can have sex with a good one the next day. If a woman has sex with a bad genetic risk, she too can have sex with a good one the next day, but if she’s already pregnant, it’s too late for her genes.

Also, there’s an irony in Williams’s comment here. The fact that she talks about the quality of older men’s sperm, suggests that she accepts the logic of the argument. In other words, the quality of the older man’s sperm only becomes relevant if the argument works were there no deterioration in quality. But, of course, then it becomes a purely empirical question. Is there enough deterioration in quality, that it will prevent the gene for fancying old men from propagating…?

The rest of the article is just as bad.

Take this claim:

“The fact that women have more to gain, biologically, from promiscuity, and men have more to gain from fidelity is very rarely touched upon…”.

Here’s Helena Cronin on the same matter:

“Give a man fifty wives, and he can have children galore; but give a woman fifty husbands – no use at all. For men quantity pays, for women quality pays. In the evolution of our species, many men didn’t breed at all whereas most women did; and some men vastly outbred others whereas women had about equal numbers of children. Thus, men’s stakes – their potential gains, potential losses – were immensely higher than women’s. So, generation after generation, down evolutionary time, natural selection favoured men with an appetite for multiple mates and a disposition to strive mightily for them. And, generation after generation, down evolutionary time, natural selection favoured women who chose prudently – for resources, protection, good genes.” [What Philosophers Think, Continuum, 2003].

So who’s right? Helena Cronin, Co-Director of the Centre for Philosophy of Natural and Social Science at the London School of Economics, or Zoe Williams who is… errr.



Navel Gazing Not the Answer?

Mar 4th, 2003 8:28 pm | By

The other article I had in mind was this one by Lauren Slater in the New York Times last weekend. It’s interesting that both articles express skepticism about the value, especially the curative or therapeutic value, of the talking cure and also of the intervention of therapists after traumatic events. At last! I’ve been rolling my eyes and making sarcastic remarks for years whenever a news story informs us that a plane crashed or a crazed gunman shot up a school/fast-food joint/post office or an earthquake leveled a town, and in the next breath added that ‘counselors are on the way’. As if that helps. As if we can all heave a big sigh of relief because at least professionals will be there to deal with the trauma. As if they don’t in fact often make things worse, hounding people who want to head for a corner and curl up in the fetal position to ‘process the experience’ instead.

And just as Tavris points out in her article, practitioners have a tendency to ignore research and cling to the ideas they’ve grown attached to. There are studies which show that people who don’t want to talk about their traumas in fact do better than people who do. George Bonnano, a researcher Slater talked to, has this to say:

In the 1980’s, trauma became an official diagnosis, and people made their careers on it. What followed was a plethora of research on how to heal from trauma by talking it out, by facing it down. These people are not likely to believe in an alternative explanation. People’s intellectual inheritance is deeply dependent upon a certain point of view.

Yes, and self-absorption is the American way, as Slater wittily points out.

We believe that the human spirit is at its best when it expresses; the individualism that Tocqueville described in his book ”Democracy in America” rests on the right, if not the need, to articulate your unique internal state. Repression, therefore, would be considered anti-American, antediluvian, anti-art and terribly Teutonic.

Ah well. If enough people speak up, maybe we will eventually catch on that narcissism isn’t all that good for us after all.



When Therapy Isn’t

Mar 4th, 2003 7:04 pm | By

There have been a couple of interesting articles on therapy in the past two weeks, each taking a fairly skeptical view of the healing powers of the…discipline? field? trade? What is therapy really?

In this one in the CHE Carol Tavris elucidates the gulf between clinical psychology and therapy on the one hand, and scientific or research psychology on the other, pointing out a number of ironic and/or horrifying facts along the way. For instance there is the fact that in many of the United States it is against the law to call oneself a psychologist unless one ‘has an advanced degree in clinical psychology and a license to practice psychotherapy’ but it is entirely legal to set oneself up as a ‘therapist’ with no training at all. The results are what one would expect, which leads to another fact that shocked me profoundly: that so-called rebirthing therapy is still legal in Colorado. Silly me, I thought it had been outlawed after what happened to Candace Newmaker who was tortured to death in a ‘rebirthing,’ suffocated in her own vomit despite her cries and pleas which went on for…hours. (All because the poor child, age ten, didn’t bond with her adopted mother; gee, imagine that, she must be sick, she must have ‘Reactive Attachment Disorder,’ a fanciful ailment which can actually be found in the notoriously ever-expanding Diagnostic and Statistical Manual, don’t get me started on that…) But no, therapists kicked up a fuss, so the dangerous cruel quackery is still allowed.

The two therapists convicted in Candace’s death are now serving time in prison, but efforts in Colorado to prohibit all forms of “restraint therapy” were defeated by protests from “attachment therapists” in the state and throughout the country. After Candace’s death, one member of the Colorado Mental Health Grievance Board noted with dismay that her hairdresser’s training took 1,500 hours, whereas anyone could take a two-week course and become “certified” in rebirthing. Yet the basic premise — that children can recover from trauma, insecure attachment, or other psychological problems by “reliving” their births or being subjected to punitive and coercive restraints — has no scientific validity whatsoever.

Tavris’ point is that the therapy industry, being completely divided from psychology as a science, does a bad job of checking its own beliefs, and has a great many of them that are discredited by evidence. Therapists are not trained in skepticism and evidence-testing the way scientists are, they don’t correct for their own confirmation bias the way scientists are supposed to, they don’t worry about falsifiability. Tavris also says, however, that clinicians have a depth of insight that is rooted in their work:

I agree that therapy often deals with issues on which science is silent: finding courage under adversity, accepting loss, making moral choices. My clinician friends constantly impress me with their deep understanding of the human condition, which is based on seeing the human condition sobbing in their offices many times a week.

But it is clear that therapy and clinical psychology, and the people they treat, would benefit enormously from much closer connections with research psychologists. But Tavris sees no reason to think that’s going to happen. The split between science and the rest of the world does more harm than we generally notice.



Kinds of Fundamentalism

Mar 3rd, 2003 8:03 pm | By

There is more than one kind of fundamentalism, as Terry Eagleton points out in this essay in the Guardian. Fundamentalism is not so much religious as it is textual, which means it covers a lot of ground.

Fundamentalists are those who believe that our linguistic currency is trustworthy only if it is backed by the gold standard of the Word of Words. They see God as copperfastening human meaning. Fundamentalism means sticking strictly to the script, which in turn means being deeply fearful of the improvised, ambiguous or indeterminate…Since writing is meaning that can be handled by anybody, any time, it is always profane and promiscuous. Meaning that has been written down is bound to be unhygienic…Fundamentalism is the paranoid condition of those who do not see that roughness is not a defect of human existence, but what makes it work.

That’s a brilliant bit of writing. Mary McCarthy examines a similar kind of attitude (one can’t quite call it thinking) in her Memories of a Catholic Girlhood, when she describes the relatives she lived with who, as it were, considered the folds of the brain to be unhygienic.

Fundamentalism makes another appearance in this review of a biography of Irving Howe.

Howe was, by all accounts, a rigid and doctrinaire Trotskyist radical in his youth…The key that opened the door for Howe was literature…If part of his early attraction to Trotskyism had been inspired by Leon Trotsky’s own prowess as an intellectual and literary critic, in good dialectical fashion literature’s open-endedness proved to be a subverter of Howe’s Marxist dogmatism. Plunging into the life of literary criticism, Howe developed, as he himself put it, a ”taste for complication, which is necessarily a threat to the political mind.”

The open-ended and complicated, the improvised and ambiguous and indeterminate, the profane and promiscuous, the unhygienic and rough. Not always what’s needed, of course: not in math or engineering, for example. But when it comes to ‘human meaning’ and human politics, it seems a good deal safer than rigidity and paranoia and excessive hygiene.



Self-fulfilling Prophecy

Mar 2nd, 2003 9:08 pm | By

One of the terms the sociologist Robert Merton, who died last week, was known for was the self-fulfilling prophecy. There’s a lot of the sort of thing about. All the endless assuring each other, for instance, that rationality, secularism, skepticism, atheism are all wrong and mistaken and harmful and stupid because humans have a Deep Need for religion. We have a Longing for ‘spirituality,’ a Hunger for myth, a nostalgia for a Big Daddy to protect us. There is a god-shaped hole at the center of our consciousness and all the silly pointless time-wasting things we do are efforts to fill it. This review of Adam Sutcliffe’s Judaism and Enlightenment, for example, says as much (paraphrasing the argument of the book):

And the prospect of a world without myth is neither possible nor desirable, Mr. Sutcliffe argues: “We need both reason and myth.” Mr. Sutcliffe thus sees his book as more than a contribution to intellectual history. It is also a philosophical argument, he says, a cautionary tale against what he calls “the seductions of rationalist absolutism.”

But is it true? Or is it just something we’ve been told so many times we’ve come to believe it. Along with other mysterious things we’re told over and over until we believe them. Watching tv in the dark is bad for your eyes, swimming immediately after eating will cause you to drown, eating that piece of cake now will Spoil your Dinner, and you mustn’t be an atheist or you’ll spend the rest of your life seeking to fill that damn god-shaped hole. This putative need for myth we hear so much about. Funny, I’ve always (from childhood) been far more aware of the opposite reaction, a feeling of impatience and exasperation when people try to assure me of the truth of obvious fictions. I resented the whole Santa Claus imposition, and from that I went on to resenting similar kinds of fraud. So what about that need then? What about the need not to be systematically lied to all the time by grownups who ought to know better? What about the truth-shaped hole?



Made not Born

Feb 27th, 2003 8:03 pm | By

I’ve been pondering this business of confusing or blurring the boundaries (see this week’s Bad Moves) between a religion and a group of people, between Judaism and Jews, Islam and Muslims, that I touched on in yesterday’s Note and Comment.

It all has to do with Identity Politics, I suppose, which is a large subject, and one we will be exploring in the future. It’s partly a generational matter. All those children of assimilated Jews who turned on their parents with cries of indignation at having been denied their heritage, their background, their identity, and turned into bland inoffensive no ones in particular when they could have been real Jews. It’s an understandable reaction, and yet it has some unfortunate side-effects, at least I think so.

Just for one thing, another boundary it blurs is the one between cognitive matters and genetic ones, between what one chooses and what one is born into, between ideas and circumstances. We are born women or men, human or dog, animal or plant. Up to a point we are born British or Chinese, black or white. But we’re not born Christian or Jewish or Muslim any more than we’re born Marxist or libertarian or Scientologist. We are not born into a set of ideas. We can be and usually are trained up in such sets, but that is not the same thing, and when we come to man’s estate, sometimes we are resilient and strong and autonomous enough to examine the sets of ideas we’ve grown up in and actually decide whether we agree with them or not. I submit that religion is emphatically one of these sets rather than being in the same category as gender or species or order or even nationality or race. If we forget or conceal that fact, and pretend that the religion of our parents is part of our own ‘identity’, we are consenting to our own imprisonment. We are also abdicating our right to make our own cognitive decisions, and making the area of human choice smaller than it needs to be. We are in fact allowing ourselves to be determined, an idea that religious people usually resist. It’s interesting that people are so eager to accuse Darwinian thinkers like Richard Dawkins of being ‘determinists’ when in fact it’s people who conflate religion with identity and nationality who do the really thorough job of that.



Eating Your Cake and Having It

Feb 26th, 2003 11:26 pm | By

There are some strange assumptions in this review of Adam Sutcliffe’s Judaism and the Enlightenment. For one thing there’s a confusion throughout between Jews and Judaism. For another and related thing, there is a confusion between Judaism as a religion and Jewishness as nationality or ‘ethnic’ ‘identity’. As a result, there is a confusion between criticising a religion and hating people or a people.

There is also a lot of familiar and none the less annoying sneering at the Enlightenment.

The British-born historian is not the first writer to knock Enlightenment thinkers off their pedestals. The period’s “dark side” has been a recurring theme for more than a century now. Critics (among them Friedrich Nietzsche, the Romantic poets, and Michel Foucault) have charged the Enlightenment as an accomplice to a range of crimes that include not only racism, sexism, and “phallologocentrism,” but also bureaucracy, technocracy, ecological devastation, Western imperialism — even fascism.

And that’s the end of it. Did the charges stick? Is the evidence any good? Are the witnesses reliable? Have those zany ‘Enlightenment thinkers’ actually been knocked off their pedestals, or is it just that people have been trying to shove them for a long time. (Not to mention the vagueness on the dates of the ‘Romantic poets’ and the silly gossippy ‘dark side’ business, and the question of what pedestals.) Postel doesn’t trouble to say; just announces the off-knocking and moves on. And ends up with this untrue bromide:

And the prospect of a world without myth is neither possible nor desirable, Mr. Sutcliffe argues: “We need both reason and myth.” The “mythic resilience” of Judaism calls attention to the limits of the Enlightenment. “Enlightenment fundamentalism,” Mr. Sutcliffe says, can distort our understanding of the Other, or that which we deem to be irrational. Mr. Sutcliffe thus sees his book as more than a contribution to intellectual history. It is also a philosophical argument, he says, a cautionary tale against what he calls “the seductions of rationalist absolutism.”

We need myth, do we. What then? Are those of us who become aware that the myths are in fact myths supposed to keep quiet about the fact, lest we fall into the dangerous embrace of ‘rationalist absolutism’? Are we supposed to lie? Cover up? And what does it even mean to say we need both reason and myth. What, one minute we believe Jesus walked on water and the next we don’t and the next we do again? Is it possible the two are not compatible? Unless we redefine myth so thoroughly that it no longer means what everyone takes it to mean, in which case we have what? A myth about myth? Unvacated pedestals, is what it looks like.



Are We Like Sheep

Feb 24th, 2003 11:55 pm | By

By way of addendum to my Note & Comment of yesterday, here is the essay ‘Dolly and the Cloth-heads’ that Richard Dawkins and others discussed on ‘Start the Week’. The subject is one that has interested and annoyed me for a long time. For instance when I read Stephen Jay Gould’s strange little book Rocks of Ages in which he, very oddly it seemed to me, simply took it for granted that the way to carve up the world between science and religion is that science should tell us the facts about the world and religion should tell us about morals. What a very peculiar assumption. Also a very common one, to be sure, but not well-founded; I don’t expect unexamined conventional wisdom from people like Gould. Why should religious people have a monopoly on moral issues? Indeed, why should they have any claim to expertise at all? Why are they not on the contrary disqualified, because they rely not on actually thinking about moral issues, but on authority. What good is that? Especially if you take a look at the authority in question. The Bible, for example: not entirely a paragon of ethical wisdom.

So Dawkins is not convinced of the utility of the ‘representatives’ of the various religious ‘traditions’ and the ‘voices’ from each ‘community,’ which all have to be heard lest any one feel slighted.

This has the incidental effect of multiplying the sheer number of people in the studio, with consequent consumption, if not waste, of time. It also, I believe, often has the effect of lowering the level of expertise and intelligence. This is only to be expected, given that these spokesmen are chosen not because of their own qualifications in the field, or as thinkers, but simply because they represent a particular section of the community.

He then goes on to suggest, daringly, that a certain minimum qualification in the brains department ought to be expected along with being a spokesman for a particular ‘tradition’ or ‘community’. It’s the kind of thing that ought to be blindingly obvious but of course is also the kind of thing that drives people into frenzies of irritation. Not us though.



Genes, Yanks, Ethics

Feb 23rd, 2003 5:09 pm | By

When I have an odd moment, or forty five of them, I listen to archived editions of BBC Radio 4’s Start the Week. Yesterday I listened to this one from February 10, with Richard Dawkins and Janet Radcliffe Richards, as well as Robert Harvey and, finally extricated from a traffic jam, Andrew Roberts. This is a highly interesting show which touches on a number of issues we are interested in at B and W. Just for one thing, we get to hear Andrew Marr tell Richard Dawkins ‘You’re not a genetic determinist, are you,’ and Dawkins reply that he’s long been plugging that line: that the way we have evolved does not determine the way we have to be. The brain has evolved, he explains, to over-reach what the genes would want if genes could want anything. He cites contraception as the most obvious example of our doing what our genes wouldn’t ‘want’ us to do. That doesn’t mean it’s easy, he elaborates: it can be difficult to wean ourselves off things that have come to us from our Pleistocene past. In reply to a question from Robert Harvey, Dawkins draws a distinction between what Darwinism can tell us about what we ought to do, which is nothing, and where we get our ethical feelings, which is something. Janet Radcliffe Richards adds that a Darwinian understanding of where we get our ethical feelings tells us nothing about which ones we should follow, that’s an entirely separate question, and Dawkins says ‘Exactly’.

So: there you are then: Richard Dawkins is not a genetic determinist. So it’s time for people to stop calling him that.

The discussion then gets into the equally fascinating territory of why panels and committees that are convened to discuss ethical issues such as cloning always include religious figures. It’s not, Dawkins points out, because they’re especially good at reasoning or arguing, it’s not because they’ve earned a place in such discussions, it’s simply because they represent a tradition, and what an odd reason that is for including people. Indeed. There is also some rather painful chat about how weird Americans are, at which I burst into tears and sobbed ‘Not all of us!’ But I can hardly blame anyone for thinking so.



Down With Indifference

Feb 22nd, 2003 9:09 pm | By

There’s been an interesting convergence lately of worry about passion and its absence, detachment and its dangers, or on the other hand about the intrusiveness and intolerance of passion and engagement. The two stances – passion and dispassion – have been exemplified in two thinkers: Richard Dawkins and Louis Menand.

David Bromwich took Louis Menand to task in the New Republic in January for his lack of a ruling passion or driving enthusiasm, excitement or anger, for being too easily unimpressed, too cool, too responsible and distant.

The idea of a radical break in thought is alien to Menand. The leveling of distinctions also serves as an intellectual labor-saving device. Nothing is very new; nothing, maybe, ever was; nothing matters as much as you think it matters.

Then last week Leon Wieseltier renewed the charge, again in the New Republic. This time the subject was George Orwell, and an essay Menand wrote about him for the New Yorker. Wieseltier is far more indignant than Bromwich (in fact it would be an interesting exercise to set up a Passion-o-Meter for all the participants in this argument).

“We don’t live just by ideas,” he observes in his sedative way, as if anybody believes that we do live just by ideas. Of course, it is precisely because we don’t live just by ideas that we must live also by ideas; but I am getting heavy. Menand sneakily makes Orwell over in his own diffident, perspectivist, mildly anti-intellectual image, so as to relieve us of Orwell’s obligations.

It’s exhilarating to see all these middle-aged or elderly intellectuals speaking up for passion and extremism of opinion. But then we hear from a former Anglican bishop who reviews Richard Dawkins’ new book A Devil’s Chaplain in the Guardian. He makes a very interesting comparison between Dawkins and Darwin, comparing the latter to the polite tactful non-interventionist Anglican (this is an ex-bishop, remember) and Dawkins to the pesky intrusive intolerant Evangelical.

A friend of mine once remarked that he liked Anglicanism, because it didn’t interfere with your religion or politics, whereas Evangelicalism couldn’t leave anyone alone and meddled endlessly in people’s lives. If Darwin was a non-interventionist atheist, Dawkins is a great believer in the pre-emptive strike.

Well possibly, but then again it’s important to remember that Dawkins is a writer and teacher. They are supposed to intervene, that’s their job, that’s the even socially-approved work they do. Teachers are meddlesome and interventionist when they teach pre-literate children to read, too, and innumerate ones to do math, and ignorant ones history and biology and poetry. And a good thing too. Personally I’m with the old geezers speaking up for passion and excitement and commitment. Leave languid tolerance and not caring much to the young, they’re so much better at it.



Chaplains and Evangelists

Feb 16th, 2003 8:28 pm | By

So, we’re agreed then. Comfort and safety and enjoyment are not what’s needed, not unless one is ill or injured or a refugee from a war zone. We need our gadflies and lecturers and correctors and reformers, our troublers of the peace. We need our evangelists.

The Guardian has a review of Richard Dawkins’ new book, A Devil’s Chaplain, today. The reviewer (who, a correspondent tells me, used to be the bishop of Edinburgh) makes an interesting distinction between Darwin’s ‘classically Anglican’ atheism and the classically Evangelical variety Dawkins goes in for.

A friend of mine once remarked that he liked Anglicanism, because it didn’t interfere with your religion or politics, whereas Evangelicalism couldn’t leave anyone alone and meddled endlessly in people’s lives. If Darwin was a non-interventionist atheist, Dawkins is a great believer in the pre-emptive strike.

Well what else are teachers for? That’s their job, isn’t it, that’s what they do and what they’re supposed to do. Isn’t it? Not leaving people alone and meddling endlessly in the contents of their heads? Surely if one actually cares about politics and religion, ‘interfering with them’, i.e. arguing that there are better versions, is the logical thing to do. But then I’m an Evangelistic type myself, so I would think that.



Thorns, Ice, Danger

Feb 15th, 2003 7:44 pm | By

The article by Harvey Mansfield we linked to in today’s News section examines a number of ways students are coddled or spoiled or pampered at Mansfield’s Harvard, coddled rather than being challenged and stretched as he thinks they ought to be and as, surely, is the whole point of education. If we are all perfectly all right just as we are, what do we need education for at all? Decoration? A status symbol, a positional good, bragging rights? A pretext for playing football or getting drunk? An expensive way to postpone getting a job?

The article is accompanied by a colloquy which offers some hair-raising personal testimony on the subject.

A questionnaire I gave students in every class to test their general knowledge led to one minority student claiming that it made him “feel stupid.” I suggested that perhaps the student was reacting in an overly sensitive manner and was informed by the department chairman that there is no such thing as hypersensitivity. (I’m sure this will come as news to paranoid schizophrenics.)

A repeated theme is the idea of education as a consumer item. This is also the subject of an article in Harper’s from 1997 which points out the way student evaluations tend to make teachers want to please and entertain their students more than provoke or push them. The classroom becomes just one more stand-up routine, and students have a good time but are left as they are.

Most of all I dislike the attitude of calm consumer expertise that pervades the responses. I’m disturbed by the serene belief that my function — and, more important, Freud’s, or Shakespeare’s, or Blake’s — is to divert, entertain, and interest…I don’t teach to amuse, to divert, or even, for that matter, to be merely interesting. When someone says that she “enjoyed” the course — and that word crops up again and again in my evaluations — somewhere at the edge of my immediate complacency I feel encroaching self-dislike. That is not at all what I had in mind.

Just so. Enjoyment, pleasure, amusement, and especially ease, comfort, and self-satisfaction are all very well in their way, but they are not enough, and they are not what people need at age eighteen. Discomfort, agitation, fear, excitement, hunger, are what’s needed. Not soft pillows and fluffy blankets and a spot before the fire and a basin of Mr. Woodhouse’s nice thin gruel.