Dude, Where’s My Site of Hegemonic Dominance?

Jan 26th, 2004 2:04 am | By

John Holbo has a very sly post on the tireless Bad Writing subject on his blog. He read the first three issues of the PMLA – Proceedings of the Modern Language Association – for 2003 cover to cover, twice. (Then he had a complete blood transfusion and is well on the way to recovery – now cut that out.) And he has some thoughts.

First he quotes Judith Butler explaining why bad writing is necessary and good:

The accused then responds that “if what he says could be said in terms of ordinary language he would probably have done so in the first place.” Understanding what the critical intellectual has to say, Marcuse goes on, “presupposes the collapse and invalidation of precisely that universe of discourse and behavior into which you want to translate it.”

Then he comments:

Mutatis mutandis, understanding how Bad Writing Contests are funny may presuppose the collapse and invalidation of precisely that universe of MLA discourse and behavior that is the butt of the joke. Judith Butler fails to apprehend the crucially performative aspect of this subversive critical work – the way in which Bad Writing Contests provide a voice to those on the margins by challenging the dominant hegemony of ‘theory’ in literary studies.

Naughty Butler, failing to apprehend the performative aspect. Very droll.

Just as you give up the right to insist on hushed reverence for all things academic when you title your paper, “Dude, where’s my reliable symbolic order?”, so you give up your right to demand fair and open-minded consideration of your views when you yourself explicitly advocate adopting language which functions expressly to short-circuit critical dissent by presupposing that one’s opponents are not just wrong but basically in a state of complete intellectual collapse. Given that this is her attitude, Judith Butler was never going to listen to Denis Dutton’s criticisms of her views in any case. He is, as she says, the editor of “a small, culturally conservative academic journal”. Being a figure on the margins – whom Butler is working to silence, by collapsing any language in which he can express his views – what does Dutton have to lose by mocking, rather than arguing?

There’s a good deal in what he says. He also says a good deal more, including his own crude but effective grading system for the articles in the PMLA. OK and Trivial are two of the possible grades, which is quite sensible. And he ends with a sentence that would make a good Thought for the Day:

It is a nice question whether it is worse to perpetrate tautological emptiness with an aura of significance or egregious unsoundness/invalidity with an aura of significance.

I love nice questions, and I plan to think about that one at odd moments for some time.



Reading For Something

Jan 25th, 2004 5:06 pm | By

One thing (but not the only thing) that prompted this train of thought (or perhaps bus of rumination or minivan of woolgathering or rollerskate of idle daydreaming) was something I read a few days ago in another of Dwight Macdonald’s letters, this one from January 1946, when Macdonald was editing his own magazine Politics.

I suppose you’ve read by now Simone Weil’s article on The Iliad. The response to it has surprised me; I thought it was a great political article, dealing with the moral questions implicit in the terrible events one reads about in every day’s newspaper, which was why I played it up so prominently in the issue…Nothing I’ve printed yet seems to have made so deep an impression. The only people who didn’t understand how such an article had a place in a political journal were – and I think this is profoundly significant – all of them Marxists. To a Marxist, an analysis of human behavior from an ethical point of view is just not ‘serious’ – even smacks a little of religion.

I think the Marxists who didn’t understand must have had a fairly crude understanding of Marxism, but that’s another subject. The relevant aspect is the question of what has a place in a particular kind of journal and what doesn’t – and the fact that Macdonald was thinking about that question. I was already thinking about it when I read that – well in fact I’m always thinking about it, really. Not every second, but every day, usually several times a day. Every time I link to a News item, in fact every time I look for a News item, which in a sense is every time I read anything at all, other than perhaps package ingredients or addresses on envelopes. As Ensign Pulver was looking for marbles all day long, I’m looking for News items all day long. Though not always actively looking – sometimes I’m just reading, like a normal person, and then as I read the act of reading is transformed into the act of reading for something. Though that doesn’t quite describe it either, because I seldom do ‘just read’ any more – or I both just read and read for something. Which is interesting, in a boring sort of way – by which I mean it interests me but I realize it may not interest everyone. Actually maybe I’ve never ‘just read’, at least not exclusively. I think that’s right – but the percentage has changed.

There’s a lot to be said for reading for something. There’s also a good deal to be said for just reading, but on the whole I prefer reading for something, as long as the something I’m reading for is worth it. I thought while I was typing it that all this was an unconscionable digression that I would probably delete, but I’ve changed my mind. It is about the subject under discussion, in a strained sort of way. Why do we read, after all? Surely the way we think about that question has some connection to what kind of thing we want to read, and why, which has some connection to why Macdonald published Weil on The Iliad. She wrote it for something, he published it for something, the readers read it for something.

At any rate, I was thinking about it more than usual even before I read the Macdonald letter. It was because of posting an item about the Bush administration’s approach to science – I was thinking about the fact that that’s not academic (to put it mildly) nonsense, so it’s not strictly our subject. I decided that what did make it our subject was the element of bullshit involved. The fact that it’s not just mistaken, but the kind of mistaken rooted in prior commitments. I decided it is worth pointing out occasionally that the academic left certainly does not have a monopoly on that kind of bad thinking. I try to do that kind of thing sparingly, because otherwise B&W will just be about anything and everything; but that’s my reasoning for doing it once in awhile. That’s my Iliad.

Because our subject is woolly thinking of a particular kind – thinking that’s fuzzy because it’s distorted, because it starts out from the wrong place. Because it starts not from genuine inquiry but from what Susan Haack calls pseudo-inquiry – not from a real desire to find the truth but from a desire to make a case for a pre-selected conclusion. That’s the bullshit aspect, the bogus element, the pseudo factor. B&W examines the academic manifestations of woolly thinking and pseudo-inquiry and bullshit, but it is worth offering an occasional example from other parts of the world too, I think, if only for epistemological reasons. It is part of the overall story, of the Big Pikcha as I said to my colleague the other day. It’s part of an understanding of how woolly thinking and bad moves work, to be able to recognize them in a variety of habitats, to realize that they’re not confined to one discipline or sector of the economy or political orientation. If we don’t know it when we see it, how can we resist it?



What Was the Question Again?

Jan 23rd, 2004 9:19 pm | By

I’ve been thinking a good deal about focus lately. About relevance, subject matter, connections. The competing merits of breadth and intensity, range and depth. About how to think about such things, and how to decide between them – metaquestions again. I seem to think about metaquestions a lot – but then that’s not surprising, is it; B&W is essentially about metaquestions. At least I think it is. That’s one of the metaquestions I’ve been thinking about – what is B&W about.

Not that I don’t know, or don’t think I know, or think I don’t know. I do think I know. Or at least I know I have an opinion. But there could be other opinions – and in fact there is at least one other opinion. There are people – one or two at a minimum – who think B&W could have a narrower focus. Who would even prefer it to have a narrower focus – without necessarily going so far as to say it would be better with a narrower focus. Which leads us into some more metaquestions about such things as the difference (if any – that’s a further metaquestion – whether there is a difference) between preferences and moral or aesthetic or epistemological judgments. Between likes and dislikes on the one hand, and moral or cognitive evaluations on the other. Is saying ‘That’s good’ exactly the same as saying ‘I like that’? How do we distinguish between taste and judgment? Much-vexed questions, all of them.

And in a way part of B&W territory – in my view of the matter. But I can see why other views would differ. I think it is (in a way, as I said) because ‘how do we know?’ and ‘how do we distinguish’ questions are background questions for the more specific, local questions that are our brief. But then I take a broad view, and that’s not the only view possible.

Perhaps it’s more accurate to say I take a variety of broad views. Maybe I take a different one at any given moment – no, I’d better not follow that thought up, or we’ll be here all night, and I’ll never get around to saying what I was going to say. What was that, anyway? Something about what kinds of subjects I think are relevant enough to the concerns and interests of B&W that it makes sense to link to them. Something about epistemology, and what a lot of ground it covers – how hard it is to talk about almost anything without getting into it. Along with something about the interest inherent in the way apparently unrelated things actually fit together, and the thought that what B&W lacks in focus it makes up in range and breadth and interestingness. Or at least that it does to my taste, but that doesn’t mean it does to everyone’s.

But you see the problem. Once I start talking about metaquestions I can’t stop, I start following them as they zip around the room, and pretty soon I’ve got myself tangled up in a horrible knitting-wool snarl of them. Of course that’s not true really – it’s just rhetoric. I could delete and re-write. Leaving the digressions there is part of the point (I have a point? No not really what you’d call a point, just an endlessly-deferred series of approaches to points) – the point being that the subject matter of B&W seems to me to be so large and complicated and branching that a narrower focus would simply leave most of the subject out. But then that depends on how you define the subject.

In fact even the subject of this N&C is too large and branching for one N&C, so I’ll have to do a chapter 2 later.



Investment Tips

Jan 22nd, 2004 1:43 am | By

I’ve been reading a volume of the letters of Dwight Macdonald lately. One bit I read this morning seemed particularly appropriate for B&W. It’s from a letter in December 1937, to Freda Kirchwey, the editor of the Nation, taking that magazine to task for a number of blind spots, such as being too ‘timid and stuffy-genteel’ in its editorial attitude to the New Deal’s recent swing to the right, and particularly for being hostile to the Commission that was investigating the Moscow Trials (he doesn’t mention John Dewey but I assume that’s the Commission usually known as the Dewey Commission).

While I was at Fortune, the Nation was always to me the great symbol of honest, truthful, intelligent journalism – everything that I missed at Fortune. But it now appears that the Nation, too, has its commitments, its investments, so to speak, just like Fortune. A journalistic investment in Soviet Russia seems to me no more admirable than a like investment in the Greatness and Goodness of Big Business.

Commitments and investments. There it is. Those are great things, necessary things, but they can rip such a hole in our ability to think straight and to see what’s in front of us – and above all to tell the truth about it. The late ’30s were notoriously a rough period for that kind of thing. What with the Moscow Trials, and Trotsky, and the POUM and the anarchists in Barcelona, and what had happened during the forced collectivization – there was a staggering amount of systematic lying, denial, looking away, self-deception in those years. It poisoned the air on the left for years, decades – arguably the left still hasn’t recovered. Watch those investments.



Before After Theory

Jan 20th, 2004 9:19 pm | By

This is an interesting review by Elaine Showalter of Terry Eagleton’s new book After Theory.

In the ’80s, theory ruled, and the subject formerly known as literature was banished or demoted in the interests of philosophy and aesthetic abstraction.

Hmm. But was it really philosophy? Or was it just little bits of philosophy here and there. That’s fine, it’s no crime to know only a little about something, that’s certainly my situation about almost everything – but one has to be clear about it. One has to be careful, it seems to me, not to confuse sampling philosophy with really studying it, and one has to be equally careful not to confuse Literary Theory with philosophy, because (this is a subtle point, now, but bear with me) they are not the same thing. One does get the impression at times that Literary Theorists think they are doing something more like philosophy than they in fact are. One also often wonders if they want to do philosophy why they aren’t in the philosophy department. It is there, after all, all set up for the purpose; why not take advantage of the fact? It seems so perverse to amble off to a different department and then set up to do the subject of another one. People don’t enroll in French departments in order to do engineering do they? Or engineering departments in order to do French. So why the English or Comparative Literature Department in order to do philosophy. One of life’s little mysteries.

He admits that cultural theory “has been shamefaced about morality and metaphysics, embarrassed about love, biology, religion, and revolution, largely silent about evil, reticent about death and suffering, dogmatic about essences, universals, and foundations and superficial about truth, objectivity, and disinterestedness.” As he says in a characteristically ironic understatement, those constitute a “rather large slice of human existence” to ignore. So he sets out to fill the gaps and propose fresh theoretical approaches to the Big Questions.

Well yes, it is a rather large slice, and that’s part of the problem, surely. Why should cultural (or literary, or critical) theory be expected or able to contribute to all those subjects? Isn’t that expecting an awful lot? Isn’t that expecting it to be an omniscient and universally applicable discipline? What would qualify it to take all that on? That’s another one of those little mysteries.

Eagleton wants to free cultural theory from crippling orthodoxy by challenging the relativism of postmodernist thought and arguing on behalf of absolute truth, human essences, and virtue (which includes acting politically). He engages with definitions of morality.

And thus we see the drawback to pretensions of universal reach and of confusing literary theory with philosophy – one ends up re-inventing the wheel and belaboring the obvious. If fans hadn’t taken ‘postmodernist thought’ too seriously to begin with, Eagleton wouldn’t now have to waste energy on challenging it, but they did so he does. That’s where fashion gets you.

First, why isn’t literature, rather than theory, the best place to go for help about morality, love, evil, death, suffering, and truth, among other things? Having written a big book about tragedy, Eagleton obviously knows that on some topics, Shakespeare is a lot more relevant than Saussure. Eagleton himself is able to command an encyclopedic range of literary reference, but he takes the side of theory rather than literature, or even a position between the two. “Critics of theory sometimes complain,” he notes, “that its devotees seem to find theory more exciting than the works of art it is meant to illuminate. But sometimes it is. Freud is a lot more fascinating than Cecil Day-Lewis. Foucault’s The Order of Things is a good deal more arresting and original than the novels of Charles Kingsley.”

And then again why are literature and theory the only two choices? And what does ‘help’ mean? What does ‘relevant’ mean? Why Saussure? Why Freud, why Foucault? It’s all a muddle, frankly. One undefined term or unexamined assumption after another. Time for After After Theory.



Eat Your Sugar

Jan 18th, 2004 7:24 pm | By

This sounds familiar, doesn’t it? We’ve read articles like this before? Only a few days ago in fact? The subject does seem to keep coming up. The Bush administration and profit-making entities on the one hand, and scientific advice and knowledge on the other. Bulldozers make better habitats than rivers do; wetlands pollute; academic scientists who receive grants should be kept off federal peer review panels while scientists with ties to profit-making entities should not. Day is night, up is down, black is red. Do we begin to detect a pattern here?

The President insists fighting fat is a matter for the individual, not the state. But today The Observer reveals how he and fellow senators have received hundreds of thousands of dollars in funding from ‘Big Sugar’. One of his main fundraisers is sugar baron Jose ‘Pepe’ Fanjul, head of Florida Crystals, who has raised at least $100,000 for November’s presidential re-election campaign.

The individual, not the state. Right. And that means that the state must not interfere, even by so much as issuing reasonable dietary guidelines, with people who make money by selling sugar-added foods. It’s up to the individual to ignore all that advertising, have a little backbone, and just stay thin. Simple.

The Bush administration, which receives millions in funding from the sugar industry, argues there is little robust evidence to show that drinking sugary drinks or eating too much sugar is a direct cause of obesity. It particularly opposes a recommendation that just 10 per cent of people’s energy intake should come from added sugar. The US has a 25 per cent guideline.

25% added sugar – sure, that sounds about right. Could be worse. Could be 75% after all.

Too bad the spinach lobby isn’t as powerful as the sugar one. But I guess there just aren’t the bucks in spinach that there are in the sweet stuff.



Hands Off Lacan!

Jan 17th, 2004 8:46 pm | By

This is quite an amusing piece. Albeit irritating. So much rhetoric, so much slippery use of emotowords, so much vagueness where precision is needed – all to protect the heritage of Freud and Lacan. Why, one has to wonder. What is it about Freud that makes people one would think ought to know better, cling so fiercely? I suppose I could postulate some sort of psychoanalytic answer, but would that tell us anything?

“When they speak of ‘professionalising’ people whose business is human misery; when they speak of ‘evaluating’ needs and results; when they try to appoint ‘super-prefects’ of the soul, grand inquisitors of human sadness – it is to hard not to agree that psychoanalysis is in the firing line,” Levy said.

That’s a translation, I assume, so perhaps it’s unfair to look too closely at the words – but I’m going to anyway. ‘People whose business is human misery.’ What does he mean ‘business’? People who make money off human misery? Why should they be protected? Or does he mean something along the lines of experts in human misery, people who know a lot about human misery. But one can know a lot about human misery, in some sense – arguably all humans know that – without having the faintest clue how to ‘fix’ it or cure it or do anything about it at all other than hand-wring or watch or write poetry. Even quacks and charlatans can know a lot about human misery.

Critics say the absence of regulation and a growing demand for therapy of all kinds has led to a proliferation of astrologers, mystics and con-artists – and they are demanding that the public be protected by a system of recognised qualifications. But Lacan, who died in 1981, said that the “analyst’s only authority is his own,” and his followers believe the state has no business interfering in the mysteries of the id and the unconscious. In a faction-ridden climate, many psychoanalysts also see the government’s initiative as an attempt by their arch-enemies the psychiatrists – hospital-based doctors who prescribe drugs for treating mental illness – to marginalise their work.

The analyst’s only authority is his own – well that’s blunt, at least. That’s a good concise summing-up of what’s wrong with psychoanalysis. It presents no evidence, it is not peer-reviewed, it rules out falsification. It’s a form of hermeneutic, we’re often told, which is all well and good but it also claims to be therapeutic. It wants to have it both ways, in short: to charge a lot of money for its ministrations on the understanding that they are in some way helpful for human misery, but to escape oversight and regulation on the understanding that psychoanalysis is some kind of sacred mystery. A ‘marginalisation’ of their work would be a fine thing, if you ask me.



The Poetics of History 2

Jan 17th, 2004 5:48 pm | By

My first comment on this subject has prompted some comments that suggest a lot of further comments (I’m in a permanent state of Infinite Regress here: everything I write seems to suggest several hundred more things I could write) and subjects to look into further. Empathy; the relationship of research to teaching; other minds and solipsism; the tendency to value emotional stances like empathy over ‘cooler’ more cognitive commitments to justice or equality; and so on.

And there is also this article in the New Yorker about a book of history and a play, Thucydides’ History and Euripides’ Medea.

To describe this war in all its complexity, Thucydides had to invent a new way of writing history. In his introduction, he says he will eschew “literary charm”—mythodes, a word related to “myth”—in favor of a carefully sifted accuracy…But this desire for what we would call balanced and accurate reporting led, paradoxically, to a most distinctive literary device: the use of invented speeches and dialogues to illustrate the progress of the war and the combatants’ thinking about it. Every key moment in the war…is cast as a dialogue that, as Thucydides admits, may not be a faithful reproduction of exactly what was said in this or that legislative session or diplomatic parley but does elucidate the ideologies at play. “My method has been,” he writes, “to make the speakers say what, in my opinion, was called for by each situation.” This, more than anything, is what gives the History its unique texture: the vivid sense of an immensely complex conflict reflected, agonizingly, in hundreds of smaller conflicts, each one presenting painful choices, all leading to the great and terrible resolution.

Which is very like the way Euripides wrote his plays, to the disapproval of Nietzsche, who claimed that Socrates and Euripides rationalized and so ruined tragedy. But others like the interplay of argument, the effort to think things through, the questioning of each other’s assumptions, that many of Euripides’ plays show us. There is room for empathy – with Medea, Andromache, Iphigenia – and also for enhanced understanding of the issues, at least as one Athenian playwright saw them. And the great skeptical historian was doing much the same thing. Thucydides was a little bit of a dramatist and Euripides was a little bit of a historian.



Graduate School and its Discontents

Jan 16th, 2004 9:04 pm | By

Invisible Adjunct has another good comment thread going. Remember that interesting (and often symptomatic) thread about the MLA a few weeks ago? There have been interesting ones since, and now there’s an especially interesting one. Well I say that because of the two last posts (last at the moment, last when I saw the thread), 10 and 11. Number 10:

In the first year of graduate school in archaeology we spent so much time learning about post-modernist theaory and how archaeology could not really tell you about the past (it could only reveal your current political views on power relationships) that by the end of the year my professors convinced me that there was no reason to continue my studies in that field. I dropped out and went to law school.

Number 11:

I also remember a huge emphasis on postmodernism when I was a doctoral student in the college of education. Yes, I enjoyed postmodern theory, but there were never any other perspectives; I had to find those on my own. For example, we never studied education from a Marxist perspective; after all, Marxism had been determined to be too “modernistic.” I guess my big gripe with postmodern theory is that it tends to lead to nihlism and a total lack of social solidarity and responsibility. It really reached the pinnacle of craziness when issues like classroom management were turned into postmodern “points of view.” For example, I remember several of us classroom teachers posing serious questions about what happened in our classrooms. We weren’t looking for “how-to” answers, but something better than the “what is disobedience, anyway?” I’m sorry, but if you were to spend time in an 8th grade classroom, I don’t think you’d have any problem with the concrete reality of negative behavior. What was super-ironic is that whenever we would be looking at politics or power relations and anyone would give down-to-earth examples of how power REALLY operated (i.e. through control of workers, surveillance, etc.) then those became modernist concerns and were open to interpretation, not social action.

Yes, from everything I hear, people in real-life 8th grade classrooms have no trouble saying what disobedience is, and why you need some of the other thing if you’re going to teach 30 or 35 children. And there’s something really enormously…ironic? Or is that too modernist. Perhaps I mean playful? Yes, no doubt that’s it. There’s something enormously ‘playful’ in the fact that postmodernist theory causes people to quit archaeology and go to law school instead. Actually what should be happening is that everyone everywhere should be dropping out of all academic programs – because those are all modernist projects too after all – and going into advertising. What could be more postmodernist than advertising? Especially now, now that everybody knows that everybody knows that everybody ‘sees through’ advertising, and ‘transforms’ it into a ‘site of resistance,’ so that advertising gets weirder and weirder, or more and more postmodern, in order to out-resist and out-transform and out-postmodernize all those people in the postmodern audience. Surely it’s the duty of all good postmodernists to provide more sites of resistance for everyone. And of course the pay is better, and you don’t risk ending up in places like Ithaca or Lubbock, and you don’t have to do all that reading.



The Poetics of History

Jan 15th, 2004 9:14 pm | By

There was an interesting subject under discussion at Cliopatria yesterday and this morning – history as defamiliarization, poetics and history, the difference between history and fiction. The whole subject touches on a lot of difficult, knotty questions – other minds; the reliability or otherwise of testimony, autobiography, narrative – of what people recount about their own experiences; empathy; imagination; the general and the particular, the abstract and the concrete – and so on. Meta-questions.

I wondered about the much-discussed idea that fiction can teach empathy in a way that more earth-bound, or factual, or evidence-tethered fields cannot. That novelists have a special imaginative faculty which enables them to show what it’s like to be Someone Else so compellingly that we learn to be tolerant, sympathetic, forgiving, understanding etc. in the act of reading. Cf. Martha Nussbaum in Poetic Justice for example. It seems plausible, up to a point, but…only up to a point. For one thing there are so many novels that are full of empathy for one character but none for all the others; and there are so many that have empathy for the wrong people and none for their victims (cf. Gone With the Wind); and there are so many mediocre and bad novels, and the aesthetic quality of a novel has little or nothing to do with its level of empathy-inducement.

I think there are a couple of background ideas at work here, that could do with being dragged into the light. One is that all novelists, all fiction-writers have this ability to teach empathy – that there is something about the very act of telling a story that produces character-sympathy, and that character-sympathy translates into sympathy for people in general as opposed to sympathy for one particular character. But anybody can set up as a novelist, including selfish, unreflective, egotistical people. There is no guarantee that telling a story has anything to do with empathy. And then there is a second idea, that what novelists imagine about other minds is somehow reliable. But why should that be true? Especially why should it be true of all novelists? At least, why should it be any more true than it is of the rest of us? We can all imagine what’s going on in other minds – and we can all be entirely wrong. Or not. It may be that particularly brilliant novelists are better at imagining what’s going on in other minds – at guessing the truth – but particularly brilliant novelists are a rare breed, and in any case, nobody knows for sure whether they have it right or not. We think they do, it sounds right, but we don’t know. All it is, after all, is the imagining of one novelist. Lizzy Bennett and Isabel Archer and Julien Sorel may tell us what it’s like to be someone else – or they may not. We simply don’t know.



Brought to You By

Jan 15th, 2004 7:07 pm | By

This is a disgusting item in the Washington Post. It sounds good at first – but then it’s meant to. And at second it doesn’t sound good at all.

The administration proposal, which is open for comment from federal agencies through Friday and could take effect in the next few months, would block the adoption of new federal regulations unless the science being used to justify them passes muster with a centralized peer review process that would be overseen by the White House Office of Management and Budget.

It’s those last seven words that give the game away – along with the word ‘centralized’ perhaps. Peer review is one thing, ‘centralized’ peer review is another, and ‘centralized’ peer review overseen by the White House Office of Management and Budgetis quite, quite another. Which peers would those be, exactly? Centralized by whom? And – ‘overseen’ in what sense, using what criteria? One can guess all too easily.

But a number of scientific organizations, citizen advocacy groups and even a cadre of former government regulators see a more sinister motivation: an effort to inject White House politics into the world of science and to use the uncertainty that inevitably surrounds science as an excuse to delay new rules that could cost regulated industries millions of dollars…Under the current system, individual agencies typically invite outside experts to review the accuracy of their science and the scientific information they offer…The proposed change would usurp much of that independence. It lays out specific rules regarding who can sit on peer review panels — rules that, to critics’ dismay, explicitly discourage the participation of academic experts who have received agency grants but offer no equivalent warnings against experts with connections to industry. And it grants the executive branch final say as to whether the peer review process was acceptable.

Perfect. Disinterested academics need not apply, but industry scientists are welcome. And the executive branch, with its dazzling track record of scrupulous impartiality in scientific matters, has the final say.



Certainty No

Jan 14th, 2004 8:10 pm | By

The New York Times has an article by Edward Rothstein on the annual Edge question, which John Brockman poses to a large number of writers, scientists and thinkers (many of them all three at once). This year the question is ‘What’s your law?’

There is some bit of wisdom, some rule of nature, some law-like pattern, either grand or small, that you’ve noticed in the universe that might as well be named after you. Gordon Moore has one; Johannes Kepler and Michael Faraday, too. So does Murphy. Since you are so bright, you probably have at least two you can articulate. Send me two laws based on your empirical work and observations you would not mind having tagged with your name. Stick to science and to those scientific areas where you have expertise. Avoid flippancy.

Very good. But Rothstein says an odd thing in his piece.

But curiously, an aura of modesty, tentativeness and skepticism hovers over the submissions — this from a group not renowned for self-abnegation. This may, perhaps, be an admission that fundamental insights are not now to be had. But it may also be an uncertainty about science itself.

But it’s not curious at all. Far from it. Is the reporter not aware that that’s how science is done? Doesn’t he realize that one of the fundamental characteristics of science, one of its definitions, is that it’s always revisable? Offering a tentative ‘law’ implies the opposite of ‘an uncertainty about science itself’; it conveys the kind of confidence one can have in a form of inquiry that is on principle committed to changing its laws when new evidence turns up.

Amusingly enough, I linked to another article in which Colin Blakemore said exactly that only yesterday.

Science must be given back to ordinary people and the key to that is education. I say that with some trepidation, given the political incorrectness of the phrase ‘public understanding of science’ and the new mantra of dialogue and debate. It doesn’t really matter whether people know that the Earth goes round the Sun. But it does matter if they don’t know what a control experiment is, if they think that science produces absolute certainties, if they see differences of opinion among scientists as an indication that the scientific process is flawed, or if they feel robbed of the right to make ethical judgments.

It does matter if people think that science produces absolute certainties. Apparently the place to start is with journalists.



‘Aims To’

Jan 14th, 2004 2:40 am | By

Here it is again – that endlessly repeated untrue statement about the utility of religion.

People like Dawkins, and the Creationists for that matter, make a mistake about the purposes of science and religion. Science tries to tell us about the physical world and how it works. Religion aims at giving a meaning to the world and to our place in it. Science asks immediate questions. Religion asks ultimate questions. There is no conflict here, except when people mistakenly think that questions from one domain demand answers from the other. Science and religion, evolution and Christianity, need not conflict, but only if each knows its place in human affairs — and stays within these boundaries.

Dawkins does not make a mistake. It’s simply not true to say or imply that religion makes no attempt to tell us about the physical world and how it works. Granted, its statements are much more easily arrived at than those of science, because miracles and the supernatural are no obstacle, whereas scientific inquiry tends to like to avoid that kind of thing. But just because religion talks about a deity that is everywhere and omniopotent and omniscient and benevolent (large problems right there, as everyone knows) – just because it talks about an entity that there’s no evidence for, in other words, an entity that’s easy to imagine but hard to find, doesn’t mean it’s not making any claims about the physical world.

But even worse is the next bit. Religion ‘aims’ at giving a meaning – well what good is that?! We all ‘aim at’ lots of things; so what? That’s evasive language, that’s what that is. The point is, religion claims to do more than just ‘aim at’ giving a meaning, it claims to succeed, and that’s a much more ambitious claim. And then a little farther on – religion asks ultimate questions. Sigh. So what? I can do that too, so can you, so can anyone. Just because we can ask a question doesn’t mean there’s an answer to it! In fact it doesn’t mean it’s not a damn silly question. As a matter of fact that’s another thing Richard Dawkins talked about on the Start the Week I mentioned below.

And then the nonsense about each knowing its place. Well religion doesn’t know its place, so what’s the point of saying that! Religion does try to tell us how the world is, and it’s really not honest to pretend it doesn’t. There’s so much weasling about religion around these days. Pretending it’s really exactly like poetry or music, it’s really just a feeling about the world, it’s really just hope or aspiration or wonder. If it were, I wouldn’t have a word to say against it, but it’s not, so I do.

It’s odd, this guff came from Michael Ruse. He’s not silly, at least I don’t think so, at least I read a good essay by him once. Perhaps he’s gone squishy since then.



Good Conversation

Jan 13th, 2004 11:28 pm | By

Start the Week is always good (well just about always), but I particularly liked last week’s, which I listened to a day or two ago. Richard Dawkins was on, explaining that (contrary to popular opinion) he’s an anti-Darwinian on moral matters. He thinks we should do our best to be different from what our genes would have us be; that, being the only species that’s capable of deciding to over-ride our genetic predispositions, we should damn well do it. Then there was Tim Hitchcock, saying some fascinating things about a change in sexual practices that happened late in the 17th century and caused a sharp rise in population. Dawkins pointed out that what Hitchcock was describing was in fact a classic example of humans acting in a way their genes would not ‘want’ them to – avoiding penetrative sex in favor of other kinds, thus lowering the birth rate. That’s one of the great things about Start the Week: the way things connect up that don’t seem to.

And then there was a fascinating bit where Dawkins asked Anthony Giddings a detailed question about chaos. He wasn’t sure he understood it properly, and he was unabashed about asking questions about it on national radio. Some people would be too vain to do that, I think. I once read something by Dawkins – I think in Unweaving the Rainbow – about a very famous scientist giving a guest lecture when Dawkins was a student. Someone in the audience pointed out that Famous Scientist was wrong about something – and FS, far from getting huffy, thanked the pointer out enthusiastically, and (I think – if I remember correctly) said that’s the great thing about science. Everyone applauded like mad. I really love that story. (I may have told it before, but if I have it was months ago, so just pretend you don’t remember.)



A Secular Candidate? What an Idea!

Jan 13th, 2004 2:08 am | By

This is a heartening statement. It’s good to see something, finally, to counter the bilge about presidential candidates and religion one sees in a lot of the press.

In Campaign 2004, secularism has become a dirty word. Democrats, particularly Howard Dean, are being warned that they do not have a chance of winning the presidential election unless they adopt a posture of religious “me-tooism” in an effort to convince voters that their politics are grounded in values just as sacred as those proclaimed by President Bush.

Aren’t they though. And there aren’t nearly enough people saying what childish nonsense that is. Maybe they’re all too busy explaining why they call themselves ‘brights’ – no, I won’t believe that.

At any rate, this op-ed says something I’ve been muttering for years. Years.

Americans tend to minimize not only the secular convictions of the founders, but also the secularist contribution to later social reform movements. One of the most common misconceptions is that organized religion deserves nearly all of the credit for 19th-century abolitionism and the 20th-century civil rights movement…Abolitionists like William Lloyd Garrison, editor of The Liberator, and the Quaker Lucretia Mott, also a women’s rights crusader, denounced the many mainstream Northern religious leaders who, in the 1830’s and 40’s, refused to condemn slavery. In return, Garrison and Mott were castigated as infidels and sometimes as atheists — a common tactic used by those who do not recognize any form of faith but their own. Garrison, strongly influenced by his freethinking predecessor Thomas Paine, observed that one need only be a decent human being — not a believer in the Bible or any creed — to discern the evil of slavery.

It’s not even only Americans. I heard Ann Widdicombe, the Tory MP, say the same thing on the BBC once – that religion is a good thing because it inspired the abolitionists. Well it also shored up their opponents, so that argument is at best a wash. And as Jacoby indicates, there were far more pious opponents of abolitionism than there were pious advocates of it.

Not a scintilla of bravery is required for a candidate, whether Democratic or Republican, to take refuge in religion. But it would take genuine courage to stand up and tell voters that elected officials cannot and should not depend on divine instructions to reconcile the competing interests and passions of human beings… Today, many voters, of many religious beliefs, might well be receptive to a candidate who forthrightly declares that his vision of social justice will be determined by the “plain, physical facts of the case” on humanity’s green and fragile earth. But that would take an inspirational leader who glories in the nation’s secular heritage and is not afraid to say so.

And of course with all the candidates uniting to nag each other to declare for religion, and columns like this one all too rare – we’d better not hold our breaths while waiting for that inspirational leader.



Confirmation Bias

Jan 11th, 2004 9:17 pm | By

The waiting socialists have a bit more on the hijab issue and our disagreement on same. (That link goes to the right post; Marcus at Harry’s Place pointed out that the waiters in fact do have Permalinks; I just overlooked them.) One comment caused me to ponder a bit.

We won’t go over the same ground again here, as we’ve responded in the comments section attached to her post, and she’s responded to us. Guess what? She hasn’t changed her mind, and neither have we changed ours. What that might say about blogging in general we’ll leave to people better able and more willing to generalise about blogging than we are.

What caused the pondering is the ‘Guess what?’ That seems to imply that non-changing of minds is not surprising, hence that we generally don’t change our minds in the course of these discussions – but I’m not sure that’s true. It seems to me I do sometimes change my mind when I see new evidence or arguments (new to me, I mean). But I don’t change my mind every single time – I don’t develop a new set of ideas with every post I read. If I did, B&W would be a pretty chaotic thing to read, wouldn’t it!

I do go into the discussion with some fairly firm presuppositions – that is to say, with plenty of opportunities for confirmation bias. I probably pay more attention to the articles that fit my presuppositions. I have frames through which I understand things, just as we all do. So I thought I would mention some of them, by way of clarification and full disclosure (or rather, partial full disclosure). I see the hijab as a badge of inferiority, as men controlling women, as misogynist and oppressive. I am aware that there are other ways to see it, but I’m not as sharply aware of that as I am of the first view. Then, I also see the hijab as having a lot of baggage – baggage that it wouldn’t have had twenty-five years ago. Wearing it now, after the Taliban, after what’s been going on in Iran, seems to me a different thing from wearing it before that. And not only wearing it, but being around people who are wearing it. It seems to me it can be seen as a poke in the eye to secularists, feminists, women who do not want all of that, who want to escape it, in a way that it wouldn’t have to such an extent before 1979. It’s a statement, a political statement, and in my view it’s a very reactionary, even brutal one. That means I’m less sympathetic to ideas about tradition, identity and so on – that’s my bias. And then, a third frame, I’m intensely hostile to religion (partly because of the history of the past twenty-five years), so I tend to favour efforts to keep it out of the public or secular realm. I’m not very good at seeing religion as a refuge from an alien culture, as the heart of a heartless world.

But another point is a bit different. I’m not convinced that I ought to do any mind-changing here, because I haven’t actually been arguing flatly that the ban on the hijab would be an unequivocally good thing and that’s all there is to it. I’ve been arguing against the view that it would be an unequivocally bad thing and that’s all there is to it. The people I’ve been disagreeing with are the ones who deny that there is any rational or non-racist reason at all to favor a ban. But if that position were accurate, there would be no such group as ‘Ni Putes ni Soumises.’ But the group exists. That is to say, there are French women of Muslim background who do support the ban. It seems to me opponents ignore them and their reasons. Surely arguing that there are people on the other side is not something I ought to change my mind about. I’m not so much arguing for the ban as I am arguing for taking all factors into account.



The Financial Pages

Jan 11th, 2004 7:08 pm | By

Following on from the last N&C on the way the Bush administration listens to developers rather than to environmental scientists in its own agencies – there is a post on corruption, and the history of attempts to limit the effects of money on political culture at Cliopatria. It is highly frustrating to see the open, unembarrassed acceptance of the role of money in politics in the US, and to see how little that changes, what a non-issue it is, how easily it keeps going, how cheerily everyone accepts it. Bribery and corruption are usually considered bad things, but the fact that huge corporations give enormous wads of cash to US political campaigns and parties is, for some reason, just taken as normal. The wads of cash are called ‘donations’ instead of ‘bribes’ and that makes all the difference. But they’re not donations, they’re quid pro quos, and everyone knows it. Yet no one cares. It’s very odd, and it’s maddening.

There is a very good article by Jonathan Chait in the New Republic last November that does something to explain the lack of outrage. The article is about bad press coverage in general, rather than about corruption, but the last section deals with both – and needless to say, they are closely entangled: the corruption survives and thrives on massive public ignorance and indifference. It seems reasonable to think that if more people were more aware of the matter, there would be a lot more pressure to do something about it. To, in fact, stop it.

Republicans now expect lobbyists to support them all the time, even on issues of ancillary concern. In return, Republicans will take unpopular positions on issues like the environment and health care that benefit those same lobbyists. Yet this enormous shift, which impacts much of the domestic agenda, has not been woven into the narrative of political journalism. That omission, too, stems from the strange conventions of Washington reporting. It’s not that journalists fail to report on business influence; it’s just that such reportage tends to get segregated. One place it lands is the lobbying beat…It’s not that the press is shilling for Wall Street fat cats. It’s that money in politics is its own, distinct beat with its own, dedicated reporter (or set of reporters).

One is tempted to mutter about want of nails and wars being lost. Such a trivial reason, for such an important matter. The way jurisdictions are carved up among reporters helps to account for why political corruption is not a front page issue.

Another enclave of superb, but underexposed, coverage about the relationship between lobbyists and policy is the financial press. The Wall Street Journal, for example, has covered the nexus between K Street and the GOP particularly well. And shortly after the 2002 elections, the Post business section ran a terrific piece observing that “it’s payback time for the distributors and other business groups whose pent-up demands for policy changes, large and small, will soon burst into public.” The reason financial reporters can be so blunt, and therefore accurate, is that they could not do their job–conveying information about which businesses are succeeding in winning legislation that will impact their bottom line–if they didn’t convey the unvarnished truth. Political reporters play by a different set of rules. If a story like that ran on page one, it would have to be filtered through the lens of “evenhandedness”–“Democrats charge that Republicans are carrying water for their donors; Republicans disagree”–even if one side were demonstrably wrong. That’s why the practice of unbiased reporting, as journalists understand it, can actually impede the truth.

Isn’t that interesting. Just read the financial pages, and all will become clear. As a matter of fact, my brother told me that about the New York Times’ financial pages many years ago. Now, if only someone could persuade the political reporters to quote their colleagues on the financial beat…mabe word would begin to get out.



Wetlands Pollute! Rivers Need Barges!

Jan 11th, 2004 1:05 am | By

There is a very interesting article about the Bush administration’s interference with science in the Christian Science Monitor. I was a little distracted while reading it, because I kept thinking I had posted an article on the same subject fairly recently, but not so recently that I could remember when, or what it was called, or where it was from. But luck was with me (or perhaps it was my guardian angel, or baby Jesus, or both, one on each shoulder), and I found it anyway. It’s here. It’s well worth reading both: they are related but quite different. The Monitor article treats science in general; the Grist one discusses cases where the Bush administration forced federal agencies to adopt policies developers and other industries wanted in place of scientifically-based findings, with nasty results for the Missouri river and Florida’s wetlands.

From the Monitor article:

Nevertheless, several science-policy experts argue that no presidency has been more calculating and ideological than the Bush administration in setting political parameters for science. President Bush’s blunt rejection of the Kyoto Protocol on global warming, and his decision restricting stem-cell research are only the most obvious and widely publicized examples of what has become a broader pattern across the administration.

From the Grist article:

As we’ve seen before, this administration’s M.O. is simple: If you don’t like the science, change the scientist. That same motto could have been scrawled atop a resignation notice submitted in late October by Bruce Boler, a former U.S. EPA scientist in Florida who quit in protest when the agency accepted a study concluding that wetlands can produce more pollution than they filter. “It’s a blatant reversal of traditional scientific findings that wetlands naturally purify water,’ Boler told Muckraker. ‘Wetlands are often referred to as nature’s kidneys. Most self-respecting scientists will tell you that, and yet [private] developers and officials [at the Corps] wanted me to support their position that wetlands are, literally, a pollution source.’

Scientists who don’t obey are fired and replaced with more biddable ones, and the EPA muzzles its own employees. Pretty story.



An Argument With Too Much Left Out

Jan 9th, 2004 7:43 pm | By

It’s odd to discover that sometimes readers know more about what I’m doing than I do. I’d actually forgotten that I’d commented on the hijab-headscarf-veil issue all the way back in October, but Socialism in an Age of Waiting reminded me.

The issue of Muslim girls wearing, or not wearing, hijab in state schools in France has given rise to extensive comment and debate all over the blogosphere. We’d cite as the most interesting discussions so far the posts, and the comments, at Butterflies and Wheels, where Ophelia Benson has been blogging about it, on and off, since October and at Harry’s Place, where the debate was taken up in December partly in response to the news that “a government-appointed commission on secularism [had] recommended drafting a new law banning all conspicuous religious symbols from French state schools”.

Why so I have. What a terrible memory I have to be sure. I wonder what else I’ve been blogging about that I’ve forgotten. Monetary policy? Weaving? The Crimean War?

Then SiaW link to another discussion of the hijab issue, saying it cuts through the knot – which I find odd, since the post in question leaves so much out. There is this question, for example:

There are fashions that annoy the hell out of me, but by what possible logic are headscarves more offensive than, say, big hair? Is there any way in which headscarves are more oppressive to women than mini-skirts?

Yes of course there is. What an absurd question. There is no equivalent of the Taliban or the religious police of Iran forcing women to wear mini-skirts by beating the shit out of them if they don’t. There is no real, literal, physical, violent, bone-breaking coercion of women to wear short skirts. There is that kind of coercion of women to wear the hijab or the chador or the burqa. The problem with the hijab is not that it’s ‘offensive.’ (That’s a sub-topic I want to go into some day – another branch of the translation problem – the way people hear ‘offensive’ when offense is not the issue at all and no one said it was. Odd, that.) Or that it’s ‘annoying.’ Read or talk to some women who have lived through a transition from not having to wear the nasty things to being forced to by violent packs of men. Talking about annoyance and offense just trivializes the issue, but it’s not damn well trivial.

And the rest of the post is along the same lines. It ignores far too much to be useful, it seems to me. I agree that there are problems with the ban; that it may be counter-productive, that it violates the freedom of some people, that in a sense it discriminates against Muslims. But there are also problems with the absence of the ban, as I said last month. A discussion that just blithely ignores those is a bit beside the point, I think.



Academostars Light up the Sky

Jan 9th, 2004 1:15 am | By

Well my questions have been answered – the ones I asked a couple of days ago, about Why is Judith Butler a superstar and who the hell thinks comp lit teachers are superstars anyway and why don’t they embarrass themselves talking that way? Well no, I didn’t ask that last question, but it’s what I was thinking.

I should have realized. Silly me. The subject is a whole field, a discipline, it has an anthology and everything. The excellent Scott McLemee, of the Chronicle of Higher Education as well as other publications, dropped a word in my ear to the effect that he wrote a few words on this subject a couple of years ago. And sure enough, he did, and very good words too. The whole thing is pretty hilarious, frankly.

“I want to debunk the usual idea that this is some kind of illicit importation [into university life] from Hollywood.” The phenomenon owes less to popular culture, he argues, than to processes taking shape within academic culture. In particular, it is a side effect of the dominance of theory within literary studies. The steady growth of literature programs stimulated what Mr. Williams terms “the theory market.” By the 1980s, thinkers who offered powerful, capacious, and stimulating models of critical analysis were becoming household names.

Household names?? Household names?!? In what households, sport? Do you get out much? I don’t get out much myself, but I get out enough to know that Stanley Fish and Gayatri Spivak are not instantly recognizable in your average American household. No, not even good old Eve or Cornel or Skip is that famous, whatever their colleagues may tell them.

But even better than that household name thing is that ‘powerful, capacious, and stimulating models of critical analysis’ bit. Oh, please. More ‘powerful, capacious, and stimulating’ than anything you will find in physics or history or sociology or philosophy or economics or psychology or cognitive science departments, for example? You know – I really, really, really don’t think so.

As Mr. Williams notes in an interview, the discussion of academostardom emerged in earnest during the 1990s — a time of transition for the humanities, during which the academic profession underwent painful restructuring, despite the overall economic boom. In “Name Recognition,” his essay for the journal’s special issue, the editor underscores how scholarly celebrity met a basic psychological need during this wrenching period. “Against the common academic anxiety of ineffectuality, especially in the humanities,” he writes, “the star system heightens the sense of the academic realm as one of influence, acclaim, and relevance.”

Ah – now I understand. It’s a kind of comfort food. Or magical thinking. ‘I am, or will be someday, or could possibly become someday maybe if I’m very lucky and very hip, influential and acclaimed and, by golly, relevant, because of my powerful, capacious, and stimulating models of critical analysis which are more powerful, capacious, and stimulating than almost anyone else’s. I can push down trees with them, I can store all of Manhattan in them, I can bring whole conferences to a frenzy with them. I am – Megacademostar!!’