Investment Tips

Jan 22nd, 2004 1:43 am | By

I’ve been reading a volume of the letters of Dwight Macdonald lately. One bit I read this morning seemed particularly appropriate for B&W. It’s from a letter in December 1937, to Freda Kirchwey, the editor of the Nation, taking that magazine to task for a number of blind spots, such as being too ‘timid and stuffy-genteel’ in its editorial attitude to the New Deal’s recent swing to the right, and particularly for being hostile to the Commission that was investigating the Moscow Trials (he doesn’t mention John Dewey but I assume that’s the Commission usually known as the Dewey Commission).

While I was at Fortune, the Nation was always to me the great symbol of honest, truthful, intelligent journalism – everything that I missed at Fortune. But it now appears that the Nation, too, has its commitments, its investments, so to speak, just like Fortune. A journalistic investment in Soviet Russia seems to me no more admirable than a like investment in the Greatness and Goodness of Big Business.

Commitments and investments. There it is. Those are great things, necessary things, but they can rip such a hole in our ability to think straight and to see what’s in front of us – and above all to tell the truth about it. The late ’30s were notoriously a rough period for that kind of thing. What with the Moscow Trials, and Trotsky, and the POUM and the anarchists in Barcelona, and what had happened during the forced collectivization – there was a staggering amount of systematic lying, denial, looking away, self-deception in those years. It poisoned the air on the left for years, decades – arguably the left still hasn’t recovered. Watch those investments.



Before After Theory

Jan 20th, 2004 9:19 pm | By

This is an interesting review by Elaine Showalter of Terry Eagleton’s new book After Theory.

In the ’80s, theory ruled, and the subject formerly known as literature was banished or demoted in the interests of philosophy and aesthetic abstraction.

Hmm. But was it really philosophy? Or was it just little bits of philosophy here and there. That’s fine, it’s no crime to know only a little about something, that’s certainly my situation about almost everything – but one has to be clear about it. One has to be careful, it seems to me, not to confuse sampling philosophy with really studying it, and one has to be equally careful not to confuse Literary Theory with philosophy, because (this is a subtle point, now, but bear with me) they are not the same thing. One does get the impression at times that Literary Theorists think they are doing something more like philosophy than they in fact are. One also often wonders if they want to do philosophy why they aren’t in the philosophy department. It is there, after all, all set up for the purpose; why not take advantage of the fact? It seems so perverse to amble off to a different department and then set up to do the subject of another one. People don’t enroll in French departments in order to do engineering do they? Or engineering departments in order to do French. So why the English or Comparative Literature Department in order to do philosophy. One of life’s little mysteries.

He admits that cultural theory “has been shamefaced about morality and metaphysics, embarrassed about love, biology, religion, and revolution, largely silent about evil, reticent about death and suffering, dogmatic about essences, universals, and foundations and superficial about truth, objectivity, and disinterestedness.” As he says in a characteristically ironic understatement, those constitute a “rather large slice of human existence” to ignore. So he sets out to fill the gaps and propose fresh theoretical approaches to the Big Questions.

Well yes, it is a rather large slice, and that’s part of the problem, surely. Why should cultural (or literary, or critical) theory be expected or able to contribute to all those subjects? Isn’t that expecting an awful lot? Isn’t that expecting it to be an omniscient and universally applicable discipline? What would qualify it to take all that on? That’s another one of those little mysteries.

Eagleton wants to free cultural theory from crippling orthodoxy by challenging the relativism of postmodernist thought and arguing on behalf of absolute truth, human essences, and virtue (which includes acting politically). He engages with definitions of morality.

And thus we see the drawback to pretensions of universal reach and of confusing literary theory with philosophy – one ends up re-inventing the wheel and belaboring the obvious. If fans hadn’t taken ‘postmodernist thought’ too seriously to begin with, Eagleton wouldn’t now have to waste energy on challenging it, but they did so he does. That’s where fashion gets you.

First, why isn’t literature, rather than theory, the best place to go for help about morality, love, evil, death, suffering, and truth, among other things? Having written a big book about tragedy, Eagleton obviously knows that on some topics, Shakespeare is a lot more relevant than Saussure. Eagleton himself is able to command an encyclopedic range of literary reference, but he takes the side of theory rather than literature, or even a position between the two. “Critics of theory sometimes complain,” he notes, “that its devotees seem to find theory more exciting than the works of art it is meant to illuminate. But sometimes it is. Freud is a lot more fascinating than Cecil Day-Lewis. Foucault’s The Order of Things is a good deal more arresting and original than the novels of Charles Kingsley.”

And then again why are literature and theory the only two choices? And what does ‘help’ mean? What does ‘relevant’ mean? Why Saussure? Why Freud, why Foucault? It’s all a muddle, frankly. One undefined term or unexamined assumption after another. Time for After After Theory.



Eat Your Sugar

Jan 18th, 2004 7:24 pm | By

This sounds familiar, doesn’t it? We’ve read articles like this before? Only a few days ago in fact? The subject does seem to keep coming up. The Bush administration and profit-making entities on the one hand, and scientific advice and knowledge on the other. Bulldozers make better habitats than rivers do; wetlands pollute; academic scientists who receive grants should be kept off federal peer review panels while scientists with ties to profit-making entities should not. Day is night, up is down, black is red. Do we begin to detect a pattern here?

The President insists fighting fat is a matter for the individual, not the state. But today The Observer reveals how he and fellow senators have received hundreds of thousands of dollars in funding from ‘Big Sugar’. One of his main fundraisers is sugar baron Jose ‘Pepe’ Fanjul, head of Florida Crystals, who has raised at least $100,000 for November’s presidential re-election campaign.

The individual, not the state. Right. And that means that the state must not interfere, even by so much as issuing reasonable dietary guidelines, with people who make money by selling sugar-added foods. It’s up to the individual to ignore all that advertising, have a little backbone, and just stay thin. Simple.

The Bush administration, which receives millions in funding from the sugar industry, argues there is little robust evidence to show that drinking sugary drinks or eating too much sugar is a direct cause of obesity. It particularly opposes a recommendation that just 10 per cent of people’s energy intake should come from added sugar. The US has a 25 per cent guideline.

25% added sugar – sure, that sounds about right. Could be worse. Could be 75% after all.

Too bad the spinach lobby isn’t as powerful as the sugar one. But I guess there just aren’t the bucks in spinach that there are in the sweet stuff.



Hands Off Lacan!

Jan 17th, 2004 8:46 pm | By

This is quite an amusing piece. Albeit irritating. So much rhetoric, so much slippery use of emotowords, so much vagueness where precision is needed – all to protect the heritage of Freud and Lacan. Why, one has to wonder. What is it about Freud that makes people one would think ought to know better, cling so fiercely? I suppose I could postulate some sort of psychoanalytic answer, but would that tell us anything?

“When they speak of ‘professionalising’ people whose business is human misery; when they speak of ‘evaluating’ needs and results; when they try to appoint ‘super-prefects’ of the soul, grand inquisitors of human sadness – it is to hard not to agree that psychoanalysis is in the firing line,” Levy said.

That’s a translation, I assume, so perhaps it’s unfair to look too closely at the words – but I’m going to anyway. ‘People whose business is human misery.’ What does he mean ‘business’? People who make money off human misery? Why should they be protected? Or does he mean something along the lines of experts in human misery, people who know a lot about human misery. But one can know a lot about human misery, in some sense – arguably all humans know that – without having the faintest clue how to ‘fix’ it or cure it or do anything about it at all other than hand-wring or watch or write poetry. Even quacks and charlatans can know a lot about human misery.

Critics say the absence of regulation and a growing demand for therapy of all kinds has led to a proliferation of astrologers, mystics and con-artists – and they are demanding that the public be protected by a system of recognised qualifications. But Lacan, who died in 1981, said that the “analyst’s only authority is his own,” and his followers believe the state has no business interfering in the mysteries of the id and the unconscious. In a faction-ridden climate, many psychoanalysts also see the government’s initiative as an attempt by their arch-enemies the psychiatrists – hospital-based doctors who prescribe drugs for treating mental illness – to marginalise their work.

The analyst’s only authority is his own – well that’s blunt, at least. That’s a good concise summing-up of what’s wrong with psychoanalysis. It presents no evidence, it is not peer-reviewed, it rules out falsification. It’s a form of hermeneutic, we’re often told, which is all well and good but it also claims to be therapeutic. It wants to have it both ways, in short: to charge a lot of money for its ministrations on the understanding that they are in some way helpful for human misery, but to escape oversight and regulation on the understanding that psychoanalysis is some kind of sacred mystery. A ‘marginalisation’ of their work would be a fine thing, if you ask me.



The Poetics of History 2

Jan 17th, 2004 5:48 pm | By

My first comment on this subject has prompted some comments that suggest a lot of further comments (I’m in a permanent state of Infinite Regress here: everything I write seems to suggest several hundred more things I could write) and subjects to look into further. Empathy; the relationship of research to teaching; other minds and solipsism; the tendency to value emotional stances like empathy over ‘cooler’ more cognitive commitments to justice or equality; and so on.

And there is also this article in the New Yorker about a book of history and a play, Thucydides’ History and Euripides’ Medea.

To describe this war in all its complexity, Thucydides had to invent a new way of writing history. In his introduction, he says he will eschew “literary charm”—mythodes, a word related to “myth”—in favor of a carefully sifted accuracy…But this desire for what we would call balanced and accurate reporting led, paradoxically, to a most distinctive literary device: the use of invented speeches and dialogues to illustrate the progress of the war and the combatants’ thinking about it. Every key moment in the war…is cast as a dialogue that, as Thucydides admits, may not be a faithful reproduction of exactly what was said in this or that legislative session or diplomatic parley but does elucidate the ideologies at play. “My method has been,” he writes, “to make the speakers say what, in my opinion, was called for by each situation.” This, more than anything, is what gives the History its unique texture: the vivid sense of an immensely complex conflict reflected, agonizingly, in hundreds of smaller conflicts, each one presenting painful choices, all leading to the great and terrible resolution.

Which is very like the way Euripides wrote his plays, to the disapproval of Nietzsche, who claimed that Socrates and Euripides rationalized and so ruined tragedy. But others like the interplay of argument, the effort to think things through, the questioning of each other’s assumptions, that many of Euripides’ plays show us. There is room for empathy – with Medea, Andromache, Iphigenia – and also for enhanced understanding of the issues, at least as one Athenian playwright saw them. And the great skeptical historian was doing much the same thing. Thucydides was a little bit of a dramatist and Euripides was a little bit of a historian.



Graduate School and its Discontents

Jan 16th, 2004 9:04 pm | By

Invisible Adjunct has another good comment thread going. Remember that interesting (and often symptomatic) thread about the MLA a few weeks ago? There have been interesting ones since, and now there’s an especially interesting one. Well I say that because of the two last posts (last at the moment, last when I saw the thread), 10 and 11. Number 10:

In the first year of graduate school in archaeology we spent so much time learning about post-modernist theaory and how archaeology could not really tell you about the past (it could only reveal your current political views on power relationships) that by the end of the year my professors convinced me that there was no reason to continue my studies in that field. I dropped out and went to law school.

Number 11:

I also remember a huge emphasis on postmodernism when I was a doctoral student in the college of education. Yes, I enjoyed postmodern theory, but there were never any other perspectives; I had to find those on my own. For example, we never studied education from a Marxist perspective; after all, Marxism had been determined to be too “modernistic.” I guess my big gripe with postmodern theory is that it tends to lead to nihlism and a total lack of social solidarity and responsibility. It really reached the pinnacle of craziness when issues like classroom management were turned into postmodern “points of view.” For example, I remember several of us classroom teachers posing serious questions about what happened in our classrooms. We weren’t looking for “how-to” answers, but something better than the “what is disobedience, anyway?” I’m sorry, but if you were to spend time in an 8th grade classroom, I don’t think you’d have any problem with the concrete reality of negative behavior. What was super-ironic is that whenever we would be looking at politics or power relations and anyone would give down-to-earth examples of how power REALLY operated (i.e. through control of workers, surveillance, etc.) then those became modernist concerns and were open to interpretation, not social action.

Yes, from everything I hear, people in real-life 8th grade classrooms have no trouble saying what disobedience is, and why you need some of the other thing if you’re going to teach 30 or 35 children. And there’s something really enormously…ironic? Or is that too modernist. Perhaps I mean playful? Yes, no doubt that’s it. There’s something enormously ‘playful’ in the fact that postmodernist theory causes people to quit archaeology and go to law school instead. Actually what should be happening is that everyone everywhere should be dropping out of all academic programs – because those are all modernist projects too after all – and going into advertising. What could be more postmodernist than advertising? Especially now, now that everybody knows that everybody knows that everybody ‘sees through’ advertising, and ‘transforms’ it into a ‘site of resistance,’ so that advertising gets weirder and weirder, or more and more postmodern, in order to out-resist and out-transform and out-postmodernize all those people in the postmodern audience. Surely it’s the duty of all good postmodernists to provide more sites of resistance for everyone. And of course the pay is better, and you don’t risk ending up in places like Ithaca or Lubbock, and you don’t have to do all that reading.



The Poetics of History

Jan 15th, 2004 9:14 pm | By

There was an interesting subject under discussion at Cliopatria yesterday and this morning – history as defamiliarization, poetics and history, the difference between history and fiction. The whole subject touches on a lot of difficult, knotty questions – other minds; the reliability or otherwise of testimony, autobiography, narrative – of what people recount about their own experiences; empathy; imagination; the general and the particular, the abstract and the concrete – and so on. Meta-questions.

I wondered about the much-discussed idea that fiction can teach empathy in a way that more earth-bound, or factual, or evidence-tethered fields cannot. That novelists have a special imaginative faculty which enables them to show what it’s like to be Someone Else so compellingly that we learn to be tolerant, sympathetic, forgiving, understanding etc. in the act of reading. Cf. Martha Nussbaum in Poetic Justice for example. It seems plausible, up to a point, but…only up to a point. For one thing there are so many novels that are full of empathy for one character but none for all the others; and there are so many that have empathy for the wrong people and none for their victims (cf. Gone With the Wind); and there are so many mediocre and bad novels, and the aesthetic quality of a novel has little or nothing to do with its level of empathy-inducement.

I think there are a couple of background ideas at work here, that could do with being dragged into the light. One is that all novelists, all fiction-writers have this ability to teach empathy – that there is something about the very act of telling a story that produces character-sympathy, and that character-sympathy translates into sympathy for people in general as opposed to sympathy for one particular character. But anybody can set up as a novelist, including selfish, unreflective, egotistical people. There is no guarantee that telling a story has anything to do with empathy. And then there is a second idea, that what novelists imagine about other minds is somehow reliable. But why should that be true? Especially why should it be true of all novelists? At least, why should it be any more true than it is of the rest of us? We can all imagine what’s going on in other minds – and we can all be entirely wrong. Or not. It may be that particularly brilliant novelists are better at imagining what’s going on in other minds – at guessing the truth – but particularly brilliant novelists are a rare breed, and in any case, nobody knows for sure whether they have it right or not. We think they do, it sounds right, but we don’t know. All it is, after all, is the imagining of one novelist. Lizzy Bennett and Isabel Archer and Julien Sorel may tell us what it’s like to be someone else – or they may not. We simply don’t know.



Brought to You By

Jan 15th, 2004 7:07 pm | By

This is a disgusting item in the Washington Post. It sounds good at first – but then it’s meant to. And at second it doesn’t sound good at all.

The administration proposal, which is open for comment from federal agencies through Friday and could take effect in the next few months, would block the adoption of new federal regulations unless the science being used to justify them passes muster with a centralized peer review process that would be overseen by the White House Office of Management and Budget.

It’s those last seven words that give the game away – along with the word ‘centralized’ perhaps. Peer review is one thing, ‘centralized’ peer review is another, and ‘centralized’ peer review overseen by the White House Office of Management and Budgetis quite, quite another. Which peers would those be, exactly? Centralized by whom? And – ‘overseen’ in what sense, using what criteria? One can guess all too easily.

But a number of scientific organizations, citizen advocacy groups and even a cadre of former government regulators see a more sinister motivation: an effort to inject White House politics into the world of science and to use the uncertainty that inevitably surrounds science as an excuse to delay new rules that could cost regulated industries millions of dollars…Under the current system, individual agencies typically invite outside experts to review the accuracy of their science and the scientific information they offer…The proposed change would usurp much of that independence. It lays out specific rules regarding who can sit on peer review panels — rules that, to critics’ dismay, explicitly discourage the participation of academic experts who have received agency grants but offer no equivalent warnings against experts with connections to industry. And it grants the executive branch final say as to whether the peer review process was acceptable.

Perfect. Disinterested academics need not apply, but industry scientists are welcome. And the executive branch, with its dazzling track record of scrupulous impartiality in scientific matters, has the final say.



Certainty No

Jan 14th, 2004 8:10 pm | By

The New York Times has an article by Edward Rothstein on the annual Edge question, which John Brockman poses to a large number of writers, scientists and thinkers (many of them all three at once). This year the question is ‘What’s your law?’

There is some bit of wisdom, some rule of nature, some law-like pattern, either grand or small, that you’ve noticed in the universe that might as well be named after you. Gordon Moore has one; Johannes Kepler and Michael Faraday, too. So does Murphy. Since you are so bright, you probably have at least two you can articulate. Send me two laws based on your empirical work and observations you would not mind having tagged with your name. Stick to science and to those scientific areas where you have expertise. Avoid flippancy.

Very good. But Rothstein says an odd thing in his piece.

But curiously, an aura of modesty, tentativeness and skepticism hovers over the submissions — this from a group not renowned for self-abnegation. This may, perhaps, be an admission that fundamental insights are not now to be had. But it may also be an uncertainty about science itself.

But it’s not curious at all. Far from it. Is the reporter not aware that that’s how science is done? Doesn’t he realize that one of the fundamental characteristics of science, one of its definitions, is that it’s always revisable? Offering a tentative ‘law’ implies the opposite of ‘an uncertainty about science itself’; it conveys the kind of confidence one can have in a form of inquiry that is on principle committed to changing its laws when new evidence turns up.

Amusingly enough, I linked to another article in which Colin Blakemore said exactly that only yesterday.

Science must be given back to ordinary people and the key to that is education. I say that with some trepidation, given the political incorrectness of the phrase ‘public understanding of science’ and the new mantra of dialogue and debate. It doesn’t really matter whether people know that the Earth goes round the Sun. But it does matter if they don’t know what a control experiment is, if they think that science produces absolute certainties, if they see differences of opinion among scientists as an indication that the scientific process is flawed, or if they feel robbed of the right to make ethical judgments.

It does matter if people think that science produces absolute certainties. Apparently the place to start is with journalists.



‘Aims To’

Jan 14th, 2004 2:40 am | By

Here it is again – that endlessly repeated untrue statement about the utility of religion.

People like Dawkins, and the Creationists for that matter, make a mistake about the purposes of science and religion. Science tries to tell us about the physical world and how it works. Religion aims at giving a meaning to the world and to our place in it. Science asks immediate questions. Religion asks ultimate questions. There is no conflict here, except when people mistakenly think that questions from one domain demand answers from the other. Science and religion, evolution and Christianity, need not conflict, but only if each knows its place in human affairs — and stays within these boundaries.

Dawkins does not make a mistake. It’s simply not true to say or imply that religion makes no attempt to tell us about the physical world and how it works. Granted, its statements are much more easily arrived at than those of science, because miracles and the supernatural are no obstacle, whereas scientific inquiry tends to like to avoid that kind of thing. But just because religion talks about a deity that is everywhere and omniopotent and omniscient and benevolent (large problems right there, as everyone knows) – just because it talks about an entity that there’s no evidence for, in other words, an entity that’s easy to imagine but hard to find, doesn’t mean it’s not making any claims about the physical world.

But even worse is the next bit. Religion ‘aims’ at giving a meaning – well what good is that?! We all ‘aim at’ lots of things; so what? That’s evasive language, that’s what that is. The point is, religion claims to do more than just ‘aim at’ giving a meaning, it claims to succeed, and that’s a much more ambitious claim. And then a little farther on – religion asks ultimate questions. Sigh. So what? I can do that too, so can you, so can anyone. Just because we can ask a question doesn’t mean there’s an answer to it! In fact it doesn’t mean it’s not a damn silly question. As a matter of fact that’s another thing Richard Dawkins talked about on the Start the Week I mentioned below.

And then the nonsense about each knowing its place. Well religion doesn’t know its place, so what’s the point of saying that! Religion does try to tell us how the world is, and it’s really not honest to pretend it doesn’t. There’s so much weasling about religion around these days. Pretending it’s really exactly like poetry or music, it’s really just a feeling about the world, it’s really just hope or aspiration or wonder. If it were, I wouldn’t have a word to say against it, but it’s not, so I do.

It’s odd, this guff came from Michael Ruse. He’s not silly, at least I don’t think so, at least I read a good essay by him once. Perhaps he’s gone squishy since then.



Good Conversation

Jan 13th, 2004 11:28 pm | By

Start the Week is always good (well just about always), but I particularly liked last week’s, which I listened to a day or two ago. Richard Dawkins was on, explaining that (contrary to popular opinion) he’s an anti-Darwinian on moral matters. He thinks we should do our best to be different from what our genes would have us be; that, being the only species that’s capable of deciding to over-ride our genetic predispositions, we should damn well do it. Then there was Tim Hitchcock, saying some fascinating things about a change in sexual practices that happened late in the 17th century and caused a sharp rise in population. Dawkins pointed out that what Hitchcock was describing was in fact a classic example of humans acting in a way their genes would not ‘want’ them to – avoiding penetrative sex in favor of other kinds, thus lowering the birth rate. That’s one of the great things about Start the Week: the way things connect up that don’t seem to.

And then there was a fascinating bit where Dawkins asked Anthony Giddings a detailed question about chaos. He wasn’t sure he understood it properly, and he was unabashed about asking questions about it on national radio. Some people would be too vain to do that, I think. I once read something by Dawkins – I think in Unweaving the Rainbow – about a very famous scientist giving a guest lecture when Dawkins was a student. Someone in the audience pointed out that Famous Scientist was wrong about something – and FS, far from getting huffy, thanked the pointer out enthusiastically, and (I think – if I remember correctly) said that’s the great thing about science. Everyone applauded like mad. I really love that story. (I may have told it before, but if I have it was months ago, so just pretend you don’t remember.)



A Secular Candidate? What an Idea!

Jan 13th, 2004 2:08 am | By

This is a heartening statement. It’s good to see something, finally, to counter the bilge about presidential candidates and religion one sees in a lot of the press.

In Campaign 2004, secularism has become a dirty word. Democrats, particularly Howard Dean, are being warned that they do not have a chance of winning the presidential election unless they adopt a posture of religious “me-tooism” in an effort to convince voters that their politics are grounded in values just as sacred as those proclaimed by President Bush.

Aren’t they though. And there aren’t nearly enough people saying what childish nonsense that is. Maybe they’re all too busy explaining why they call themselves ‘brights’ – no, I won’t believe that.

At any rate, this op-ed says something I’ve been muttering for years. Years.

Americans tend to minimize not only the secular convictions of the founders, but also the secularist contribution to later social reform movements. One of the most common misconceptions is that organized religion deserves nearly all of the credit for 19th-century abolitionism and the 20th-century civil rights movement…Abolitionists like William Lloyd Garrison, editor of The Liberator, and the Quaker Lucretia Mott, also a women’s rights crusader, denounced the many mainstream Northern religious leaders who, in the 1830’s and 40’s, refused to condemn slavery. In return, Garrison and Mott were castigated as infidels and sometimes as atheists — a common tactic used by those who do not recognize any form of faith but their own. Garrison, strongly influenced by his freethinking predecessor Thomas Paine, observed that one need only be a decent human being — not a believer in the Bible or any creed — to discern the evil of slavery.

It’s not even only Americans. I heard Ann Widdicombe, the Tory MP, say the same thing on the BBC once – that religion is a good thing because it inspired the abolitionists. Well it also shored up their opponents, so that argument is at best a wash. And as Jacoby indicates, there were far more pious opponents of abolitionism than there were pious advocates of it.

Not a scintilla of bravery is required for a candidate, whether Democratic or Republican, to take refuge in religion. But it would take genuine courage to stand up and tell voters that elected officials cannot and should not depend on divine instructions to reconcile the competing interests and passions of human beings… Today, many voters, of many religious beliefs, might well be receptive to a candidate who forthrightly declares that his vision of social justice will be determined by the “plain, physical facts of the case” on humanity’s green and fragile earth. But that would take an inspirational leader who glories in the nation’s secular heritage and is not afraid to say so.

And of course with all the candidates uniting to nag each other to declare for religion, and columns like this one all too rare – we’d better not hold our breaths while waiting for that inspirational leader.



Confirmation Bias

Jan 11th, 2004 9:17 pm | By

The waiting socialists have a bit more on the hijab issue and our disagreement on same. (That link goes to the right post; Marcus at Harry’s Place pointed out that the waiters in fact do have Permalinks; I just overlooked them.) One comment caused me to ponder a bit.

We won’t go over the same ground again here, as we’ve responded in the comments section attached to her post, and she’s responded to us. Guess what? She hasn’t changed her mind, and neither have we changed ours. What that might say about blogging in general we’ll leave to people better able and more willing to generalise about blogging than we are.

What caused the pondering is the ‘Guess what?’ That seems to imply that non-changing of minds is not surprising, hence that we generally don’t change our minds in the course of these discussions – but I’m not sure that’s true. It seems to me I do sometimes change my mind when I see new evidence or arguments (new to me, I mean). But I don’t change my mind every single time – I don’t develop a new set of ideas with every post I read. If I did, B&W would be a pretty chaotic thing to read, wouldn’t it!

I do go into the discussion with some fairly firm presuppositions – that is to say, with plenty of opportunities for confirmation bias. I probably pay more attention to the articles that fit my presuppositions. I have frames through which I understand things, just as we all do. So I thought I would mention some of them, by way of clarification and full disclosure (or rather, partial full disclosure). I see the hijab as a badge of inferiority, as men controlling women, as misogynist and oppressive. I am aware that there are other ways to see it, but I’m not as sharply aware of that as I am of the first view. Then, I also see the hijab as having a lot of baggage – baggage that it wouldn’t have had twenty-five years ago. Wearing it now, after the Taliban, after what’s been going on in Iran, seems to me a different thing from wearing it before that. And not only wearing it, but being around people who are wearing it. It seems to me it can be seen as a poke in the eye to secularists, feminists, women who do not want all of that, who want to escape it, in a way that it wouldn’t have to such an extent before 1979. It’s a statement, a political statement, and in my view it’s a very reactionary, even brutal one. That means I’m less sympathetic to ideas about tradition, identity and so on – that’s my bias. And then, a third frame, I’m intensely hostile to religion (partly because of the history of the past twenty-five years), so I tend to favour efforts to keep it out of the public or secular realm. I’m not very good at seeing religion as a refuge from an alien culture, as the heart of a heartless world.

But another point is a bit different. I’m not convinced that I ought to do any mind-changing here, because I haven’t actually been arguing flatly that the ban on the hijab would be an unequivocally good thing and that’s all there is to it. I’ve been arguing against the view that it would be an unequivocally bad thing and that’s all there is to it. The people I’ve been disagreeing with are the ones who deny that there is any rational or non-racist reason at all to favor a ban. But if that position were accurate, there would be no such group as ‘Ni Putes ni Soumises.’ But the group exists. That is to say, there are French women of Muslim background who do support the ban. It seems to me opponents ignore them and their reasons. Surely arguing that there are people on the other side is not something I ought to change my mind about. I’m not so much arguing for the ban as I am arguing for taking all factors into account.



The Financial Pages

Jan 11th, 2004 7:08 pm | By

Following on from the last N&C on the way the Bush administration listens to developers rather than to environmental scientists in its own agencies – there is a post on corruption, and the history of attempts to limit the effects of money on political culture at Cliopatria. It is highly frustrating to see the open, unembarrassed acceptance of the role of money in politics in the US, and to see how little that changes, what a non-issue it is, how easily it keeps going, how cheerily everyone accepts it. Bribery and corruption are usually considered bad things, but the fact that huge corporations give enormous wads of cash to US political campaigns and parties is, for some reason, just taken as normal. The wads of cash are called ‘donations’ instead of ‘bribes’ and that makes all the difference. But they’re not donations, they’re quid pro quos, and everyone knows it. Yet no one cares. It’s very odd, and it’s maddening.

There is a very good article by Jonathan Chait in the New Republic last November that does something to explain the lack of outrage. The article is about bad press coverage in general, rather than about corruption, but the last section deals with both – and needless to say, they are closely entangled: the corruption survives and thrives on massive public ignorance and indifference. It seems reasonable to think that if more people were more aware of the matter, there would be a lot more pressure to do something about it. To, in fact, stop it.

Republicans now expect lobbyists to support them all the time, even on issues of ancillary concern. In return, Republicans will take unpopular positions on issues like the environment and health care that benefit those same lobbyists. Yet this enormous shift, which impacts much of the domestic agenda, has not been woven into the narrative of political journalism. That omission, too, stems from the strange conventions of Washington reporting. It’s not that journalists fail to report on business influence; it’s just that such reportage tends to get segregated. One place it lands is the lobbying beat…It’s not that the press is shilling for Wall Street fat cats. It’s that money in politics is its own, distinct beat with its own, dedicated reporter (or set of reporters).

One is tempted to mutter about want of nails and wars being lost. Such a trivial reason, for such an important matter. The way jurisdictions are carved up among reporters helps to account for why political corruption is not a front page issue.

Another enclave of superb, but underexposed, coverage about the relationship between lobbyists and policy is the financial press. The Wall Street Journal, for example, has covered the nexus between K Street and the GOP particularly well. And shortly after the 2002 elections, the Post business section ran a terrific piece observing that “it’s payback time for the distributors and other business groups whose pent-up demands for policy changes, large and small, will soon burst into public.” The reason financial reporters can be so blunt, and therefore accurate, is that they could not do their job–conveying information about which businesses are succeeding in winning legislation that will impact their bottom line–if they didn’t convey the unvarnished truth. Political reporters play by a different set of rules. If a story like that ran on page one, it would have to be filtered through the lens of “evenhandedness”–“Democrats charge that Republicans are carrying water for their donors; Republicans disagree”–even if one side were demonstrably wrong. That’s why the practice of unbiased reporting, as journalists understand it, can actually impede the truth.

Isn’t that interesting. Just read the financial pages, and all will become clear. As a matter of fact, my brother told me that about the New York Times’ financial pages many years ago. Now, if only someone could persuade the political reporters to quote their colleagues on the financial beat…mabe word would begin to get out.



Wetlands Pollute! Rivers Need Barges!

Jan 11th, 2004 1:05 am | By

There is a very interesting article about the Bush administration’s interference with science in the Christian Science Monitor. I was a little distracted while reading it, because I kept thinking I had posted an article on the same subject fairly recently, but not so recently that I could remember when, or what it was called, or where it was from. But luck was with me (or perhaps it was my guardian angel, or baby Jesus, or both, one on each shoulder), and I found it anyway. It’s here. It’s well worth reading both: they are related but quite different. The Monitor article treats science in general; the Grist one discusses cases where the Bush administration forced federal agencies to adopt policies developers and other industries wanted in place of scientifically-based findings, with nasty results for the Missouri river and Florida’s wetlands.

From the Monitor article:

Nevertheless, several science-policy experts argue that no presidency has been more calculating and ideological than the Bush administration in setting political parameters for science. President Bush’s blunt rejection of the Kyoto Protocol on global warming, and his decision restricting stem-cell research are only the most obvious and widely publicized examples of what has become a broader pattern across the administration.

From the Grist article:

As we’ve seen before, this administration’s M.O. is simple: If you don’t like the science, change the scientist. That same motto could have been scrawled atop a resignation notice submitted in late October by Bruce Boler, a former U.S. EPA scientist in Florida who quit in protest when the agency accepted a study concluding that wetlands can produce more pollution than they filter. “It’s a blatant reversal of traditional scientific findings that wetlands naturally purify water,’ Boler told Muckraker. ‘Wetlands are often referred to as nature’s kidneys. Most self-respecting scientists will tell you that, and yet [private] developers and officials [at the Corps] wanted me to support their position that wetlands are, literally, a pollution source.’

Scientists who don’t obey are fired and replaced with more biddable ones, and the EPA muzzles its own employees. Pretty story.



An Argument With Too Much Left Out

Jan 9th, 2004 7:43 pm | By

It’s odd to discover that sometimes readers know more about what I’m doing than I do. I’d actually forgotten that I’d commented on the hijab-headscarf-veil issue all the way back in October, but Socialism in an Age of Waiting reminded me.

The issue of Muslim girls wearing, or not wearing, hijab in state schools in France has given rise to extensive comment and debate all over the blogosphere. We’d cite as the most interesting discussions so far the posts, and the comments, at Butterflies and Wheels, where Ophelia Benson has been blogging about it, on and off, since October and at Harry’s Place, where the debate was taken up in December partly in response to the news that “a government-appointed commission on secularism [had] recommended drafting a new law banning all conspicuous religious symbols from French state schools”.

Why so I have. What a terrible memory I have to be sure. I wonder what else I’ve been blogging about that I’ve forgotten. Monetary policy? Weaving? The Crimean War?

Then SiaW link to another discussion of the hijab issue, saying it cuts through the knot – which I find odd, since the post in question leaves so much out. There is this question, for example:

There are fashions that annoy the hell out of me, but by what possible logic are headscarves more offensive than, say, big hair? Is there any way in which headscarves are more oppressive to women than mini-skirts?

Yes of course there is. What an absurd question. There is no equivalent of the Taliban or the religious police of Iran forcing women to wear mini-skirts by beating the shit out of them if they don’t. There is no real, literal, physical, violent, bone-breaking coercion of women to wear short skirts. There is that kind of coercion of women to wear the hijab or the chador or the burqa. The problem with the hijab is not that it’s ‘offensive.’ (That’s a sub-topic I want to go into some day – another branch of the translation problem – the way people hear ‘offensive’ when offense is not the issue at all and no one said it was. Odd, that.) Or that it’s ‘annoying.’ Read or talk to some women who have lived through a transition from not having to wear the nasty things to being forced to by violent packs of men. Talking about annoyance and offense just trivializes the issue, but it’s not damn well trivial.

And the rest of the post is along the same lines. It ignores far too much to be useful, it seems to me. I agree that there are problems with the ban; that it may be counter-productive, that it violates the freedom of some people, that in a sense it discriminates against Muslims. But there are also problems with the absence of the ban, as I said last month. A discussion that just blithely ignores those is a bit beside the point, I think.



Academostars Light up the Sky

Jan 9th, 2004 1:15 am | By

Well my questions have been answered – the ones I asked a couple of days ago, about Why is Judith Butler a superstar and who the hell thinks comp lit teachers are superstars anyway and why don’t they embarrass themselves talking that way? Well no, I didn’t ask that last question, but it’s what I was thinking.

I should have realized. Silly me. The subject is a whole field, a discipline, it has an anthology and everything. The excellent Scott McLemee, of the Chronicle of Higher Education as well as other publications, dropped a word in my ear to the effect that he wrote a few words on this subject a couple of years ago. And sure enough, he did, and very good words too. The whole thing is pretty hilarious, frankly.

“I want to debunk the usual idea that this is some kind of illicit importation [into university life] from Hollywood.” The phenomenon owes less to popular culture, he argues, than to processes taking shape within academic culture. In particular, it is a side effect of the dominance of theory within literary studies. The steady growth of literature programs stimulated what Mr. Williams terms “the theory market.” By the 1980s, thinkers who offered powerful, capacious, and stimulating models of critical analysis were becoming household names.

Household names?? Household names?!? In what households, sport? Do you get out much? I don’t get out much myself, but I get out enough to know that Stanley Fish and Gayatri Spivak are not instantly recognizable in your average American household. No, not even good old Eve or Cornel or Skip is that famous, whatever their colleagues may tell them.

But even better than that household name thing is that ‘powerful, capacious, and stimulating models of critical analysis’ bit. Oh, please. More ‘powerful, capacious, and stimulating’ than anything you will find in physics or history or sociology or philosophy or economics or psychology or cognitive science departments, for example? You know – I really, really, really don’t think so.

As Mr. Williams notes in an interview, the discussion of academostardom emerged in earnest during the 1990s — a time of transition for the humanities, during which the academic profession underwent painful restructuring, despite the overall economic boom. In “Name Recognition,” his essay for the journal’s special issue, the editor underscores how scholarly celebrity met a basic psychological need during this wrenching period. “Against the common academic anxiety of ineffectuality, especially in the humanities,” he writes, “the star system heightens the sense of the academic realm as one of influence, acclaim, and relevance.”

Ah – now I understand. It’s a kind of comfort food. Or magical thinking. ‘I am, or will be someday, or could possibly become someday maybe if I’m very lucky and very hip, influential and acclaimed and, by golly, relevant, because of my powerful, capacious, and stimulating models of critical analysis which are more powerful, capacious, and stimulating than almost anyone else’s. I can push down trees with them, I can store all of Manhattan in them, I can bring whole conferences to a frenzy with them. I am – Megacademostar!!’



From Below

Jan 8th, 2004 8:54 pm | By

Well I made good on my threat, and did that In Focus. I’ll be adding a lot more links, since it’s a large subject.

I also posted again at Cliopatria, about Romila Thapar. There are more interesting comments there, from people who know far more about history and historians than I do. Timothy Burke makes this excellent point:

This is one of those junctures where the tragic confusion of some scholars in the US and England about where their sympathies should lie potentially becomes pretty dangerous if not corrected. It strikes me that Hindutva’s self-representation is actually pretty fair in one respect: it is more genuinely popular, “from-below”, and less obviously “Western” than scholarly history practiced in Indian academies (though in the end, I’d say it’s actually quite resonantly “Western” in the same way that most forms of romantic anti-modernity modernism ultimately are). For some scholars, the mere notion that something is meaningfully “from-below” accords it instant moral legitimacy, particularly if it involves non-Westerners refusing or rejecting something that can be reasonably tagged as Western. But Hindutva is systematically repellant, and any intellectual or morally conscious person anywhere in the world ought to recognize it as such.

Just so. The old ‘from-below’ trap. It’s well-meaning, it’s understandable, but – it is such a mistake.



Outrage

Jan 7th, 2004 8:27 pm | By

Well, really. I’ve probably said this before…but I’ll simply have to repeat myself then. This is one of those times I just have to shake my grizzled head and croak with the Wicked Witch of the North, ‘What a world, what a world.’

A kind and helpful reader, Chris of the blog Intelligent Life, alerted me to this horrible story in a comment on another story about gangs of religious thugs terrorizing people. There’s just no end to it, it seems.

Sanika Bapat, another post-graduate scholar merely questioned, ‘‘Why did they tear a Shivaji manuscript from this library? Are they Shivaji worshippers or patriots? They are worse than any militia. We are another Taliban now.’’ People sitting outside the library were at a loss for words. Said Dr M A Mehendale, ‘‘What can I say in the face of this destruction? Words really fail me.”

Everyone’s worst nightmare, or most people’s anyway. ‘We are another Taliban now.’ The worst kind of ignorant, narrow, aggressive, righteously-enraged, violent, destructive, red-eyed zealots breaking down the doors and smashing everything they find. Bullying, punishing, beating up, imprisoning women; destroying books and manuscripts; demolishing the Bamiyan Buddhas; smashing airplanes and heavily-populated buildings; burning down mosques; murdering publishers and translators; blackening the faces of women on billboards; threatening, threatening, threatening.

Another reader alerted me a few months ago to the matter of Romila Thapar – another historian that the Hindutva brigade doesn’t like.

While 72-year-old Thapar’s appointment was greeted with applause by serious students of history, little did anyone realise that acolytes of the Hindutva brand of politics, primarily those in the Indian diaspora, would unleash a vitriolic campaign against her built on name-calling and the disparaging of her professional qualifications. Claiming that “her appointment is a great travesty”, an online petition calling for its cancellation has, as of the last week in May, collected over 2000 signatures. Thapar, according to the petition, “is an avowed antagonist of India’s Hindu civilization. As a well-known Marxist, she represents a completely Euro-centric world view”. Protesting that she cannot “be the correct choice to represent India’s ancient history and civilization”, it states that she “completely disavows that India ever had a history”. The petitioners also aver that by “discrediting Hindu civilization” Romila Thapar and others are engaged in a “war of cultural genocide”.

At this rate, I’m going to have to put together an In Focus on Religious Outrages on Scholarship, or something. I certainly have more than enough material.

Update: I also posted about this at the group history blog Cliopatria which invited me to join them the other day. There are interesting comments from historians there.



Judith Butler Superstar

Jan 6th, 2004 8:17 pm | By

Okay, what’s the deal with Judith Butler. Why does everyone who writes about her call her a celebrity or a superstar. A superstar?? Someone who teaches gender studies at Berkeley? A superstar?

Berkeley’s Judith Butler, a superstar of gender and literary studies, drew a packed house with her analysis of Defense Secretary Donald Rumsfeld’s bad grammar and slippery use of the term “sovereignty.”

I’m not making it up, that’s from the Boston Globe, from a story on the MLA convention. Not a very affectionate or over-impressed story, either – and yet Scott Jaschik calls Butler a superstar. Well if she got married in Las Vegas and then had the marriage annulled the next day, would we hear about it? Would it take up time on the BBC World Service’s half-hour news report? If she were arrested for child molestation, would we hear about that, would the World Service consider that important enough to spend a minute or two on? I have to say, I kind of doubt it.

Well to be sure, Jaschik does call her a superstar of gender and literary studies – not just a superstar tout court. But then further questions arise. Do reporters write about superstars of nursing, superstars of postal work, superstars of meat packing? Do they write about superstars of history or Classics? I don’t think so. So what is it about Judith Butler that somehow hypnotizes people into calling her a superstar? Or is it something about her field that does that – and if so, what? And what does it mean – what does all this vocabulary of stardom and trendiness and hipness and fashion portend? Why is it catching? Why don’t people just laugh when academics are called superstars?

Even in Israel, even that far away and with other things to attend to, they are susceptible. Witness Ha’aretz.

Butler is an unusual figure in academia. On the one hand, she is a celebrity who has a community of followers and who exudes charisma. Groups of followers sometimes line up for her lectures, as though she were a rock star; and her major influence on feminism at the start of the 21st century is widely noted. On the other hand, many persons outside of feminist academia have never heard of her; nor have they come across her ideas, or been influenced by them.

Celebrity? Followers? Charisma? Rock star? Well at least Ha’aretz realizes some of the truth – ‘many persons outside of feminist academia have never heard of her.’ Yes, you could say that. Quite a few, I daresay. In fact I would venture to guess that the non-hearing of Judith Butler outside feminist academia is pretty nearly universal.

Rock star, celebrity, superstar. How do these rumours get started, I always wonder.