Feb 1st, 2004 12:24 am | By

There is an amusing post here by a blogger who is eccentric enough to read B&W. He’s just been reading a N&C from back in early January, the one about academostars – which sent him to an article by Scott McLemee in the Chronicle, which prompted some reflections on Stanley Fish’s Reagonomical views of the merits of overpaying academostars.

To be fair, Fish may have a point: his presence in an English department may draw starry-eyed grad students into the department and increase funding for more useless graduate seminars on esoteric topics that will prove of little or no use to anyone teaching at most universities. In this respect, the “material conditions” of the other professors in the department may be improved somewhat (though the graduate students and adjuncts will still be teaching the same thankless classes for the same poverty-level wages). But Fish will still be making 2-3 times what his colleagues do for far less work (most academic superstars teach one course a year, generally a graduate or senior seminar with a small enrollment). Very little of that privileged status will be trickling down to his colleagues.

It’s all very reminiscent of Robert Frank and Philip Cook’s The Winner-Take-All Society, an excellent book on the way minuscule differences in talent can make the difference between a hugely remunerative career and none at all (in professional sports, movie stardom, popular music, for example). Stardom does work that way. And often it has so much to do with – a kind of gas, really. Vapor, hot air, bubbles. Especially in the case of academostars. People become stars because people start calling them stars, and other people hear that and call them stars too, and more people do the same, and more and more. Mass hypnosis. Pretty soon it becomes unthinkable or socially unacceptable to ask ‘Why is this person a star? What’s the big deal? In what way is this one so enormously better than that one?’ It all seems to have far more to do with hype and silly showbizzy attitudes than it does with anything resembling intellectual interests or values.

False Consciousness

Jan 30th, 2004 5:52 pm | By

So here’s Nawal El Sadaawi, saying the demonstrations of women against the French proposal to ban the hijab are a ‘signal example of how “false consciousness” makes women enemies of their freedom, enemies of themselves, an example of how they are used in the political game being played by the Islamic fundamentalist movement in its bid for power.’ I have noticed repeatedly that a lot of Westerners who oppose the ban have an unpleasant (to put it mildly) tendency to accuse supporters and semi-supporters of racism and colonialist ways of thinking – as if there were total unanimity among people of Muslim background. But of course there isn’t. Far from it. Of course many Muslims and people of Muslim background are strongly opposed to the ban, but there are also many who favour it. For reasons which El Sadaawi makes admirably clear.

False consciousness makes women obedient instruments of their own oppression, and transmitters of this false consciousness to future generations of children, of girls and boys. It is lethal because what it does to women’s minds is not visible. Unlike physical female genital mutilation it is an invisible gender mutilation which destroys the dynamism, the capacity to understand what is happening, to react and resist, to change, to participate in making changes. It destroys the essential creativity of the human mind. It instills fear, obedience, resignation, illusions, an inability to decide or else it leads women to make decisions, to take positions, to defend values and ideas inimical to their own interests, to the health and development of their life. It makes women their own enemy, incapable of discerning friend from foe.

False consciousness is a very, very difficult notion to defend, for obvious reasons. The retort is always available, ‘How the hell do you know whose consciousness is false, that X doesn’t really believe what she says she believes, that if only she listened to you she would change her mind?’ And yet we know there is such a thing – we know it if only from our own experience. We know how easy it is to be misled, to be persuaded, to see things through a glass darkly. We know it happens. And this argument between women who cling to the veil and want other women to cling to it too, and women who want to take it off and want other women to take it off too, has been going on for many decades. One can always just shrug and mumble about ‘Their culture’ and let it go at that – but that doesn’t really get anywhere, does it, since ‘Their culture’ itself is riven with disputes over the matter. It’s as well to keep that in mind when the issue comes up.

Hazlitt Speaks His Mind

Jan 29th, 2004 6:55 pm | By

Something put me in mind of Hazlitt’s famous Letter to William Gifford this morning – so I thought I might as well give you a bit of the flavour of it. It’s a permanent, settled grievance of mine that Hazlitt is so little-known. I think he’s the single most inexplicably obscure writer in English. He ought to be at least as famous as Orwell and far more so than Lamb or Carlyle. He’s an absolutely brilliant, dazzling writer, and he’s no slouch as a thinker, either.

The letter to Gifford starts off briskly:

Sir, You have an ugly trick of saying what is not true of any one you do not like; and it will be the object of this letter to cure you of it.

There are so many people around who have that ugly trick, these days. How one wishes for a few Hazlitts to cure them of it.

You are the Government Critic, a character nicely differing from that of a government spy – the invisible link, that connects literature with the police…The distinction between truth and falsehood you make no account of: you mind only the distinction between Whig and Tory…The same set of threadbare common-places, the same second-hand assortment of abusive nick-names, the same assumption of little magisterial airs of superiority, are regularly repeated…You dictate your opinions to a party, because not one of your opinions is formed upon an honest conviction of the truth or justice of the case, but by collusion with the prejudices, caprice, interest or vanity of your employers.

And that’s just from the first couple of pages. There’s quite a lot more. It’s a little lesson in history and politics, as well as epistemology, in itself.

Update. That was silly – I should have given the link for this wonderful site that has a number of Hazlitt essays (though I don’t see the Letter to Gifford, alas). If you’ve never read a word of him, you don’t have to wait until you can get to the library or bookstore, you can start with Blue Pete.

Corruption? Yawn

Jan 28th, 2004 7:58 pm | By

Corruption in US politics is a hardy perennial issue. Reliable, sturdy, always there; something to count on in a disconcerting world. This is, of course, because nothing is ever done about it, and the people who ought to care about it mostly don’t, and the people who ought to pay a penalty for engaging in it don’t, and the people who ought to be paying attention mostly aren’t, and the people who ought to be bringing it to the attention of the people who ought to be paying attention and ought to care mostly aren’t. It’s all a bit discouraging, frankly. Or to put it another way, it’s completely disgusting and infuriating, and an outrage, and absurd, and blindingly obviously not the way things ought to be done. And yet it goes on and on and on, like the drumming rabbit with the Eveready battery (that’s an advert, for our non-US readers).

The subject came up at Crooked Timber a few days ago when Henry posted about a certain Congressional Representative:

CT extends its hearty congratulations to Congressman Billy Tauzin (R-La), who’s demonstrating his sincere attachment to free market virtues by retiring from politics and selling himself to the highest bidder. For the last couple of weeks, there’s been a bidding war between the Motion Picture Association of America (MPAA) and the Pharmaceutical Research and Manufacturers of America (PhRMA) for Tauzin’s services…The phenomenon of Congressman-turned-lobbyist is hardly a new one; but the openness and extent of the greed on display is unusual, even for Washington. A sign of the times.

The next morning I heard a brief segment on the BBC World Service about the role of what’s euphemistically (and that’s a big part of the problem, I think) called ‘campaign contributions’ in US politics. The segment was pretty good, it did make some of the points that need to be made, but it was far too polite about it all. The words ‘bribery’ and ‘corruption’ were not used. But that’s the problem. This stuff is just flat bribery, but it’s almost never called that. If it were would the great American electorate be quite so torpid about the whole thing?

It’s very simple. It’s not hard to understand. People with financial interests give huge sums of money to political parties and campaigns, and they expect something in return, and they get it. At the very least they get what is called ‘access’ – they get to talk to Representatives or presidential staffers on the phone, while people without satchels of cash to give away have to make do with a Representative’s staffer. The ability of money to buy access is so taken for granted that it’s not even concealed behind those veils of euphemism – the beneficiaries don’t even bother to pretend that that doesn’t happen. The bizarre mantra that is apparently supposed to make all this okay is ‘Money doesn’t buy influence but it does buy access.’ Oh well that’s okay then! Great! Perfect! Rich people have access and poor people don’t, right, that’s the way to run things, that’s fair. Splendid.

Fresh Air talked to Charles Lewis of the Center for Public Integrity about this yesterday. He and a team of researchers have written a series of books, The Selling of the President 2004 (2000, 1996, etc.) It’s worth a listen, and a read. Maybe some day people will start to pay attention. I think I won’t hold my breath though.


Jan 27th, 2004 9:44 pm | By

This is funny. Hilarious, in fact. A blogger and frequent blog-commenter who is well-known for an unattractive combination of heavy sarcasm and rudeness made safe by anonymity, tries another bit of heavy sarcasm that falls rather flat, and contradicts himself in the process. Compare three statements:

“High-Caste Hindu”: Irreverently Humorous or Casually Colonialist and Racist? Chun Informs, You Decide

Do you assume that Spivak calling herself this would make it any less casually colonialist or racist (if in fact that’s the proper description–about which I, as I wrote, have no opinion)?

I find your points to be cogent, but I believe you must detect a patronizing note in Inglis’s description.

Ah. You decide. I have no opinion. But on the other hand, surely you detect the patronizing note. Don’t you? Surely? Come on, you must, it’s so obvious – not that I have an opinion of course. And not that I ever strike a patronizing note myself. And not that – oh never mind.

Do I What?!

Jan 27th, 2004 8:52 pm | By

This article starts off with a pretty bizarre story.

The other day, I was reading an interview with Democratic presidential candidate Howard Dean in Newsweek when I had to stop and check that it was indeed Newsweek and not, say, Christianity Today. Yes, it was indeed Newsweek. And, after a series of questions about a variety of public policy issues, Dean was asked, out of the clear blue, the following question: “Do you see Jesus Christ as the son of God and believe in him as the route to salvation and eternal life?”

Really? Really?? I never read Newsweek, so I don’t know, but that is such a weird question that it strains credulity. I mean, was the reporter Billy Graham, or what? I wouldn’t at this point be surprised to hear that an American reporter had asked a candidate impertinent questions about religion, but that one is really off the deep end.

The real issue, though, is why this question even came up in a political magazine. Do we now have a religious test for public office—something that was explicitly rejected by the Founders of the United States of America?…But in the past few weeks, Dean has been the target of something dangerously close to a religious witch-hunt—and that should concern all of us, whatever our party affiliation or our political, religious, and moral convictions.

Indeed it should, and it does. All the more so since reading that loony question.

Dude, Where’s My Site of Hegemonic Dominance?

Jan 26th, 2004 2:04 am | By

John Holbo has a very sly post on the tireless Bad Writing subject on his blog. He read the first three issues of the PMLA – Proceedings of the Modern Language Association – for 2003 cover to cover, twice. (Then he had a complete blood transfusion and is well on the way to recovery – now cut that out.) And he has some thoughts.

First he quotes Judith Butler explaining why bad writing is necessary and good:

The accused then responds that “if what he says could be said in terms of ordinary language he would probably have done so in the first place.” Understanding what the critical intellectual has to say, Marcuse goes on, “presupposes the collapse and invalidation of precisely that universe of discourse and behavior into which you want to translate it.”

Then he comments:

Mutatis mutandis, understanding how Bad Writing Contests are funny may presuppose the collapse and invalidation of precisely that universe of MLA discourse and behavior that is the butt of the joke. Judith Butler fails to apprehend the crucially performative aspect of this subversive critical work – the way in which Bad Writing Contests provide a voice to those on the margins by challenging the dominant hegemony of ‘theory’ in literary studies.

Naughty Butler, failing to apprehend the performative aspect. Very droll.

Just as you give up the right to insist on hushed reverence for all things academic when you title your paper, “Dude, where’s my reliable symbolic order?”, so you give up your right to demand fair and open-minded consideration of your views when you yourself explicitly advocate adopting language which functions expressly to short-circuit critical dissent by presupposing that one’s opponents are not just wrong but basically in a state of complete intellectual collapse. Given that this is her attitude, Judith Butler was never going to listen to Denis Dutton’s criticisms of her views in any case. He is, as she says, the editor of “a small, culturally conservative academic journal”. Being a figure on the margins – whom Butler is working to silence, by collapsing any language in which he can express his views – what does Dutton have to lose by mocking, rather than arguing?

There’s a good deal in what he says. He also says a good deal more, including his own crude but effective grading system for the articles in the PMLA. OK and Trivial are two of the possible grades, which is quite sensible. And he ends with a sentence that would make a good Thought for the Day:

It is a nice question whether it is worse to perpetrate tautological emptiness with an aura of significance or egregious unsoundness/invalidity with an aura of significance.

I love nice questions, and I plan to think about that one at odd moments for some time.

Reading For Something

Jan 25th, 2004 5:06 pm | By

One thing (but not the only thing) that prompted this train of thought (or perhaps bus of rumination or minivan of woolgathering or rollerskate of idle daydreaming) was something I read a few days ago in another of Dwight Macdonald’s letters, this one from January 1946, when Macdonald was editing his own magazine Politics.

I suppose you’ve read by now Simone Weil’s article on The Iliad. The response to it has surprised me; I thought it was a great political article, dealing with the moral questions implicit in the terrible events one reads about in every day’s newspaper, which was why I played it up so prominently in the issue…Nothing I’ve printed yet seems to have made so deep an impression. The only people who didn’t understand how such an article had a place in a political journal were – and I think this is profoundly significant – all of them Marxists. To a Marxist, an analysis of human behavior from an ethical point of view is just not ‘serious’ – even smacks a little of religion.

I think the Marxists who didn’t understand must have had a fairly crude understanding of Marxism, but that’s another subject. The relevant aspect is the question of what has a place in a particular kind of journal and what doesn’t – and the fact that Macdonald was thinking about that question. I was already thinking about it when I read that – well in fact I’m always thinking about it, really. Not every second, but every day, usually several times a day. Every time I link to a News item, in fact every time I look for a News item, which in a sense is every time I read anything at all, other than perhaps package ingredients or addresses on envelopes. As Ensign Pulver was looking for marbles all day long, I’m looking for News items all day long. Though not always actively looking – sometimes I’m just reading, like a normal person, and then as I read the act of reading is transformed into the act of reading for something. Though that doesn’t quite describe it either, because I seldom do ‘just read’ any more – or I both just read and read for something. Which is interesting, in a boring sort of way – by which I mean it interests me but I realize it may not interest everyone. Actually maybe I’ve never ‘just read’, at least not exclusively. I think that’s right – but the percentage has changed.

There’s a lot to be said for reading for something. There’s also a good deal to be said for just reading, but on the whole I prefer reading for something, as long as the something I’m reading for is worth it. I thought while I was typing it that all this was an unconscionable digression that I would probably delete, but I’ve changed my mind. It is about the subject under discussion, in a strained sort of way. Why do we read, after all? Surely the way we think about that question has some connection to what kind of thing we want to read, and why, which has some connection to why Macdonald published Weil on The Iliad. She wrote it for something, he published it for something, the readers read it for something.

At any rate, I was thinking about it more than usual even before I read the Macdonald letter. It was because of posting an item about the Bush administration’s approach to science – I was thinking about the fact that that’s not academic (to put it mildly) nonsense, so it’s not strictly our subject. I decided that what did make it our subject was the element of bullshit involved. The fact that it’s not just mistaken, but the kind of mistaken rooted in prior commitments. I decided it is worth pointing out occasionally that the academic left certainly does not have a monopoly on that kind of bad thinking. I try to do that kind of thing sparingly, because otherwise B&W will just be about anything and everything; but that’s my reasoning for doing it once in awhile. That’s my Iliad.

Because our subject is woolly thinking of a particular kind – thinking that’s fuzzy because it’s distorted, because it starts out from the wrong place. Because it starts not from genuine inquiry but from what Susan Haack calls pseudo-inquiry – not from a real desire to find the truth but from a desire to make a case for a pre-selected conclusion. That’s the bullshit aspect, the bogus element, the pseudo factor. B&W examines the academic manifestations of woolly thinking and pseudo-inquiry and bullshit, but it is worth offering an occasional example from other parts of the world too, I think, if only for epistemological reasons. It is part of the overall story, of the Big Pikcha as I said to my colleague the other day. It’s part of an understanding of how woolly thinking and bad moves work, to be able to recognize them in a variety of habitats, to realize that they’re not confined to one discipline or sector of the economy or political orientation. If we don’t know it when we see it, how can we resist it?

What Was the Question Again?

Jan 23rd, 2004 9:19 pm | By

I’ve been thinking a good deal about focus lately. About relevance, subject matter, connections. The competing merits of breadth and intensity, range and depth. About how to think about such things, and how to decide between them – metaquestions again. I seem to think about metaquestions a lot – but then that’s not surprising, is it; B&W is essentially about metaquestions. At least I think it is. That’s one of the metaquestions I’ve been thinking about – what is B&W about.

Not that I don’t know, or don’t think I know, or think I don’t know. I do think I know. Or at least I know I have an opinion. But there could be other opinions – and in fact there is at least one other opinion. There are people – one or two at a minimum – who think B&W could have a narrower focus. Who would even prefer it to have a narrower focus – without necessarily going so far as to say it would be better with a narrower focus. Which leads us into some more metaquestions about such things as the difference (if any – that’s a further metaquestion – whether there is a difference) between preferences and moral or aesthetic or epistemological judgments. Between likes and dislikes on the one hand, and moral or cognitive evaluations on the other. Is saying ‘That’s good’ exactly the same as saying ‘I like that’? How do we distinguish between taste and judgment? Much-vexed questions, all of them.

And in a way part of B&W territory – in my view of the matter. But I can see why other views would differ. I think it is (in a way, as I said) because ‘how do we know?’ and ‘how do we distinguish’ questions are background questions for the more specific, local questions that are our brief. But then I take a broad view, and that’s not the only view possible.

Perhaps it’s more accurate to say I take a variety of broad views. Maybe I take a different one at any given moment – no, I’d better not follow that thought up, or we’ll be here all night, and I’ll never get around to saying what I was going to say. What was that, anyway? Something about what kinds of subjects I think are relevant enough to the concerns and interests of B&W that it makes sense to link to them. Something about epistemology, and what a lot of ground it covers – how hard it is to talk about almost anything without getting into it. Along with something about the interest inherent in the way apparently unrelated things actually fit together, and the thought that what B&W lacks in focus it makes up in range and breadth and interestingness. Or at least that it does to my taste, but that doesn’t mean it does to everyone’s.

But you see the problem. Once I start talking about metaquestions I can’t stop, I start following them as they zip around the room, and pretty soon I’ve got myself tangled up in a horrible knitting-wool snarl of them. Of course that’s not true really – it’s just rhetoric. I could delete and re-write. Leaving the digressions there is part of the point (I have a point? No not really what you’d call a point, just an endlessly-deferred series of approaches to points) – the point being that the subject matter of B&W seems to me to be so large and complicated and branching that a narrower focus would simply leave most of the subject out. But then that depends on how you define the subject.

In fact even the subject of this N&C is too large and branching for one N&C, so I’ll have to do a chapter 2 later.

Investment Tips

Jan 22nd, 2004 1:43 am | By

I’ve been reading a volume of the letters of Dwight Macdonald lately. One bit I read this morning seemed particularly appropriate for B&W. It’s from a letter in December 1937, to Freda Kirchwey, the editor of the Nation, taking that magazine to task for a number of blind spots, such as being too ‘timid and stuffy-genteel’ in its editorial attitude to the New Deal’s recent swing to the right, and particularly for being hostile to the Commission that was investigating the Moscow Trials (he doesn’t mention John Dewey but I assume that’s the Commission usually known as the Dewey Commission).

While I was at Fortune, the Nation was always to me the great symbol of honest, truthful, intelligent journalism – everything that I missed at Fortune. But it now appears that the Nation, too, has its commitments, its investments, so to speak, just like Fortune. A journalistic investment in Soviet Russia seems to me no more admirable than a like investment in the Greatness and Goodness of Big Business.

Commitments and investments. There it is. Those are great things, necessary things, but they can rip such a hole in our ability to think straight and to see what’s in front of us – and above all to tell the truth about it. The late ’30s were notoriously a rough period for that kind of thing. What with the Moscow Trials, and Trotsky, and the POUM and the anarchists in Barcelona, and what had happened during the forced collectivization – there was a staggering amount of systematic lying, denial, looking away, self-deception in those years. It poisoned the air on the left for years, decades – arguably the left still hasn’t recovered. Watch those investments.

Before After Theory

Jan 20th, 2004 9:19 pm | By

This is an interesting review by Elaine Showalter of Terry Eagleton’s new book After Theory.

In the ’80s, theory ruled, and the subject formerly known as literature was banished or demoted in the interests of philosophy and aesthetic abstraction.

Hmm. But was it really philosophy? Or was it just little bits of philosophy here and there. That’s fine, it’s no crime to know only a little about something, that’s certainly my situation about almost everything – but one has to be clear about it. One has to be careful, it seems to me, not to confuse sampling philosophy with really studying it, and one has to be equally careful not to confuse Literary Theory with philosophy, because (this is a subtle point, now, but bear with me) they are not the same thing. One does get the impression at times that Literary Theorists think they are doing something more like philosophy than they in fact are. One also often wonders if they want to do philosophy why they aren’t in the philosophy department. It is there, after all, all set up for the purpose; why not take advantage of the fact? It seems so perverse to amble off to a different department and then set up to do the subject of another one. People don’t enroll in French departments in order to do engineering do they? Or engineering departments in order to do French. So why the English or Comparative Literature Department in order to do philosophy. One of life’s little mysteries.

He admits that cultural theory “has been shamefaced about morality and metaphysics, embarrassed about love, biology, religion, and revolution, largely silent about evil, reticent about death and suffering, dogmatic about essences, universals, and foundations and superficial about truth, objectivity, and disinterestedness.” As he says in a characteristically ironic understatement, those constitute a “rather large slice of human existence” to ignore. So he sets out to fill the gaps and propose fresh theoretical approaches to the Big Questions.

Well yes, it is a rather large slice, and that’s part of the problem, surely. Why should cultural (or literary, or critical) theory be expected or able to contribute to all those subjects? Isn’t that expecting an awful lot? Isn’t that expecting it to be an omniscient and universally applicable discipline? What would qualify it to take all that on? That’s another one of those little mysteries.

Eagleton wants to free cultural theory from crippling orthodoxy by challenging the relativism of postmodernist thought and arguing on behalf of absolute truth, human essences, and virtue (which includes acting politically). He engages with definitions of morality.

And thus we see the drawback to pretensions of universal reach and of confusing literary theory with philosophy – one ends up re-inventing the wheel and belaboring the obvious. If fans hadn’t taken ‘postmodernist thought’ too seriously to begin with, Eagleton wouldn’t now have to waste energy on challenging it, but they did so he does. That’s where fashion gets you.

First, why isn’t literature, rather than theory, the best place to go for help about morality, love, evil, death, suffering, and truth, among other things? Having written a big book about tragedy, Eagleton obviously knows that on some topics, Shakespeare is a lot more relevant than Saussure. Eagleton himself is able to command an encyclopedic range of literary reference, but he takes the side of theory rather than literature, or even a position between the two. “Critics of theory sometimes complain,” he notes, “that its devotees seem to find theory more exciting than the works of art it is meant to illuminate. But sometimes it is. Freud is a lot more fascinating than Cecil Day-Lewis. Foucault’s The Order of Things is a good deal more arresting and original than the novels of Charles Kingsley.”

And then again why are literature and theory the only two choices? And what does ‘help’ mean? What does ‘relevant’ mean? Why Saussure? Why Freud, why Foucault? It’s all a muddle, frankly. One undefined term or unexamined assumption after another. Time for After After Theory.

Eat Your Sugar

Jan 18th, 2004 7:24 pm | By

This sounds familiar, doesn’t it? We’ve read articles like this before? Only a few days ago in fact? The subject does seem to keep coming up. The Bush administration and profit-making entities on the one hand, and scientific advice and knowledge on the other. Bulldozers make better habitats than rivers do; wetlands pollute; academic scientists who receive grants should be kept off federal peer review panels while scientists with ties to profit-making entities should not. Day is night, up is down, black is red. Do we begin to detect a pattern here?

The President insists fighting fat is a matter for the individual, not the state. But today The Observer reveals how he and fellow senators have received hundreds of thousands of dollars in funding from ‘Big Sugar’. One of his main fundraisers is sugar baron Jose ‘Pepe’ Fanjul, head of Florida Crystals, who has raised at least $100,000 for November’s presidential re-election campaign.

The individual, not the state. Right. And that means that the state must not interfere, even by so much as issuing reasonable dietary guidelines, with people who make money by selling sugar-added foods. It’s up to the individual to ignore all that advertising, have a little backbone, and just stay thin. Simple.

The Bush administration, which receives millions in funding from the sugar industry, argues there is little robust evidence to show that drinking sugary drinks or eating too much sugar is a direct cause of obesity. It particularly opposes a recommendation that just 10 per cent of people’s energy intake should come from added sugar. The US has a 25 per cent guideline.

25% added sugar – sure, that sounds about right. Could be worse. Could be 75% after all.

Too bad the spinach lobby isn’t as powerful as the sugar one. But I guess there just aren’t the bucks in spinach that there are in the sweet stuff.

Hands Off Lacan!

Jan 17th, 2004 8:46 pm | By

This is quite an amusing piece. Albeit irritating. So much rhetoric, so much slippery use of emotowords, so much vagueness where precision is needed – all to protect the heritage of Freud and Lacan. Why, one has to wonder. What is it about Freud that makes people one would think ought to know better, cling so fiercely? I suppose I could postulate some sort of psychoanalytic answer, but would that tell us anything?

“When they speak of ‘professionalising’ people whose business is human misery; when they speak of ‘evaluating’ needs and results; when they try to appoint ‘super-prefects’ of the soul, grand inquisitors of human sadness – it is to hard not to agree that psychoanalysis is in the firing line,” Levy said.

That’s a translation, I assume, so perhaps it’s unfair to look too closely at the words – but I’m going to anyway. ‘People whose business is human misery.’ What does he mean ‘business’? People who make money off human misery? Why should they be protected? Or does he mean something along the lines of experts in human misery, people who know a lot about human misery. But one can know a lot about human misery, in some sense – arguably all humans know that – without having the faintest clue how to ‘fix’ it or cure it or do anything about it at all other than hand-wring or watch or write poetry. Even quacks and charlatans can know a lot about human misery.

Critics say the absence of regulation and a growing demand for therapy of all kinds has led to a proliferation of astrologers, mystics and con-artists – and they are demanding that the public be protected by a system of recognised qualifications. But Lacan, who died in 1981, said that the “analyst’s only authority is his own,” and his followers believe the state has no business interfering in the mysteries of the id and the unconscious. In a faction-ridden climate, many psychoanalysts also see the government’s initiative as an attempt by their arch-enemies the psychiatrists – hospital-based doctors who prescribe drugs for treating mental illness – to marginalise their work.

The analyst’s only authority is his own – well that’s blunt, at least. That’s a good concise summing-up of what’s wrong with psychoanalysis. It presents no evidence, it is not peer-reviewed, it rules out falsification. It’s a form of hermeneutic, we’re often told, which is all well and good but it also claims to be therapeutic. It wants to have it both ways, in short: to charge a lot of money for its ministrations on the understanding that they are in some way helpful for human misery, but to escape oversight and regulation on the understanding that psychoanalysis is some kind of sacred mystery. A ‘marginalisation’ of their work would be a fine thing, if you ask me.

The Poetics of History 2

Jan 17th, 2004 5:48 pm | By

My first comment on this subject has prompted some comments that suggest a lot of further comments (I’m in a permanent state of Infinite Regress here: everything I write seems to suggest several hundred more things I could write) and subjects to look into further. Empathy; the relationship of research to teaching; other minds and solipsism; the tendency to value emotional stances like empathy over ‘cooler’ more cognitive commitments to justice or equality; and so on.

And there is also this article in the New Yorker about a book of history and a play, Thucydides’ History and Euripides’ Medea.

To describe this war in all its complexity, Thucydides had to invent a new way of writing history. In his introduction, he says he will eschew “literary charm”—mythodes, a word related to “myth”—in favor of a carefully sifted accuracy…But this desire for what we would call balanced and accurate reporting led, paradoxically, to a most distinctive literary device: the use of invented speeches and dialogues to illustrate the progress of the war and the combatants’ thinking about it. Every key moment in the war…is cast as a dialogue that, as Thucydides admits, may not be a faithful reproduction of exactly what was said in this or that legislative session or diplomatic parley but does elucidate the ideologies at play. “My method has been,” he writes, “to make the speakers say what, in my opinion, was called for by each situation.” This, more than anything, is what gives the History its unique texture: the vivid sense of an immensely complex conflict reflected, agonizingly, in hundreds of smaller conflicts, each one presenting painful choices, all leading to the great and terrible resolution.

Which is very like the way Euripides wrote his plays, to the disapproval of Nietzsche, who claimed that Socrates and Euripides rationalized and so ruined tragedy. But others like the interplay of argument, the effort to think things through, the questioning of each other’s assumptions, that many of Euripides’ plays show us. There is room for empathy – with Medea, Andromache, Iphigenia – and also for enhanced understanding of the issues, at least as one Athenian playwright saw them. And the great skeptical historian was doing much the same thing. Thucydides was a little bit of a dramatist and Euripides was a little bit of a historian.

Graduate School and its Discontents

Jan 16th, 2004 9:04 pm | By

Invisible Adjunct has another good comment thread going. Remember that interesting (and often symptomatic) thread about the MLA a few weeks ago? There have been interesting ones since, and now there’s an especially interesting one. Well I say that because of the two last posts (last at the moment, last when I saw the thread), 10 and 11. Number 10:

In the first year of graduate school in archaeology we spent so much time learning about post-modernist theaory and how archaeology could not really tell you about the past (it could only reveal your current political views on power relationships) that by the end of the year my professors convinced me that there was no reason to continue my studies in that field. I dropped out and went to law school.

Number 11:

I also remember a huge emphasis on postmodernism when I was a doctoral student in the college of education. Yes, I enjoyed postmodern theory, but there were never any other perspectives; I had to find those on my own. For example, we never studied education from a Marxist perspective; after all, Marxism had been determined to be too “modernistic.” I guess my big gripe with postmodern theory is that it tends to lead to nihlism and a total lack of social solidarity and responsibility. It really reached the pinnacle of craziness when issues like classroom management were turned into postmodern “points of view.” For example, I remember several of us classroom teachers posing serious questions about what happened in our classrooms. We weren’t looking for “how-to” answers, but something better than the “what is disobedience, anyway?” I’m sorry, but if you were to spend time in an 8th grade classroom, I don’t think you’d have any problem with the concrete reality of negative behavior. What was super-ironic is that whenever we would be looking at politics or power relations and anyone would give down-to-earth examples of how power REALLY operated (i.e. through control of workers, surveillance, etc.) then those became modernist concerns and were open to interpretation, not social action.

Yes, from everything I hear, people in real-life 8th grade classrooms have no trouble saying what disobedience is, and why you need some of the other thing if you’re going to teach 30 or 35 children. And there’s something really enormously…ironic? Or is that too modernist. Perhaps I mean playful? Yes, no doubt that’s it. There’s something enormously ‘playful’ in the fact that postmodernist theory causes people to quit archaeology and go to law school instead. Actually what should be happening is that everyone everywhere should be dropping out of all academic programs – because those are all modernist projects too after all – and going into advertising. What could be more postmodernist than advertising? Especially now, now that everybody knows that everybody knows that everybody ‘sees through’ advertising, and ‘transforms’ it into a ‘site of resistance,’ so that advertising gets weirder and weirder, or more and more postmodern, in order to out-resist and out-transform and out-postmodernize all those people in the postmodern audience. Surely it’s the duty of all good postmodernists to provide more sites of resistance for everyone. And of course the pay is better, and you don’t risk ending up in places like Ithaca or Lubbock, and you don’t have to do all that reading.

The Poetics of History

Jan 15th, 2004 9:14 pm | By

There was an interesting subject under discussion at Cliopatria yesterday and this morning – history as defamiliarization, poetics and history, the difference between history and fiction. The whole subject touches on a lot of difficult, knotty questions – other minds; the reliability or otherwise of testimony, autobiography, narrative – of what people recount about their own experiences; empathy; imagination; the general and the particular, the abstract and the concrete – and so on. Meta-questions.

I wondered about the much-discussed idea that fiction can teach empathy in a way that more earth-bound, or factual, or evidence-tethered fields cannot. That novelists have a special imaginative faculty which enables them to show what it’s like to be Someone Else so compellingly that we learn to be tolerant, sympathetic, forgiving, understanding etc. in the act of reading. Cf. Martha Nussbaum in Poetic Justice for example. It seems plausible, up to a point, but…only up to a point. For one thing there are so many novels that are full of empathy for one character but none for all the others; and there are so many that have empathy for the wrong people and none for their victims (cf. Gone With the Wind); and there are so many mediocre and bad novels, and the aesthetic quality of a novel has little or nothing to do with its level of empathy-inducement.

I think there are a couple of background ideas at work here, that could do with being dragged into the light. One is that all novelists, all fiction-writers have this ability to teach empathy – that there is something about the very act of telling a story that produces character-sympathy, and that character-sympathy translates into sympathy for people in general as opposed to sympathy for one particular character. But anybody can set up as a novelist, including selfish, unreflective, egotistical people. There is no guarantee that telling a story has anything to do with empathy. And then there is a second idea, that what novelists imagine about other minds is somehow reliable. But why should that be true? Especially why should it be true of all novelists? At least, why should it be any more true than it is of the rest of us? We can all imagine what’s going on in other minds – and we can all be entirely wrong. Or not. It may be that particularly brilliant novelists are better at imagining what’s going on in other minds – at guessing the truth – but particularly brilliant novelists are a rare breed, and in any case, nobody knows for sure whether they have it right or not. We think they do, it sounds right, but we don’t know. All it is, after all, is the imagining of one novelist. Lizzy Bennett and Isabel Archer and Julien Sorel may tell us what it’s like to be someone else – or they may not. We simply don’t know.

Brought to You By

Jan 15th, 2004 7:07 pm | By

This is a disgusting item in the Washington Post. It sounds good at first – but then it’s meant to. And at second it doesn’t sound good at all.

The administration proposal, which is open for comment from federal agencies through Friday and could take effect in the next few months, would block the adoption of new federal regulations unless the science being used to justify them passes muster with a centralized peer review process that would be overseen by the White House Office of Management and Budget.

It’s those last seven words that give the game away – along with the word ‘centralized’ perhaps. Peer review is one thing, ‘centralized’ peer review is another, and ‘centralized’ peer review overseen by the White House Office of Management and Budgetis quite, quite another. Which peers would those be, exactly? Centralized by whom? And – ‘overseen’ in what sense, using what criteria? One can guess all too easily.

But a number of scientific organizations, citizen advocacy groups and even a cadre of former government regulators see a more sinister motivation: an effort to inject White House politics into the world of science and to use the uncertainty that inevitably surrounds science as an excuse to delay new rules that could cost regulated industries millions of dollars…Under the current system, individual agencies typically invite outside experts to review the accuracy of their science and the scientific information they offer…The proposed change would usurp much of that independence. It lays out specific rules regarding who can sit on peer review panels — rules that, to critics’ dismay, explicitly discourage the participation of academic experts who have received agency grants but offer no equivalent warnings against experts with connections to industry. And it grants the executive branch final say as to whether the peer review process was acceptable.

Perfect. Disinterested academics need not apply, but industry scientists are welcome. And the executive branch, with its dazzling track record of scrupulous impartiality in scientific matters, has the final say.

Certainty No

Jan 14th, 2004 8:10 pm | By

The New York Times has an article by Edward Rothstein on the annual Edge question, which John Brockman poses to a large number of writers, scientists and thinkers (many of them all three at once). This year the question is ‘What’s your law?’

There is some bit of wisdom, some rule of nature, some law-like pattern, either grand or small, that you’ve noticed in the universe that might as well be named after you. Gordon Moore has one; Johannes Kepler and Michael Faraday, too. So does Murphy. Since you are so bright, you probably have at least two you can articulate. Send me two laws based on your empirical work and observations you would not mind having tagged with your name. Stick to science and to those scientific areas where you have expertise. Avoid flippancy.

Very good. But Rothstein says an odd thing in his piece.

But curiously, an aura of modesty, tentativeness and skepticism hovers over the submissions — this from a group not renowned for self-abnegation. This may, perhaps, be an admission that fundamental insights are not now to be had. But it may also be an uncertainty about science itself.

But it’s not curious at all. Far from it. Is the reporter not aware that that’s how science is done? Doesn’t he realize that one of the fundamental characteristics of science, one of its definitions, is that it’s always revisable? Offering a tentative ‘law’ implies the opposite of ‘an uncertainty about science itself’; it conveys the kind of confidence one can have in a form of inquiry that is on principle committed to changing its laws when new evidence turns up.

Amusingly enough, I linked to another article in which Colin Blakemore said exactly that only yesterday.

Science must be given back to ordinary people and the key to that is education. I say that with some trepidation, given the political incorrectness of the phrase ‘public understanding of science’ and the new mantra of dialogue and debate. It doesn’t really matter whether people know that the Earth goes round the Sun. But it does matter if they don’t know what a control experiment is, if they think that science produces absolute certainties, if they see differences of opinion among scientists as an indication that the scientific process is flawed, or if they feel robbed of the right to make ethical judgments.

It does matter if people think that science produces absolute certainties. Apparently the place to start is with journalists.

‘Aims To’

Jan 14th, 2004 2:40 am | By

Here it is again – that endlessly repeated untrue statement about the utility of religion.

People like Dawkins, and the Creationists for that matter, make a mistake about the purposes of science and religion. Science tries to tell us about the physical world and how it works. Religion aims at giving a meaning to the world and to our place in it. Science asks immediate questions. Religion asks ultimate questions. There is no conflict here, except when people mistakenly think that questions from one domain demand answers from the other. Science and religion, evolution and Christianity, need not conflict, but only if each knows its place in human affairs — and stays within these boundaries.

Dawkins does not make a mistake. It’s simply not true to say or imply that religion makes no attempt to tell us about the physical world and how it works. Granted, its statements are much more easily arrived at than those of science, because miracles and the supernatural are no obstacle, whereas scientific inquiry tends to like to avoid that kind of thing. But just because religion talks about a deity that is everywhere and omniopotent and omniscient and benevolent (large problems right there, as everyone knows) – just because it talks about an entity that there’s no evidence for, in other words, an entity that’s easy to imagine but hard to find, doesn’t mean it’s not making any claims about the physical world.

But even worse is the next bit. Religion ‘aims’ at giving a meaning – well what good is that?! We all ‘aim at’ lots of things; so what? That’s evasive language, that’s what that is. The point is, religion claims to do more than just ‘aim at’ giving a meaning, it claims to succeed, and that’s a much more ambitious claim. And then a little farther on – religion asks ultimate questions. Sigh. So what? I can do that too, so can you, so can anyone. Just because we can ask a question doesn’t mean there’s an answer to it! In fact it doesn’t mean it’s not a damn silly question. As a matter of fact that’s another thing Richard Dawkins talked about on the Start the Week I mentioned below.

And then the nonsense about each knowing its place. Well religion doesn’t know its place, so what’s the point of saying that! Religion does try to tell us how the world is, and it’s really not honest to pretend it doesn’t. There’s so much weasling about religion around these days. Pretending it’s really exactly like poetry or music, it’s really just a feeling about the world, it’s really just hope or aspiration or wonder. If it were, I wouldn’t have a word to say against it, but it’s not, so I do.

It’s odd, this guff came from Michael Ruse. He’s not silly, at least I don’t think so, at least I read a good essay by him once. Perhaps he’s gone squishy since then.

Good Conversation

Jan 13th, 2004 11:28 pm | By

Start the Week is always good (well just about always), but I particularly liked last week’s, which I listened to a day or two ago. Richard Dawkins was on, explaining that (contrary to popular opinion) he’s an anti-Darwinian on moral matters. He thinks we should do our best to be different from what our genes would have us be; that, being the only species that’s capable of deciding to over-ride our genetic predispositions, we should damn well do it. Then there was Tim Hitchcock, saying some fascinating things about a change in sexual practices that happened late in the 17th century and caused a sharp rise in population. Dawkins pointed out that what Hitchcock was describing was in fact a classic example of humans acting in a way their genes would not ‘want’ them to – avoiding penetrative sex in favor of other kinds, thus lowering the birth rate. That’s one of the great things about Start the Week: the way things connect up that don’t seem to.

And then there was a fascinating bit where Dawkins asked Anthony Giddings a detailed question about chaos. He wasn’t sure he understood it properly, and he was unabashed about asking questions about it on national radio. Some people would be too vain to do that, I think. I once read something by Dawkins – I think in Unweaving the Rainbow – about a very famous scientist giving a guest lecture when Dawkins was a student. Someone in the audience pointed out that Famous Scientist was wrong about something – and FS, far from getting huffy, thanked the pointer out enthusiastically, and (I think – if I remember correctly) said that’s the great thing about science. Everyone applauded like mad. I really love that story. (I may have told it before, but if I have it was months ago, so just pretend you don’t remember.)