Author: Ophelia Benson

  • BHL Has Views That Annoy

    ‘The Palestinian “victimocracy” has a tendency to hide wars that are infinitely longer and more murderous.’

  • A Theosophical Heir to the Throne

    ‘Much of this assault on contemporary rationalism flows from the prince’s rather eclectic spiritualism.’

  • There Are Limits, After All

    Okay, that does it. I’m going to have to put my foot down. (Ooh, scary.) I’m going to have to get all authoritarian and domineering – all prescriptive instead of descriptive. There’s no help for it.

    There was a discussion on Crooked Timber the other day about the odd usage whereby ‘argue that’ means the opposite of what it means. The example that caught Harry’s attention was this one: ‘Though few would argue that children should be protected from exposure to Internet pornography, COPA, the law designed to protect them has been struck down by the U.S. Supreme Court.’ You see the problem? It’s confusing, and stupidly confusing – you realize (from the context) when you get to the end of the sentence that it means the opposite of what you took it to mean when you were in the middle of it. The fool who wrote it meant ‘few would dispute or disagree that’, not ‘few would argue that’. It is stupid and bad, especially with a sentence as long as that because you have plenty of time to think it means one thing so then when you realize it means (because of a silly mistake by the writer of it) the opposite you have to go back and re-understand the sentence. What a waste of time and effort.

    Right. Few sensible people would argue I mean disagree that that’s a mistake worth not making, because it causes confusion. Or you would think few sensible people would disagree, but actually some would. Some would say that we can figure out the meaning from the context. Well – yes, sometimes, but at the price of extra effort, which is not normally the goal of using words, so why do it? Isn’t it better to use words to mean what they mean rather than what they don’t mean? Is there some benefit in forcing people to puzzle over the meaning of a word when it ought to be quite straightforward and clear? Rather than guessing at the meaning from context, isn’t it better to know the meaning because the right word was chosen? It seems better to me. Why write or say ‘I’m going to the North Pole tomorrow, I hope it won’t be too hot,’ and expect your readers or hearers to figure out from the context that by ‘North Pole’ you happen to mean Atlanta?

    And besides, it’s not even true that it’s always possible to figure out the meaning from the context, and in the case of this particular idiotic usage, it can be impossible, and the meaning of everything one is saying can be entirely misunderstood. This isn’t a mere nuance or shade of meaning. It’s more like standing up in court and saying ‘Guilty’ when you mean ‘Not guilty’ or saying ‘That’s true’ when you mean ‘That’s false.’ Basic. And I’ll give you an example – the example that made me say ‘That does it’ and decide to do this N&C. The example was mentioned at the bottom of that thread at CT. It’s from Matthew Yglesias.

    It’s one of the fixed-points of the American national security discourse that it would be A Very Bad Thing if Iran had nuclear weapons. And I won’t argue that it would be preferable for them not to go nuclear.

    Since I’ve prepared you with all this ranting, perhaps it won’t seem as bad as it is. But it is just hopeless. The reader has to think too much to figure out what the hell he means. Except I suppose not the reader who has become wholly used to this mistake, but then won’t that reader be confused when ‘argue’ is used to mean ‘argue’ instead of ‘dispute’? No, probably not, because the people who use argue to mean both argue and dispute have only a very hazy grasp of what anything means. Maybe to them ‘argue’ means argue, dispute, laugh, ice cream, hair pin, Ferrari, smoke, armadillo, popcorn, shoe. Whatever. But people like that don’t read B&W. So here’s my foot down. Nobody gets to use ‘argue that’ to mean dispute. Period.

  • Judy, Judy, Judy

    Here we go again. What is it about Judith Butler that makes people come over all delusional? That causes them 1) to exaggerate her fame and celebrity and stardom and name-recognition in an utterly grotesque manner and 2) causes them to overestimate her real as opposed to apparent or fame-related importance, interest, originality, ‘insight’, profundity, originality, and brilliance?

    Well, I suppose one answer is, shall we say, a certain lack of nous. At least on the evidence of this article in Salon that seems to be one answer. [Note: you have to click through a brief advert to read article.] For instance there is the sentence ‘Butler even made headlines in the New York Times when she won an award for “Bad Writing” — writing that was too theoretically obtuse, a trademark of postmodern critique.’ Oh dear. That freshman Comp mistake of conflating ‘obscure’ and ‘abstruse’ and thus saying ‘obtuse’ when it’s not at all what one means. That’s embarrassing (doesn’t Salon have editors?), especially coming from someone who is in such a frenzy of excitement over Butler and her way with language. Of course it’s also pretty funny. Yes, a trademark of ‘postmodern critique’ is indeed that it is way too ‘theoretically obtuse’ but I bet you didn’t mean to admit it!

    And there’s also the fact that Nussbaum’s famous takedown of Butler was not in the Atlantic Monthly. (Were all the editors in the Hamptons that week, or what?) So, who knows, maybe the answer to the question in this case is just that that’s what one does for a sloppy puff piece. But all the same, the level of coercive flattery is remarkably high.

    These were the Culture Wars, and fighting on the front lines were tenured humanities professors from America’s elite universities, proponents of what has come to be known simply as Theory. Armed with the insights of postmodern philosophy, they shocked and awed through their intellectual acrobatics…

    Hmm. Wars, fighting, front lines, armed with – yeah, right. Muy macho. And insights, intellectual acrobatics that shocked and awed. In your dreams. And that sly bit about ‘what has come to be known simply as Theory’ – no it hasn’t ‘come to be known’ as that, the ‘Theorists’ themselves have done their best to force the rest of the world to think of what they do as ‘simply’ ‘Theory’, by calling it that three times in every sentence. And yet still, the only people who think that what English and comp lit teachers do is ‘Theory’ are – wait for it – English and comp lit teachers. And the writers of articles like this one.

    …author of the now classic Gender Trouble: Feminism and the Subversion of Identity, one of the defining works of queer theory…academia’s equivalent of a platinum album…her seminal work…provocative political essays…immense success…there was even a fanzine, Judy!, printed in her honor…fellow academics, who may or may not have envied her popularity…

    And then we move directly to mention of the Nussbaum article. You know, I really, really doubt that Nussbaum has the faintest shred of envy of Butler’s putative popularity. I really strongly doubt that Nussbaum would prefer to have written the books Butler wrote rather than the ones she herself wrote. I know I wouldn’t. I know if I could wave a magic wand and have written either, say, The Fragility of Goodness or Sex and Social Justice, or Butler’s Collected Works, I would not choose the latter. No verily, not even if I could be a ‘superstar’ by doing so, nor would I bother envying Butler her supposed stardom, any more than I envy Britney Spears hers.

    That sounds like just mockery but it isn’t; there’s a real point behind it. There’s something badly wrong with the kind of thinking that mixes up fame with quality – that gets in such a fever of excitement over Butler’s superstardom and popularity that it becomes quite unable to see that her actual books are not particularly good. In fact it’s just another version of the kind of thing we were looking at the other day: of groupthink and social pressure, coercion and majority opinion-mongering. (It’s especially ironic since another basic idea of the article is that ‘Theory’ and its epigones are Outsiders, radicals, embattled martyrs of thought, nonconformists.) It’s such a basic point – popular is not the same thing as good, majority opinion is not the same thing as truth. And the attempt to admire people for being hugely popular and radically nonconformist at the same time is something of a mug’s game, frankly.

  • Sucking Up to Judith Butler

    Superstar, classic, defining work, platinum album, seminal work, provocative, immense success, etc.

  • Open Democracy on Multiculturalism

    Does multiculturalism lead to cultural relativism? What about the universal standards of human rights?

  • The Fahrenheit 9/11 Files

    And now to be serious again. Or maybe not so much serious as slightly less egomaniacal. The discussion of Michael Moore’s new movie rages on. Or not really rages, perhaps, but several people are talking about it. Todd Gitlin, for example, who has some reservations –

    But now a pause for a moment of conscience. Let intellect have its due. Moore cuts plenty of corners, so how good can that be? Compelling? Useful? Moore specializes in hodgepodge. He jokes his way past the rough edges. He’s neither journalist nor documentarian, for he doesn’t set out to discover what he doesn’t already know. To patronize Michael Moore by calling him useful is to give him a pass for shoddy work, sloppy insinuations, emotional blackmail and all–around demagoguery.

    I haven’t seen ‘Fahrenheit 9/11’ so I can’t comment on that in particular – well I can, of course, and I’ve been known to comment noisily on movies I haven’t seen, but I won’t right now, is what I mean. I haven’t seen ‘Bowling for Columbine’ either. But I watched ‘TV Nation’ when it was on, and I’ve seen the earlier movies – so I certainly do know what Gitlin means. But I also know what Gitlin means later on in the article:

    So give Moore a cheer for this…because, in the thick of a rolling political emergency, he’s packing in blue–state crowds and blue–niche–of–red–state crowds and who–knows–what–color–in–purple–state crowds. Fahrenheit 9/11 opened as the highest–grossing nonfiction (some would quarrel with the label, but never mind) film of all time. Its average box office take per theatre beat out – good God – Mel Gibson’s Passion of the Christ.

    Yep. I’m in two minds, I suppose, because I think nonfiction movies ought to be actually nonfiction movies, but on the other hand – the left is so pathetic and hapless and ignored over here, it is very difficult not to rejoice that his movie is packing them in and his books are best-sellers. Very difficult indeed, so difficult that I don’t even try.

    The issue is also being discussed at Crooked Timber and Normblog and Crooked Timber again.

  • Machiavellian Monkeys

    Our brains are huge, particularly if you take into consideration the relative size of our bodies. Generally, the proportion of brain to body is pretty tight among mammals. But the human brain is seven times bigger than what you’d predict from the size of our body. Six million years ago, hominid brains were about a third the size they are today, comparable to a chimp’s. So what accounts for the big boom? It would be flattering ourselves to say that the cause was something we are proud of–our ability to talk, or our gifts with tools. Certainly, our brains show signs of being adapted for these sorts of things (consider the language gene FOXP2). But those adaptations probably were little more than tinkerings with a brain that was already expanding thanks to other factors. And one of those factors may have been tricking our fellow hominid.

    In the 1980s, some primatologists noticed that monkeys and apes–unlike other mammals–sometimes deceived members of their own species, in order to trick them out of food or sneak off for some furtive courtships. The primatologists got to thinking that deception involved some pretty sophisticated brain power. A primate needed to understand something about the mental state of other primates and have the ability to predict how a change in that mental state might change the way other primates behaved.

    The primatologists then considered the fact that humans aren’t the only primates with oversized brains. In fact, monkeys and apes, on average, have brains twice the size you’d predict for mammals of their body size. Chimpanzees and other great apes have particularly big brains, and they seemed to be particularly adept at tricking each other. What’s more, primates don’t simply have magnified brains. Instead, certain regions of the brain have expanded, such as the neocortex, the outer husk of the brain which handles abstract associations. Activity in the neocortex is exactly the sort of thinking necessary for tricking your fellow ape.

    Taking all this into consideration, the primatologists made a pretty gutsy hypothesis: that the challenges of social life–including deception–actually drive the expansion of the primate brain. Sometimes called the Machiavellian Intelligence hypothesis, it has now been put to its most rigorous test so far, and passed quite well. Richard Byrne and Nadia Corp of the University of St. Andrews in Scotland published a study today in the Proceedings of the Royal Society of London. (The link’s not up yet, but here’s a New Scientist piece. They found that in 18 species from all the major branches of primates, the size of the neocortex predicts how much deception the species practices. Bigger brains mean more trickery. They were able to statistically rule out a number of other factors that might have created a link where none existed. And they were able to show that deception is not just a side-effect of having a big brain or something that opportunistically emerges more often in big groups. Deception is probably just a good indicator of something bigger going on here–something psychologists sometimes call “social intelligence.” Primates don’t just deceive one another; they also cooperate and form alliances and bonds, which they can keep track of for years.

    While deception isn’t just an opportunistic result of being in big groups, big groups may well be the ultimate source of deception (and by extension big brains). That’s the hypothesis of Robin Dunbar of Liverpool, as he detailed last fall in the Annual Review of Anthropology. Deception and other sorts of social intelligence can give a primate a reproductive edge in many different ways. It can trick its way to getting more food, for example; a female chimp can ward off an infanticidal male from her kids with the help of alliances. Certain factors make this social intelligence more demanding. If primates live under threat of a lot of predators, for example, they may get huddled up into big groups. Bigger groups mean more individuals to keep track of, which means more demands on the brain. Which, in turn, may lead to a bigger brain.

    If that’s true, then the human brain may have begun to emerge as our ancestors huddled in bigger groups. It’s possible, for example, that early hominids living as bipeds in patchy forests became easier targets for leopards and other predators. Brain size increased modestly until about two million years ago. It may not have been able to grow any faster because of the diet of early hominids. They probably dined on nuts, fruits, and the occasional bit of meat, like chimpanzees do today. That may not have been enough fuel to support a really big brain; brain tissue is incredibly hungry, demanding 16 times more energy than muscle, pound for pound. It was only after hominids began making butchering tools out of stones and got a steady supply of meat from carcasses that the brain began to expand. And it was probably around this time (between 2 and 1.5 million years ago) that hominids began evolving the extraordinary powers of deception (and other sorts of social intelligence) that humans have. We don’t just learn how other people act–we develop a powerful instinct about what’s going on in their minds. (I wrote about the neuroscience behind this “mentalizing” last year in an article for Science.)

    So next time you get played, temper your anger with a little evolutionary perspective. You’ve just come face to face with a force at work in our evolution for over 50 million years.

    UPDATE 7/3/04: A skeptical reader doubted some of my statements about the brain and the energy it requires. Those who crave more information should check out Northwestern University anthropologist William Leonard’s article “Food for Thought” in Scientific American.

    Carl Zimmer is the author of several popular science books and writes frequently for magazines including The New York Times Magazine, National Geographic, Science, Newsweek, Discover, Popular Science and Natural History. His latest book is Soul Made Flesh. This article first appeared on his blog The Loom and is republished here by permission.

  • The Hubble’s Last Years?

    NASA has canceled missions to service telescope.

  • High Art v Low is a False Dichotomy

    ‘In America, even the intellectuals are anti-intellectual.’

  • Tupac Shakur not Some Sort of Byron

    John McWhorter says rap teaches ‘recreational outrage.’

  • The ‘No Ectoplasm Clause’

    Massimo Pigliucci on the neurobiology of regret.

  • Arab News Media and the ‘Blood of Martyrs’

    ‘al Qaeda has become mainstream and being part of the movement is “cool” in the eyes of young people.’

  • Moore Could be Better and Still Be Moore

    ‘He could show us that war kills and Bush is appalling, and yet be more scrupulous.’

  • Louis Menand is a Tosser, Publisher Says

    Helps to have a sense of humour, to get point of Eats, Shoots & Leaves.

  • John Sutherland Deplores Soggy Platitudes

    No one reading this blah-ridden document would guess how serious the crisis in arts funding is.

  • Carl Zimmer on a New Hominid Find

    Is human development straight or branching? How important were long legs and big brains?

  • Skull Fuels Homo erectus Debate

    Small skull could show diversity within species, or different species.

  • Sudden Doubling of Known Planet Population

    Hubble telescope has found nearly 100 new planets.

  • Big Al

    That article of Steven Waldman’s has sent me to dear old Alexis de Tocqueville, the darling percipient frog that he is. Because Waldman’s whole schtick in that article is just exactly the kind of thing Tocqueville, and, inspired by him, John Stuart Mill, had in mind. The old majority opinion trick – the old ‘We all think this so you’d better think it too or else, and never mind whether it’s true or not just shut up and think what you’re told.’ I actually don’t think Waldman is really talking about Kerry there, I think that’s just a pretext – a disguise, a mask, a beard for what he really wants to say, which is that Most Americans believe in God and so all of them ought to and they should be subject to non-stop social pressure and accusations of elitism, coastalism, intelligentsia-wannabeism, and any other kind of thought-crime we can think of if they refuse. (Waldman for instance is toying with the idea that people who refuse to believe are racists, because a lot of African-Americans are believers. He doesn’t actually say as much, but the implication is there, like a faint but bad smell.)

    So here is some wisdom from Democracy in America, Volume I chapter 15.

    The authority of a king is physical and controls the actions of men without subduing their will. But the majority possesses a power that is physical and moral at the same time, which acts upon the will as much as upon the actions and represses not only all contest, but all controversy…
    In America the majority raises formidable barriers around the liberty of opinion; within these barriers an author may write what he pleases, but woe to him if he goes beyond them. Not that he is in danger of an auto-da-fe, but he is exposed to continued obloquy and persecution…He yields at length, overcome by the daily effort which he has to make, and subsides into silence, as if he felt remorse for having spoken the truth…
    Monarchs had, so to speak, materialized oppression; the democratic republics of the present day have rendered it as entirely an affair of the mind as the will which it is intended to coerce.

    1835, this was published. Isn’t it interesting how consistent we are.

    Absolute monarchies had dishonored despotism; let us beware lest democratic republics should reinstate it and render it less odious and degrading in the eyes of the many by making it still more onerous to the few…
    …there can be no literary genius without freedom of opinion, and freedom of opinion does not exist in America. The Inquisition has never been able to prevent a vast number of anti-religious books from circulating in Spain. The empire of the majority succeeds much better in the United States, since it actually removes any wish to publish them. Unbelievers are to be met with in America, but there is no public organ of infidelity.

    Unbelievers are to be met with, that’s still true. But do they fill the newspapers and airwaves? Not on the planet I live on, they don’t. Some of our commenters seem to have found a different planet – and I hope they’re right and I’m wrong. I hope any day now the mass media will fill up with articles and comments urging Kerry to honor the separation of church and state. But, once again, I’m going to avoid breath-holding.