Author: Ophelia Benson

  • US v WHO on Sugar and Obesity

    US accused of diluting dietary advice to please sugar lobby.

  • Cot Death Trials Review

    258 cases of parents found guilty of killing a child to be reviewed.

  • Is the Self a Narrative?

    No, there are other ways to think about it, says Galen Strawson.

  • Not all Skepticism is Good Skepticism

    Carl Zimmer on how extinction skeptics get it wrong.

  • Eat Your Sugar

    This sounds familiar, doesn’t it? We’ve read articles like this before? Only a few days ago in fact? The subject does seem to keep coming up. The Bush administration and profit-making entities on the one hand, and scientific advice and knowledge on the other. Bulldozers make better habitats than rivers do; wetlands pollute; academic scientists who receive grants should be kept off federal peer review panels while scientists with ties to profit-making entities should not. Day is night, up is down, black is red. Do we begin to detect a pattern here?

    The President insists fighting fat is a matter for the individual, not the state. But today The Observer reveals how he and fellow senators have received hundreds of thousands of dollars in funding from ‘Big Sugar’. One of his main fundraisers is sugar baron Jose ‘Pepe’ Fanjul, head of Florida Crystals, who has raised at least $100,000 for November’s presidential re-election campaign.

    The individual, not the state. Right. And that means that the state must not interfere, even by so much as issuing reasonable dietary guidelines, with people who make money by selling sugar-added foods. It’s up to the individual to ignore all that advertising, have a little backbone, and just stay thin. Simple.

    The Bush administration, which receives millions in funding from the sugar industry, argues there is little robust evidence to show that drinking sugary drinks or eating too much sugar is a direct cause of obesity. It particularly opposes a recommendation that just 10 per cent of people’s energy intake should come from added sugar. The US has a 25 per cent guideline.

    25% added sugar – sure, that sounds about right. Could be worse. Could be 75% after all.

    Too bad the spinach lobby isn’t as powerful as the sugar one. But I guess there just aren’t the bucks in spinach that there are in the sweet stuff.

  • Science and Profit Collide Again

    WHO criticizes Bush administration for letting sugar lobby block efforts against obesity.

  • Jung’s Concepts Empirical not Speculative?

    Hunger for meaning not always compatible with wish to be a scientist.

  • Grayling on Jung

    The pseudo-scientific psychological theories of Freud and Jung are of little interest now.

  • Hands Off Lacan!

    This is quite an amusing piece. Albeit irritating. So much rhetoric, so much slippery use of emotowords, so much vagueness where precision is needed – all to protect the heritage of Freud and Lacan. Why, one has to wonder. What is it about Freud that makes people one would think ought to know better, cling so fiercely? I suppose I could postulate some sort of psychoanalytic answer, but would that tell us anything?

    “When they speak of ‘professionalising’ people whose business is human misery; when they speak of ‘evaluating’ needs and results; when they try to appoint ‘super-prefects’ of the soul, grand inquisitors of human sadness – it is to hard not to agree that psychoanalysis is in the firing line,” Levy said.

    That’s a translation, I assume, so perhaps it’s unfair to look too closely at the words – but I’m going to anyway. ‘People whose business is human misery.’ What does he mean ‘business’? People who make money off human misery? Why should they be protected? Or does he mean something along the lines of experts in human misery, people who know a lot about human misery. But one can know a lot about human misery, in some sense – arguably all humans know that – without having the faintest clue how to ‘fix’ it or cure it or do anything about it at all other than hand-wring or watch or write poetry. Even quacks and charlatans can know a lot about human misery.

    Critics say the absence of regulation and a growing demand for therapy of all kinds has led to a proliferation of astrologers, mystics and con-artists – and they are demanding that the public be protected by a system of recognised qualifications. But Lacan, who died in 1981, said that the “analyst’s only authority is his own,” and his followers believe the state has no business interfering in the mysteries of the id and the unconscious. In a faction-ridden climate, many psychoanalysts also see the government’s initiative as an attempt by their arch-enemies the psychiatrists – hospital-based doctors who prescribe drugs for treating mental illness – to marginalise their work.

    The analyst’s only authority is his own – well that’s blunt, at least. That’s a good concise summing-up of what’s wrong with psychoanalysis. It presents no evidence, it is not peer-reviewed, it rules out falsification. It’s a form of hermeneutic, we’re often told, which is all well and good but it also claims to be therapeutic. It wants to have it both ways, in short: to charge a lot of money for its ministrations on the understanding that they are in some way helpful for human misery, but to escape oversight and regulation on the understanding that psychoanalysis is some kind of sacred mystery. A ‘marginalisation’ of their work would be a fine thing, if you ask me.

  • The Poetics of History 2

    My first comment on this subject has prompted some comments that suggest a lot of further comments (I’m in a permanent state of Infinite Regress here: everything I write seems to suggest several hundred more things I could write) and subjects to look into further. Empathy; the relationship of research to teaching; other minds and solipsism; the tendency to value emotional stances like empathy over ‘cooler’ more cognitive commitments to justice or equality; and so on.

    And there is also this article in the New Yorker about a book of history and a play, Thucydides’ History and Euripides’ Medea.

    To describe this war in all its complexity, Thucydides had to invent a new way of writing history. In his introduction, he says he will eschew “literary charm”—mythodes, a word related to “myth”—in favor of a carefully sifted accuracy…But this desire for what we would call balanced and accurate reporting led, paradoxically, to a most distinctive literary device: the use of invented speeches and dialogues to illustrate the progress of the war and the combatants’ thinking about it. Every key moment in the war…is cast as a dialogue that, as Thucydides admits, may not be a faithful reproduction of exactly what was said in this or that legislative session or diplomatic parley but does elucidate the ideologies at play. “My method has been,” he writes, “to make the speakers say what, in my opinion, was called for by each situation.” This, more than anything, is what gives the History its unique texture: the vivid sense of an immensely complex conflict reflected, agonizingly, in hundreds of smaller conflicts, each one presenting painful choices, all leading to the great and terrible resolution.

    Which is very like the way Euripides wrote his plays, to the disapproval of Nietzsche, who claimed that Socrates and Euripides rationalized and so ruined tragedy. But others like the interplay of argument, the effort to think things through, the questioning of each other’s assumptions, that many of Euripides’ plays show us. There is room for empathy – with Medea, Andromache, Iphigenia – and also for enhanced understanding of the issues, at least as one Athenian playwright saw them. And the great skeptical historian was doing much the same thing. Thucydides was a little bit of a dramatist and Euripides was a little bit of a historian.

  • ‘Too Early’ for Women in Afghanistan

    ‘We are opposed to women singing and dancing,’ Supreme Court says.

  • ‘Middle-Earth is the Kingdom of Kitsch’

    Perpetual childhood, emotion on the cheap, sincere sentimentalism.

  • Disgust Quiz

    A companion exercise to Taboo. Good clean fun.

  • Graduate School and its Discontents

    Invisible Adjunct has another good comment thread going. Remember that interesting (and often symptomatic) thread about the MLA a few weeks ago? There have been interesting ones since, and now there’s an especially interesting one. Well I say that because of the two last posts (last at the moment, last when I saw the thread), 10 and 11. Number 10:

    In the first year of graduate school in archaeology we spent so much time learning about post-modernist theaory and how archaeology could not really tell you about the past (it could only reveal your current political views on power relationships) that by the end of the year my professors convinced me that there was no reason to continue my studies in that field. I dropped out and went to law school.

    Number 11:

    I also remember a huge emphasis on postmodernism when I was a doctoral student in the college of education. Yes, I enjoyed postmodern theory, but there were never any other perspectives; I had to find those on my own. For example, we never studied education from a Marxist perspective; after all, Marxism had been determined to be too “modernistic.” I guess my big gripe with postmodern theory is that it tends to lead to nihlism and a total lack of social solidarity and responsibility. It really reached the pinnacle of craziness when issues like classroom management were turned into postmodern “points of view.” For example, I remember several of us classroom teachers posing serious questions about what happened in our classrooms. We weren’t looking for “how-to” answers, but something better than the “what is disobedience, anyway?” I’m sorry, but if you were to spend time in an 8th grade classroom, I don’t think you’d have any problem with the concrete reality of negative behavior. What was super-ironic is that whenever we would be looking at politics or power relations and anyone would give down-to-earth examples of how power REALLY operated (i.e. through control of workers, surveillance, etc.) then those became modernist concerns and were open to interpretation, not social action.

    Yes, from everything I hear, people in real-life 8th grade classrooms have no trouble saying what disobedience is, and why you need some of the other thing if you’re going to teach 30 or 35 children. And there’s something really enormously…ironic? Or is that too modernist. Perhaps I mean playful? Yes, no doubt that’s it. There’s something enormously ‘playful’ in the fact that postmodernist theory causes people to quit archaeology and go to law school instead. Actually what should be happening is that everyone everywhere should be dropping out of all academic programs – because those are all modernist projects too after all – and going into advertising. What could be more postmodernist than advertising? Especially now, now that everybody knows that everybody knows that everybody ‘sees through’ advertising, and ‘transforms’ it into a ‘site of resistance,’ so that advertising gets weirder and weirder, or more and more postmodern, in order to out-resist and out-transform and out-postmodernize all those people in the postmodern audience. Surely it’s the duty of all good postmodernists to provide more sites of resistance for everyone. And of course the pay is better, and you don’t risk ending up in places like Ithaca or Lubbock, and you don’t have to do all that reading.

  • Noah’s Flood Made the Grand Canyon

    Book offering Biblical version of geology for sale in National Park shop.

  • Framing and ‘Tax Relief’

    Control the definition and the game is yours.

  • The Poetics of History

    There was an interesting subject under discussion at Cliopatria yesterday and this morning – history as defamiliarization, poetics and history, the difference between history and fiction. The whole subject touches on a lot of difficult, knotty questions – other minds; the reliability or otherwise of testimony, autobiography, narrative – of what people recount about their own experiences; empathy; imagination; the general and the particular, the abstract and the concrete – and so on. Meta-questions.

    I wondered about the much-discussed idea that fiction can teach empathy in a way that more earth-bound, or factual, or evidence-tethered fields cannot. That novelists have a special imaginative faculty which enables them to show what it’s like to be Someone Else so compellingly that we learn to be tolerant, sympathetic, forgiving, understanding etc. in the act of reading. Cf. Martha Nussbaum in Poetic Justice for example. It seems plausible, up to a point, but…only up to a point. For one thing there are so many novels that are full of empathy for one character but none for all the others; and there are so many that have empathy for the wrong people and none for their victims (cf. Gone With the Wind); and there are so many mediocre and bad novels, and the aesthetic quality of a novel has little or nothing to do with its level of empathy-inducement.

    I think there are a couple of background ideas at work here, that could do with being dragged into the light. One is that all novelists, all fiction-writers have this ability to teach empathy – that there is something about the very act of telling a story that produces character-sympathy, and that character-sympathy translates into sympathy for people in general as opposed to sympathy for one particular character. But anybody can set up as a novelist, including selfish, unreflective, egotistical people. There is no guarantee that telling a story has anything to do with empathy. And then there is a second idea, that what novelists imagine about other minds is somehow reliable. But why should that be true? Especially why should it be true of all novelists? At least, why should it be any more true than it is of the rest of us? We can all imagine what’s going on in other minds – and we can all be entirely wrong. Or not. It may be that particularly brilliant novelists are better at imagining what’s going on in other minds – at guessing the truth – but particularly brilliant novelists are a rare breed, and in any case, nobody knows for sure whether they have it right or not. We think they do, it sounds right, but we don’t know. All it is, after all, is the imagining of one novelist. Lizzy Bennett and Isabel Archer and Julien Sorel may tell us what it’s like to be someone else – or they may not. We simply don’t know.

  • Brought to You By

    This is a disgusting item in the Washington Post. It sounds good at first – but then it’s meant to. And at second it doesn’t sound good at all.

    The administration proposal, which is open for comment from federal agencies through Friday and could take effect in the next few months, would block the adoption of new federal regulations unless the science being used to justify them passes muster with a centralized peer review process that would be overseen by the White House Office of Management and Budget.

    It’s those last seven words that give the game away – along with the word ‘centralized’ perhaps. Peer review is one thing, ‘centralized’ peer review is another, and ‘centralized’ peer review overseen by the White House Office of Management and Budgetis quite, quite another. Which peers would those be, exactly? Centralized by whom? And – ‘overseen’ in what sense, using what criteria? One can guess all too easily.

    But a number of scientific organizations, citizen advocacy groups and even a cadre of former government regulators see a more sinister motivation: an effort to inject White House politics into the world of science and to use the uncertainty that inevitably surrounds science as an excuse to delay new rules that could cost regulated industries millions of dollars…Under the current system, individual agencies typically invite outside experts to review the accuracy of their science and the scientific information they offer…The proposed change would usurp much of that independence. It lays out specific rules regarding who can sit on peer review panels — rules that, to critics’ dismay, explicitly discourage the participation of academic experts who have received agency grants but offer no equivalent warnings against experts with connections to industry. And it grants the executive branch final say as to whether the peer review process was acceptable.

    Perfect. Disinterested academics need not apply, but industry scientists are welcome. And the executive branch, with its dazzling track record of scrupulous impartiality in scientific matters, has the final say.

  • Julian Baggini on Kilroy-Silk

    Remember Mill’s distinction between offense and harm.

  • Peer Reviewed – by Which Peers?

    Bush administration seeks to control who reviews scientific research.