Author: Ophelia Benson

  • More a Meditation Than an Argument

    Richard Sennett’s Respect ‘draws on fields normally guarded by specialists: urbanism, psychology, literature, architecture, the history of ideas’.

  • Unoriginal, and False

    Colin McGinn disagrees with Damasio’s version of the James-Lange theory of emotion.

  • Oh dear, some journalists should only write about the Spice Girls…

    It is well-known that most journalists write mostly nonsense most of the time. Happily, this is normally about things like Posh Spice, the g-spot or Iraq. But Zoe Williams clearly has greater ambitions, for she writes nonsense about sociobiology.

    This article is so bad that it is hard to know where to start when discussing it. Take this claim:

    “There are logical problems with it which it doesn’t take a degree in zoology (even from Oregon) to determine. First, it relies, as so many of these theories do, on the egregious notion that, while women’s fertility is all downhill from the moment they start enjoying The Archers, men suffer no deterioration of sperm quality till they’re one day older than Charlie Chaplin. This is a misconception so basic that it’s incredible to hear scientists still peddling it…”.

    Right. But no it doesn’t. What it relies on is the fact that women have a far greater investment in their genetic material than do men, so it pays them to get it right in a way in which it doesn’t pay men to get it right. Crudely, if a man has sex with a bad genetic risk, he can have sex with a good one the next day. If a woman has sex with a bad genetic risk, she too can have sex with a good one the next day, but if she’s already pregnant, it’s too late for her genes.

    Also, there’s an irony in Williams’s comment here. The fact that she talks about the quality of older men’s sperm, suggests that she accepts the logic of the argument. In other words, the quality of the older man’s sperm only becomes relevant if the argument works were there no deterioration in quality. But, of course, then it becomes a purely empirical question. Is there enough deterioration in quality, that it will prevent the gene for fancying old men from propagating…?

    The rest of the article is just as bad.

    Take this claim:

    “The fact that women have more to gain, biologically, from promiscuity, and men have more to gain from fidelity is very rarely touched upon…”.

    Here’s Helena Cronin on the same matter:

    “Give a man fifty wives, and he can have children galore; but give a woman fifty husbands – no use at all. For men quantity pays, for women quality pays. In the evolution of our species, many men didn’t breed at all whereas most women did; and some men vastly outbred others whereas women had about equal numbers of children. Thus, men’s stakes – their potential gains, potential losses – were immensely higher than women’s. So, generation after generation, down evolutionary time, natural selection favoured men with an appetite for multiple mates and a disposition to strive mightily for them. And, generation after generation, down evolutionary time, natural selection favoured women who chose prudently – for resources, protection, good genes.” [What Philosophers Think, Continuum, 2003].

    So who’s right? Helena Cronin, Co-Director of the Centre for Philosophy of Natural and Social Science at the London School of Economics, or Zoe Williams who is… errr.

  • So Far So Good…Maybe

    But what happens when the results are bad? When students mark teachers down for not being entertaining enough?

  • Born or Made?

    Is gayness in the genes or in the will, and who thinks which, and why.

  • A Bluffer’s Guide to Science Studies and the Sociology of “Knowledge”

    Ever since science became a going concern in the ancient world, people have
    asked: “What is this thing called science?” An early answer was given by Aristotle
    in his Organon, its focus being largely on the logic and methodology
    of scientific reasoning. Even if its substantive claims are now no longer central,
    it inaugurated a tradition of philosophical thought about science that has had
    wide acceptance by many scientists and philosophers; in their different ways
    recent philosophers such as Carnap, Popper, Lakatos and the Bayesians are all
    within this tradition. It involves belief in, and the application of, principles
    of logic, methodology and of rationality generally; on the whole such principles
    have been instrumental in leading scientists, if not others, to hold the scientific
    beliefs they do.


    But these days this tradition has fallen out of fashion and has been replaced
    by the burgeoning fields of sociology of science, cultural studies of science,
    constructivism, postmodernism, and the like. Reasons for the change in fashion
    are several, one having to do with the political, economic and social uses of
    science, some of which, rather than enhance our lives, threaten our very existence.
    By blaming these ills on science itself, rather than, say, the uses to which
    it is put by the military, industry, commerce, governments and others, advocates
    of anti-science have placed the philosophical "ideologists" who talk
    of the rationality of science under a cloud of suspicion.


    Another reason has to do with the status claimed by science. In countering
    claims about its rational basis, attempts have been made to de-legitimate and
    "demystify" science. Here the attack on the Aristotelian tradition
    is at its most profound. Science is alleged to be no more legitimate than many
    other non-scientific practices such as those embodied in "local knowledge",
    "ethnoscience" and what the postmodernist Lyotard calls "narratives".
    To claim otherwise is to indulge in philosophical metanarratives towards which
    Lyotard invites us to be incredulous, this being his definition of the postmodern.
    Nor, it is also alleged, can science claim to give us a picture of what the
    world is like, or a picture that is better than that given in non-sciences.
    It is not that science gives us, as many philosophers claim, either an ideal
    model of the world that approximates reality, or a picture that has only truthlikeness.
    Rather science is likened to a discourse that gives no picture at all since
    it fails to represent anything. Here critics of the Aristotelian tradition join
    hands with those who have strongly empiricist and anti-realist or irrealist
    inclinations about science.


    So, what are the causes of our beliefs in matters scientific if the canons
    of rationality are to be abandoned as an unbelievable metanarrative, or if scientific
    beliefs do not represent? The Ancient Greek Presocratic philosopher Xenophanes,
    rather than Aristotle, gives us a clue. Xenophanes was a sceptic who denied
    that knowledge could be obtained by us humans; at best we merely have beliefs,
    the truth or falsity of which will remain largely unknown to us. Our beliefs
    are not a response to reality – but something else. Xenophanes illustrates his
    view in the case of belief about God, but it has wider application. He says
    of the causes of beliefs in the gods, or God: ‘Each group of men paint the shape
    of the gods in a fashion similar to themselves; the Ethiopians draw them dark
    and snub-nosed, the Thracians red-haired and blue-eyed’. And he has similar
    remarks about the gods that cows, horses and lions would draw if they had hands;
    they would, respectively, look just like cows, horses or lions.


    One of the several points being made here is that if the gods are believed
    to be dark and snub-nosed, then the cause of this belief has nothing to do with
    the gods themselves; rather its cause has to do with some feature of ourselves.
    The gods themselves are not causally involved in our representations of them;
    rather it is something about ourselves that leads us to make the representations
    we do. It is as if the gods drop out of the picture as far as the causes of
    our beliefs is concerned; something completely non-god-like plays a causal role
    in the production of belief. In so far as this something else concerns social
    aspects of ourselves, then Xenophanes is the first sociologist of "knowledge"
    or, as we should more correctly say, the first sociologist of belief.


    Subsequent sociologists have extended Xenophanes’ views on the causes of beliefs
    about God to scientific belief itself. A Xenophanes-like account of scientific
    belief has come to be adopted by a wide range of people such as Marx, Mannheim,
    contemporary sociologists of scientific belief, Foucault, Nietzsche, to mention
    a few. Surprisingly, even though they differ markedly over what they claim are
    the specific causes of belief, they all espouse the same general form of explanatory
    theory which is intended to replace explanations that appeal to scientific rationality.
    For many of them, scientific belief is not a rational response to the world;
    scientific "knowledge", as the title of David Bloor’s influential
    book has it, is nothing but social imagery.


    Karl Marx was one of the first to suggest that the sciences, along with
    ideology, forms of consciousness, religious belief and the like, are also determined,
    shaped or caused by the prevailing forces and relations of production. Marx
    proposed a two-tiered view of all social factors in his doctrine of historical
    materialism. Forces and relations of production constituted the economic foundation
    of society while anything that has to do with ideas ("forms of consciousness")
    is to be placed in the superstructure, which depends, in some unspecified way,
    on the foundation. Marx bequeathed to the sociology of belief a problem that
    it has never been able to solve, viz., what exactly the relation of dependence
    is between beliefs or ideas and their alleged foundation.


    In his programmatic pronouncements about historical materialism, Marx does
    not specifically mention science in the superstructure. But some of his other
    comments do indicate that he saw science this way, while yet other comments
    indicate that perhaps science might not fit into such a simple two-tiered model
    after all. Either the model is deficient, or science is simply a third item
    outside the two-tiered model. His followers from Engels onward were not so ambivalent;
    they see science as something "determined" by forces and relations
    of production.


    Put this way, there is an evident confusion between the very content of science,
    such as its laws or theories, and other aspects of science such as the choices
    as to which lines of research to pursue, what programmes to fund, what applications
    of theory might be the most commercially promising, and so on. A case might
    be made for forces and relations of production playing some limited role concerning
    the latter; but they play no role in determining the former, the very content
    of science. It is here that those within the Aristotelian tradition of understanding
    science would claim that methodological principles play an important role (along
    with other factors) as a cause of belief. But this is denied by those who follow
    Marx and Engels in regarding such beliefs, even in the very content of science,
    as arising from the interplay between forces and relations of production.


    As an illustration of not just the claims that Marxists might make, but also
    most sociologists of scientific "knowledge", consider Forman’s account
    of how German physicists in the Weimar period were caused to believe in physical
    acausality. In Xenophanes style, the causes have nothing to do with their purported
    objects, viz., indeterministic laws and happenings in the physical world. Rather
    it is the social milieu of the physicists of the Weimar period with its Spenglerian
    hostility to science and causality that is the cause of their beliefs. Forman’s
    story does not appeal to any forces and relations of production; so it does
    not fit Marx’s model. But as will be seen it supports other sociological stories
    about the cause of belief.


    Such sociological explanations of scientific belief can be used to expose the
    "false consciousness" of the kind of explanations offered by philosophers
    in terms of (belief in) principles of rational methodology, and to debunk them.
    Physicists in Weimar Germany were deluded if they thought that methodological
    principles played a role in bringing about their scientific beliefs. Their beliefs
    are "social imagery’ caused by "socio-cultural conditions" (or
    by belief in such conditions). They are not caused by any (belief in) rational
    principles which accompany their theoretical and experimental endeavours. The
    "ideological" pretensions of rationalist philosophers, still working
    within the misleading framework bequeathed to us by the enlightenment, is now
    exposed, and debunked, by the rival explanations of the sociologists of "knowledge".


    However one might well ask how the sociologists manage to establish their claims
    about the causes of belief if the do not accept some principles of rationality.
    What they need to show, but do not, is that that the physicists’ current beliefs
    in physics, and in methodology, are causally impotent in bringing about the
    Weimar physicists’ belief in acausality; what allegedly does all the work is
    their concurrent socio-cultural circumstance, or their beliefs about this. A
    necessary smoke screen is raised to obscure their failure to employ causal methodology
    at some point.


    Karl Mannheim held that the two important progenitors of the sociology of “knowledge" were Marx and Nietzsche. We will turn to Nietzsche shortly.
    Mannheim himself proposed a sociology of "knowledge" in which, as
    he obscurely puts it, there is an “existential determination of knowledge”.
    Mannheim does not get much further than talk of bare relations of thought or
    knowledge to "historical-social existence". But he does importantly
    suggest that there are some areas of thought or knowledge, such as mathematics
    and science, that are independent of historical-social existence and that evolve
    according to their own “inner dialectic” or “imminent laws”. This liberality
    towards the independence of science is criticised by advocates of the Strong
    Programme in the sociology of scientific "knowledge". They claim that
    Mannheim lost his nerve in failing to extend the programme of the sociology
    of “knowledge” to science and mathematics.


    The most recent incarnation of the basic ideas of Marx and Mannheim can be
    found in the Strong Programme in the sociology of scientific "knowledge".
    Its central causality tenet tells us that all scientific belief or "knowledge"
    is to be causally explained by social, historical or cultural conditions (or
    belief in such conditions, an important ambiguity in many formulations
    of the doctrine often passed over). These operate in conjunction with non-social
    causes which, because of their general occurrence across humanity, cannot be
    used to explain variation in belief; this is something only variable social
    factors can provide. Importantly, there is no mention of (belief in) any norms
    of method in the causality tenet as a possible cause of scientific belief. These
    are ruled out as explainers not only on the basis of the naturalism espoused
    by the Strong Programme, but also on the basis of the all-important symmetry
    tenet which says that the same kind of explanation must apply to all beliefs
    regardless of their truth or falsity, or their rationality or irrationality.
    Since on their view the only viable explanations are those which appeal to socio-historical
    causes, the symmetry tenet then rules out all explanation of scientific belief
    on the basis of normative methodological principles. Such normative explanations
    are said to be an unnatural intrusion upon the causal realm in which only naturalistic
    social (and naturalistic non-social) factors can be causally efficacious in
    bringing about belief. Such restrictions are imposed by the scientism of the
    Strong Programme in which, like any other science, only naturalistic causal
    factors and causal laws are to be admitted.


    While there is some debate among sociologists of scientific “knowledge” as
    to the applicability of all the tenets of the Strong Programme, the symmetry
    tenet remains central. Explanations on the basis of rational principles of method
    are out. If such principles are admitted, then they are only accepted locally
    as what the community endorses; they have no further underlying authority or
    status. At this point advocates of the Strong Programme recruit Wittgenstein’s
    doctrine of rule following to their cause. In fact they adopt the communitarian
    interpretation of rule following in which what the community determines is the
    ultimate Court of Appeal and there is no further fact of the matter concerning
    the correctness of any rule of method. The crucial issue here is whether advocates
    of the Strong Programme deny that there is such a thing as scientific rationality
    expressed in methodological principles; or whether they think there are such
    principles but they are only locally accepted as such, and so are simply more
    grist to the mill of the causality tenet of their programme. If the latter,
    then advocates of the Strong Programme have undercut the authority of methodology
    (they believe it can be given no account) and instead of continuing the traditional
    discussion of scientific rationality they have, in effect, changed the subject
    under debate.


    Much criticism of the Strong Programme centres around the many case studies
    its advocates have given of episodes in the history of science. One example
    already mentioned is that of the physicists in Weimar Germany and the allegedly
    social causes of their beliefs in acausality in physics. As always, the alleged
    causes of belief are all socio-cultural with no role for principles of method.
    But as already noted, principles of causal methodology have to be assumed in
    order to establish any causal link between scientists’ social circumstances
    and their scientific beliefs. So appeal to methods that are not merely locally
    accepted as such cannot be avoided even within the Strong Programme if any case
    studies in support of the central causality tenet are to be established. The
    general verdict of outsiders is that causal methodology has been badly applied
    and no convincing case has been established.


    The power/knowledge doctrine of Michel Foucault bears a striking resemblance
    to the claims of Marx, Mannheim and advocates of the Strong Programme. Where
    he differs is in his claims about power and its alleged efficaciousness in bringing
    about “knowledge”. In turn “knowledge” itself bringing about further power relations;
    and so on in a spiral of successive connections. While Foucault denies that
    “power is knowledge” he never makes fully clear what connection there
    might be between power and "knowledge"; but he always assumes that
    there must be one and never thinks that there might be none. He talks of power
    producing "knowledge", or of their being no "knowledge"
    without power, without further exploring the assumed connection. But as we have
    seen, lack of clarity about connection is endemic in social studies of belief.


    Foucault has many perorations about the nature of power. Since he conceives
    it broadly as the effect that the actions of one person can have on the actions
    of another (either opening them up or closing them down but not completely),
    then power is simply everywhere, as Foucault notices. But this is more a defect
    in his implausibly broad notion of power than a new interesting discovery about
    its ubiquity. Nor does Foucault always talk about “knowledge” as being related
    to power. Power is also linked to a whole host of other items such as truth,
    discourses of truth, or simply discourses. Though many claim that Foucault restricted
    his claims about power to "knowledge’ in the human sciences, there is in
    fact evidence that Foucault also intended his doctrine to apply to other sciences
    such as chemistry and mathematics. Whatever the scope of Foucault’s doctrine,
    the boundaries of a plausible sociological investigation of aspects of science
    have been burst and extended in a quite unfounded way into the sociology of
    scientific “knowledge", that is, the very cognitive content of the sciences
    themselves. Finally Foucault does claim that his own genealogy of power is a
    better explainer of belief than, say, Marxism or Freudian analysis. But since
    he never discusses what he means by "better explanation" in a methodological
    context, or modifies the scope of his "power/knowledge" doctrine,
    this quite late appeal to methodology can have little force.


    Finally turn very briefly to Nietzsche. Nietzsche’s doctrine of the “will to
    power” strongly influenced Foucault’s "power/knowledge" doctrine.
    Nietzsche’s doctrine shares the same form as all the other doctrines mentioned.
    But it is broader in that the "will to power" is both a metaphysical
    and psycho-social force at work in all of nature and life, including human life.
    Understood as a primitive psychological drive within people, it is causally
    efficacious in bringing about not only our beliefs in ordinary matters and in
    morality, but also our presuppositional philosophical beliefs in logic, in the
    identity that ordinary objects have and in the existence of ordinary everyday
    objects themselves. Nietzsche’s doctrine is best illustrated in the case he
    makes about the origins of Christian moral values such as altruism, pity and
    the like. He famously claims that they are due to the resentment of those “slaves
    of morality” who advocated them while overthrowing the values of "master"
    morality. Here the will to power operates as a psychological drive of resentment
    in people causing them to bring about, and maintain, certain moral beliefs.
    In allegedly uncovering the sordid origins of Christian morality in resentment,
    Nietzsche hoped to debunk it.


    Nietzsche is a master at proposing theories of the origins of our beliefs that
    rival those that are commonly accepted. In this way he hopes to unmask them,
    and then debunk them. The double unmasking/debunking move makes him the darling
    of postmodernism. This double move can be played out not only in the sphere
    of moral belief, but in any sphere of belief, such as our beliefs concerning
    logic, or truth, or our everyday framework of belief about objects. One of Nietzsche’s
    prime targets in this respect is Kant who, like a good modernist, attempted
    to give, as far as is possible, rational grounds for our ordinary beliefs and
    for morality. But one can be a critic of much of Kantian philosophy without
    accepting Nietzsche’s critique and its alternative worldview. What is important
    for our purposes is that the Nietzschean unmasking and debunking moves have
    wide application. Nietzsche extended it to a "genealogical" critique
    of truth and our pursuit of it; and it can be extended to the sciences and the
    methodological principles that many of a rational persuasion believe have been
    instrumental in the growth of science knowledge.


    Those of a modernist persuasion hold that principles of rationality and of
    methodology can play an important role in bringing about beliefs in science
    and elsewhere, for some of us some of the time. But this is not so for the writers
    mentioned. Marxists regard all scientific belief as a response to the forces
    and relations of production. Mannheim holds that most belief is a response to
    the conditions of social existence in which we think and believe. Advocates
    of the Strong Programme postulate a strong causal role for social, historical
    and cultural factors in bringing about all belief. Foucault sees power is the
    prime mover of all belief in the sciences. And finally Nietzsche claims that
    the will to power operates even in the sphere of belief.


    All these theorists have an in-house disagreement about what does the causal
    work, be it forces and relations of production, existential conditions, socio-historico-cultural
    factors, power or "will to power". But they all agree that, whatever
    it is, it cannot be anything rational. Rational explanations of belief are mystifications
    that need to be dragged out, unmasked and debunked. There are alternative explanations
    for why we believe what we do (though the grounds on why they may be better
    explanations remain obscure). In some cases they might be right – but not always.
    Each case needs to be determined on its merits. Moreover, one might well ask
    what kind of explanations do they offer. Often their model of explanation is
    one that a rationalist could also accept; what they resist are any explanations
    that appeal to rationality or methodology. But their reasons (assuming they
    believe that holding reasons can be efficacious) for the wide scope of their
    claims are quite lame.


    What these theorists might also jibe at is that their own belief in the very
    doctrines they proclaim is itself merely an instance of the social, historical,
    cultural or psycho-causal theories of belief they advocate. More often than
    not they advance what they take to be profound truths; but at the same time
    they take themselves to be unmasking and debunking the failed modernist programme,
    part of which is the unmasking and debunking of the very notion of truth they
    employ. Their own views often fall victim to arguments similar to those Plato
    first advanced against the advocates of power, rhetoric and relativism about
    truth that he encountered in his own time from Protagoras to Callicles. The
    first time these considerations were played out they might have been tragedy
    (for Plato’s opponents); but to repeat them now is only farce.

    Robert Nola’s most recent book is Rescuing Reason: A Critique of Anti-Rationalist Views of Science and Knowledge (Boston Studies in the Philosophy of Science, V. 230), published by Kluwer. He is a professor of philosophy at the University of Auckland.

  • Life Explained

    ALDaily on Difference Feminism, or anyway on Difference.

  • And He Wasn’t Even Cool

    Sometimes a teacher can change lives.

  • Navel Gazing Not the Answer?

    The other article I had in mind was this one by Lauren Slater in the New York Times last weekend. It’s interesting that both articles express skepticism about the value, especially the curative or therapeutic value, of the talking cure and also of the intervention of therapists after traumatic events. At last! I’ve been rolling my eyes and making sarcastic remarks for years whenever a news story informs us that a plane crashed or a crazed gunman shot up a school/fast-food joint/post office or an earthquake leveled a town, and in the next breath added that ‘counselors are on the way’. As if that helps. As if we can all heave a big sigh of relief because at least professionals will be there to deal with the trauma. As if they don’t in fact often make things worse, hounding people who want to head for a corner and curl up in the fetal position to ‘process the experience’ instead.

    And just as Tavris points out in her article, practitioners have a tendency to ignore research and cling to the ideas they’ve grown attached to. There are studies which show that people who don’t want to talk about their traumas in fact do better than people who do. George Bonnano, a researcher Slater talked to, has this to say:

    In the 1980’s, trauma became an official diagnosis, and people made their careers on it. What followed was a plethora of research on how to heal from trauma by talking it out, by facing it down. These people are not likely to believe in an alternative explanation. People’s intellectual inheritance is deeply dependent upon a certain point of view.

    Yes, and self-absorption is the American way, as Slater wittily points out.

    We believe that the human spirit is at its best when it expresses; the individualism that Tocqueville described in his book ”Democracy in America” rests on the right, if not the need, to articulate your unique internal state. Repression, therefore, would be considered anti-American, antediluvian, anti-art and terribly Teutonic.

    Ah well. If enough people speak up, maybe we will eventually catch on that narcissism isn’t all that good for us after all.

  • When Therapy Isn’t

    There have been a couple of interesting articles on therapy in the past two weeks, each taking a fairly skeptical view of the healing powers of the…discipline? field? trade? What is therapy really?

    In this one in the CHE Carol Tavris elucidates the gulf between clinical psychology and therapy on the one hand, and scientific or research psychology on the other, pointing out a number of ironic and/or horrifying facts along the way. For instance there is the fact that in many of the United States it is against the law to call oneself a psychologist unless one ‘has an advanced degree in clinical psychology and a license to practice psychotherapy’ but it is entirely legal to set oneself up as a ‘therapist’ with no training at all. The results are what one would expect, which leads to another fact that shocked me profoundly: that so-called rebirthing therapy is still legal in Colorado. Silly me, I thought it had been outlawed after what happened to Candace Newmaker who was tortured to death in a ‘rebirthing,’ suffocated in her own vomit despite her cries and pleas which went on for…hours. (All because the poor child, age ten, didn’t bond with her adopted mother; gee, imagine that, she must be sick, she must have ‘Reactive Attachment Disorder,’ a fanciful ailment which can actually be found in the notoriously ever-expanding Diagnostic and Statistical Manual, don’t get me started on that…) But no, therapists kicked up a fuss, so the dangerous cruel quackery is still allowed.

    The two therapists convicted in Candace’s death are now serving time in prison, but efforts in Colorado to prohibit all forms of “restraint therapy” were defeated by protests from “attachment therapists” in the state and throughout the country. After Candace’s death, one member of the Colorado Mental Health Grievance Board noted with dismay that her hairdresser’s training took 1,500 hours, whereas anyone could take a two-week course and become “certified” in rebirthing. Yet the basic premise — that children can recover from trauma, insecure attachment, or other psychological problems by “reliving” their births or being subjected to punitive and coercive restraints — has no scientific validity whatsoever.

    Tavris’ point is that the therapy industry, being completely divided from psychology as a science, does a bad job of checking its own beliefs, and has a great many of them that are discredited by evidence. Therapists are not trained in skepticism and evidence-testing the way scientists are, they don’t correct for their own confirmation bias the way scientists are supposed to, they don’t worry about falsifiability. Tavris also says, however, that clinicians have a depth of insight that is rooted in their work:

    I agree that therapy often deals with issues on which science is silent: finding courage under adversity, accepting loss, making moral choices. My clinician friends constantly impress me with their deep understanding of the human condition, which is based on seeing the human condition sobbing in their offices many times a week.

    But it is clear that therapy and clinical psychology, and the people they treat, would benefit enormously from much closer connections with research psychologists. But Tavris sees no reason to think that’s going to happen. The split between science and the rest of the world does more harm than we generally notice.

  • The Therapy-Science Gap

    Therapists and clinical psychologists believe things that evidence has shown to be false, and there is danger in that.

  • Outrage Inflation

    No more of those petty little conspiracy theories, now it’s time for the big stuff.

  • Just Don’t Talk About It

    Maybe dwelling on one’s problems isn’t all that curative after all?

  • Kinds of Fundamentalism

    There is more than one kind of fundamentalism, as Terry Eagleton points out in this essay in the Guardian. Fundamentalism is not so much religious as it is textual, which means it covers a lot of ground.

    Fundamentalists are those who believe that our linguistic currency is trustworthy only if it is backed by the gold standard of the Word of Words. They see God as copperfastening human meaning. Fundamentalism means sticking strictly to the script, which in turn means being deeply fearful of the improvised, ambiguous or indeterminate…Since writing is meaning that can be handled by anybody, any time, it is always profane and promiscuous. Meaning that has been written down is bound to be unhygienic…Fundamentalism is the paranoid condition of those who do not see that roughness is not a defect of human existence, but what makes it work.

    That’s a brilliant bit of writing. Mary McCarthy examines a similar kind of attitude (one can’t quite call it thinking) in her Memories of a Catholic Girlhood, when she describes the relatives she lived with who, as it were, considered the folds of the brain to be unhygienic.

    Fundamentalism makes another appearance in this review of a biography of Irving Howe.

    Howe was, by all accounts, a rigid and doctrinaire Trotskyist radical in his youth…The key that opened the door for Howe was literature…If part of his early attraction to Trotskyism had been inspired by Leon Trotsky’s own prowess as an intellectual and literary critic, in good dialectical fashion literature’s open-endedness proved to be a subverter of Howe’s Marxist dogmatism. Plunging into the life of literary criticism, Howe developed, as he himself put it, a ”taste for complication, which is necessarily a threat to the political mind.”

    The open-ended and complicated, the improvised and ambiguous and indeterminate, the profane and promiscuous, the unhygienic and rough. Not always what’s needed, of course: not in math or engineering, for example. But when it comes to ‘human meaning’ and human politics, it seems a good deal safer than rigidity and paranoia and excessive hygiene.

  • Golden Rice

    Critics of GM are missing the point.

  • Literature Subverts Dogmatism

    Irving Howe’s taste for the complication and open-endedness of literature played hell with his Marxist certainties.

  • Self-fulfilling Prophecy

    One of the terms the sociologist Robert Merton, who died last week, was known for was the self-fulfilling prophecy. There’s a lot of the sort of thing about. All the endless assuring each other, for instance, that rationality, secularism, skepticism, atheism are all wrong and mistaken and harmful and stupid because humans have a Deep Need for religion. We have a Longing for ‘spirituality,’ a Hunger for myth, a nostalgia for a Big Daddy to protect us. There is a god-shaped hole at the center of our consciousness and all the silly pointless time-wasting things we do are efforts to fill it. This review of Adam Sutcliffe’s Judaism and Enlightenment, for example, says as much (paraphrasing the argument of the book):

    And the prospect of a world without myth is neither possible nor desirable, Mr. Sutcliffe argues: “We need both reason and myth.” Mr. Sutcliffe thus sees his book as more than a contribution to intellectual history. It is also a philosophical argument, he says, a cautionary tale against what he calls “the seductions of rationalist absolutism.”

    But is it true? Or is it just something we’ve been told so many times we’ve come to believe it. Along with other mysterious things we’re told over and over until we believe them. Watching tv in the dark is bad for your eyes, swimming immediately after eating will cause you to drown, eating that piece of cake now will Spoil your Dinner, and you mustn’t be an atheist or you’ll spend the rest of your life seeking to fill that damn god-shaped hole. This putative need for myth we hear so much about. Funny, I’ve always (from childhood) been far more aware of the opposite reaction, a feeling of impatience and exasperation when people try to assure me of the truth of obvious fictions. I resented the whole Santa Claus imposition, and from that I went on to resenting similar kinds of fraud. So what about that need then? What about the need not to be systematically lied to all the time by grownups who ought to know better? What about the truth-shaped hole?

  • Postmodernism and truth

    Here is a story you probably haven’t heard, about how a team of American researchers
    inadvertently introduced a virus into a third world country they were studying.(1)
    They were experts in their field, and they had the best intentions; they thought
    they were helping the people they were studying, but in fact they had never
    really seriously considered whether what they were doing might have ill effects.
    It had not occurred to them that a side-effect of their research might be damaging
    to the fragile ecology of the country they were studying. The virus they introduced
    had some dire effects indeed: it raised infant mortality rates, led to a general
    decline in the health and wellbeing of women and children, and, perhaps worst
    of all, indirectly undermined the only effective political force for democracy
    in the country, strengthening the hand of the traditional despot who ruled the
    nation. These American researchers had something to answer for, surely, but
    when confronted with the devastation they had wrought, their response was frustrating,
    to say the least: they still thought that what they were doing was, all things
    considered, in the interests of the people, and declared that the standards
    by which this so-called devastation was being measured were simply not appropriate.
    Their critics, they contended, were trying to impose “Western” standards in
    a cultural environment that had no use for such standards. In this strange defense
    they were warmly supported by the country’s leaders–not surprisingly–and little
    was heard–not surprisingly–from those who might have been said, by
    Western standards, to have suffered as a result of their activities.


    These researchers were not biologists intent on introducing new strains of
    rice, nor were they agri-business chemists testing new pesticides, or doctors
    trying out vaccines that couldn’t legally be tested in the U.S.A. They were
    postmodernist science critics and other multiculturalists who were arguing,
    in the course of their professional researches on the culture and traditional
    “science” of this country, that Western science was just one among many equally
    valid narratives, not to be “privileged” in its competition with native traditions
    which other researchers–biologists, chemists, doctors and others–were eager
    to supplant. The virus they introduced was not a macromolecule but a meme (a
    replicating idea): the idea that science was a “colonial” imposition, not a
    worthy substitute for the practices and beliefs that had carried the third-world
    country to its current condition. And the reason you have not heard of this
    particular incident is that I made it up, to dramatize the issue and to try
    to unsettle what seems to be current orthodoxy among the literati about
    such matters. But it is inspired by real incidents–that is to say, true reports.
    Events of just this sort have occurred in India and elsewhere, reported, movingly,
    by a number of writers, among them:


    Meera Nanda, “The Epistemic Charity of the Social Constructivist Critics of
    Science and Why the Third World Should Refuse the Offer,” in N. Koertge, ed.,
    A House Built on Sand: Exposing Postmodernist Myths about Science, Oxford
    University Press, 1998, pp286-311


    Reza Afshari, “An Essay on Islamic Cultural Relativism in the Discourse of
    Human Rights,” in Human Rights Quarterly, 16, 1994, pp.235-76.


    Susan Okin, “Is Multiculturalism Bad for Women?” Boston Review, October/November,
    1997, pp 25-28.


    Pervez Hoodbhoy, Islam and Science: Religious Orthodoxy and the Battle for
    Rationality
    , London and New Jersey, Zed Books Ltd. 1991.


    My little fable is also inspired by a wonderful remark of E. O. Wilson, in
    Atlantic Monthly a few months ago: “Scientists, being held responsible
    for what they say, have not found postmodernism useful.” Actually, of course,
    we are all held responsible for what we say. The laws of libel and slander,
    for instance, exempt none of us, but most of us–including scientists in many
    or even most fields–do not typically make assertions that, independently of
    libel and slander considerations, might bring harm to others, even indirectly.
    A handy measure of this fact is the evident ridiculousness we discover in the
    idea of malpractice insurance for . . . . literary critics, philosophers, mathematicians,
    historians, cosmologists. What on earth could a mathematician or literary critic
    do, in the course of executing her profession duties, that might need the security
    blanket of malpractice insurance? She might inadvertently trip a student in
    the corridor, or drop a book on somebody‘s head, but aside from such
    outré side-effects, our activities are paradigmatically innocuous.
    One would think. But in those fields where the stakes are higher–and more direct–there
    is a longstanding tradition of being especially cautious, and of taking particular
    responsibility for ensuring that no harm results (as explicitly honored in the
    Hippocratic Oath). Engineers, knowing that thousands of people’s safety may
    depend on the bridge they design, engage in focussed exercises with specified
    constraints designed to determine that, according to all current knowledge,
    their designs are safe and sound. Even economists–often derided for the risks
    they take with other people’s livelihoods–when they find themselves
    in positions to endorse specific economic measures considered by government
    bodies or by their private clients, are known to attempt to put a salutary strain
    on their underlying assumptions, just to be safe. They are used to asking themselves,
    and to being expected to ask themselves: “What if I’m wrong?” We others seldom
    ask ourselves this question, since we have spent our student and professional
    lives working on topics that are, according both to tradition and common sense,
    incapable of affecting any lives in ways worth worrying about. If my topic is
    whether or not Vlastos had the best interpretation of Plato’s Parmenides
    or how the wool trade affected imagery in Tudor poetry, or what the best version
    of string theory says about time, or how to recast proofs in topology in some
    new formalism, if I am wrong, dead wrong, in what I say, the only damage I am
    likely to do is to my own scholarly reputation. But when we aspire to have a
    greater impact on the “real” (as opposed to “academic”) world– and many philosophers
    do aspire to this today–we need to adopt the attitudes and habits of these
    more applied disciplines. We need to hold ourselves responsible for what we
    say, recognizing that our words, if believed, can have profound effects for
    good or ill.


    When I was a young untenured professor of philosophy, I once received a visit
    from a colleague from the Comparative Literature Department, an eminent and
    fashionable literary theorist, who wanted some help from me. I was flattered
    to be asked, and did my best to oblige, but the drift of his questions about
    various philosophical topics was strangely perplexing to me. For quite a while
    we were getting nowhere, until finally he managed to make clear to me what he
    had come for. He wanted “an epistemology,” he said. An epistemology.
    Every self-respecting literary theorist had to sport an epistemology that season,
    it seems, and without one he felt naked, so he had come to me for an epistemology
    to wear–it was the very next fashion, he was sure, and he wanted the dernier
    cri
    in epistemologies. It didn’t matter to him that it be sound, or defensible,
    or (as one might as well say) true; it just had to be new and different
    and stylish. Accessorize, my good fellow, or be overlooked at the party.


    At that moment I perceived a gulf between us that I had only dimly seen before.
    It struck me at first as simply the gulf between being serious and being frivolous.
    But that initial surge of self-righteousness on my part was, in fact, a naive
    reaction. My sense of outrage, my sense that my time had been wasted by this
    man’s bizarre project, was in its own way as unsophisticated as the reaction
    of the first-time theater-goer who leaps on the stage to protect the heroine
    from the villain. “Don’t you understand?” we ask incredulously. “It’s make
    believe
    . It’s art. It isn’t supposed to be taken literally!”
    Put in that context, perhaps this man’s quest was not so disreputable after
    all. I would not have been offended, would I, if a colleague in the Drama Department
    had come by and asked if he could borrow a few yards of my books to put on the
    shelves of the set for his production of Tom Stoppard’s play, Jumpers.
    What if anything would be wrong in outfitting this fellow with a snazzy set
    of outrageous epistemological doctrines with which he could titillate or confound
    his colleagues?


    What would be wrong would be that since this man didn’t acknowledge the gulf,
    didn’t even recognize that it existed, my acquiescence in his shopping spree
    would have contributed to the debasement of a precious commodity, the erosion
    of a valuable distinction. Many people, including both onlookers and participants,
    don’t see this gulf, or actively deny its existence, and therein lies the problem.
    The sad fact is that in some intellectual circles, inhabited by some of our
    more advanced thinkers in the arts and humanities, this attitude passes as a
    sophisticated appreciation of the futility of proof and the relativity of all
    knowledge claims. In fact this opinion, far from being sophisticated, is the
    height of sheltered naiveté, made possible only by flatfooted ignorance
    of the proven methods of scientific truth-seeking and their power. Like many
    another naif, these thinkers, reflecting on the manifest inability of their
    methods of truth-seeking to achieve stable and valuable results, innocently
    generalize from their own cases and conclude that nobody else knows how
    to discover the truth either.


    Among those who contribute to this problem, I am sorry to say, is, my good
    friend Dick Rorty. Richard Rorty and I have been constructively disagreeing
    with each other for over a quarter of a century now. Each of us has taught the
    other a great deal, I believe, in the reciprocal process of chipping away at
    our residual points of disagreement. I can’t name a living philosopher from
    whom I have learned more. Rorty has opened up the horizons of contemporary philosophy,
    shrewdly showing us philosophers many things about how our own projects have
    grown out of the philosophical projects of the distant and recent past, while
    boldly describing and prescribing future paths for us to take. But there is
    one point over which he and I do not agree at all–not yet–and that concerns
    his attempt over the years to show that philosophers’ debates about Truth and
    Reality really do erase the gulf, really do license a slide into some form of
    relativism. In the end, Rorty tells us, it is all just “conversations,” and
    there are only political or historical or aesthetic grounds for taking one role
    or another in an ongoing conversation.


    Rorty has often tried to enlist me in his campaign, declaring that he could
    find in my own work one explosive insight or another that would help him with
    his project of destroying the illusory edifice of objectivity. One of his favorite
    passages is the one with which I ended my book Consciousness Explained
    (1991):


    It’s just a war of metaphors, you say–but metaphors are not “just” metaphors;
    metaphors are the tools of thought. No one can think about consciousness without
    them, so it is important to equip yourself with the best set of tools available.
    Look what we have built with our tools. Could you have imagined it without them?
    [p.455]


    “I wish,” Rorty says, “he had taken one step further, and had added that such
    tools are all that inquiry can ever provide, because inquiry is never ‘pure’
    in the sense of [Bernard] Williams’ ‘project of pure inquiry.’ It is always
    a matter of getting us something we want.” (“Holism, Intrinsicality, Transcendence,”
    in Dahlbom, ed., Dennett and his Critics. 1993.) But I would never take
    that step, for although metaphors are indeed irreplaceable tools of thought,
    they are not the only such tools. Microscopes and mathematics and MRI scanners
    are among the others. Yes, any inquiry is a matter of getting us something we
    want: the truth about something that matters to us, if all goes as it should.


    When philosophers argue about truth, they are arguing about how not to inflate
    the truth about truth into the Truth about Truth, some absolutistic doctrine
    that makes indefensible demands on our systems of thought. It is in this regard
    similar to debates about, say, the reality of time, or the reality of the past.
    There are some deep, sophisticated, worthy philosophical investigations into
    whether, properly speaking, the past is real. Opinion is divided, but you entirely
    misunderstand the point of these disagreements if you suppose that they undercut
    claims such as the following:


    Life first emerged on this planet more than three thousand million years ago.
    The Holocaust happened during World War II.
    Jack Ruby shot and killed Lee Harvey Oswald at 11:21 am, Dallas time, November
    24, 1963.


    These are truths about events that really happened. Their denials are falsehoods.
    No sane philosopher has ever thought otherwise, though in the heat of battle,
    they have sometimes made claims that could be so interpreted.


    Richard Rorty deserves his large and enthralled readership in the arts and
    humanities, and in the “humanistic” social sciences, but when his readers enthusiastically
    interpret him as encouraging their postmodernist skepticism about truth, they
    trundle down paths he himself has refrained from traveling. When I press him
    on these points, he concedes that there is indeed a useful concept of truth
    that survives intact after all the corrosive philosophical objections have been
    duly entered. This serviceable, modest concept of truth, Rorty acknowledges,
    has its uses: when we want to compare two maps of the countryside for reliability,
    for instance, or when the issue is whether the accused did or did not commit
    the crime as charged.


    Even Richard Rorty, then, acknowledges the gap, and the importance of the
    gap, between appearance and reality, between those theatrical exercises that
    may entertain us without pretence of truth-telling, and those that aim for,
    and often hit, the truth. He calls it a “vegetarian” concept of truth. Very
    well, then, let’s all be vegetarians about the truth. Scientists never wanted
    to go the whole hog anyway.


    So now, let’s ask about the sources or foundations of this mild, uncontroversial,
    vegetarian concept of truth.


    Right now, as I speak, billions of organisms on this planet are engaged in
    a game of hide and seek. It is not just a game for them. It is a matter of life
    and death. Getting it right, not making mistakes, has been of paramount
    importance to every living thing on this planet for more than three billion
    years, and so these organisms have evolved thousands of different ways of finding
    out about the world they live in, discriminating friends from foes, meals from
    mates, and ignoring the rest for the most part. It matters to them that they
    not be misinformed about these matters–indeed nothing matters more–but they
    don’t, as a rule, appreciate this. They are the beneficiaries of equipment exquisitely
    designed to get what matters right but when their equipment malfunctions and
    gets matters wrong, they have no resources, as a rule, for noticing this, let
    alone deploring it. They soldier on, unwittingly. The difference between how
    things seem and how things really are is just as fatal a gap for them as it
    can be for us, but they are largely oblivious to it. The recognition
    of the difference between appearance and reality is a human discovery. A few
    other species–some primates, some cetaceans, maybe even some birds–shows signs
    of appreciating the phenomenon of “false belief”–getting it wrong. They
    exhibit sensitivity to the errors of others, and perhaps even some sensitivity
    to their own errors as errors, but they lack the capacity for the reflection
    required to dwell on this possibility, and so they cannot use this sensitivity
    in the deliberate design of repairs or improvements of their own seeking gear
    or hiding gear. That sort of bridging of the gap between appearance and reality
    is a wrinkle that we human beings alone have mastered.


    We are the species that discovered doubt. Is there enough food laid by for
    winter? Have I miscalculated? Is my mate cheating on me? Should we have moved
    south? Is it safe to enter this cave? Other creatures are often visibly agitated
    by their own uncertainties about just such questions, but because they cannot
    actually ask themselves these questions, they cannot articulate their
    predicaments for themselves or take steps to improve their grip on the truth.
    They are stuck in a world of appearances, making the best they can of how things
    seem and seldom if ever worrying about whether how things seem is how they truly
    are.


    We alone can be wracked with doubt, and we alone have been provoked by that
    epistemic itch to seek a remedy: better truth-seeking methods. Wanting to keep
    better track of our food supplies, our territories, our families, our enemies,
    we discovered the benefits of talking it over with others, asking questions,
    passing on lore. We invented culture. Then we invented measuring, and arithmetic,
    and maps, and writing. These communicative and recording innovations come with
    a built-in ideal: truth. The point of asking questions is to find true
    answers; the point of measuring is to measure accurately; the point of
    making maps is to find your way to your destination. There may be an
    Island of the Colour-blind (allowing Oliver Sacks his usual large dose of poetic
    license), but no Island of the People Who Do Not Recognize Their Own Children.
    The Land of the Liars could exist only in philosophers’ puzzles; there are no
    traditions of False Calendar Systems for mis-recording the passage of time.
    In short, the goal of truth goes without saying, in every human culture.


    We human beings use our communicative skills not just for truth-telling, but
    also for promise-making, threatening, bargaining, story-telling, entertaining,
    mystifying, inducing hypnotic trances, and just plain kidding around, but prince
    of these activities is truth-telling, and for this activity we have invented
    ever better tools. Alongside our tools for agriculture, building, warfare, and
    transportation, we have created a technology of truth: science. Try to draw
    a straight line, or a circle, “freehand.” Unless you have considerable artistic
    talent, the result will not be impressive. With a straight edge and a compass,
    on the other hand, you can practically eliminate the sources of human variability
    and get a nice clean, objective result, the same every time.


    Is the line really straight? How straight is it? In response to these questions,
    we develop ever finer tests, and then tests of the accuracy of those tests,
    and so forth, bootstrapping our way to ever greater accuracy and objectivity.
    Scientists are just as vulnerable to wishful thinking, just as likely to be
    tempted by base motives, just as venal and gullible and forgetful as the rest
    of humankind. Scientists don’t consider themselves to be saints; they don’t
    even pretend to be priests (who according to tradition are supposed to do a
    better job than the rest of us at fighting off human temptation and frailty).
    Scientists take themselves to be just as weak and fallible as anybody else,
    but recognizing those very sources of error in themselves and in the groups
    to which they belong, they have devised elaborate systems to tie their own hands,
    forcibly preventing their frailties and prejudices from infecting their results.


    It is not just the implements, the physical tools of the trade, that are designed
    to be resistant to human error. The organization of methods is also under severe
    selection pressure for improved reliability and objectivity. The classic example
    is the double blind experiment, in which, for instance, neither the human subjects
    nor the experimenters themselves are permitted to know which subjects get the
    test drug and which the placebo, so that nobody’s subliminal hankerings and
    hunches can influence the perception of the results. The statistical design
    of both individual experiments and suites of experiments, is then embedded in
    the larger practice of routine attempts at replication by independent investigators,
    which is further embedded in a tradition–flawed, but recognized–of publication
    of both positive and negative results.


    What inspires faith in arithmetic is the fact that hundreds of scribblers,
    working independently on the same problem, will all arrive at the same answer
    (except for those negligible few whose errors can be found and identified to
    the mutual satisfaction of all). This unrivalled objectivity is also found in
    geometry and the other branches of mathematics, which since antiquity have been
    the very model of certain knowledge set against the world of flux and controversy.
    In Plato’s early dialogue, the Meno, Socrates and the slave boy work
    out together a special case of the Pythagorean theorem. Plato’s example expresses
    the frank recognition of a standard of truth to be aspired to by all truth-seekers,
    a standard that has not only never been seriously challenged, but that has been
    tacitly accepted–indeed heavily relied upon, even in matters of life and death–by
    the most vigorous opponents of science. (Or do you know a church that keeps
    track of its flock, and their donations, without benefit of arithmetic?)


    Yes, but science almost never looks as uncontroversial, as cut-and-dried,
    as arithmetic. Indeed rival scientific factions often engage in propaganda battles
    as ferocious as anything to be found in politics, or even in religious conflict.
    The fury with which the defenders of scientific orthodoxy often defend their
    doctrines against the heretics is probably unmatched in other arenas of human
    rhetorical combat. These competitions for allegiance–and, of course, funding–are
    designed to capture attention, and being well-designed, they typically succeed.
    This has the side effect that the warfare on the cutting edge of any science
    draws attention away from the huge uncontested background, the dull metal heft
    of the axe that gives the cutting edge its power. What goes without saying,
    during these heated disagreements, is an organized, encyclopedic collection
    of agreed-upon, humdrum scientific fact.


    Robert Proctor usefully draws our attention to a distinction between neutrality
    and objectivity.(2) Geologists, he notes, know
    a lot more about oil-bearing shales than about other rocks–for the obvious
    economic and political reasons–but they do know objectively about oil
    bearing shales. And much of what they learn about oil-bearing shales can be
    generalized to other, less favored rocks. We want science to be objective; we
    should not want science to be neutral. Biologists know a lot more about the
    fruit-fly, Drosophila, than they do about other insects–not because
    you can get rich off fruit flies, but because you can get knowledge out of fruit
    flies easier than you can get it out of most other species. Biologists also
    know a lot more about mosquitoes than about other insects, and here it is because
    mosquitoes are more harmful to people than other species that might be much
    easier to study. Many are the reasons for concentrating attention in science,
    and they all conspire to making the paths of investigation far from neutral;
    they do not, in general, make those paths any less objective. Sometimes, to
    be sure, one bias or another leads to a violation of the canons of scientific
    method. Studying the pattern of a disease in men, for instance, while neglecting
    to gather the data on the same disease in women, is not just not neutral; it
    is bad science, as indefensible in scientific terms as it is in political terms.


    It is true that past scientific orthodoxies have themselves inspired policies
    that hindsight reveals to be seriously flawed. One can sympathize, for instance,
    with Ashis Nandy, editor of the passionately anti-scientific anthology, Science,
    Hegemony and Violence: A Requiem for Modernity
    , Delhi: Oxford Univ. Press,
    1988. Having lived through Atoms for Peace, and the Green Revolution, to name
    two of the most ballyhooed scientific juggernauts that have seriously disrupted
    third world societies, he sees how “the adaptation in India of decades-old western
    technologies are advertised and purchased as great leaps forward in science,
    even when such adaptations turn entire disciplines or areas of knowledge into
    mere intellectual machines for the adaptation, replication and testing of shop-worn
    western models which have often been given up in the west itself as too dangerous
    or as ecologically non-viable.” (p8) But we should recognize this as a political
    misuse of science, not as a fundamental flaw in science itself.


    The methods of science aren’t foolproof, but they are indefinitely perfectible.
    Just as important: there is a tradition of criticism that enforces improvement
    whenever and wherever flaws are discovered. The methods of science, like everything
    else under the sun, are themselves objects of scientific scrutiny, as method
    becomes methodology, the analysis of methods. Methodology in turn falls
    under the gaze of epistemology, the investigation of investigation itself–nothing
    is off limits to scientific questioning. The irony is that these fruits of scientific
    reflection, showing us the ineliminable smudges of imperfection, are sometimes
    used by those who are suspicious of science as their grounds for denying it
    a privileged status in the truth-seeking department–as if the institutions
    and practices they see competing with it were no worse off in these regards.
    But where are the examples of religious orthodoxy being simply abandoned in
    the face of irresistible evidence? Again and again in science, yesterday’s heresies
    have become today’s new orthodoxies. No religion exhibits that pattern in its
    history.


    1. Portions of this paper are derived from “Faith in the
    Truth,” my Amnesty Lecture, Oxford, February 17, 1997.
    2. Value-Free Science?, Harvard Univ. Press, 1991.

    This is the final draft of a paper given at the 1998 World Congress of Philosophy. Daniel Dennett’s most recent book, Freedom Evolves, has just been published by Viking Press.

  • Galen Strawson Reviews Daniel Dennett

    Dennett on the evolution of freedom.

  • Part History Part Polemic

    And marred by bad arguments, Simon Wessely says of this book about science and the chemical weapons industry.